Config P6Spy on Quarkus log result sql - java

I try config P6spy in my quarkus project successfully. But now I want
it log result sql. How to P6spy log result sql? this is my code:
<dependency>
<groupId>p6spy</groupId>
<artifactId>p6spy</artifactId>
<version>1.3</version>
</dependency>
application.properties
quarkus.datasource.jdbc.driver=com.p6spy.engine.spy.P6SpyDriver
spy.properties
module.log=com.p6spy.engine.logging.P6LogFactory
realdriver=org.postgresql.Driver
deregisterdrivers=false
outagedetection=false
filter=false
autoflush=true
#excludecategories=info,debug,batch,statement,commit,rollback,outage
logfile=quarkus.log
reloadproperties=false
reloadpropertiesinterval=60
useprefix=false
appender=com.p6spy.engine.logging.appender.FileLogger
append=true
log4j.appender.STDOUT=org.apache.log4j.ConsoleAppender
log4j.appender.STDOUT.layout=org.apache.log4j.PatternLayout
log4j.appender.STDOUT.layout.ConversionPattern=p6spy - %m%n
log4j.logger.p6spy=INFO,STDOUT
dateformat=yyyy-MM-dd hh:mm:ss a
this is quarkus.log :
2022-12-28 03:37:06 PM|45|0|statement|select customer0_.id as id1_5_, customer0_.CREATE_TIME as create_t2_5_, customer0_.CREATE_TIME_UNIX as create_t3_5_, customer0_.UPDATE_TIME as update_t4_5_, customer0_.UPDATE_TIME_UNIX as update_t5_5_, customer0_.business_registration_number as business6_5_, customer0_.company as company7_5_, customer0_.country as country8_5_, customer0_.currency as currency9_5_, customer0_.description as descrip10_5_, customer0_.email as email11_5_, customer0_.mobile as mobile12_5_, customer0_.name as name13_5_, customer0_.status as status14_5_, customer0_.telephone as telepho15_5_ from customer customer0_ where customer0_.email=? limit ?|select customer0_.id as id1_5_, customer0_.CREATE_TIME as create_t2_5_, customer0_.CREATE_TIME_UNIX as create_t3_5_, customer0_.UPDATE_TIME as update_t4_5_, customer0_.UPDATE_TIME_UNIX as update_t5_5_, customer0_.business_registration_number as business6_5_, customer0_.company as company7_5_, customer0_.country as country8_5_, customer0_.currency as currency9_5_, customer0_.description as descrip10_5_, customer0_.email as email11_5_, customer0_.mobile as mobile12_5_, customer0_.name as name13_5_, customer0_.status as status14_5_, customer0_.telephone as telepho15_5_ from customer customer0_ where customer0_.email='hoatest14#gmail.com' limit 1
2022-12-28 03:37:06 PM|0|0|result|select customer0_.id as id1_5_, customer0_.CREATE_TIME as create_t2_5_, customer0_.CREATE_TIME_UNIX as create_t3_5_, customer0_.UPDATE_TIME as update_t4_5_, customer0_.UPDATE_TIME_UNIX as update_t5_5_, customer0_.business_registration_number as business6_5_, customer0_.company as company7_5_, customer0_.country as country8_5_, customer0_.currency as currency9_5_, customer0_.description as descrip10_5_, customer0_.email as email11_5_, customer0_.mobile as mobile12_5_, customer0_.name as name13_5_, customer0_.status as status14_5_, customer0_.telephone as telepho15_5_ from customer customer0_ where customer0_.email=? limit ?|select customer0_.id as id1_5_, customer0_.CREATE_TIME as create_t2_5_, customer0_.CREATE_TIME_UNIX as create_t3_5_, customer0_.UPDATE_TIME as update_t4_5_, customer0_.UPDATE_TIME_UNIX as update_t5_5_, customer0_.business_registration_number as business6_5_, customer0_.company as company7_5_, customer0_.country as country8_5_, customer0_.currency as currency9_5_, customer0_.description as descrip10_5_, customer0_.email as email11_5_, customer0_.mobile as mobile12_5_, customer0_.name as name13_5_, customer0_.status as status14_5_, customer0_.telephone as telepho15_5_ from customer customer0_ where customer0_.email='hoatest14#gmail.com' limit 1
2022-12-28 03:37:06 PM|52|0|commit||
Expect: it show result log
{
"createTime": "2022-09-27 08:27:42:705",
"createTimeUnix": 1664292462705,
"id": 3787,
"updateTime": "2022-09-27 08:27:42:705",
"updateTimeUnix": 1664292462705,
"company": "IMIP",
"country": "VietNam",
"currency": "KRW",
"description": "",
"email": "hoatest14#gmail.com",
"name": "Tong Thi Hoa 14",
"status": "Active"
}

By default p6spy does not log results - please read about excludecategories parameter and default values:
#list of categories to exclude: error, info, batch, debug, statement,
#commit, rollback, result and resultset are valid values
# (default is info,debug,result,resultset,batch)
#excludecategories=info,debug,result,resultset,batch
in your case I believe that should be something like:
excludecategories=info,debug,result,batch
FYI. The only author of p6spy has abandoned project - I do not see any reason to use p6spy in new projects, moreover quarkus recommends to use opentracing

Related

How to add another data in my sql column of type JSON

How to add another data in my sql column of type JSON.
In my table I have one column of json type null.
so I using this command to update the value.
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
this is working fine. and I hve two data.
Now I want to add one more data in That.
I am using same command
update myTable
set columnJson = '[{"id" : "someId3", "name": "someNamme3"}]'
where id = "rowID1";
But the previous value is getting washed away. Is there anyway I can add n number of values. I am doing this in Java.
You need JSON functions like JSON_ARRAY_APPEND see more functions to maniüulate.
Json needs some special function which have to be learned, we usually recomend nit to use JSON, because in a normalized table you can use all the sql functionality there exists,
JSON needs always a moderate learn effort
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[0]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[[{"id": "someId1", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"], {"id": "someId2", "name": "someNamme2"}]
fiddle
And if you want another position you change te point where it shold change
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[1]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[{"id": "someId1", "name": "someNamme2"}, [{"id": "someId2", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"]]
fiddle

Creating Query like "A is "b" and C is '' " (Spring Boot/Spring Data w/Mongo)

I'm making a simple application with Spring Boot (2.3.4) using MongoDB with Spring Data for MongoDB. I usually create queries for the app using the #Query annotation and it works very fine. But for an Aggregation I want to use, I built a query with the Criteria class. The criteria I need is like
where("primary").is(value).and("secondary").is("").
I need all entries where primary is equal to 'value' and secondary is empty. The query entered in MOngoDB Compass
{ $and: [ { primary: 'value' }, { secondary: ''} ] }
works as expected, but when I try to use the Criteria with Spring, it looks like the and part with the secondary is completely dropped. I get any results with 'value' in primary and with anything in secondary. This means an empty fields or anything else. Replacing the .is("") part with .regex("^$") didn't help.
This looks pretty basic to me, so what am I missing here? I don't want to replace the empty secondary with an "empty flag", because that feels wrong.
Update:
This is the code in question
Criteria crit;
if(!primary.equals(secondary)) {
crit = where("primary").is(primary.name()).and("secondary").is(secondary.name());
} else {
crit = where("primary").is(primary.name()).and("secondary").is("");
}
MatchOperation matchStage = Aggregation.match(crit);
GroupOperation groupStage = Aggregation.group("grouping").count().as("sum");
SortOperation sortStage = new SortOperation(Sort.by("_id"));
Aggregation aggregation = Aggregation.newAggregation(matchStage, groupStage, sortStage);
AggregationResults<TypePerGroup> results = mongoTemplate.aggregate(aggregation, "dataCollection", TypePerGroup.class);
This works with mongodb - Not sure what abstraction compass adds. Both queries don't generate the same json query but they are equal.
Generated query
where("primary").is(value).and("secondary").is("").
is
{"primary":value, "secondary": ""}
Perhaps compass doesn't like this variant ?
Anyways to generate query similar to what you have you input in compass you can use below code
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("primary").is("hello"), Criteria.where("secondary").is(""));
Query query = Query.query(criteria);
You are not missing anything. where("primary").is(value).and("secondary").is("") is correct and is functionally equivalent to { $and: [ { primary: 'value' }, { secondary: ''} ] }. You should turn on debug level logging for MongoTemplate to see the generated query.
A have connected to Atlas using Mongo DBCompas and added 4 records to collection:
[{
"primary": "A",
"secondary": "A"
},{
"primary": "A",
"secondary": ""
},{
"primary": "B",
"secondary": "B"
},{
"primary": "B",
"secondary": ""
}]
both queries:
List<Data> firstResults = mongoTemplate.query(Data.class)
.matching(Query.query(Criteria.where("primary").is("B").and("secondary").is("")))
.all();
System.out.println(firstResults);
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("primary").is("B"), Criteria.where("secondary").is(""));
List<Data> secondResults = mongoTemplate.query(Data.class)
.matching(Query.query(criteria))
.all();
System.out.println(secondResults);
gave the same result:
[Data{primary='B', secondary=''}]
Campfire can you please provide example of your code to analyze?

Java: How to assign or read JSON data returned from a postgresql Query into JPA Repository method. [query: SELECT row_to_json(...)]

I have the below code situated in the JpaRepository on a spring boot project, the query returns a JSON structured as GeoJson as below.
I am getting the Error: Resolved [org.springframework.orm.jpa.JpaSystemException: No Dialect mapping for JDBC type: 1111; nested exception is org.hibernate.MappingException: No Dialect mapping for JDBC type: 1111]*
String queryString = "SELECT row_to_json(fc)\n" +
" FROM ( SELECT 'FeatureCollection' As type, array_to_json(array_agg(f)) As features\n" +
" FROM (SELECT 'Feature' As type\n" +
" , ST_AsGeoJSON(lg.column1) As geometry\n" +
" , row_to_json((column2, column3)) As properties\n" +
" FROM public.table As lg ) As f ) As fc;";
#Query(value=queryString, nativeQuery=true)
Json getData(#Param("id") Long id);
In application.properties file i set my Dialect to/as:
spring.jpa.database-platform=org.hibernate.spatial.dialect.postgis.PostgisDialect
The Query returns a GeoJson FeatureCollection Object, and i want to assign or be able to read it into java.
Please bare with me with the way i asked the question, i am new here.
returned GeoJson would look something like this:
{"type" : "FeatureCollection", "features" : [{"type": "Feature", "geometry": {"type":"Point","coordinates":[1,1]}, "properties": {"id": 1, "name": "one"}}, {"type": "Feature", "geometry": {"type":"Point","coordinates":[2,2]}, "properties": {"id": 2, "name": "two"}}, {"type": "Feature", "geometry": {"type":"Point","coordinates":[3,3]}, "properties": {"id": 3, "name": "three"}}]}
I don't know what you are trying to achieve here exactly, but guess you need to use String as return type and parse that JSON string in another method.

Spring - mongodb - aggregation - The 'cursor' option is required

Executing the following aggregation pipeline:
public void getMostLikedItems () {
UnwindOperation unwind = Aggregation.unwind("favoriteItems");
GroupOperation group = Aggregation.group("favoriteItems").count().as("likes");
SortOperation sort = Aggregation.sort(Sort.Direction.DESC, "likes");
Aggregation aggregation = newAggregation(unwind, group, sort);
DBObject result = mongoTemplate.aggregate(aggregation, "users", LikedItem.class).getRawResults();
}
throws the following exception:
com.mongodb.MongoCommandException: Command failed with error 9: 'The 'cursor' option is required, except for aggregate with the explain argument' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "The 'cursor' option is required, except for aggregate with the explain argument", "code" : 9, "codeName" : "FailedToParse" }
I don't understand what is meant by cursor option here. Where should this option be configured?
EDIT Here is a sample user document
{
"_id": "5a6df13552f42a34dcca9aa6",
"username": "user1",
"password": "$2a$10$p0OXq5PPa41j1e4iPcGZHuWjoKJ983sieS/ovFI.cVX5Whwj21WYi",
"favoriteItems": [
{
"_id": "5a0c6b2dfd3eb67969316d6d",
"name": "item1",
"city": "Rabat"
},
{
"_id": "5a0c680afd3eb67969316d0b",
"name": "item2",
"city": "Rabat"
}
]
}
From the docs.
MongoDB 3.4 deprecates the use of aggregate command without the cursor
option, unless the pipeline includes the explain option. When
returning aggregation results inline using the aggregate command,
specify the cursor option using the default batch size cursor: {} or
specify the batch size in the cursor option cursor: { batchSize:
}.
You can pass batchSize with AggregationOptions in Spring Mongo 2.x version
Aggregation aggregation = newAggregation(unwind, group).withOptions(newAggregationOptions().cursorBatchSize(100).build());
With default batch size
Aggregation aggregation = newAggregation(unwind, group).withOptions(newAggregationOptions().cursor(new Document()).build());
'The 'cursor' option is required, except for aggregate with the explain argument'
This type of error raised in spring data when you are using incompatible versions of MongoDB and Spring-data-mongo.
Though you can get rawResults with explain, cursor arguments.
Aggregation aggregation = Aggregation.newAggregation(group).withOptions( new AggregationOptions(allowDiskUse, explain, cursor));
//try with .withOptions( new AggregationOptions(true,false,new Document()));
Passing by commented Arguments you will get result in rawResult but it will not be mapped in given outType.class.
To get mapped result you have to download right dependency of spring-data version according to your MongoDb version.
EDIT
I have used Spring version 5.0.3 and Spring-data-mongoDB version 2.0.3
It is working Fine.
You can provide outputmode as cursor as providing a cursor is mandatory
List<DBObject> list = new ArrayList<DBObject>();
list.add(unwind.toDBObject(Aggregation.DEFAULT_CONTEXT));
list.add(group.toDBObject(Aggregation.DEFAULT_CONTEXT));
list.add(sort.toDBObject(Aggregation.DEFAULT_CONTEXT));
DBCollection col = mongoTemplate.getCollection("users");
Cursor cursor = col.aggregate(list, AggregationOptions.builder().allowDiskUse(true).outputMode(OutputMode.CURSOR).build());
List<AggregationResultVO> result = new ArrayList<AggregationResultVO>();
while(cursor.hasNext()) {
DBObject object = cursor.next();
result.add(new AggregationResultVO(object.get("aggregationResultId").toString()));
}

Solr DataImport Error reading data from database (MySQL)

I've installed Solr, as well as the DataImport handler/plugin...and after some trial and error I was able to get Solr to connect to a DB.
Unfortunately, I don't know Java, and this error response below is stumping me...
{
"responseHeader": {
"status": 0,
"QTime": 2682
},
"initArgs": [
"defaults",
[
"config",
"data-config.xml"
]
],
"command": "full-import",
"mode": "debug",
"documents": [],
"verbose-output": [
"entity:product",
[
"document#1",
[
"query",
"SELECT A.product_attribute_id, A.product_id, A.model, A.sku_code, A.status, A.image_filename, A.local_filename, A.width, A.width_final, A.length, A.length_final, A.msrp, A.map_price, A.sku_upc_code, A.url, A.discontinued, A.sq_foot, A.weight, A.rolled_length, A.rolled_width, A.rolled_height, A.lifestyle, A.detailed_color, A.bullet_point1, A.bullet_point2, A.bullet_point3, A.on_hand, A.eta_date, A.pile_height, A.eta_quantity, DATE_FORMAT(FROM_UNIXTIME(A.last_updated), '%e %b %Y') AS last_updated, P.sku_description, P.url as product_url, P.design, C.collection_id, C.collection_name, C.url as collection_url, M.manufacturer_name, M.url as manufacturer_url, O.origin_name, T.type_name, CO.content_name, S.style_name, COL.color_name, COL.color_id, PS.parentstyle_name, PC.parentcolor_name, SHA.shape_name, SA.sale_amount, DATE_FORMAT(FROM_UNIXTIME(SA.sale_start), '%e %b %Y') AS sale_start, DATE_FORMAT(FROM_UNIXTIME(SA.sale_end), '%e %b %Y') AS sale_end FROM tr_product_attributes as A INNER JOIN tr_products as P ON A.product_id = P.product_id INNER JOIN tr_collections as C ON P.collection = C.collection_id INNER JOIN tr_manufacturers as M ON P.manufacturer = M.manufacturer_id INNER JOIN tr_origins as O ON P.origin = O.origin_id INNER JOIN tr_types as T ON P.type = T.type_id INNER JOIN tr_contents as CO ON P.content = CO.content_id INNER JOIN tr_styles as S ON A.style = S.style_id INNER JOIN tr_parentstyles as PS ON A.parentstyle = PS.parentstyle_id INNER JOIN tr_colors as COL ON A.color = COL.color_id INNER JOIN tr_parentcolors as PC ON A.parentcolor = PC.parentcolor_id INNER JOIN tr_shapes as SHA ON A.shape = SHA.shape_id LEFT JOIN tr_sales as SA ON P.collection = SA.sale_collection_id WHERE A.status=1 AND A.url NOT LIKE '%http://www.XXXXXXX.com/rugs/rug/-///%' AND PC.parentcolor_name!='NA' AND A.active='Y' AND A.discontinued=0 AND A.local_filename LIKE '%jpg%' AND P.status=1 AND P.discontinued=0 AND C.status=1 AND C.discontinued=0 AND M.status=1",
"time-taken",
"0:0:2.652",
"EXCEPTION",
"org.apache.solr.handler.dataimport.DataImportHandlerException: Error reading data from database Processing Document # 1\n\tat org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:70)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.getARow(JdbcDataSource.java:398)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$600(JdbcDataSource.java:296)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(JdbcDataSource.java:336)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(JdbcDataSource.java:328)\n\tat org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:133)\n\tat org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:74)\n\tat org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:243)\n\tat org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:475)\n\tat org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:414)\n\tat org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:329)\n\tat org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)\n\tat org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:416)\n\tat org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:480)\n\tat org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:185)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2064)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:450)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:227)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:196)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:497)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat java.lang.Thread.run(Thread.java:745)\nCaused by: java.sql.SQLException: Value '\u000535298\u000541486\u0010SE30 Gingerlilly\rSE30R024X036S\u00011E/img/RugImages/Vendors/Colonial Mills/Generation/SE30-gingerlilyl.jpg\u0014SE30-gingerlilyl.jpg\u00052.000\u00052' 0\"\u00053.000\u00053' 0\"\b119.0000\u000789.0000\f082262203658Zhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape/gingerlilly/41486/35298\u00010\u00010\u00013\u00010\u00014\u00014\u0000\u0000\u0000\u0000\u0000\u00010\n0000-00-00\u00010\u00010\n1 Jan 1970\u0004NULLMhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape-se30/41486\bSeascape\u0003347\u0013Generation-SeascapeOhttp://www.XXXXXXX.com/rugs/brands/colonial-mills/collection/generationseascape\u000eColonial Mills1http://www.XXXXXXX.com/rugs/brands/colonial-mills\u0003USA\u0007Braided\rPolypropylene\u0007Braided\u000bGingerlilly\u0003626\fTransitional\fGold, Yellow\tRectangle���\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000' can not be represented as java.sql.Date\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:998)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:937)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:926)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:872)\n\tat com.mysql.jdbc.ResultSetRow.getDateFast(ResultSetRow.java:145)\n\tat com.mysql.jdbc.BufferRow.getDateFast(BufferRow.java:689)\n\tat com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:2012)\n\tat com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:1975)\n\tat com.mysql.jdbc.ResultSetImpl.getObject(ResultSetImpl.java:4587)\n\tat com.mysql.jdbc.ResultSetImpl.getObject(ResultSetImpl.java:4710)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.getARow(JdbcDataSource.java:358)\n\t... 39 more\n"
]
]
],
"status": "idle",
"importResponse": "",
"statusMessages": {
"Time Elapsed": "0:0:2.667",
"Total Requests made to DataSource": "1",
"Total Rows Fetched": "0",
"Total Documents Processed": "0",
"Total Documents Skipped": "0",
"Full Dump Started": "2015-10-21 14:05:52",
"Full Import failed": "2015-10-21 14:05:55"
}
}
I think i've narrowed it down to this particular part of the error message...
Caused by: java.sql.SQLException: Value '\u000535298\u000541486\u0010SE30 Gingerlilly\rSE30R024X036S\u00011E/img/RugImages/Vendors/Colonial Mills/Generation/SE30-gingerlilyl.jpg\u0014SE30-gingerlilyl.jpg\u00052.000\u00052' 0\"\u00053.000\u00053' 0\"\b119.0000\u000789.0000\f082262203658Zhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape/gingerlilly/41486/35298\u00010\u00010\u00013\u00010\u00014\u00014\u0000\u0000\u0000\u0000\u0000\u00010\n0000-00-00\u00010\u00010\n1 Jan 1970\u0004NULLMhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape-se30/41486\bSeascape\u0003347\u0013Generation-SeascapeOhttp://www.XXXXXXX.com/rugs/brands/colonial-mills/collection/generationseascape\u000eColonial Mills1http://www.XXXXXXX.com/rugs/brands/colonial-mills\u0003USA\u0007Braided\rPolypropylene\u0007Braided\u000bGingerlilly\u0003626\fTransitional\fGold, Yellow\tRectangle���\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000' can not be represented as java.sql.Date
The only problem is that there is no "Date" field in this set of data. I have a few "Timestamp" fields, but when I wrote the config file for Solr I have these represented as Numeric or Intergers...or something like that. Not date fields. In addition, I temporarily adjusted the "Timestamp" fields to VARCHAR 255 just to see if that would make a difference, and it did not.
Any help is greatly appreciated!
You probably have some error in solr mapping. The error says it expected to convert the "blah blah Gingerlilly blah" value read from the database to a Date and failed.
If you're able to find what column does contain the "Gingerlilly" stuff in the select's output, you'll have a good pointer at what configuration to check.
I came across the exact same problem in a dataimport from MySQL. In my case the problem was that I was trying to pass a zero value to a date field.

Categories