I've installed Solr, as well as the DataImport handler/plugin...and after some trial and error I was able to get Solr to connect to a DB.
Unfortunately, I don't know Java, and this error response below is stumping me...
{
"responseHeader": {
"status": 0,
"QTime": 2682
},
"initArgs": [
"defaults",
[
"config",
"data-config.xml"
]
],
"command": "full-import",
"mode": "debug",
"documents": [],
"verbose-output": [
"entity:product",
[
"document#1",
[
"query",
"SELECT A.product_attribute_id, A.product_id, A.model, A.sku_code, A.status, A.image_filename, A.local_filename, A.width, A.width_final, A.length, A.length_final, A.msrp, A.map_price, A.sku_upc_code, A.url, A.discontinued, A.sq_foot, A.weight, A.rolled_length, A.rolled_width, A.rolled_height, A.lifestyle, A.detailed_color, A.bullet_point1, A.bullet_point2, A.bullet_point3, A.on_hand, A.eta_date, A.pile_height, A.eta_quantity, DATE_FORMAT(FROM_UNIXTIME(A.last_updated), '%e %b %Y') AS last_updated, P.sku_description, P.url as product_url, P.design, C.collection_id, C.collection_name, C.url as collection_url, M.manufacturer_name, M.url as manufacturer_url, O.origin_name, T.type_name, CO.content_name, S.style_name, COL.color_name, COL.color_id, PS.parentstyle_name, PC.parentcolor_name, SHA.shape_name, SA.sale_amount, DATE_FORMAT(FROM_UNIXTIME(SA.sale_start), '%e %b %Y') AS sale_start, DATE_FORMAT(FROM_UNIXTIME(SA.sale_end), '%e %b %Y') AS sale_end FROM tr_product_attributes as A INNER JOIN tr_products as P ON A.product_id = P.product_id INNER JOIN tr_collections as C ON P.collection = C.collection_id INNER JOIN tr_manufacturers as M ON P.manufacturer = M.manufacturer_id INNER JOIN tr_origins as O ON P.origin = O.origin_id INNER JOIN tr_types as T ON P.type = T.type_id INNER JOIN tr_contents as CO ON P.content = CO.content_id INNER JOIN tr_styles as S ON A.style = S.style_id INNER JOIN tr_parentstyles as PS ON A.parentstyle = PS.parentstyle_id INNER JOIN tr_colors as COL ON A.color = COL.color_id INNER JOIN tr_parentcolors as PC ON A.parentcolor = PC.parentcolor_id INNER JOIN tr_shapes as SHA ON A.shape = SHA.shape_id LEFT JOIN tr_sales as SA ON P.collection = SA.sale_collection_id WHERE A.status=1 AND A.url NOT LIKE '%http://www.XXXXXXX.com/rugs/rug/-///%' AND PC.parentcolor_name!='NA' AND A.active='Y' AND A.discontinued=0 AND A.local_filename LIKE '%jpg%' AND P.status=1 AND P.discontinued=0 AND C.status=1 AND C.discontinued=0 AND M.status=1",
"time-taken",
"0:0:2.652",
"EXCEPTION",
"org.apache.solr.handler.dataimport.DataImportHandlerException: Error reading data from database Processing Document # 1\n\tat org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:70)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.getARow(JdbcDataSource.java:398)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$600(JdbcDataSource.java:296)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(JdbcDataSource.java:336)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(JdbcDataSource.java:328)\n\tat org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:133)\n\tat org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:74)\n\tat org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:243)\n\tat org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:475)\n\tat org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:414)\n\tat org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:329)\n\tat org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)\n\tat org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:416)\n\tat org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:480)\n\tat org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:185)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2064)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:450)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:227)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:196)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:497)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat java.lang.Thread.run(Thread.java:745)\nCaused by: java.sql.SQLException: Value '\u000535298\u000541486\u0010SE30 Gingerlilly\rSE30R024X036S\u00011E/img/RugImages/Vendors/Colonial Mills/Generation/SE30-gingerlilyl.jpg\u0014SE30-gingerlilyl.jpg\u00052.000\u00052' 0\"\u00053.000\u00053' 0\"\b119.0000\u000789.0000\f082262203658Zhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape/gingerlilly/41486/35298\u00010\u00010\u00013\u00010\u00014\u00014\u0000\u0000\u0000\u0000\u0000\u00010\n0000-00-00\u00010\u00010\n1 Jan 1970\u0004NULLMhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape-se30/41486\bSeascape\u0003347\u0013Generation-SeascapeOhttp://www.XXXXXXX.com/rugs/brands/colonial-mills/collection/generationseascape\u000eColonial Mills1http://www.XXXXXXX.com/rugs/brands/colonial-mills\u0003USA\u0007Braided\rPolypropylene\u0007Braided\u000bGingerlilly\u0003626\fTransitional\fGold, Yellow\tRectangle���\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000' can not be represented as java.sql.Date\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:998)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:937)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:926)\n\tat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:872)\n\tat com.mysql.jdbc.ResultSetRow.getDateFast(ResultSetRow.java:145)\n\tat com.mysql.jdbc.BufferRow.getDateFast(BufferRow.java:689)\n\tat com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:2012)\n\tat com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:1975)\n\tat com.mysql.jdbc.ResultSetImpl.getObject(ResultSetImpl.java:4587)\n\tat com.mysql.jdbc.ResultSetImpl.getObject(ResultSetImpl.java:4710)\n\tat org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.getARow(JdbcDataSource.java:358)\n\t... 39 more\n"
]
]
],
"status": "idle",
"importResponse": "",
"statusMessages": {
"Time Elapsed": "0:0:2.667",
"Total Requests made to DataSource": "1",
"Total Rows Fetched": "0",
"Total Documents Processed": "0",
"Total Documents Skipped": "0",
"Full Dump Started": "2015-10-21 14:05:52",
"Full Import failed": "2015-10-21 14:05:55"
}
}
I think i've narrowed it down to this particular part of the error message...
Caused by: java.sql.SQLException: Value '\u000535298\u000541486\u0010SE30 Gingerlilly\rSE30R024X036S\u00011E/img/RugImages/Vendors/Colonial Mills/Generation/SE30-gingerlilyl.jpg\u0014SE30-gingerlilyl.jpg\u00052.000\u00052' 0\"\u00053.000\u00053' 0\"\b119.0000\u000789.0000\f082262203658Zhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape/gingerlilly/41486/35298\u00010\u00010\u00013\u00010\u00014\u00014\u0000\u0000\u0000\u0000\u0000\u00010\n0000-00-00\u00010\u00010\n1 Jan 1970\u0004NULLMhttp://www.XXXXXXX.com/rugs/rug/colonial-mills-generation-seascape-se30/41486\bSeascape\u0003347\u0013Generation-SeascapeOhttp://www.XXXXXXX.com/rugs/brands/colonial-mills/collection/generationseascape\u000eColonial Mills1http://www.XXXXXXX.com/rugs/brands/colonial-mills\u0003USA\u0007Braided\rPolypropylene\u0007Braided\u000bGingerlilly\u0003626\fTransitional\fGold, Yellow\tRectangle���\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000' can not be represented as java.sql.Date
The only problem is that there is no "Date" field in this set of data. I have a few "Timestamp" fields, but when I wrote the config file for Solr I have these represented as Numeric or Intergers...or something like that. Not date fields. In addition, I temporarily adjusted the "Timestamp" fields to VARCHAR 255 just to see if that would make a difference, and it did not.
Any help is greatly appreciated!
You probably have some error in solr mapping. The error says it expected to convert the "blah blah Gingerlilly blah" value read from the database to a Date and failed.
If you're able to find what column does contain the "Gingerlilly" stuff in the select's output, you'll have a good pointer at what configuration to check.
I came across the exact same problem in a dataimport from MySQL. In my case the problem was that I was trying to pass a zero value to a date field.
Related
How to add another data in my sql column of type JSON.
In my table I have one column of json type null.
so I using this command to update the value.
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
this is working fine. and I hve two data.
Now I want to add one more data in That.
I am using same command
update myTable
set columnJson = '[{"id" : "someId3", "name": "someNamme3"}]'
where id = "rowID1";
But the previous value is getting washed away. Is there anyway I can add n number of values. I am doing this in Java.
You need JSON functions like JSON_ARRAY_APPEND see more functions to maniüulate.
Json needs some special function which have to be learned, we usually recomend nit to use JSON, because in a normalized table you can use all the sql functionality there exists,
JSON needs always a moderate learn effort
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[0]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[[{"id": "someId1", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"], {"id": "someId2", "name": "someNamme2"}]
fiddle
And if you want another position you change te point where it shold change
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[1]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[{"id": "someId1", "name": "someNamme2"}, [{"id": "someId2", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"]]
fiddle
I'm making a simple application with Spring Boot (2.3.4) using MongoDB with Spring Data for MongoDB. I usually create queries for the app using the #Query annotation and it works very fine. But for an Aggregation I want to use, I built a query with the Criteria class. The criteria I need is like
where("primary").is(value).and("secondary").is("").
I need all entries where primary is equal to 'value' and secondary is empty. The query entered in MOngoDB Compass
{ $and: [ { primary: 'value' }, { secondary: ''} ] }
works as expected, but when I try to use the Criteria with Spring, it looks like the and part with the secondary is completely dropped. I get any results with 'value' in primary and with anything in secondary. This means an empty fields or anything else. Replacing the .is("") part with .regex("^$") didn't help.
This looks pretty basic to me, so what am I missing here? I don't want to replace the empty secondary with an "empty flag", because that feels wrong.
Update:
This is the code in question
Criteria crit;
if(!primary.equals(secondary)) {
crit = where("primary").is(primary.name()).and("secondary").is(secondary.name());
} else {
crit = where("primary").is(primary.name()).and("secondary").is("");
}
MatchOperation matchStage = Aggregation.match(crit);
GroupOperation groupStage = Aggregation.group("grouping").count().as("sum");
SortOperation sortStage = new SortOperation(Sort.by("_id"));
Aggregation aggregation = Aggregation.newAggregation(matchStage, groupStage, sortStage);
AggregationResults<TypePerGroup> results = mongoTemplate.aggregate(aggregation, "dataCollection", TypePerGroup.class);
This works with mongodb - Not sure what abstraction compass adds. Both queries don't generate the same json query but they are equal.
Generated query
where("primary").is(value).and("secondary").is("").
is
{"primary":value, "secondary": ""}
Perhaps compass doesn't like this variant ?
Anyways to generate query similar to what you have you input in compass you can use below code
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("primary").is("hello"), Criteria.where("secondary").is(""));
Query query = Query.query(criteria);
You are not missing anything. where("primary").is(value).and("secondary").is("") is correct and is functionally equivalent to { $and: [ { primary: 'value' }, { secondary: ''} ] }. You should turn on debug level logging for MongoTemplate to see the generated query.
A have connected to Atlas using Mongo DBCompas and added 4 records to collection:
[{
"primary": "A",
"secondary": "A"
},{
"primary": "A",
"secondary": ""
},{
"primary": "B",
"secondary": "B"
},{
"primary": "B",
"secondary": ""
}]
both queries:
List<Data> firstResults = mongoTemplate.query(Data.class)
.matching(Query.query(Criteria.where("primary").is("B").and("secondary").is("")))
.all();
System.out.println(firstResults);
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("primary").is("B"), Criteria.where("secondary").is(""));
List<Data> secondResults = mongoTemplate.query(Data.class)
.matching(Query.query(criteria))
.all();
System.out.println(secondResults);
gave the same result:
[Data{primary='B', secondary=''}]
Campfire can you please provide example of your code to analyze?
I have the below code situated in the JpaRepository on a spring boot project, the query returns a JSON structured as GeoJson as below.
I am getting the Error: Resolved [org.springframework.orm.jpa.JpaSystemException: No Dialect mapping for JDBC type: 1111; nested exception is org.hibernate.MappingException: No Dialect mapping for JDBC type: 1111]*
String queryString = "SELECT row_to_json(fc)\n" +
" FROM ( SELECT 'FeatureCollection' As type, array_to_json(array_agg(f)) As features\n" +
" FROM (SELECT 'Feature' As type\n" +
" , ST_AsGeoJSON(lg.column1) As geometry\n" +
" , row_to_json((column2, column3)) As properties\n" +
" FROM public.table As lg ) As f ) As fc;";
#Query(value=queryString, nativeQuery=true)
Json getData(#Param("id") Long id);
In application.properties file i set my Dialect to/as:
spring.jpa.database-platform=org.hibernate.spatial.dialect.postgis.PostgisDialect
The Query returns a GeoJson FeatureCollection Object, and i want to assign or be able to read it into java.
Please bare with me with the way i asked the question, i am new here.
returned GeoJson would look something like this:
{"type" : "FeatureCollection", "features" : [{"type": "Feature", "geometry": {"type":"Point","coordinates":[1,1]}, "properties": {"id": 1, "name": "one"}}, {"type": "Feature", "geometry": {"type":"Point","coordinates":[2,2]}, "properties": {"id": 2, "name": "two"}}, {"type": "Feature", "geometry": {"type":"Point","coordinates":[3,3]}, "properties": {"id": 3, "name": "three"}}]}
I don't know what you are trying to achieve here exactly, but guess you need to use String as return type and parse that JSON string in another method.
I have below given sampleMeasurement1; I want to update values of respective columns in InfluxDB. How to update those values?
SELECT * FROM sampleMeasurement1;
{ "results": [ { "series": [ { "name": "sampleMeasurement1", "columns": [ "time", "disk_type", "field1", "field2", "hostname" ], "values": [ [ 1520315870774000000, null, 12212, 22.44, "server001" ], [ 1520315870843000000, "HDD", 112, 21.44, "localhost" ] ] } ] } ] }
We can't change tag values via InfluxDB commands, we can however write a client script that can change the value of a tag by inserting "duplicate" points in the measurement with the same timestamp, fieldset and tagset, except that the desired tag will have its value changed.
Point with wrong tag ( https://docs.influxdata.com/influxdb/v1.4/write_protocols/line_protocol_reference/#syntax ):
cpu,hostname=machine.lan cpu=50 1514970123
After running
INSERT cpu,hostname=machine.mydomain.com cpu=50 1514970123
a SELECT * FROM CPU would include
cpu,hostname=machine.lan cpu=50 1514970123
cpu,hostname=machine.mydomain.com cpu=50 1514970123
After the script runs all the INSERT commands, you'll need to drop the obsolete series of points with the old tag value:
DROP SERIES FROM cpu WHERE hostname='machine.lan'
Change tag value in InfluxDB
This is an extension of this question here:
Getting Facebook Events using fql
I am running into the problem when there is no location/venue set for the events.
Here is the query I am using:
{\"events\":
\"SELECT eid, name,description, start_time, end_time, pic_small,pic_big, eid,venue,location
from event WHERE eid in (SELECT eid FROM event_member WHERE uid = me()
and start_time > %s)\" ,
\"locations\":
\"select page_id,location from page where page_id IN
(select venue.id from #events)\" ,
\"rsvpStatus\":
\"select eid, rsvp_status from event_member where eid IN
(select eid from #events) AND uid = me()\" }
My observation : when there is location set for this event, the venue atttibute is a json object with and id property. Please refer below json
{
"eid": 34343434322,
"name": "Another yo you",
"description": "",
"start_time": 1347699600,
"end_time": 1347786000,
"pic_small": "http:\/\/profile.ak.fbcdn.net\/static-ak\/rsrc.php\/v2\/yy\/r\/XcB-JGXohjk.png",
"pic_big": "http:\/\/profile.ak.fbcdn.net\/static-ak\/rsrc.php\/v2\/yn\/r\/5uwzdFmIMKQ.png",
"venue": {
"id": 34243242
},
"location": "Airbiquity"
}
When there is no location set, it is an empty array. Please refer below json.
{
"eid": 441320552577649,
"name": "Meeting",
"description": "",
"start_time": 1347491700,
"end_time": 1347502500,
"pic_small": "http:\/\/profile.ak.fbcdn.net\/static-ak\/rsrc.php\/v2\/yy\/r\/XcB-JGXohjk.png",
"pic_big": "http:\/\/profile.ak.fbcdn.net\/static-ak\/rsrc.php\/v2\/yn\/r\/5uwzdFmIMKQ.png",
"venue": [
],
"location": null
}
It's weird that the type of the venue attribute changes. I am GSON to convert this response for Java object and it's throwing off due to unepected structure for venue.
Can someone help me with this issue?
Facebook's treatment of venues has changed this past May. There are several different situations I've get returned by the API, depending on when and how the event was created.
The id of a known venue is returned.
The venue field is blank and an arbitrary location string is returned.
Venue contains an array of information about the event venue, but no id.
Unfortunately, your script has to deal with all of these cases and be ready for whatever future cases the Facebook engineers may come up with.