BSON Message To Map in JAVA - java

We are currently sending messages to a Redis Queue, which is being picked up by our JAVA application.
Anyone have an idea how to convert the BSON message to a Map in JAVA?
Here is an example MSG in BSON we pop from the Redis queue:
\x16\x00\x00\x00\x02hello\x00\x06\x00\x00\x00world\x00\x00

You can use MongoDB Driver:
Parse your BSON data just like this:
RawDBObject obj(your ByteBuffer buf );
obj.toMap();
done.
https://github.com/mongodb/mongo-java-driver/blob/master/src/main/com/mongodb/RawDBObject.java
or BSON official site may help:
http://bsonspec.org/#/implementation

You can use a BSON parser to parse your BSON input. Google gives me bson4jackson but I have never tried it myself.

Related

What is equivalent of cts:element-query in MarkLogic Java API

I have a MarkLogic query written in XQuery and I would like to convert it to Java API using StructuredQueryBuilder. Unfortunately, I can't find Java equivalent for cts:element-query. Can you please show me how to implement it in Java?
The query that I want to convert:
cts:element-query(fn:QName("http://www.example.com/2009/pfi2","content"), cts:word-query("florists", ("case-insensitive","lang=en"), 4.5), ())
The StructuredQueryBuilder.containerQuery() method constructs a search:container-query in the Search API. On the enode, the REST API converts the search:container-query to cts:element-query() or cts:json-property-query() or cts:json-property-scope-query() as appropriate.
For more detail, see:
http://docs.marklogic.com/javadoc/client/com/marklogic/client/query/StructuredQueryBuilder.html#containerQuery-com.marklogic.client.query.StructuredQueryBuilder.ContainerIndex-com.marklogic.client.query.StructuredQueryDefinition-
http://docs.marklogic.com/guide/search-dev/structured-query#id_87231
The other way to provide the query in the Java API is to serialize the cts:element-query() as JSON or XML to learn the query structure and then use a DOM to construct the query and pass the query as a RawCtsQueryDefinition payload.
For that approach, see:
http://docs.marklogic.com/guide/java/searches#id_45762
http://docs.marklogic.com/javadoc/client/com/marklogic/client/query/RawCtsQueryDefinition.html
Hoping that helps,

How can I append timestamp to rdd and push to elasticsearch

I am new to spark streaming and elasticsearch, I am trying to read data from kafka topic using spark and storing data as rdd. In the rdd I want to append time stamp, as soon as new data comes and then push to elasticsearch.
lines.foreachRDD(rdd -> {
if(!rdd.isEmpty()){
// rdd.collect().forEach(System.out::println);
String timeStamp = new
SimpleDateFormat("yyyy::MM::dd::HH::mm::ss").format(new Date());
List<String> myList = new ArrayList<String>(Arrays.asList(timeStamp.split("\\s+")));
List<String> f = rdd.collect();
Map<List<String>, ?> rddMaps = ImmutableMap.of(f, 1);
Map<List<String>, ?> myListrdd = ImmutableMap.of(myList, 1);
JavaRDD<Map<List<String>, ?>> javaRDD = sc.parallelize(ImmutableList.of(rddMaps));
JavaEsSpark.saveToEs(javaRDD, "sample/docs");
}
});
Spark?
As far as I understand, spark streaming is for real time streaming data computation, like map, reduce, join and window. It seems no need to use such a powerful tool, in the case that what we need is just add a timestamp for event.
Logstash?
If this is the situation, Logstash may be more suitable for our case.
Logstash will record the timestamp when event coming and it also has persistent queue and Dead Letter Queues that ensure the data resiliency. It has the native support for push data to ES (after all they are belong to a serial of products), which make it is very easy to push data to.
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
}
}
More
for more about Logstash, here is introduction.
here is a sample logstash config file.
Hope this is helpful.
Ref
Deploying and Scaling Logstash
If all you're using Spark Streaming for is getting the data from Kafka to Elasticsearch a neater way–and not needing any coding–would be to use Kafka Connect.
There is an Elasticsearch Kafka Connect sink. Depending on what you want to do with a Timestamp (e.g. for index routing, or to add as a field) you can use Single Message Transforms (there's an example of them here).

Flink Table API not able to convert DataSet to DataStream

I am using Flink Table API using Java where I want to convert DataSet to DataStream .... Following is my code :
TableEnvironment tableEnvironment=new TableEnvironment();
Table tab1=table.where("related_value < 2014").select("related_value,ref_id");
DataSet<MyClass>ds2=tableEnvironment.toDataSet(tab1, MyClass.class);
DataStream<MyClass> d=tableEnvironment.toDataStream(tab1, MyClass.class);
But when I try to execute this program,it throws following exception :
org.apache.flink.api.table.ExpressionException: Invalid Root for JavaStreamingTranslator: Root(ArraySeq((related_value,Double), (ref_id,String))). Did you try converting a Table based on a DataSet to a DataStream or vice-versa? I want to know how we can convert DataSet to DataStream using Flink Table API ??
Another thing I want to know that, for Pattern matching, there is Flink CEP Library available.But is it feasible to use Flink Table API for Pattern Matching ??
Flink's Table API was not designed to convert a DataSet into a DataStream and vice versa. It is not possible to do that with the Table API and there is also no other way to do it with Flink at the moment.
Unifying the DataStream and DataSet APIs (handling batch processing as a special case of streaming, i.e., as bounded streams) is on the long-term roadmap of Flink.
You cannot convert to DataStream API when using TableEnvironment, you must create an StreamTableEnvironment to convert from table to DataStream, something like this:
final EnvironmentSettings fsSettings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
final StreamTableEnvironment fsTableEnv = StreamTableEnvironment.create(configuration, fsSettings);
DataStream<String> finalRes = fsTableEnv.toAppendStream(tableNameHere, MyClass.class);
Hope to help you in somehow.
Kind regards!

MongoDB Java Driver

Given some JSON value and a query in MongoDB format, I want to filter the same way that MongoDB does, the json entities I want without going to the MongoDB.
For example, I have:
JSON Value: [{qty: 10}, {qty: 30}, {qty: 50}]
Query in MongoDB format: { qty: { $gt: 20 } }
Result: [{qty: 50}]
I want that without going to Mongo database, for example calling some method that recives JSON Value and JSON Query String in Mongo format, inside some JAR.
Thanks!
I want that without going to Mongo database
Parse JSON using Jackson and create a Query Object and a Collection containing the target objects.
Use a collections framework such as Guava or GS-Collections and filter.
'Jackson' library offers JSON parsing & generation in Java. Once you've parsed, you can filter values/ data structure using Java code to your heart's content.
Java obviously has no direct implementation of Mongo query language.. you can implement Java code yourself as desired.
See:
http://jackson.codehaus.org/

Converting JSON to entities and storing in mongoDB using morphia

I have a JSON string being sent from a client (browser ).I want to save it to my mongoDB database which already has some collections defined by the user.I was able to successfully save objects using Morphia.But How can I do the same if I already have the JSON string being returned from client I want to put in the "bands" collection.
Mongo mongo = new Mongo("localhost");
Datastore datastore = new Morphia().createDatastore(mongo,
"bandmanager");
Band band = new Band();
band.setName("Punjabi band");
band.getMembers().add("Lucky1");
band.getMembers().add("Lucky2");
band.getMembers().add("Lucky3");
band.getMembers().add("Lucky4");
band.getMembers().add("Lucky5");
band.getMembers().add("Lucky6");
band.setGenre("Punjabi");
datastore.save(band);
Did you annotate Band with #Entity("bands")? I'm not sure what you're asking... Are you asking how to convert that json string in to a Band object? If so, look in to jackson
If you already have a JSON object, you don't really need Morphia. You can simply do the following with the Java driver:
DBObject dbObject = (DBObject) JSON.parse(yourJsonString);
For a full blog post on this see http://www.mkyong.com/mongodb/java-mongodb-convert-json-data-to-dbobject/
PS: Do not forget to sanitize the JSON you get from the client!

Categories