Import nested Json into cassandra - java

I have list of nested Json objects which I want to save into cassandra (1.2.15).
However the constraint I have is that I do not know the column family's column data types before hand i.e each Json object has got a different structure with fields of different datatypes.
So I am planning to use dynamic composite type for creating a column family.
So I would like to know if there is an API or suggest some ideas on how to save such Json object list into cassandra.
Thanks

If you don't need to be able to query individual items from the json structure, just store the whole serialized string into one column.
If you do need to be able to query individual items, I suggest using one of the collection types: list, set, or map. As far as typing goes, I would leave the value as text or blob and rely on json to handle the typing. In other words, json encode the values before inserting and then json decode the values when reading.

Related

What is the best possible structure for a db to store configurations date

I have a requirement where i need to store different application config data , so just wanted to know the best possible way to to store it.
Currently I am using below fields for my db
id, name ,category ,city,value(text field), int_value, float_value, string_value,date_value,bool_value
value field will be used to store complex data which in turn are json object capable of storing different key value pair.
for eg.
value:
{ "is_enabled" : true,
"list_of_applicable_ids" : ["123","345","567","890"]
}
And the reason I have added different value data type field(int_value, float_value) because it will be easy to query on that fields and index will make this even faster.
So just wanted to know the better approach to store these kind of data in db.
will using only value fields enough for my requirements ?
Frequency of config changes are very less( once or twice in a month)
Take a look a file based databases.
An example would be MongoDB.
Mongo uses a data form that closely resembles JSON (called BSON), and could in your case store the entire config file without the need to predefined it's fields.

DynamoDB query on sub field of JSON Object

Is it possible to search for a subfield of a json object in dynamoDB table?
My table:
Item: "item name",
Location: {...},
ItemInformation :
{
ItemName: "itemName",
ProductLine: {
Brand: "Razer",
ManufacturerSource: "Razer"
}
Originally in this table ItemInformation would be a key and searching for an object we would construct the json for the item information and then query with the json string as a key.. Now we need to implement searching by sub fields of that object, which can contain different fields each time, i.e. isDigital: "true".
I notice in the question:
DynamoDB advanced scan - JAVA
The answer would seem to be no and I would have to separate out the fields. But I am curious about why and how the PHP library can query for sub fields on a JSON object in dynamoDB. Is there really no better solution then to store the column as separate fields and then add an index on all fields?
After looking through documentation it is not feasible to implement the search fields as I originally intended. The problem is that while the values are JSON they are stored as string literals so I have to do refactoring to start storing as JSON objects. Additionally I cannot add in columns and index because the search could operate on any number of fields and different items can have different fields, i.e. an Item can have Brand, BatteryInformation, Name. Given that the requirement is that any of these subfields should be searchable its better to do this in Cloud Search or ElasticSearch where I can index and search on arbitrary fields and values within a column of an object.
Since this is a DynamoDB table, I am going to use CloudSearch since it offers easier indexing option and integration for data.
At the moment, the only solution available is to store the column as separate fields. Probably, aws may come up with a solution in future releases.

How to persist a sub-document of arbitrary data to MongoDB using Spring Data?

I am trying to insert a document(json string) in a mongo db. One of the key "profile" of this has a value which is a json string. So, basically its a nested json structure. I know its possible to insert a nested json by abusing collection-refs / one-may relationships in the document class.
The issue I am facing here is that the json structure of the nested part is not fixed and hence cannot be abstracted to a java class as it is a custom data json fetched from social networking APIs. Defining "profile" as Java string inserts profile data with slashes thus escaping the double-quotes, curly brackets, etc. in json data .
Is there any other way without casting it to another object.
The way to go is probably to make profile a Map<String, Object> in the containing class. This way, you can store arbitrary data within it.
class MyDocument {
Map<String, Object> profile;
}
The answer of using a Map was a little unclear to me... how do you convert an arbitrary JSON String to a Map in a way that Spring Data will persist it as-is?
I found that using a property of type "com.mongodb.DBObject" works. Then set your property using JSON.parse:
profile = (DBObject) JSON.parse(arbitraryJSONString)

Java - Store HashSet in mysql

How would I go about storing a HashSet with an unknown size in a mysql table. I know I can loop through it and store it into a longtext field.
However when I retrieve that field and store it into a string temporarily it will use extra memory to store that huge string of info.
Is there any easy way to store a HashSet?
A SQL table with indexed columns is basically a hash set.
You shouldn't try to store (persist) a binary representation of a HashSet in a table. You should store (persist) the data of your HashSet as rows and columns and then read that data into your HashSet on the Java side.
In other words, you should be using a database to store your data directly rather than saving a serialized representation of a Java collection that holds your data. That's what a database is meant to do... store/persist data in a structured, consistent manner. If all you really need to do is to serialize a HashSet then why bother with a database at all?
I would turn it into JSON and store it as text.
There are libraries out there that convert to and from JSON with ease - eg googles gson
Yes. Each element in a separate record in a table with foreign key to the 'parent' table. Yhay is how JPA handles it.
You can serialize it and save it as BLOB type. Because HashSet implements Serializable. http://download.oracle.com/javase/1.4.2/docs/api/java/util/HashSet.html
http://download.oracle.com/javase/1.4.2/docs/api/serialized-form.html#java.util.HashSet

how to dynamically specify data type while creating the table in MS access using Java

My application is extracting the data from excel sheet. I am storing the value and type of the data from the sheet into the ArrayList. ie., If my excel sheet consists of employee data, i will retrive [ Employee name, String] [ Employee id, number] and so on.. So i have to create a table with these names and with their respective data types. So how could i dynamically specify the data types for the attributes in the table. I am using JDBC,MS Access..
Well, you read your data in a String, and for every value do String.matches(regex) to find out the datatype. For example do value.matches("\d"), if it mathces, then instantiate an Integer like, new Integer(value). Now, you should be able to add this new integer object into your List.
I hope you will be able to see how to go further. Check the instanceof or something while creating the table in the database.
For each different data type, use a different class. You can either use the default java classes like String or Integer, or make your own depending on your further requirements. You can store all these classes in your ArrayList.
When retrieving your data, check which class is used and handle it appropriately.

Categories