Migrate postgresql JSON array of IDs to Mongodb ObjectID - java

I'm trying migrate from postgres db to mongodb. I've a field JSON in postgres table, which has key and value pairs. I need to fetch the value based on JSON key and convert those values(values are basically postgres id) to Mongo objectID and then map those to mongo document.
JSON field looks like this
"B":["956c5b0a5341d14c23ceed071bf136f8"]}
I've written a function in java to convert postgres ID column to mongoID.
public String convertIDToMongo(String colName){
---encoding logic
}
This is working when the field is explicitly available in the table and also if the field datatype is not an array. In case of JSON and array, is there a way to fetch set of values from JSON field and convert them into Mongo Object ID?
"Select json::json->'A' from table" ```
gives me the value ["06992a6fef0bcfc1ede998ac7da8f08b","7d1d55e1078191e53244c41d6b55b3a3","31571835c6600517f4e4c15b5144959b"] But I need some help to convert this list of IDs to Mongo Object IDs. Below is how I convert for non JSON field with one postgres ID value to Mongo ID.
"Select " + convertIDToMongo('colName') + " as "_id", \n" +
Any help is greatly appreciated. Thanks.

You need to parse the JSON array, then call your converter for each item in the array. There are many JSON parsing options available.

Related

Athena APIs Java Datum

I'm querying a table in athena using the athena APIs.
The table has a field of type struct.
How can I map this field into a Map<String, String> record?
All records in Datum type and only way to get the values is using getVarCharValue
Datum record_value_datum = row_data.get(i);
String value = record_value_datum.getVarCharValue()
And the values are not quoted so the code above would give me sth in this format:
key1=value1, key=value2, key3=valeu3

How to query this JSONB column in postgres?

So I have a table
Node_Mapping(location_id:UUID, node_ids: jsonb)
The corresponding POJO for this is
class NodeMapping{
UUID locationId;
Set<String> nodeIds;
}
Example data in table is
UUID1 : ['uuid100', 'uuid101']
UUID2 : ['uuid103', 'uuid101']
So I want to make a query like, find out all the locationIds which contains 'uuid101'.
Please help me to form the query.
You can use the contains operator ?
select *
from node_mapping
where node_ids ? 'uuid100';
This assumes that in reality the column stores a valid JSON array, e.g. ["uuid100", "uuid101"] but not an invalid JSON like UUID1 : ['uuid100', 'uuid101']

Cassandra Java Driver converting field names to lower case

I am using cassandra-driver-core version 3.5.1
I have a table in cassandra. All fields in table are camel cased and created with double quotes. These fields needs to be in camel case as my solr schema has camel casing and we have around 80-120 field.
But when I insert my json documents in this table, using below code:
//jsonData is json document in String
Insert insertQuery = QueryBuilder.insertInto(keySpace, table).json(jsonData);
ResultSet resultSet = session.execute(insertQuery.toString());
Generated insert query:
INSERT INTO asda_search.GROCERIES JSON '{"catalogName":"some data", .....);
cassandra driver converts the field in insert statement to lower case, leading to below exception:
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: JSON values map contains unrecognized column: catalogname
In my table field name is catalogName
What I should do, so that driver does not lower case my fields?
Update:
I know I can create Query as below:
QueryBuilder.insertInto(keySpace, table).values(fieldNameList, fieldValueList)
while creating fieldNameList I can add quotes to the field names.
Any other solution?
While creating JSON added escaped double quotes to such fields.
Example:"\"catalogName\"".
If using jackson: #JsonProperty("\"catalogName\"")
This worked for me.

Map JSONObject to SQL server

I know the title is quite cliched, but it is not about storing JSON data in SQL Server.
I have a JSONArray with JSONObjects with keys that match the column names in SQL Server 2012. I want to save the data to the database into the proper columns.
I know the obvious way to do this is the iterate through the JSONArray and save the values with individual insert commands. I was wondering if there was another way to do this.
I don't want to use T-SQL. I want to handle this from Java only.
Here is an example data that matches the format of my JSONArray:
[
{
"FEATURE":"A",
"OPTION":"92384",
"ERROR_TYPE":"MISSING",
"DESCRIPTION":"Feature A is missing the option 92384",
"SERIAL_NUMBER":"249752-23894"
},
{
"FEATURE":"B",
"OPTION":"0288394",
"ERROR_TYPE":"MISSING",
"DESCRIPTION":"Feature B is missing the option 0288394",
"SERIAL_NUMBER":"Y2394-20392Q"
}
]
My SQLServer table looks like this:
What would be best way to achieve this without looping through each JSONArray?
As you have added java tag I would convert JSON to Java object and save it with Hibernate. Here are two useful links how to do that
Json to Java
Hibernate example
DECLARE #testJson NVARCHAR(4000)= N'[{"id":2,"name":"n2"},{"id":1,"name":"n1"}]'
INSERT
INTO
test_table SELECT
*
FROM
OPENJSON(#testJson) WITH (
id int N'$.id',
name VARCHAR(200) N'$.name'
)
Update
use java
jsonStr -> jsonObject and getKeys (some json libs )
String youJson = "{\"id\":0, \"name\":\"n0\"}";
JsonParser parser = new JsonParser();
JsonObject jsonObject = parser.parse(json).getAsJsonObject();
Set<String> keys = jsonObject.keySet();
Then use NamedParameterJdbcTemplate and MapSqlParameterSource (spring-jdbc) to create sql and set parameters dynamically .
PS. I hate ORM in java.

Spring Data mongo to insert null values to DB

I am using Spring data mongo to insert a record to Mongo,
here is my code
mongoTemplate.save(person,"personCollection");
Here is my person object
public class Person implements Serializable {
int age;
String address;
String name;
//Getters and setters including hashcode and equals
}
my address is null here , after inserting the record in the collection, the data is populated with only age and name
i know that mongodb treats null value and noKey as the same thing, but my requirement is to even populate the address:null to have the schema consistent how do i do this with Spring Data mongo
current o/p: {"age":21,"name":"john Doe"}
expected o/p: {"age":21,"name":"john Doe","address":null}
NoSQL DB works in a different way compared to RDBMS.
the document {"age":21,"name":"john Doe"} is same as {"age":21,"name":"john Doe";"address":null}
instead of storing the key's as null better to not store the key at all this improves the performance of your reads/updates against the DB.
However, if your usecase still demands to sore null due to whatever the reasons it might be convert your POJO to BSONObject and then persist the BSONObject in the MongoDB.
Here is the example ( but however it will be only a work around to get the things going)
BSONObject personBsonObj = BasicDBObjectBuilder.start()
.add("name","John Doe")
.add("age",21)
.add("address",null).get();
if you are using spring data mongo use
mongoTemplate.insert(personBsonObj,"personCollection");
document in the db:
db.personCollection.findOne().pretty();
{"age":21,"name":"John Doe";"address":null}*
I've solved this problem using the below code
final Document document = new Document();
document.put("test1", "test1");
document.put("test2", null);
document.put("test3", "test3");
mongoTemplate.getCollection("your-collection-name").insert(document);
Here instead of using BSONObject, I used Document object and it worked fine.
Document inserted in DB
{
"_id" : ObjectId("some-id"),
"test1" : "test1",
"test2" : null,
"test3" : "test3"
}

Categories