How to add another data in my sql column of type JSON.
In my table I have one column of json type null.
so I using this command to update the value.
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
this is working fine. and I hve two data.
Now I want to add one more data in That.
I am using same command
update myTable
set columnJson = '[{"id" : "someId3", "name": "someNamme3"}]'
where id = "rowID1";
But the previous value is getting washed away. Is there anyway I can add n number of values. I am doing this in Java.
You need JSON functions like JSON_ARRAY_APPEND see more functions to maniüulate.
Json needs some special function which have to be learned, we usually recomend nit to use JSON, because in a normalized table you can use all the sql functionality there exists,
JSON needs always a moderate learn effort
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[0]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[[{"id": "someId1", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"], {"id": "someId2", "name": "someNamme2"}]
fiddle
And if you want another position you change te point where it shold change
update myTable
set columnJson = '[{"id" : "someId1" , "name": "someNamme2"}
,{"id" : "someId2", "name": "someNamme2"}]'
where id = "rowID1";
Rows matched: 1 Changed: 1 Warnings: 0
update myTable
set columnJson = JSON_ARRAY_APPEND(columnJson, '$[1]', '{"id" : "someId3", "name": "someNamme3"}')
Rows matched: 1 Changed: 1 Warnings: 0
SELECT * FROM myTable
id
columnJson
rowID1
[{"id": "someId1", "name": "someNamme2"}, [{"id": "someId2", "name": "someNamme2"}, "{"id" : "someId3", "name": "someNamme3"}"]]
fiddle
I have a table of media files:
id title date
The date is in epoch format, and for example, I have 3 photos:
1 xxxxx 1644777107349(6:31:47.349)
2 wwwww 1644777110138(6:31:50.138)
3 zzzzz 1644777117453(6:31:57.453)
I want to get the count of photos that were taken with a gap of 5 to 30 seconds, for example:
2 and 3 were taken with a gap of 7 seconds so I want to get 2. I want to compare every row with the consecutive row.
This is the SQL query:
SELECT Count(*)
FROM (SELECT A._id
FROM media_files A
INNER JOIN media_files B
ON( Abs(A.added_time - B.added_time) >= 5000 )
AND A._id != B._id
AND A.type = 'image'
AND B.type = 'image'
WHERE Abs(A.added_time - B.added_time) <= 30000
GROUP BY A._id)
And for the above example, I always get 3 instead of 2, any idea what is the problem?
Here is a test setup and a query which returns only the images taken between 5 and 30 seconds after the previous one.NB I'm working on mySQL so where I have put 5 and 30 seconds it may be that in SQLlite you need to put 5000 and 30000.
CREATE TABLE media_files (
id INT,
title VARCHAR(25),
added_date DATETIME,
type VARCHAR(25)
);
INSERT INTO media_files VALUES ( 1, 'xxxxx', '2022-02-13 6:30:47.349','image');
INSERT INTO media_files VALUES ( 1, 'xxxxx', '2022-02-13 6:31:27.349','image');
INSERT INTO media_files VALUES ( 1, 'xxxxx', '2022-02-13 6:31:47.349','image');
INSERT INTO media_files VALUES ( 2, 'wwwww', '2022-02-13 6:31:50.138','image');
INSERT INTO media_files VALUES ( 3, 'zzzzz', '2022-02-13 6:31:57.453','image');
SELECT id,
added_date ad,
added_date -
( SELECT MAX(added_date)
FROM media_files m
WHERE m.added_date < mf.added_date ) DIF
FROM media_files mf
WHERE ( added_date -
( SELECT MAX(added_date)
FROM media_files m
WHERE m.added_date < mf.added_date )) > 5
AND ( added_date -
( SELECT MAX(added_date)
FROM media_files m
WHERE m.added_date < mf.added_date )) < 30;
id | ad | DIF
-: | :------------------ | --:
1 | 2022-02-13 06:31:47 | 20
3 | 2022-02-13 06:31:57 | 7
I have a database with next scheme, that uses postgressql and the postgis plugin:
table_id | id | mag | time | felt | tsunami | geom
I have a next SQL to select some rows and return those columns as a JSON:
SELECT ROW_TO_JSON(t) as properties
FROM (
SELECT id, mag, time, felt, tsunami FROM earthquakes
) t
I would like to create an SQL sentence that returns the table_id, properties and geom like:
SELECT table_id, properties, GeometryType(geom)
from earthquakes
How can I return the table_id and geom with properties as a JSON?
Edit:
I've created this sql:
SELECT table_id,
row_to_json((SELECT d FROM (SELECT id, mag, time, felt, tsunami ) d)) AS properties,
GeometryType(geom)
FROM earthquakes ORDER BY table_id ASC;
But when I do a request with postman, it returns this:
[
{
"table_id": 1,
"properties": {
"type": "json",
"value": "{\"id\" : \"ak16994521\", \"mag\" : 2.3}"
}
},
...
]
How can I return the values as an object?
My expected result should be:
[
{
"table_id": 1,
"properties": {"id" : "ak16994521", "mag" : 2.3}
},
...
]
Java Method:
public List<Map<String, Object>> readTable(String nameTable) {
try {
String SQL = "SELECT table_id, GeometryType(geom) FROM " + nameTable + " ORDER BY table_id ASC;";
return jdbcTemplate.queryForList(SQL);
} catch( BadSqlGrammarException error) {
log.info("ERROR READING TABLE: " + nameTable);
return null;
}
}
whit this code returns this json:
[
{
"table_id": 1,
"geometrytype": "POINT"
},
{
"table_id": 2,
"geometrytype": "POINT"
},
....
]
My expected result should be:
[
{
"table_id": 1,
"properties": {"id" : "ak16994521", "mag" : 2.3, "time": 1507425650893, "felt": "null", "tsunami": 0 },
"geometrytype": "POINT"
},
...
]
Why not do the whole conversion to JSON on the database?
SELECT json_agg(x) as json_values FROM (
SELECT
table_id,
row_to_json((select d from (select id, mag, time, felt, tsunami) d)) as properties,
GeometryType(geom)
FROM earthquakes
ORDER BY table_id ASC
) x;
I found this sql that creates a geojson file, so when I call method returns perfect as string serialized.
SELECT row_to_json(fc) FROM (
SELECT 'FeatureCollection' As type,
array_to_json(array_agg(f)) As features FROM
(SELECT 'Feature' As type,
ST_AsGeoJSON(lg.geom)::json As geometry,
row_to_json((SELECT l FROM (SELECT id, mag, time, felt, tsunami) As l )) As properties FROM terremotos As lg ) As f ) As fc;
I am using couchbase java-client version 2.5.4 and jdk 1.8. Couchbase SearchQuery doesn't return values when match phrase is "all".
users:
firstname lastname age gender
Ani Mary 12 Female
John William 16 Male
Rohan all 12 Male
John Sam 13 Male
When I query the following query in couchbase cluster UI,
select * from users where lastname ="all" and firstname="rohan";
it returns a row but it doesn't return a value when I use MatchPhrase query.
SearchQueryResult result = bucket.query(new SearchQuery("users_fts",
SearchQuery.matchPhrase("all")).fields("lastname"));
The result must be like the following, users:
firstname lastname age gender
Rohan all 12 Male
FTS index:
{
"type": "fulltext-index",
"name": "users_fts",
"uuid": "4dcbdd4c3399403a",
"sourceType": "couchbase",
"sourceName": "users",
"sourceUUID": "199658ae21c89cd0c33add8325ca88aa",
"planParams": {
"maxPartitionsPerPIndex": 171
},
"params": {
"doc_config": {
"mode": "type_field",
"type_field": "type"
},
"mapping": {
"default_analyzer": "standard",
"default_datetime_parser": "dateTimeOptional",
"default_field": "_all",
"default_mapping": {
"dynamic": true,
"enabled": true
},
"default_type": "_default",
"index_dynamic": true,
"store_dynamic": false
},
"store": {
"kvStoreName": "mossStore"
}
},
"sourceParams": {}
}
How to write a code to get this result with string "all" in couchbase SearchQuery?
I need to save JSON data to an Oracle database. The JSON looks like this(see below). But it doesn't stay in the same format. I might add some additional nodes or modify existing ones. So is it possible to create or modify oracle tables dynamically to add more columns? I was going to do that with Java. I will create a Java class matching the JSON, convert JSON to Java object and persist it to the table. But how can I modify Java class dynamically? Or would it be better idea to do that with PL/SQL? The JSON comes from a mobile device to a REST web service.
{"menu": {
"id": "file",
"value": "File",
"popup": {
"menuitem": [
{"value": "New", "onclick": "CreateNewDoc()"},
{"value": "Open", "onclick": "OpenDoc()"},
{"value": "Close", "onclick": "CloseDoc()"}
]
}
}}
I would suggest that you avoid creating new columns, and instead create a new table that will contain one entry for each of what would have been the new columns. I'm assuming here that the new columns would be menu items. So you would have a "menu" table with these columns:
id file
and you would have a "menuitem" table which would contain one entry for each of your menu items:
id value onclick
So instead of adding columns dynamically, you would be adding records.
I suggested in the comments to change your approach to a NoSQL database like MongoDB. However, if you still feel you need to use a relational database, maybe the EAV model can point you in the right direction.
In summay, you would have a "helper" table that stores which columns an Entity has and their types.
You cannot modify a Java class but you can define the class like a Map and implement the logic to add the desired columns.
Magento, a PHP product, uses EAV in its database.
Mongodb may be your best choice, or you could have a large TEXT field and only extract the columns you are likely to search one.
However, you can CREATE TABLE for additional normalised data and ALTER TABLE to add a column. The later can be particularity expensive.
Use https://github.com/zolekode/json-to-tables/.
Here you go:
import json
from core.extent_table import ExtentTable
from core.table_maker import TableMaker
menu = {
"id": "file",
"value": "File",
"popup": {
"menuitem": [
{"value": "New", "onclick": "CreateNewDoc()"},
{"value": "Open", "onclick": "OpenDoc()"},
{"value": "Close", "onclick": "CloseDoc()"}
]
}
}
menu = json.dumps(menu)
menu = json.loads(menu)
extent_table = ExtentTable()
table_maker = TableMaker(extent_table)
table_maker.convert_json_object_to_table(menu, "menu")
table_maker.show_tables(8)
table_maker.save_tables("menu", export_as="sql", sql_connection="your_connection")
Output:
SHOWING TABLES :D
menu
ID id value popup
0 0 file File 0
1 1 None None None
____________________________________________________
popup
ID
0 0
1 1
____________________________________________________
popup_?_menuitem
ID PARENT_ID is_scalar scalar
0 0 0 False None
1 1 0 False None
2 2 0 False None
____________________________________________________
popup_?_menuitem_$_onclick
ID value onclick PARENT_ID
0 0 New CreateNewDoc() 0
1 1 Open OpenDoc() 1
2 2 Close CloseDoc() 2
3 3 None None None
____________________________________________________
This can be done in MYSQL database:
This code takes a JSON input string and automatically generates
SQL Server CREATE TABLE statements to make it easier
to convert serialized data into a database schema.
It is not perfect, but should provide a decent starting point when starting
to work with new JSON files.
SET NOCOUNT ON;
DECLARE
#JsonData nvarchar(max) = '
{
"Id" : 1,
"IsActive":true,
"Ratio": 1.25,
"ActivityArray":[true,false,true],
"People" : ["Jim","Joan","John","Jeff"],
"Places" : [{"State":"Connecticut", "Capitol":"Hartford", "IsExpensive":true},{"State":"Ohio","Capitol":"Columbus","MajorCities":["Cleveland","Cincinnati"]}],
"Thing" : { "Type":"Foo", "Value" : "Bar" },
"Created_At":"2018-04-18T21:25:48Z"
}',
#RootTableName nvarchar(4000) = N'AppInstance',
#Schema nvarchar(128) = N'dbo',
#DefaultStringPadding smallint = 20;
DROP TABLE IF EXISTS ##parsedJson;
WITH jsonRoot AS (
SELECT
0 as parentLevel,
CONVERT(nvarchar(4000),NULL) COLLATE Latin1_General_BIN2 as parentTableName,
0 AS [level],
[type] ,
#RootTableName COLLATE Latin1_General_BIN2 AS TableName,
[key] COLLATE Latin1_General_BIN2 as ColumnName,
[value],
ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
FROM
OPENJSON(#JsonData, '$')
UNION ALL
SELECT
jsonRoot.[level] as parentLevel,
CONVERT(nvarchar(4000),jsonRoot.TableName) COLLATE Latin1_General_BIN2,
jsonRoot.[level]+1,
d.[type],
CASE WHEN jsonRoot.[type] IN (4,5) THEN CONVERT(nvarchar(4000),jsonRoot.ColumnName) ELSE jsonRoot.TableName END COLLATE Latin1_General_BIN2,
CASE WHEN jsonRoot.[type] IN (4) THEN jsonRoot.ColumnName ELSE d.[key] END,
d.[value],
ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
FROM
jsonRoot
CROSS APPLY OPENJSON(jsonRoot.[value], '$') d
WHERE
jsonRoot.[type] IN (4,5)
), IdRows AS (
SELECT
-2 as parentLevel,
null as parentTableName,
-1 as [level],
null as [type],
TableName as Tablename,
TableName+'Id' as columnName,
null as [value],
0 as columnsequence
FROM
(SELECT DISTINCT tablename FROM jsonRoot) j
), FKRows AS (
SELECT
DISTINCT -1 as parentLevel,
null as parentTableName,
-1 as [level],
null as [type],
TableName as Tablename,
parentTableName+'Id' as columnName,
null as [value],
0 as columnsequence
FROM
(SELECT DISTINCT tableName,parentTableName FROM jsonRoot) j
WHERE
parentTableName is not null
)
SELECT
*,
CASE [type]
WHEN 1 THEN
CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN 'nvarchar' ELSE 'datetime2' END
WHEN 2 THEN
CASE WHEN TRY_CONVERT(int, [value]) IS NULL THEN 'float' ELSE 'int' END
WHEN 3 THEN
'bit'
END COLLATE Latin1_General_BIN2 AS DataType,
CASE [type]
WHEN 1 THEN
CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN MAX(LEN([value])) OVER (PARTITION BY TableName, ColumnName) + #DefaultStringPadding ELSE NULL END
WHEN 2 THEN
NULL
WHEN 3 THEN
NULL
END AS DataTypePrecision
INTO ##parsedJson
FROM jsonRoot
WHERE
[type] in (1,2,3)
UNION ALL SELECT IdRows.parentLevel, IdRows.parentTableName, IdRows.[level], IdRows.[type], IdRows.TableName, IdRows.ColumnName, IdRows.[value], -10 AS ColumnSequence, 'int IDENTITY(1,1) PRIMARY KEY' as datatype, null as datatypeprecision FROM IdRows
UNION ALL SELECT FKRows.parentLevel, FKRows.parentTableName, FKRows.[level], FKRows.[type], FKRows.TableName, FKRows.ColumnName, FKRows.[value], -9 AS ColumnSequence, 'int' as datatype, null as datatypeprecision FROM FKRows
-- For debugging:
-- SELECT * FROM ##parsedJson ORDER BY ParentLevel, level, tablename, columnsequence
DECLARE #CreateStatements nvarchar(max);
SELECT
#CreateStatements = COALESCE(#CreateStatements + CHAR(13) + CHAR(13), '') +
'CREATE TABLE ' + #Schema + '.' + TableName + CHAR(13) + '(' + CHAR(13) +
STRING_AGG( ColumnName + ' ' + DataType + ISNULL('('+CAST(DataTypePrecision AS nvarchar(20))+')','') + CASE WHEN DataType like '%PRIMARY KEY%' THEN '' ELSE ' NULL' END, ','+CHAR(13)) WITHIN GROUP (ORDER BY ColumnSequence)
+ CHAR(13)+')'
FROM
(SELECT DISTINCT
j.TableName,
j.ColumnName,
MAX(j.ColumnSequence) AS ColumnSequence,
j.DataType,
j.DataTypePrecision,
j.[level]
FROM
##parsedJson j
CROSS APPLY (SELECT TOP 1 ParentTableName + 'Id' AS ColumnName FROM ##parsedJson p WHERE j.TableName = p.TableName ) p
GROUP BY
j.TableName, j.ColumnName,p.ColumnName, j.DataType, j.DataTypePrecision, j.[level]
) j
GROUP BY
TableName
PRINT #CreateStatements;
You can find the solution on https://bertwagner.com/posts/converting-json-to-sql-server-create-table-statements/
ALso JSON can be converted to a POJO class in JAVA :
package com.cooltrickshome;
2
import java.io.File;
3
import java.io.IOException;
4
import java.net.MalformedURLException;
5
import java.net.URL;
6
import org.jsonschema2pojo.DefaultGenerationConfig;
7
import org.jsonschema2pojo.GenerationConfig;
8
import org.jsonschema2pojo.Jackson2Annotator;
9
import org.jsonschema2pojo.SchemaGenerator;
10
import org.jsonschema2pojo.SchemaMapper;
11
import org.jsonschema2pojo.SchemaStore;
12
import org.jsonschema2pojo.SourceType;
13
import org.jsonschema2pojo.rules.RuleFactory;
14
import com.sun.codemodel.JCodeModel;
15
public class JsonToPojo {
16
/**
17
* #param args
18
*/
19
public static void main(String[] args) {
20
String packageName="com.cooltrickshome";
21
File inputJson= new File("."+File.separator+"input.json");
22
File outputPojoDirectory=new File("."+File.separator+"convertedPojo");
23
outputPojoDirectory.mkdirs();
24
try {
25
new JsonToPojo().convert2JSON(inputJson.toURI().toURL(), outputPojoDirectory, packageName, inputJson.getName().replace(".json", ""));
26
} catch (IOException e) {
27
// TODO Auto-generated catch block
28
System.out.println("Encountered issue while converting to pojo: "+e.getMessage());
29
e.printStackTrace();
30
}
31
}
32
public void convert2JSON(URL inputJson, File outputPojoDirectory, String packageName, String className) throws IOException{
33
JCodeModel codeModel = new JCodeModel();
34
URL source = inputJson;
35
GenerationConfig config = new DefaultGenerationConfig() {
36
#Override
37
public boolean isGenerateBuilders() { // set config option by overriding method
38
return true;
39
}
40
public SourceType getSourceType(){
41
return SourceType.JSON;
42
}
43
};
44
SchemaMapper mapper = new SchemaMapper(new RuleFactory(config, new Jackson2Annotator(config), new SchemaStore()), new SchemaGenerator());
45
mapper.generate(codeModel, className, packageName, source);
46
codeModel.build(outputPojoDirectory);
47
}
48
}