Unable to store complete json field in mysql table field - java
I am getting a large json data in a api response. I am trying to store this json response in local mysql table. But unable to store complete json response. Please find the below json info.
Sample API json response :
{
"responseCode": 200,
"date": "2020-06-03",
"message": "Success",
"couponDetails": {
"total": 14949,
"codes": "35033769,35441136,35803675,34407176,34717909,34950692,35059148,35452352,35688911,35904465,35904658,35904753,35904824,35904942,35905306,35905318,35905434,35905673,35906615,35907029,35907154,35907222,35907345,35907592,35907683,35907951,35908161,35908194,35908206,34664348,34664436,34665057,34665072,34665768,34665950,34666051,34666110,34666879,34667228,34668101,34670133,34670162,34670259,34670661,34670687,34670994,34671179,34671296,34672207,34672276,34672631,34672747,34673619,34673709,34675355,34676588,34677690,34678019,34679260,34679468,34680550,34680694,34680838,34683321,34684752,34684796,34685198,34685826,34686220,34686276,34351922,34352193,34352369,34352553,34353629,34353971,34355064,34355541,34355625,34356802,34357668,34357869,34357922,34360451,34360500,34360764,34361049,34361174,34361315,34362337,34362412,34363370,34364187,34365025,34365188,34365415,34365904,34366777,34366877,34367361,34368025,34368078,35542974,35543013,35543084,35268238,35268397,35268774,35269689,35269933,35270038,35250597,35063719,35064231,35064237,35270577,35270705,35270969,35064514,35064963,35065129,35251645,35251660,35251798,35253022,35253300,35272389,35272446,35272519,35272640,35272641,35273596,35273716,35423127,35423184,35423372,35424244,35425607,35485524,35486647,35486711,35486970,35487111,35470199,35470485,35488099,35488145,35488270,35490204,35534378,35535484,35535520,35535559,35535601,35535818,21979363,21508096,26237385,24734847,22263784,26889428,29292212,20415646,21836743,20300178,21831783,21198543,23739734,29773862,20715551,25488915,28894112,26536357,26695866,27133857,29133336,28763373,21850298,21990790,27757421,2421785723"
}
}
In my local DB table I am able to insert the below information only, which is not complete :
{
"responseCode": 200,
"date": "2020-06-03",
"message": "Success",
"couponDetails": {
"total": 14949,
"codes": "35033769,35441136,35803675,34407176,34717909,34950692,35059148,35452352,35688911,35904465,35904658,35904753,35904824,35904942,35905306,
Mysql Table Structure :
CREATE TABLE `bookdata_codeinfo_history` (
`generated_date` date DEFAULT NULL,
`book_code` longtext,
`service` varchar(45) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
I want to store json in 'book_code' field but only few information is storing. I am using mysql-5.7.13 version.
Please tell me how to resolve this issue
I think rather than attempting to store the book codes as a string, a better way to model the table would be to break the book_code string into individual rows.
This would help in searching the data and future extensibility of the data model.
That's really strange a LONGTEXT can support up to 4,294,967,295 bytes ~4GB. (kindly recheck your table strucure) .
Or
Put these jsons in files and save the path in the database
Or
Split the codes fields into multiple row, something like idrequest , code and every code contains one value
Related
How to save JSON as a CLOB or JSON type in database
In my application, I would like to save JSON for example like that below in database without a specific fields in a model. Posted JSON should be saved as a CLOB or JSON format. Can you shortly explain me how to do it? How my #RestController and Model should look like? { color: "Blue" miles: 100 vin: "1234" }
in Postgreql you can use it as jsonb data type to store in DB. In oracle you have number of options like BLOB,CLOB or varchar2. https://blogs.oracle.com/jsondb/storing-json-in-the-oracle-database
How to delete a full column from spreadsheet using Java?
I am building an application by using Google Spreadsheet to store data. I am adding and updating columns dynamically but not getting how to delete a complete column at once! Could anyone please help me in deleting a column from spreadsheet? I can delete a cell but want to delete all cells which come under a particular column. Like delete column 'A' and the nearby column 'B' replace column 'A', Like we do by right clicking on a column and select option 'delete column' on drive spreadsheet. Can anyone help me doing this? Any api or link?
You may want to check rows and columns operations wherein Sheets API allows you to insert, remove and manipulate rows and columns in sheets. In deleting rows and columns, you may try the sample spreadsheets.batchUpdate request wherein second request deletes columns B:D. The request protocol is shown below: POST https://sheets.googleapis.com/v4/spreadsheets/spreadsheetId:batchUpdate { "requests": [ { "deleteDimension": { "range": { "sheetId": sheetId, "dimension": "ROWS", "startIndex": 0, "endIndex": 3 } } }, { "deleteDimension": { "range": { "sheetId": sheetId, "dimension": "COLUMNS", "startIndex": 1, "endIndex": 4 } } }, ], } You may also check Updating Spreadsheets guide for more information on how to implement a batch update in different languages using the Google API client libraries including Java. Hope that helps!
Load a big json file in Mysql or Oracle database
At work, we supply files for other services. Size of them are between 5mo and 500mo. We want to use Json instead of XML, but i m wondering how our customers could upload those files in an easy way in their database, Oracle or Mysql. I mean, i can t find on the web APi or functions or tools, in Mysql or oracle, to do that. I know that it s easy to work data by data to load a small Json file, decode each object or array and put them at the right place in database. But is there an other way to do this, like sqlloader in Oracle ? And if so, size of our file aren t they too big to produce JSON file, in JAVA for example ? I guess it might be difficult to do this load job automatically, especially because of arrays like this : {"employees":[ {"firstName":"John", "lastName":"Doe", "salaryHistory":[1000,2000,3000]}, {"firstName":"Anna", "lastName":"Smith", "salaryHistory":[500,800]}, {"firstName":"Peter", "lastName":"Jones", "salaryHistory":[400]} ]} where salaryHistory must produce problems because their sizes are different, and data are not madatoryly in the same table. Any ideas or help would be welcomed ! Edit i m looking for a solution to put each data in the good column of a table, i don t need to store a Json structure in a single column of simple table. like this : table employees : column are id, FirstName, lastName and table salaryHistory : column are id, order, salary and each data must go in the good column like "John" in firstname, "Doe" in lastname, then "1000" in a new row of table salaryHistory , "2000" in another new row of salaryHistory and so on.
Starting with MySQL 5.7 there is a new data type: JSON. Take a look here for more details. Example for Oracle 12c: create table transactions ( id number not null primary key, trans_msg clob, constraint check_json check (trans_msg is json) ); regular insert: insert into transactions values ( sys_guid(), systimestamp, '{ "TransId" : 3, "TransDate" : "01-JAN-2015", "TransTime" : "10:05:00", "TransType" : "Deposit", "AccountNumber" : 125, "AccountName" : "Smith, Jane", "TransAmount" : 300.00, "Location" : "website", "CashierId" : null, "ATMDetails" : null, "WebDetails" : { "URL" : "www.proligence.com/acme/dep.htm" }, "Source" : "Transfer", "TransferDetails" : { "FromBankRouting" : "012345678", "FromAccountNo" : "1234567890", "FromAccountType" : "Checking" } }' ) / SQL*Loader control file and data file: load data into table transactions fields terminated by ',' ( trans_id sequence(max,1), fname filler char(80), trans_body lobfile(fname) terminated by EOF )
kairosdb and elasticsearch integration
I'm using Kairosdb as my primary db. Now I want to integrate the Elasticsearch functionalities to my data inside Kairosdb. As stated inside the docs I have to duplicate all entries of my primary db inside Elasticsearch database. Update What I mean is that, if I want to index something inside elasticsearch, I have to do, for example: Retrieve data of Kairosdb, a example json {"name": "hi","value": "6","tags"} and then put it inside Elasticsearch: curl -XPUT 'http://localhost:9200/firstIndex/test/1' -d '{"name": "hi","value": "6","tags"}' If I want to search I have to do this: curl 'http://localhost:9200/_search?q=name:hi&pretty=true' I'm wondering if it is possible to not duplicate my data inside Elasticsearch, in a way which I can achieve this: get data from KairosDB index them using Elasticsearch without duplicate the data. How can I go about that?
It sounds like you're hoping to use Elasticsearch as a secondary (and external) fulltext index for your primary datastore (KairosDB). Since KairosDB is remaining your primary datastore, each record you load into Elasticsearch needs two pieces of information (at minimum): The primary key field(s) for locating the corresponding KairosDB record(s). In the mapping, make sure to set "store": true, "index": "not_analyzed" Any fields which you wish to be searchable (in your example, only name is searched) "store": false, "index": "analyzed" If you want to reduce your index size further, consider disabling the _source field Then your search workflow becomes a two-step process: Query Elasticsearch for name:hi and retrieve the KairosDB primary key field(s) for each of the matching record(s). Query/return KairosDB time-series data using key fields returned from Elasticsearch. But to be clear. You don't need an exact duplicate of each KairosDB record loaded into Elasticsearch. Just the searchable fields, along with a means to locate the original record in KairosDB.
Get column name ( Meta Data ) Talend
I'm trying to export data and meta data from Mysql Database to a JSON . My JSON output need to have this structure : { "classifier":[ { "name":"Frequency", "value":"75 kHz" }, { "name":"depth", "value":"100 m" } ]} Frequency for me represent a column Name and 75 Khz is the value of the column for a specific row. I'm using Talend data integration to do this, and i can get the data, but i can't figure out how to get the meta data, do i have to enter it myself ? or there is a more easy way to do this ?
You cannot export metadata of json file from Mysql because Mysql provide a structured data, hence we have to create our json structure independently using an existing file or manually, the easiest way is to create a sample file like the one used in your question. See Talend Help.