I'm trying to export data and meta data from Mysql Database to a JSON .
My JSON output need to have this structure :
{ "classifier":[
{
"name":"Frequency",
"value":"75 kHz"
},
{
"name":"depth",
"value":"100 m"
} ]}
Frequency for me represent a column Name and 75 Khz is the value of the column for a specific row.
I'm using Talend data integration to do this, and i can get the data, but i can't figure out how to get the meta data, do i have to enter it myself ? or there is a more easy way to do this ?
You cannot export metadata of json file from Mysql because Mysql provide a structured data, hence we have to create our json structure independently using an existing file or manually, the easiest way is to create a sample file like the one used in your question. See Talend Help.
Related
Following diagram depicts the simplified ingestion flow we are building to ingest data from different RDBS to Hive.
Step 1: Using JDBC connection to the data-source, source data is streamed and saved in a CSV file on HDFS using HDFS java API.
Basically, execute a 'SELECT * ' query and each row is saved in CSV until the ResultSet is exhausted.
Step 2: Using LOAD DATA INPATH command, Hive table is populated using the CSV file created in Step 1.
We use JDBC ResultSet.getString() to get column data.
This works fine for non-binary data.
But for BLOC,CLOB type columns, we cannot write column data into a text/CSV file.
My question is it possible to use OCR or AVRO format to handle binary columns? Does these formats support write row-by-row?
(Update: We are aware of Sqoop/Nifi..etc technologies, the reason for implementing our custom ingestion-flow is beyond the scope of this question)
My requirement is like, i have a json file data as below.,
{Key1: value1, key2: value 2,....} with file name example.json
It should be inserted into sql server as,
Take name: example
Data as below,
Id key1 key2
1 value 1 value 2
I'm searching for a solution such that, there won't be any Java layer complexities, like first converting this data into Java beans then inserting into sql using java database drivers...
Implementation should be language and database independent. In future if i change my database server, then it should be a minimal change.
The coupling should be very loose. Like if i introduce a new key in JSON file, that should be minimal change in the solution.
Tia
So, use some library that can do http request to the server. For eg, if you want insert your json data to server in android app , then use simple libraries like Volley , OkHttp etc.
Problem with your question is , we dont know which server you r using and what application you are doing,, and which programming language you wanna use,, So, im assuming is java android,,
Currently I'm trying to import an SQL-INSERT dump from a postgres database into my Derby development/testing database using the Eclipse's Data Tools SQL scratchpad. The export created a lot of data that looked like the following:
CREATE TABLE mytable ( testfield BLOB );
INSERT INTO mytable ( testfield ) VALUES ('\x0123456789ABCDEF');
Executing it in Eclispe's SQL Scratchpad results in (translated from german):
Columns of type 'BLOB' shall not contain values of type 'CHAR'.
The problem seems, that the PostgreSQL admin tool exported BLOB data in a format like '\x0123456789ABCDEF' which is not recognized by Derby (Embedded).
Changing this to X'0123456789ABCDEF' or simply '0123456789ABCDEF'did also not work.
The only thing that worked was CAST (X'123456789ABCDEF' AS BLOB), but I'm not yet sure, if this results in the correct binary data when read back in Java and if the X'0123456789ABCDEF'is 100% portable.
CAST (...whatever... AS BLOB) doesn't work in java DB / Apache DERBY!
One must use the built-in system procedure
SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE. I do not think there is any other way. For instance:
CALL SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE (
'MYSCHEMA', 'MYTABLE',
'MY_KEY, MY_VARCHAR, MY_INT, MY_BLOB_DATA',
'1,3,4,2',
'c:\tmp\import.txt', ',' , '"' , 'UTF-8',
0);
where the referenced "import.txt" file will be CSV like (as specified by the ',' and '"' arguments above) and contain as 2nd field (I scrambled the CSV field order versus DB column names order on purpose to illustrate) a file name that contains the binary data in proper for the BLOB's. For instance, "import.txt" is like:
"A001","c:\tmp\blobimport.dat.0.55/","TEST",123
where the supplied BLOB data file name bears the convention "filepath.offset.length/"
Actually, you can first export your table with
CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE_LOBS_TO_EXTFILE(
'MYSCHEMA', 'MYTABLE', 'c:\tmp\export.txt', ',' ,'"',
'UTF-8', 'c:\tmp\blobexport.dat');
to generate sample files with the syntax to reuse on import.
Hi am parsing an xml file through JAXB and saving the data in the database table and am able to do this appropriately. My question is if the xml file returns an empty data for a particular field it should display as === in the database table. How can I do this while processing the xml file.
The xml file has two nodes, abc and xyz and the xml file should contain any one of these.There are two coloumns available in the database say name and version. these two coloumns will be derived by comparing the abc and xyz nodes of teh xml file in the database by using a common id and fetch the values for name and title. Can some one please help me in understanding how to handle this by processing the xml file.
I Wish I could post the code, but the code is too huge to post it.
What you are asking sounds weird, but assuming your data is a String, you could do this:
if (data == null || data.isEmpty())
{
data = "===";
}
From the tags on your question you appear to have two steps in your processing:
XML to object using JAXB
object to database using ?
I would put the logic for storing === in the database as part of the object-to-database processing. If you are using JPA 2.1 (part of Java EE 7) you could look at JPA converters to encapsulate this logic.
I want to insert the Xml file data into MySQL table ,, by choosing which column to insert into ,, using Java How will this be done ?
It really depends on the format of your XML file. If your XML file is a direct export from the MySQL file, please refer to this question.
If your XML is in some other format, then I would probably be using JAXB to parse XML into POJO, then write some logic to map the POJO into the database table.