insert Xml file data into MySQL table - java

I want to insert the Xml file data into MySQL table ,, by choosing which column to insert into ,, using Java How will this be done ?

It really depends on the format of your XML file. If your XML file is a direct export from the MySQL file, please refer to this question.
If your XML is in some other format, then I would probably be using JAXB to parse XML into POJO, then write some logic to map the POJO into the database table.

Related

How to import CSV file data into a PostgreSQL table via java code?

I have a csv file which I want to import to table in postgres
The table contains 3 fields (id text, name text, geo geometry).
The csv file is in the same format (3 values separate with comma).
I want to use java code in order to import the file (input.csv) into the table (tbl)
How can I do it ?
Is there a query which I pass the file path to the DB ?
You can use OpenCSV to read csv file into java object (here is an example - https://www.geeksforgeeks.org/mapping-csv-to-javabeans-using-opencsv/), and then use Java JPA Repository to insert data into database (here is an example - https://www.baeldung.com/spring-data-crud-repository-save)

Ingest data from JDBC connections to Hive : Handling binary columns

Following diagram depicts the simplified ingestion flow we are building to ingest data from different RDBS to Hive.
Step 1: Using JDBC connection to the data-source, source data is streamed and saved in a CSV file on HDFS using HDFS java API.
Basically, execute a 'SELECT * ' query and each row is saved in CSV until the ResultSet is exhausted.
Step 2: Using LOAD DATA INPATH command, Hive table is populated using the CSV file created in Step 1.
We use JDBC ResultSet.getString() to get column data.
This works fine for non-binary data.
But for BLOC,CLOB type columns, we cannot write column data into a text/CSV file.
My question is it possible to use OCR or AVRO format to handle binary columns? Does these formats support write row-by-row?
(Update: We are aware of Sqoop/Nifi..etc technologies, the reason for implementing our custom ingestion-flow is beyond the scope of this question)

Read multiple csv file with CsvJdbc

I need to bind a group of csv file in the format "YYYY-MM-DD hh:mm:ss.csv" that are present in the same folder with a unique table that contains all the data present in all the files.
I need to read the data from a Java EE application thus I would like to create a connection pool inside the application server. I found the CsvJdbc driver that allows the reading of multiple files as a single entity. A good starting point was this page in the section with this paragraph:
To read several files (for example, daily log files) as a single table, set the database connection property indexedFiles. The following example demonstrates how to do this.
The example could be fine for me but the problem is that I do not have a header word in the filename string. So the corresponding table becames an empty string that makes obviously impossible to query the table.
How can I tell the driver to map the pattern to a table that hasn't a header part?
P.S. I already tried to use hsqldb as a frontend to the csv files but it does not support multiple files.
Setup CsvJdbc to read several files as described in http://csvjdbc.sourceforge.net/doc.html and then use an empty table name in the SQL query because your CSV filenames do not have any header before the fileTailPattern regular expression. For example:
props.put("fileTailPattern", "(\\d+)-(\\d+)-(\\d+) (\\d+):(\\d+):(\\d+)");
props.put("fileTailParts", "Year,Month,Day,Hour,Minutes,Seconds");
...
ResultSet results = stmt.executeQuery("SELECT * FROM \"\" AS T1");

store === characters in database for empty data while processing xml file

Hi am parsing an xml file through JAXB and saving the data in the database table and am able to do this appropriately. My question is if the xml file returns an empty data for a particular field it should display as === in the database table. How can I do this while processing the xml file.
The xml file has two nodes, abc and xyz and the xml file should contain any one of these.There are two coloumns available in the database say name and version. these two coloumns will be derived by comparing the abc and xyz nodes of teh xml file in the database by using a common id and fetch the values for name and title. Can some one please help me in understanding how to handle this by processing the xml file.
I Wish I could post the code, but the code is too huge to post it.
What you are asking sounds weird, but assuming your data is a String, you could do this:
if (data == null || data.isEmpty())
{
data = "===";
}
From the tags on your question you appear to have two steps in your processing:
XML to object using JAXB
object to database using ?
I would put the logic for storing === in the database as part of the object-to-database processing. If you are using JPA 2.1 (part of Java EE 7) you could look at JPA converters to encapsulate this logic.

data parsing from a file into java and then into a mysql database

I have .Data file given in the above format . I am writing a program in java that will take the values from the .data file and put it in the buffer. MY java program is connected to Mysql(windows) via JDBC. So I need to read the values from the file given in the above format and put it the buffer like
Insert Into building values ("--", "---",----)
In this way, i store these values and jdbc will populate the database tables on Mysql(windows). Please tell me teh best way.
Check out the answers to this question for reading file lines and splitting them into chunks. I know the question says Groovy: but most answers are Java. Then insert the values you retrieved via JDBC.
Actually, since your data file is obviously CSV, you could also use a CSV libary like OpenCSV to read the values.
The data is in CSV format, so use a CSV library to parse the file and then just add some JDBC code to insert this into database.
Or just call MySQL CSV import command from Java:
try {
// Execute a command with arguments
String command = "mysqlimport [options] db_name textfile1 [textfile2 ...]";
Process child = Runtime.getRuntime().exec(command);
} catch (IOException e) {
}
This is the fourth question for the same task... If your data file is well formatted like in the example you provided, then you don't have to split the line into values:
Source: "AAH196","Austin","TX","Virginia Beach","VA"
Target: INSERT INTO BUILDING VALUES("AAH196","Austin","TX","Virginia Beach","VA");
<=> "INSERT INTO BUILDING VALUES(" + Source + ");"
Just take a complete row from you csv file and concatenate a SQL expression.
(see my answer to question 1 of 4 - BTW, if SQL INJECTION is a potential problem, splitting a line of values is not a solution too)
you can bind your csv with java beans using opencsv.
http://opencsv.sourceforge.net/
you can make these beans persistent using an ORM framework, like Hibernate, Cayenne or with JPA which're based on annotations and map your fields to tables easily without creating any sql statement.
This would be a perfect job for Groovy. Here's a gist with a small skeleton script to build upon.

Categories