How to insert a file into AS400 through SQL statement - java

I would like to insert a file that hasn't an extension. It is a text file without a .txt extension. This is my code:
public boolean setData(List<String> data) {
SABConnection connection = new SABConnection();
boolean bool = false;
try {
PreparedStatement ps = connection.connectToSAB().prepareStatement("INSERT INTO AS400.ZXMTR03 VALUES (?)");
if (!data.isEmpty()) {
for (String file: data) {
File fi = new File(file);
FileInputStream fis = new FileInputStream(file);
ps.setAsciiStream(1, fis);
int done = ps.executeUpdate();
if (done > 0) {
System.out.println("File: " + fi.getName() + " Inserted successfully");
}else {
System.out.println("Insertion of File: " + fi.getName() + " failed");
}
}
bool = true;
}else {
System.out.println("Le repertoire est vide");
}
}catch (Exception e) {
System.out.println("error caused by: " + e.getMessage());
}
return bool;
}
I keep getting a data truncation error.
ps:
the file ZXMTR03 doesn't have columns.
to insert such thing manually into as400 I write this statement: insert into ZXMTR03 (select * from n.niama/ZXMTR02) it works. When I write insert into ZXMTR03 values ('12345') it works.
I'm using JT400 library.

You can't insert a stream file into a database table like that.
Assuming your text file has EOL indicators, you'd need to split it into rows to insert one row at time; or insert some distinct number of rows at a time using a multi-row insert.
Also you're wrong in thinking ZXMTR03 doesn't have columns, every DB table on the IBM i has at least 1 column.
Alternatively, you could copy, using FTP, SMB, NFS, ect. or even the JT400 AccessIfsFile class, the text file to the Integrated File System (IFS), which supports stream files. And make use of the Copy From Import File (CPYFRMIMPF) command or perhaps the IFS Copy (CPY) command. If on a current version of the OS, you might want to check out the QSYS2_IFS_READ() table functions

Related

How to convert Java byte[] to MySQL BLOB's in SQL script file created by Java code

I have a Java program that creates an SQL script that will then later be imported as a file into MySQL. The Java program cannot directly access the MySQL database but has to generate an SQL file with all the insert commands to be ingested by MySQL. Without getting into too many details we can ignore any security concerns because the data is used once and then the database deleted.
The Java code does something like this:
String sql = "INSERT INTO myTable (column1, column2) VALUES (1, 'hello world');";
BufferedWriter bwr = new BufferedWriter(new FileWriter(new File("output.sql")));
bwr.write(sql);
// then flushes the stream, etc.
The issue I have is when I need to include a byte[] array as the third column:
The issue I have is that I now need to include a byte[] array as the third column. Therefore I want to do:
byte[] data = getDataFromSomewhere();
// Convert byte[] to String and replace all single quote with an escape character for the import
String dataString = new String(data, StandardCharsets.UTF_8).replaceAll("'", "\\\\'");
String sql = "INSERT INTO myTable (column1, column2, column3) VALUES (1, 'hello world', '" + dataString + "');";
BufferedWriter bwr = new BufferedWriter(new FileWriter(new File("output.sql")));
bwr.write(sql);
// then flushes the stream, etc.
The problem is that on the other computer when I load the file I get the following error:
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''????' at line 1
The core of the code to load the SQL file is simply:
try (Stream<String> stream = Files.lines(Paths.get(IMPORT_SQL_FILE)))
{
stream.forEach(line ->
{
try
{
// Yes it could be batched but I omitted it for simplicity
executeStatement(connection, line);
} catch (Exception e) {
e.printStackTrace();
System.exit(-1);
}
});
}
And if I load that the file directly in MySQL I get the following error message:
1 row(s) affected, 1 warning(s): 1300 Invalid utf8 character string: 'F18E91'
Therefore my question is how can I generate an SQL file with binary data from Java?
Inline your BLOB data into a hexadecimal literal:
StringBuilder sb = new StringBuilder(a.length * 2);
for(byte b: data) {
sb.append(String.format("%02x", b));
}
String sql = "INSERT INTO myTable (column1, column2, column3) "
+ "VALUES (1, 'hello world', x'" + sb.toString() + "');";

HSQLDB not saving updates made through Java

I am trying to add records to a table in an HSQL database through Java.
I have an HSQL database I made through OpenOffice, renamed the .odb file to .zip and extracted the SCRIPT and PROPERTIES files (It has no data in it at the moment) to a folder "\database" in my java project folder.
The table looks like this in the SCRIPT file
CREATE CACHED TABLE PUBLIC."Season"("SeasonID" INTEGER GENERATED BY DEFAULT AS IDENTITY(START WITH 0) NOT NULL PRIMARY KEY,"Year" VARCHAR(50))
All fine so far, the database connects just fine in Java with this code:
public void connect(){
try{
String dbName = "database\\db";
con = DriverManager.getConnection("jdbc:hsqldb:file:" + dbName, // filenames prefix
"sa", // user
""); // pass
}catch (Exception e){
e.printStackTrace();
}
}
I have the following code to insert a record into "Season".
public void addSeason(String year){
int result = 0;
try {
stmt = con.createStatement();
result = stmt.executeUpdate("INSERT INTO \"Season\"(\"Year\") VALUES ('" + year + "')");
con.commit();
stmt.close();
}catch (Exception e) {
e.printStackTrace();
}
System.out.println(result + " rows affected");
}
I have a final function called printTables():
private void printTables(){
try {
stmt = con.createStatement();
ResultSet rs = stmt.executeQuery("SELECT * FROM \"Season\"");
System.out.println("SeasonID\tYear");
while(rs.next()){
System.out.println(rs.getInt("SeasonID") + "\t\t" + rs.getString("Year"));
}
}catch (Exception e) {
e.printStackTrace(System.out);
}
}
Now if I run this sequence of functions:
connect();
printTables();
addSeason("2010");
printTables();
I get this output:
SeasonID Year
1 rows affected
SeasonID Year
0 2010
Now when I close the program and start it again I get exactly the same output. So the change made during the first run hasn't been saved to the database. Is there something I'm missing?
It's caused by write delay params in hsqldb, by default has 500ms delay synch from memory to files.
So problem is solved when it's set to false
statement.execute("SET FILES WRITE DELAY FALSE");
or set as you like based on your app behaviour.
So my workaround is to close the connection after every update, then open a new connection any time I want to do something else.
This is pretty unsatisfactory and i'm sure it will cause problems later on if I want to perform queries mid-update. Also it's a time waster.
If I could find a way to ensure that con.close() was called whenever the program was killed that would be fine...

Append data to a DB2 blob

In my DB2 database, I have a table with a Blob:
CREATE TABLE FILE_STORAGE (
FILE_STORAGE_ID integer,
DATA blob(2147483647),
CONSTRAINT PK_FILE_STORAGE PRIMARY KEY (FILE_STORAGE_ID));
Using the db2jcc JDBC driver (db2jcc4-9.7.jar), I can read and write data in this table without any problems.
Now I need to be able to append data to existing rows, but DB2 gives the cryptic error
Invalid operation: setBinaryStream is not allowed on a locator or a reference. ERRORCODE=-4474, SQLSTATE=null
I use the following code to append my data:
String selectQuery = "SELECT DATA FROM FILE_STORAGE WHERE FILE_STORAGE_ID = ?";
try (PreparedStatement ps = conn.prepareStatement(selectQuery, ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_UPDATABLE)) {
ps.setInt(1, fileStorageID);
try (ResultSet rs = ps.executeQuery()) {
if (rs.next()) {
Blob existing = rs.getBlob(1);
try {
// The following line throws the exception:
try (OutputStream output = existing.setBinaryStream(existing.length() + 1)) {
// append the new data to the output:
writeData(output);
} catch (IOException e) {
throw new IllegalStateException("Error writing output stream to blob", e);
}
rs.updateBlob(1, existing);
rs.updateRow();
} finally {
existing.free();
}
} else {
throw new IllegalStateException("No row found for file storage ID: " + fileStorageID);
}
}
}
My code is using the methods as suggested in OutputStream to the BLOB column of a DB2 database table. There also seem to be other people who have the same problem: Update lob columns using lob locator.
As a workaround, I currently read all the existing data into memory, append the new data in memory, and then write the complete data back into the blob. This works, but it's very slow and obviously it will take longer if there's more data in the blob, getting slower with each update.
I do need to use Java to update the data, but apart from switching away from the JVM, I am happy to try any possible alternatives at all, I just need to append the data somehow.
Thanks in advance for any ideas!
If you only need to append data to the end of a BLOB column and don't want to read the entire value into your program, a simple UPDATE statement will be faster and more straightforward.
Your Java program could run something like this via executeUpdate():
UPDATE file_storage SET data = data || BLOB(?) WHERE file_storage_id = ?
The parameter markers for this would be populated by setBlob(1, dataToAppend) and setInt(2, fileStorageID).

How to a csv file in oracle using sql loader in java

I want to load data from a csv file to oracle database. Here is my code-
void importData(Connection conn) {
Statement stmt;
String query;
String filename = "C:/CSVData/Student.csv";
try {
stmt = conn.createStatement(
ResultSet.TYPE_SCROLL_SENSITIVE,
ResultSet.CONCUR_UPDATABLE);
query = "LOAD DATA INFILE '" + filename + "' INTO TABLE Student FIELDS terminated by ',' ;";
System.out.println(query);
stmt.executeQuery(query);
} catch (Exception e) {
e.printStackTrace();
stmt = null;
}
}
This code runs perfectly and load data in mysql. But now I want to load data in oracle. what change do i have to make in query. Please help me. Thank you in advance...
First, you need to write a control file.
Control file example FYI:
Load data
infile "D:/Viki/test.CSV" --the input file(s) you need to import
truncate --the option you need do. (truncate, append, insert, replace. insert by default)
into table vk_recon_China_201409_i --table need insert to
fields terminated by "," --
trailing nullcols
(
col_a filler
, col_b "Trim(:col_b)"
, col_c "To_Date(:col_c,'yyyy/mm/dd hh24:mi:ss')"
, seqno sequence(Max,1)
)
Then, call sqlldr command by Runtime.exec or ProcessImpl.start,
public void startUp() {
StringBuffer sb = new StringBuffer();
String path = "sqlldr user/password#sid readsize=10485760 bindsize=10485760 rows=1000 control=controlFileName.ctl log=controlFileName.log direct=true \n pause";
try {
Process pro = Runtime.getRuntime().exec(path);
BufferedReader br = new BufferedReader(new InputStreamReader(pro.getInputStream()), 4096);
String line = null;
int i = 0;
while ((line = br.readLine()) != null) {
if (0 != i)
sb.append("\r\n");
i++;
sb.append(line);
}
} catch (Exception e) {
sb.append(e.getMessage());
}
}
Try making the external table.You can create an external table on your CSV file using ORACLE_LOADER driver and then update your existing table with data in your external table using DML (MERGE for example).
I think below query should work.
query = "LOAD DATA INFILE '" + filename + "' APPEND INTO TABLE Student FIELDS terminated by ',' ;";
For more info:-
http://docs.oracle.com/cd/E11882_01/server.112/e16536/ldr_control_file.htm#SUTIL005

MySQL, Most efficient Way to Load Data from a parsed file

My File has the following format:
Table1; Info
rec_x11;rec_x21;rec_x31;rec_x41
rec_x12;rec_x22;rec_x32;rec_x42
...
\n
Table2; Info
rec_x11;rec_x21;rec_x31;rec_x41
rec_x12;rec_x22;rec_x32;rec_x42
...
\n
Table3; Info
rec_x11;rec_x21;rec_x31;rec_x41
rec_x12;rec_x22;rec_x32;rec_x42
...
Each batch of records starting from the next line after TableX header and ending by an empty line delimiter is about 700-800 lines size.
Each such batch of lines (rec_xyz...) need to be imported into the relevant MyISAM table name indicated in the header of the batch (TableX)
I am familiar with the option to pipeline the stream using shell comands into LOAD DATA command.
I am interested in simple java snipet code which will parse this file and execute LOAD DATA for a single batch of records each time (in a for loop and maybe using seek command).
for now i am trying to use IGNORE LINES to jump over processed records, but i am not familiar if there is an option to ignore lines from BELOW?
is there a more efficient way to parse and load this type of file into DB?
EDIT
I have read that JDBC supports input stream to LOAD DATA starting from 5.1.3, can i use it to iterate over the file with an input stream and change the LOAD DATA statement each time?
I am attaching my code as a solution,
This solution is based on the additional functionality (setLocalInfileInputStream) added by MySQL Connector/J 5.1.3 and later.
I am pipe-lining input-stream into LOAD DATA INTO statement, instead of using direct file URL.
Additional info: I am using BoneCP as a connection pool
public final void readFile(final String path)
throws IOException, SQLException, InterruptedException {
File file = new File(path);
final Connection connection = getSqlDataSource().getConnection();
Statement statement = SqlDataSource.getInternalStatement(connection.createStatement());
try{
Scanner fileScanner = new Scanner(file);
fileScanner.useDelimiter(Pattern.compile("^$", Pattern.MULTILINE));
while(fileScanner.hasNext()){
String line;
while ((line = fileScanner.nextLine()).isEmpty());
InputStream is = new ByteArrayInputStream(fileScanner.next().getBytes("UTF-8"));
String [] tableName = line.split(getSeparator());
setTable((tableName[0]+"_"+tableName[1]).replace('-', '_'));
String sql = "LOAD DATA LOCAL INFILE '" + SingleCsvImportBean.getOsDependantFileName(file) + "' "
+ "INTO TABLE " + SqlUtils.escape(getTable())
+ "FIELDS TERMINATED BY '" + getSeparator()
+ "' ESCAPED BY '' LINES TERMINATED BY '" + getLinefeed() + "' ";
sql += "(" + implodeStringArray(getFields(), ", ") + ")";
sql += getSetClause();
((com.mysql.jdbc.Statement) statement).setLocalInfileInputStream(is);
statement.execute(sql);
}
}finally{
statement.close();
connection.close();
}
}

Categories