error while importing data to postgres with java - java

well my problem is that i need to import data to a postgres table through java from a csv file, the thing is when i import the data with PGadmin4 it is imported without errors, however when i
execute this sql statement with java COPY account FROM 'path/account.csv' DELIMITER ',' QUOTE '"' HEADER CSV an exception occures stating this :
was aborted: ERROR: date/time field value out of range: "24/3/1995"
Indice : Perhaps you need a different "datestyle" setting.
i have checked the datetime and it's dmy

Related

Getting error on server side while executing query

We have one JAVA Application which has ETL process. When we execute that ETL process, zip file is unzip which contains many csv file. And this using this csv file, we load data into external tables. While loading data to external table we are getting following error.
CREATE TABLE "EXTERNAL_TABLE"
( "column1" VARCHAR2(10 BYTE),
"column2" VARCHAR2(40 BYTE),
"column3" VARCHAR2(64 BYTE)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXT_DIRECTORY"
ACCESS PARAMETERS
( RECORDS DELIMITED BY '\n'
BADFILE EXT_DIRECTORY:'test.bad'
DISCARDFILE EXT_DIRECTORY:'test.dsc'
SKIP 1
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' AND '"'
LRTRIM
MISSING FIELD VALUES ARE NULL (
column1 CHAR(4000), column2 CHAR(4000), column3 CHAR(4000) )
)
LOCATION
( 'test.csv'
)
)
REJECT LIMIT UNLIMITED ;
Error:
Caused by: java.sql.BatchUpdateException: error occurred during batching: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file test.csv in TEST_TBL not found at oracle.jdbc.driver.OracleStatement.executeBatch(OracleStatement.java:4615)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
This exception is raised when the database attempts to access an external table and the file which is called for in the external table definition doesn't exist or is inaccessible. In this case the file it's looking for is test.csv, which is supposed to exist in a directory whose name in the database is TEST_TBL. You might try writing a test procedure to see if you can open and read this file using the UTL_FILE package.
Due to lack of details it's difficult to say if any of the other answers on this site which refer to similar problems apply in your case, but you might look at the following:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
sqlplus error on select from external table: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
Error in executing ODCIEXTTABLEOPEN callout

Migrate h2 database to a postgresql docker in java spring

As titled, I am trying to migrate my h2 database to a postgres docker.
I made a dump of my h2 database using the command
SCRIPT TO 'fileName'
I then copied the file to my docker through the docker cp command. After that, I created a new database, and I tried launching
psql realdb < myfile.sql
which gave me an enourmous serie of errors, including:
ERROR: syntax error at or near "."
LINE 1: ...TER TABLE PUBLIC.FILE_RECORD ADD CONSTRAINT PUBLIC.CONSTRAIN...
^
ERROR: syntax error at or near "CACHED"
LINE 1: CREATE CACHED TABLE PUBLIC.NODE_EVENT(
^
ERROR: syntax error at or near "."
LINE 1: ...LTER TABLE PUBLIC.NODE_EVENT ADD CONSTRAINT PUBLIC.CONSTRAIN...
^
ERROR: relation "public.node_event" does not exist
LINE 1: INSERT INTO PUBLIC.NODE_EVENT(ID, DAY, HOUR, MINUTE, MONTH, ...
^
NOTICE: table "system_lob_stream" does not exist, skipping
DROP TABLE
ERROR: syntax error at or near "CALL"
LINE 1: CALL SYSTEM_COMBINE_BLOB(-1);
So I decided I would try dumping the database as a csv file, which perhaps a more standard approach, using the command
call CSVWRITE( '/home/lee/test.csv' , 'SELECT * FROM PORT' )
which seems to work for just one table at time. Is there a way to export all tables at once? How would I import them into postgres docker?
Is there a better way to do all of this?
This is my application.config
spring.datasource.url=jdbc:h2:/tmp/ossdb:testdb;MODE=PostgreSQL
spring.datasource.username=admin
spring.datasource.password=admin
spring.datasource.platform=postgresql

Invalid utf8 character string when importing csv file into MySQL database

I use the following command to import data from a .csv file into a MySQL database table like so:
String loadQuery = "LOAD DATA LOCAL INFILE '" + file + "' INTO TABLE source_data_android_cell FIELDS TERMINATED BY ','" + "ENCLOSED BY '\"'"
+ " LINES TERMINATED BY '\n' " + "IGNORE 1 LINES(.....)" +"SET test_date = STR_TO_DATE(#var1, '%d/%m/%Y %k:%i')";
However, as one of the columns in the sourcefile contains a really screwy data which is: viva Y31L.RastaMod䋢_Version the program refuses to import the data into MySQL and keeps throwing this error:
java.sql.SQLException: Invalid utf8 character string: 'viva
Y31L.RastaMod'
I searched up on this but cant really understand what exactly the error was, other than that the INPUT format of this string "viva Y31L.RastaMod䋢_Version" was wrong and didn't fit the utf8 format used in the MySQL database?
However, I already did the following which is SET NAMES UTF8MB4 in my MySQL db, since it was suggested in other questions that UTF8MB4 was more flexible in accepting weird characters.
I explored this further by manually inserting that weird data into MySQL database table in the Command Prompt, which worked fine. In fact, the table displayed almost the full entry: viva Y31L.RastaMod?ã¢_Version. But if I ran my program from the IDE the file gets rejected.
Would appreciate any explanations.
Second minor question related to the import process of csv file into mySQL:
I noticed that I couldn't import a copy of the same file into the MySQL database. Errors thrown included that the data was a duplicate. Is that because MySQL rejects duplicate column data? But when I changed all the data of one column leaving the rest the same in that copied file, it gets imported correctly. Why is that so?
I don't think this immediate error has to do with the destination of the data not being able to cope with UTF-8 characters, but rather the way you are using LOAD DATA. You can try specifying the character set which should be used when loading the data. Consider the following LOAD DATA command, which is what you had originally but slightly modified:
LOAD DATA LOCAL INFILE path/to/file INTO TABLE source_data_android_cell
CHARACTER SET utf8
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES(.....)
SET test_date = STR_TO_DATE(#var1, '%d/%m/%Y %k:%i')
This being said, you should also make sure that the target table uses a character set which supports the data you are trying to load into it.

Insert BLOB in Derby Database using SQL

Currently I'm trying to import an SQL-INSERT dump from a postgres database into my Derby development/testing database using the Eclipse's Data Tools SQL scratchpad. The export created a lot of data that looked like the following:
CREATE TABLE mytable ( testfield BLOB );
INSERT INTO mytable ( testfield ) VALUES ('\x0123456789ABCDEF');
Executing it in Eclispe's SQL Scratchpad results in (translated from german):
Columns of type 'BLOB' shall not contain values of type 'CHAR'.
The problem seems, that the PostgreSQL admin tool exported BLOB data in a format like '\x0123456789ABCDEF' which is not recognized by Derby (Embedded).
Changing this to X'0123456789ABCDEF' or simply '0123456789ABCDEF'did also not work.
The only thing that worked was CAST (X'123456789ABCDEF' AS BLOB), but I'm not yet sure, if this results in the correct binary data when read back in Java and if the X'0123456789ABCDEF'is 100% portable.
CAST (...whatever... AS BLOB) doesn't work in java DB / Apache DERBY!
One must use the built-in system procedure
SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE. I do not think there is any other way. For instance:
CALL SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE (
'MYSCHEMA', 'MYTABLE',
'MY_KEY, MY_VARCHAR, MY_INT, MY_BLOB_DATA',
'1,3,4,2',
'c:\tmp\import.txt', ',' , '"' , 'UTF-8',
0);
where the referenced "import.txt" file will be CSV like (as specified by the ',' and '"' arguments above) and contain as 2nd field (I scrambled the CSV field order versus DB column names order on purpose to illustrate) a file name that contains the binary data in proper for the BLOB's. For instance, "import.txt" is like:
"A001","c:\tmp\blobimport.dat.0.55/","TEST",123
where the supplied BLOB data file name bears the convention "filepath.offset.length/"
Actually, you can first export your table with
CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE_LOBS_TO_EXTFILE(
'MYSCHEMA', 'MYTABLE', 'c:\tmp\export.txt', ',' ,'"',
'UTF-8', 'c:\tmp\blobexport.dat');
to generate sample files with the syntax to reuse on import.

Create Stored Procedure from File

I am trying to execute a whole directory of .SQL files in Java.
I'm struggling to execute a stored procedure. I have found this so far (the most helpful) including a dead link, sadly. I have downloaded liquibase also, but I cannot figure out how I should use it for this purpose.
In my current code, I split the files including procedures into different statements:
(Statements split in a Vector[String] and executed in a for loop)
Example:
//File f;
//Statement st;
Vector<String> vProcedure = getProcedureStatements(f, Charset.defaultCharset(), "//");
for (Iterator<String> itr = vProcedure.iterator(); itr.hasNext();)
st.execute(itr.next());
System.out.println(f.getName() + " - done executing.");
The Vector contains the four elements (see SQL-Code #SPLIT x).
DROP PROCEDURE IF EXISTS `Add_Position`; #SPLIT 1
DELIMITER // #SPLIT 2
CREATE PROCEDURE `Add_Position`
(
IN iO_ID INT,
IN iCID INT,
IN iWID INT,
IN iA INT
)
BEGIN
#statements;
END
// #SPLIT 3
DELIMITER ; #SPLIT 4
Result when trying to execute #SPLIT 2:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'DELIMITER //' at line 1
Q: Could anyone tell me if there's a exteral library I could use, or how liquibase does work ? I can't get it to work the JDBC-way.
The statement DELIMITER \\ is not actually SQL - it is a command that is interpreted by the mysql command-line tool (and possibly also their GUI tool), but if you pass it straight to the execution engine, you will get this syntax error.
Unfortunately, Liquibase uses ; as the default statement separator, so it is difficult to get this sort of thing to work using SQL files and Liquibase. I have put in a pull request to allow setting the delimiter from the Liquibase command line - see https://github.com/liquibase/liquibase/pull/361.

Categories