We have one JAVA Application which has ETL process. When we execute that ETL process, zip file is unzip which contains many csv file. And this using this csv file, we load data into external tables. While loading data to external table we are getting following error.
CREATE TABLE "EXTERNAL_TABLE"
( "column1" VARCHAR2(10 BYTE),
"column2" VARCHAR2(40 BYTE),
"column3" VARCHAR2(64 BYTE)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXT_DIRECTORY"
ACCESS PARAMETERS
( RECORDS DELIMITED BY '\n'
BADFILE EXT_DIRECTORY:'test.bad'
DISCARDFILE EXT_DIRECTORY:'test.dsc'
SKIP 1
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' AND '"'
LRTRIM
MISSING FIELD VALUES ARE NULL (
column1 CHAR(4000), column2 CHAR(4000), column3 CHAR(4000) )
)
LOCATION
( 'test.csv'
)
)
REJECT LIMIT UNLIMITED ;
Error:
Caused by: java.sql.BatchUpdateException: error occurred during batching: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file test.csv in TEST_TBL not found at oracle.jdbc.driver.OracleStatement.executeBatch(OracleStatement.java:4615)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
This exception is raised when the database attempts to access an external table and the file which is called for in the external table definition doesn't exist or is inaccessible. In this case the file it's looking for is test.csv, which is supposed to exist in a directory whose name in the database is TEST_TBL. You might try writing a test procedure to see if you can open and read this file using the UTL_FILE package.
Due to lack of details it's difficult to say if any of the other answers on this site which refer to similar problems apply in your case, but you might look at the following:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
sqlplus error on select from external table: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
Error in executing ODCIEXTTABLEOPEN callout
Related
well my problem is that i need to import data to a postgres table through java from a csv file, the thing is when i import the data with PGadmin4 it is imported without errors, however when i
execute this sql statement with java COPY account FROM 'path/account.csv' DELIMITER ',' QUOTE '"' HEADER CSV an exception occures stating this :
was aborted: ERROR: date/time field value out of range: "24/3/1995"
Indice : Perhaps you need a different "datestyle" setting.
i have checked the datetime and it's dmy
I use the following command to import data from a .csv file into a MySQL database table like so:
String loadQuery = "LOAD DATA LOCAL INFILE '" + file + "' INTO TABLE source_data_android_cell FIELDS TERMINATED BY ','" + "ENCLOSED BY '\"'"
+ " LINES TERMINATED BY '\n' " + "IGNORE 1 LINES(.....)" +"SET test_date = STR_TO_DATE(#var1, '%d/%m/%Y %k:%i')";
However, as one of the columns in the sourcefile contains a really screwy data which is: viva Y31L.RastaMod䋢_Version the program refuses to import the data into MySQL and keeps throwing this error:
java.sql.SQLException: Invalid utf8 character string: 'viva
Y31L.RastaMod'
I searched up on this but cant really understand what exactly the error was, other than that the INPUT format of this string "viva Y31L.RastaMod䋢_Version" was wrong and didn't fit the utf8 format used in the MySQL database?
However, I already did the following which is SET NAMES UTF8MB4 in my MySQL db, since it was suggested in other questions that UTF8MB4 was more flexible in accepting weird characters.
I explored this further by manually inserting that weird data into MySQL database table in the Command Prompt, which worked fine. In fact, the table displayed almost the full entry: viva Y31L.RastaMod?ã¢_Version. But if I ran my program from the IDE the file gets rejected.
Would appreciate any explanations.
Second minor question related to the import process of csv file into mySQL:
I noticed that I couldn't import a copy of the same file into the MySQL database. Errors thrown included that the data was a duplicate. Is that because MySQL rejects duplicate column data? But when I changed all the data of one column leaving the rest the same in that copied file, it gets imported correctly. Why is that so?
I don't think this immediate error has to do with the destination of the data not being able to cope with UTF-8 characters, but rather the way you are using LOAD DATA. You can try specifying the character set which should be used when loading the data. Consider the following LOAD DATA command, which is what you had originally but slightly modified:
LOAD DATA LOCAL INFILE path/to/file INTO TABLE source_data_android_cell
CHARACTER SET utf8
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES(.....)
SET test_date = STR_TO_DATE(#var1, '%d/%m/%Y %k:%i')
This being said, you should also make sure that the target table uses a character set which supports the data you are trying to load into it.
Currently I'm trying to import an SQL-INSERT dump from a postgres database into my Derby development/testing database using the Eclipse's Data Tools SQL scratchpad. The export created a lot of data that looked like the following:
CREATE TABLE mytable ( testfield BLOB );
INSERT INTO mytable ( testfield ) VALUES ('\x0123456789ABCDEF');
Executing it in Eclispe's SQL Scratchpad results in (translated from german):
Columns of type 'BLOB' shall not contain values of type 'CHAR'.
The problem seems, that the PostgreSQL admin tool exported BLOB data in a format like '\x0123456789ABCDEF' which is not recognized by Derby (Embedded).
Changing this to X'0123456789ABCDEF' or simply '0123456789ABCDEF'did also not work.
The only thing that worked was CAST (X'123456789ABCDEF' AS BLOB), but I'm not yet sure, if this results in the correct binary data when read back in Java and if the X'0123456789ABCDEF'is 100% portable.
CAST (...whatever... AS BLOB) doesn't work in java DB / Apache DERBY!
One must use the built-in system procedure
SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE. I do not think there is any other way. For instance:
CALL SYSCS_UTIL.SYSCS_IMPORT_DATA_LOBS_FROM_EXTFILE (
'MYSCHEMA', 'MYTABLE',
'MY_KEY, MY_VARCHAR, MY_INT, MY_BLOB_DATA',
'1,3,4,2',
'c:\tmp\import.txt', ',' , '"' , 'UTF-8',
0);
where the referenced "import.txt" file will be CSV like (as specified by the ',' and '"' arguments above) and contain as 2nd field (I scrambled the CSV field order versus DB column names order on purpose to illustrate) a file name that contains the binary data in proper for the BLOB's. For instance, "import.txt" is like:
"A001","c:\tmp\blobimport.dat.0.55/","TEST",123
where the supplied BLOB data file name bears the convention "filepath.offset.length/"
Actually, you can first export your table with
CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE_LOBS_TO_EXTFILE(
'MYSCHEMA', 'MYTABLE', 'c:\tmp\export.txt', ',' ,'"',
'UTF-8', 'c:\tmp\blobexport.dat');
to generate sample files with the syntax to reuse on import.
My issue is related with Hive UDF,
I have created one UDF which convert String date to julian date , It's working fine when I execute select query but it throws an error while using command Create table temp as.
Create function convertToJulian as 'com.convertToJulian'
Using jar 'hdfs:/user/hive/'.
Execute Select Query :
SELECT name, date FROM custTable
WHERE name is not null and convertToJulian(date) < convertToJulia
(to_date(from_unixtime(unix_timestamp())));
OutPut :
converting to local hdfs:/user/hive/udf.jar
Added [/usr/local/hivetmp/amit.pathak/9381feb3-6c5f-469b-b6b1-
9af55abbdabd/udf.jar] to class path
Added resources: [hdfs:/user/hive/udf.jar]
It's working fine and providing me exact data what I need.
Now in second step I want to add this data in another new table so I added
CREATE Table trop
As
SELECT name, date FROM custTable
WHERE name is not null and convertToJulian(date) <
convertToJulian (to_date(from_unixtime(unix_timestamp())));
Output :
java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/usr/local/hivetmp/amit.pathak/9381feb3-6c5f-469b-b6b1-9af55abbdabd/udf.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
I am not able to find out why its fetching data from hdfs location
hdfs://localhost:54310/usr/local/hivetmp/amit.pathak/9381feb3-6c5f-469b-b6b1-9af55abbdabd/udf.jar
I also tried several ways like adding data manually in hdfs.
But hive generating random session id which create folder name with same session Id name.
I am able to fix above issue , now I have two ways to fix this, I have added my finding on my blog, you can refer this link Hive udf exception Fix
I am using Spring's JdbcDaoSupport class with a DriverManagerDataSource using the MySQL Connector/J 5.0 driver (driverClassName=com.mysql.jdbc.driver). allowMultiQueries is set to true in the url.
My application is an in-house tool we recently developed that executes sql scripts in a directory one-by-one (allows us to re-create our schema and reference table data for a given date, etc, but I digress). The sql scripts sometime contain multiple statements (hence allowMultiQueries), so one script can create a table, add indexes for that table, etc.
The problem happens when including a statement to add a foreign key constraint in one of these files. If I have a file that looks like...
--(column/constraint names are examples)
CREATE TABLE myTable (
fk1 BIGINT(19) NOT NULL,
fk2 BIGINT(19) NOT NULL,
PRIMARY KEY (fk1, fk2)
);
ALTER TABLE myTable ADD CONSTRAINT myTable_fk1
FOREIGN KEY (fk1)
REFERENCES myOtherTable (id)
;
ALTER TABLE myTable ADD CONSTRAINT myTable_fk2
FOREIGN KEY (fk2)
REFERENCES myOtherOtherTable (id)
;
then JdbcTemplate.execute throws an UncategorizedSqlException with the following error message and stack trace:
Exception in thread "main" org.springframework.jdbc.UncategorizedSQLException: StatementCallback; uncategorized SQLException for SQL [ THE SQL YOU SEE ABOVE LISTED HERE ];
SQL state [HY000]; error code [1005]; Can't create table 'myDatabase.myTable' (errno: 150); nested exception is java.sql.SQLException: Can't create table 'myDatabase.myTable' (errno: 150)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:83)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80)
and the table and foreign keys are not inserted.
Also, especially weird: if I take the foreign key statements out of the script I showed above and then place them in their own script that executes after (so I now have 1 script with just the create table statement, and 1 script with the add foreign key statements that executes after that) then what happens is:
tool executes create table script, works fine, table is created
tool executes add fk script, throws the same exception as seen above (except errno=121 this time), but the FKs actually get added (!!!)
In other words, when the create table/FK statements are in the same script then the exception is thrown and nothing is created, but when they are different scripts a nearly identical exception is thrown but both things get created.
Any help on this would be greatly appreciated. Please let me know if you'd like me to clarify anything more.
Some more info:
1) This only happens on my box. My coworker does not get the same problem.
2) The script that forces the tool to error works fine when executed from the mysql command line using the "script" command
My God.
http://bugs.mysql.com/bug.php?id=41635
and
[2nd link removed because spam filter isn't letting me add 2 links. Search Google for "mysql connector / j errno 150" and it's the 3rd result]
...
Looks like mySql5.1 has a bug with its jdbc connector where it bombs where an alter statement to add a FK is in a script with any other statement.
When I broke out my 3 statements into 3 scripts, it worked (the way I was trying before with the 2 fk statements in their own script still bombed because they were sharing a script!!). Also, my coworker is using MySql5.0, so it didn't affect him.
Holy Cow, that was a fun 5 hours.