How to catch specific exceptions in JDBC? Examples: primary key exception or foreign key exception.
The best and DB-independent way to handle SQLException more specifically is to determine the SQL state code which can be obtained by SQLException#getSQLState(). The SQLState is a 5-char code, of which the first two are common among all DB's and the last three might differ depending on the DB and/or the specific condition. Here's an extract from the spec:
02: no data
07: dynamic SQL error
08: connection exception
0A: feature not supported
21: cardinality violation
22: data exception
23: integrity constraint violation
24: invalid cursor state
25: invalid transaction state
26: invalid SQL statement name
28: invalid authorization specification
2B: dependent privilege descriptors still exist
2C: invalid character set name
2D: invalid transaction termination
2E: invalid connection name
33: invalid SQL descriptor name
34: invalid cursor name
35: invalid condition number
3C: ambiguous cursor name
3D: invalid catalog name
3F: invalid schema name
So to determine whether the SQL Exception is caused by a constraint violation, you can just do the following in a (fictive) SQLUtil class:
public static boolean isConstraintViolation(SQLException e) {
return e.getSQLState().startsWith("23");
}
SQLException contains some database-specific info related to the exception. From the doc:
Each SQLException provides several
kinds of information:
1) a string
describing the error. This is used as
the Java Exception message, available
via the method getMesage.
2) a "SQLstate"
string, which follows either the XOPEN
SQLstate conventions or the SQL 99
conventions. The values of the
SQLState string are described in the
appropriate spec. The DatabaseMetaData
method getSQLStateType can be used to
discover whether the driver returns
the XOPEN type or the SQL 99 type.
3) an
integer error code that is specific to
each vendor. Normally this will be the
actual error code returned by the
underlying database.
4) a chain to a next
Exception. This can be used to provide
additional error information.
Brian's right, a SQLException will be thrown for just about ANY JDBC problem. This is partially why JDBC is so annoying. The Spring library JDBC helpers provide an exception translator to look at the SQLCode, SQLState, etc., and throw the appropriate DataAccessException. There are many of these exception classes, and they give you a better idea of what went wrong, with names such as DataIntegrityViolationException, DataSourceLookupFailureException, PermissionDeniedDataAccessException, and others.
Following up on BalusC's answer, here's a more complete, recent list of all the classes and subclasses as specified by the SQL:2011 standard. I have just recently assembled this list for the Javadoc of jOOQ's SQLStateSubclass.
+----+-----------------------------------------------------------+-----+--------------------------------------------------------------+
| Class and class description | Subclass and subclass description |
+----+-----------------------------------------------------------+-----+--------------------------------------------------------------+
| 00 | Successful completion | 000 | No subclass |
| 01 | Warning | 000 | No subclass |
| 01 | Warning | 001 | Cursor operation conflict |
| 01 | Warning | 002 | Disconnect error |
| 01 | Warning | 003 | Null value eliminated in set function |
| 01 | Warning | 004 | String data, right truncation |
| 01 | Warning | 005 | Insufficient item descriptor areas |
| 01 | Warning | 006 | Privilege not revoked |
| 01 | Warning | 007 | Privilege not granted |
| 01 | Warning | 009 | Search condition too long for information schema |
| 01 | Warning | 00A | Query expression too long for information schema |
| 01 | Warning | 00B | Default value too long for information schema |
| 01 | Warning | 00C | Result sets returned |
| 01 | Warning | 00D | Additional result sets returned |
| 01 | Warning | 00E | Attempt to return too many result sets |
| 01 | Warning | 00F | Statement too long for information schema |
| 01 | Warning | 012 | Invalid number of conditions |
| 01 | Warning | 02F | Array data, right truncation |
| 02 | No data | 000 | No subclass |
| 02 | No data | 001 | No additional result sets returned |
| 07 | Dynamic SQL Error | 000 | No subclass |
| 07 | Dynamic SQL Error | 001 | Using clause does not match dynamic parameter specifications |
| 07 | Dynamic SQL Error | 002 | Using clause does not match target specifications |
| 07 | Dynamic SQL Error | 003 | Cursor specification cannot be executed |
| 07 | Dynamic SQL Error | 004 | Using clause required for dynamic parameters |
| 07 | Dynamic SQL Error | 005 | Prepared statement not a cursor specification |
| 07 | Dynamic SQL Error | 006 | Restricted data type attribute violation |
| 07 | Dynamic SQL Error | 007 | Using clause required for result fields |
| 07 | Dynamic SQL Error | 008 | Invalid descriptor count |
| 07 | Dynamic SQL Error | 009 | Invalid descriptor index |
| 07 | Dynamic SQL Error | 00B | Data type transform function violation |
| 07 | Dynamic SQL Error | 00C | Undefined DATA value |
| 07 | Dynamic SQL Error | 00D | Invalid DATA target |
| 07 | Dynamic SQL Error | 00E | Invalid LEVEL value |
| 07 | Dynamic SQL Error | 00F | Invalid DATETIME_INTERVAL_CODE |
| 08 | Connection exception | 000 | No subclass |
| 08 | Connection exception | 001 | SQL-client unable to establish SQL-connection |
| 08 | Connection exception | 002 | Connection name in use |
| 08 | Connection exception | 003 | Connection does not exist |
| 08 | Connection exception | 004 | SQL-server rejected establishment of SQL-connection |
| 08 | Connection exception | 006 | Connection failure |
| 08 | Connection exception | 007 | Transaction resolution unknown |
| 09 | Triggered action exception | 000 | No subclass |
| 0A | Feature not supported | 000 | No subclass |
| 0A | Feature not supported | 001 | Multiple server transactions |
| 0D | Invalid target type specification | 000 | No subclass |
| 0E | Invalid schema name list specification | 000 | No subclass |
| 0F | Locator exception | 000 | No subclass |
| 0F | Locator exception | 001 | Invalid specification |
| 0L | Invalid grantor | 000 | No subclass |
| 0M | Invalid SQL-invoked procedure reference | 000 | No subclass |
| 0P | Invalid role specification | 000 | No subclass |
| 0S | Invalid transform group name specification | 000 | No subclass |
| 0T | Target table disagrees with cursor specification | 000 | No subclass |
| 0U | Attempt to assign to non-updatable column | 000 | No subclass |
| 0V | Attempt to assign to ordering column | 000 | No subclass |
| 0W | Prohibited statement encountered during trigger execution | 000 | No subclass |
| 0W | Prohibited statement encountered during trigger execution | 001 | Modify table modified by data change delta table |
| 0Z | Diagnostics exception | 000 | No subclass |
| 0Z | Diagnostics exception | 001 | Maximum number of stacked diagnostics areas exceeded |
| 21 | Cardinality violation | 000 | No subclass |
| 22 | Data exception | 000 | No subclass |
| 22 | Data exception | 001 | String data, right truncation |
| 22 | Data exception | 002 | Null value, no indicator parameter |
| 22 | Data exception | 003 | Numeric value out of range |
| 22 | Data exception | 004 | Null value not allowed |
| 22 | Data exception | 005 | Error in assignment |
| 22 | Data exception | 006 | Invalid interval format |
| 22 | Data exception | 007 | Invalid datetime format |
| 22 | Data exception | 008 | Datetime field overflow |
| 22 | Data exception | 009 | Invalid time zone displacement value |
| 22 | Data exception | 00B | Escape character conflict |
| 22 | Data exception | 00C | Invalid use of escape character |
| 22 | Data exception | 00D | Invalid escape octet |
| 22 | Data exception | 00E | Null value in array target |
| 22 | Data exception | 00F | Zero-length character string |
| 22 | Data exception | 00G | Most specific type mismatch |
| 22 | Data exception | 00H | Sequence generator limit exceeded |
| 22 | Data exception | 00P | Interval value out of range |
| 22 | Data exception | 00Q | Multiset value overflow |
| 22 | Data exception | 010 | Invalid indicator parameter value |
| 22 | Data exception | 011 | Substring error |
| 22 | Data exception | 012 | Division by zero |
| 22 | Data exception | 013 | Invalid preceding or following size in window function |
| 22 | Data exception | 014 | Invalid argument for NTILE function |
| 22 | Data exception | 015 | Interval field overflow |
| 22 | Data exception | 016 | Invalid argument for NTH_VALUE function |
| 22 | Data exception | 018 | Invalid character value for cast |
| 22 | Data exception | 019 | Invalid escape character |
| 22 | Data exception | 01B | Invalid regular expression |
| 22 | Data exception | 01C | Null row not permitted in table |
| 22 | Data exception | 01E | Invalid argument for natural logarithm |
| 22 | Data exception | 01F | Invalid argument for power function |
| 22 | Data exception | 01G | Invalid argument for width bucket function |
| 22 | Data exception | 01H | Invalid row version |
| 22 | Data exception | 01S | Invalid XQuery regular expression |
| 22 | Data exception | 01T | Invalid XQuery option flag |
| 22 | Data exception | 01U | Attempt to replace a zero-length string |
| 22 | Data exception | 01V | Invalid XQuery replacement string |
| 22 | Data exception | 01W | Invalid row count in fetch first clause |
| 22 | Data exception | 01X | Invalid row count in result offset clause |
| 22 | Data exception | 020 | Invalid period value |
| 22 | Data exception | 021 | Character not in repertoire |
| 22 | Data exception | 022 | Indicator overflow |
| 22 | Data exception | 023 | Invalid parameter value |
| 22 | Data exception | 024 | Unterminated C string |
| 22 | Data exception | 025 | Invalid escape sequence |
| 22 | Data exception | 026 | String data, length mismatch |
| 22 | Data exception | 027 | Trim error |
| 22 | Data exception | 029 | Noncharacter in UCS string |
| 22 | Data exception | 02D | Null value substituted for mutator subject parameter |
| 22 | Data exception | 02E | Array element error |
| 22 | Data exception | 02F | Array data, right truncation |
| 22 | Data exception | 02G | Invalid repeat argument in sample clause |
| 22 | Data exception | 02H | Invalid sample size |
| 23 | Integrity constraint violation | 000 | No subclass |
| 23 | Integrity constraint violation | 001 | Restrict violation |
| 24 | Invalid cursor state | 000 | No subclass |
| 25 | Invalid transaction state | 000 | No subclass |
| 25 | Invalid transaction state | 001 | Active SQL-transaction |
| 25 | Invalid transaction state | 002 | Branch transaction already active |
| 25 | Invalid transaction state | 003 | Inappropriate access mode for branch transaction |
| 25 | Invalid transaction state | 004 | Inappropriate isolation level for branch transaction |
| 25 | Invalid transaction state | 005 | No active SQL-transaction for branch transaction |
| 25 | Invalid transaction state | 006 | Read-only SQL-transaction |
| 25 | Invalid transaction state | 007 | Schema and data statement mixing not supported |
| 25 | Invalid transaction state | 008 | Held cursor requires same isolation level |
| 26 | Invalid SQL statement name | 000 | No subclass |
| 27 | Triggered data change violation | 000 | No subclass |
| 27 | Triggered data change violation | 001 | Modify table modified by data change delta table |
| 28 | Invalid authorization specification | 000 | No subclass |
| 2B | Dependent privilege descriptors still exist | 000 | No subclass |
| 2C | Invalid character set name | 000 | No subclass |
| 2C | Invalid character set name | 001 | Cannot drop SQL-session default character set |
| 2D | Invalid transaction termination | 000 | No subclass |
| 2E | Invalid connection name | 000 | No subclass |
| 2F | SQL routine exception | 000 | No subclass |
| 2F | SQL routine exception | 002 | Modifying SQL-data not permitted |
| 2F | SQL routine exception | 003 | Prohibited SQL-statement attempted |
| 2F | SQL routine exception | 004 | Reading SQL-data not permitted |
| 2F | SQL routine exception | 005 | Function executed no return statement |
| 2H | Invalid collation name | 000 | No subclass |
| 30 | Invalid SQL statement identifier | 000 | No subclass |
| 33 | Invalid SQL descriptor name | 000 | No subclass |
| 34 | Invalid cursor name | 000 | No subclass |
| 35 | Invalid condition number | 000 | No subclass |
| 36 | Cursor sensitivity exception | 000 | No subclass |
| 36 | Cursor sensitivity exception | 001 | request rejected |
| 36 | Cursor sensitivity exception | 002 | request failed |
| 38 | External routine exception | 000 | No subclass |
| 38 | External routine exception | 001 | Containing SQL not permitted |
| 38 | External routine exception | 002 | Modifying SQL-data not permitted |
| 38 | External routine exception | 003 | Prohibited SQL-statement attempted |
| 38 | External routine exception | 004 | Reading SQL-data not permitted |
| 39 | External routine invocation exception | 000 | No subclass |
| 39 | External routine invocation exception | 004 | Null value not allowed |
| 3B | Savepoint exception | 000 | No subclass |
| 3B | Savepoint exception | 001 | Invalid specification |
| 3B | Savepoint exception | 002 | Too many |
| 3C | Ambiguous cursor name | 000 | No subclass |
| 3D | Invalid catalog name | 000 | No subclass |
| 3F | Invalid schema name | 000 | No subclass |
| 40 | Transaction rollback | 000 | No subclass |
| 40 | Transaction rollback | 001 | Serialization failure |
| 40 | Transaction rollback | 002 | Integrity constraint violation |
| 40 | Transaction rollback | 003 | Statement completion unknown |
| 40 | Transaction rollback | 004 | Triggered action exception |
| 42 | Syntax error or access rule violation | 000 | No subclass |
| 44 | With check option violation | 000 | No subclass |
| HZ | Remote database access | 000 | No subclass |
+----+-----------------------------------------------------------+-----+--------------------------------------------------------------+
You can also use getErrorCode() method to handle exceptions correctly, especially useful when you work with stored procedures or functions and you have got your own custom error codes.
It may be helpful for someone having a similar context.
In the catch clause you can be more specific to handle that Exception.
try {
// Your code here
} catch(SQLException ex){
if(ex instanceof SQLIntegrityConstraintViolationException) {
// Handle Here
}
}
Related
I am trying to write a cucumber feature file using data tables. The object that I need to form using the dataTable has a field which requires two fields. Example:
| Name | Owner | Properties.Key | Properties.value |
| Name1 | myself | someKey1 | someValue1 |
| Name2 | robins | someKey2 | someValue2 |
I was wondering if instead of writing it this way, if there's a better way to write the nested objects using dataTables. Something more like SpecFLow. Example:
| Name | Owner | Properties |
| name1 | myself | {nested} |
| name2 | robins | {nested} |
| key | value |
| someKey1 | someValue1 |
| someKey2 | someValue2 |
Or is there any other way to make the nested dataTable??
Also, how will the steps for the table will look like in java?
I am trying to export a table from HDFS to SQOOP but I am getting java exceptions.
The query I'm using is as follows:
sqoop export --connect jdbc:mysql://172.31.54.174/Database --driver com.mysql.jdbc.Driver --username user --password userpassword --table accounts --export-dir /user/pri/accounts
While execution this query gives me below error:
17/03/29 07:54:26 INFO mapreduce.Job: map 0% reduce 0%
17/03/29 07:54:30 INFO mapreduce.Job: Task Id : attempt_1489328678238_4886_m_000002_0, Status : FAILED
Error: java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: Can't parse input data: '\N'
at accounts.__loadFromFields(accounts.java:691)
at accounts.parse(accounts.java:584)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
at java.sql.Timestamp.valueOf(Timestamp.java:204)
at accounts.__loadFromFields(accounts.java:643)
... 12 more
The file that I am exporting contains data as below :
1,2008-10-23 16:05:05.0,\N,Donald,Becton,2275 Washburn Street,Oakland,CA,94660,5100032418,2014-03-18 13:29:47.0,2014-03-18 13:29:47.0
2,2008-11-12 03:00:01.0,\N,Donna,Jones,3885 Elliott Street,San Francisco,CA,94171,4150835799,2014-03-18 13:29:47.0,2014-03-18 13:29:47.0
I have also created the table accounts and its structure is as follows:
+----------------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------------+-------------+------+-----+---------+-------+
| acct_num | varchar(20) | NO | PRI | | |
| acct_create_dt | datetime | NO | | NULL | |
| acc_close_dt | datetime | YES | | NULL | |
| first_name | varchar(20) | NO | | NULL | |
| last_name | varchar(20) | NO | | NULL | |
| address | varchar(30) | NO | | NULL | |
| city | varchar(20) | NO | | NULL | |
| state | varchar(20) | NO | | NULL | |
| zipcode | varchar(20) | NO | | NULL | |
| phone_number | varchar(20) | YES | | NULL | |
| created | datetime | NO | | NULL | |
| modified | datetime | NO | | NULL | |
+----------------+-------------+------+-----+---------+-------+
I am also attaching a screenshot of the error.
As you can see from your logs '\N' is like escape character so it isnot fitting in varchar.I dont understand why you are adding the same. Also the timestamp format issues is indicated.And also check in your existing data if any column you are using for primary key is repeated itself.
Add --input-null-string '\\N' --input-null-non-string '\\N' in your sqoop export command.
So I have the following table that I must map to Java Objects:
+---------+-----------+---------------------+---------------------+--------+
| task_id | attribute | lastModified | activity | row_id |
+---------+-----------+---------------------+---------------------+--------+
| 1 | 1 | 2016-08-23 21:05:09 | first activity | 1 |
| 1 | 3 | 2016-08-23 21:08:28 | connect to db | 2 |
| 1 | 3 | 2016-08-23 21:08:56 | create web services | 3 |
| 1 | 4 | 2016-08-23 21:08:56 | data dump | 4 |
| 1 | 5 | 2016-08-23 21:08:56 | test cases | 5 |
| 1 | 6 | 2016-08-23 21:08:57 | dao object | 6 |
| 1 | 7 | 2016-08-23 21:08:57 | buy streetfood | 7 |
| 2 | 6 | 2016-08-23 21:08:57 | drink coke | 8 |
| 2 | 6 | 2016-08-23 21:09:00 | drink tea | 9 |
| 2 | 1 | 2016-08-23 21:12:30 | make tea | 10 |
| 2 | 2 | 2016-08-23 21:13:32 | charge phone | 11 |
| 2 | 3 | 2016-08-23 21:13:32 | shower | 12 |
| 2 | 4 | 2016-08-23 21:13:32 | sleep | 13 |
+---------+-----------+---------------------+---------------------+--------+
Here, each Task object( identified by the task_id column) has multiple attribute objects. These attribute objects have the lastModified, and activity fields. So far my approach has been to create a Row object have each row of the table mapped to a Row object via myBatis. Then do some Java-side processing to sort everything out. Is there a way to directly map this table via myBatis annotations and/or xml so that the 2 Task objects are created with each of them having a list of populated Atttribute objects inside?
Here is mybatis document:http://www.mybatis.org/mybatis-3/sqlmap-xml.html .May be you can use mybatis collection to solve your problem.
I have the tables accounts and action. accounts needs to be modified according to the instruction stored in action.
In action each row contains an account-id, an action (i=insert, u=update, d=delete, x=invalid operation) and an amount by which to update the account.
On an insert, if the account already exists, an update should be done
instead
On an update, if the account does not exist, it is created by an
insert
On a delete, if the row does not exist, no action is taken
Input
accounts:
+---id----value--+
| 1 | 1000 |
| 2 | 2000 |
| 3 | 1500 |
| 4 | 6500 |
| 5 | 500 |
+----------------+
action:
+---account_id---o---new_value---status---+
| 3 | u | 599 | |
| 6 | i | 2099 | |
| 5 | d | | |
| 7 | u | 1599 | |
| 1 | i | 399 | |
| 9 | d | | |
| 10 | x | | |
+-----------------------------------------+
Output
accounts:
+---id----value--+
| 1 | 399 |
| 2 | 800 |
| 3 | 599 |
| 4 | 1400 |
| 6 | 20099 |
| 7 | 1599 |
+----------------+
action:
+---account_id---o---new_value-------------------status----------------+
| 3 | u | 599 | Update: Success |
| 6 | i | 20099 | Update: Success |
| 5 | d | | Delete: Success |
| 7 | u | 1599 | Update: ID not founds. Value inserted |
| 1 | i | 399 | Insert: Acc exists. Updated instead |
| 9 | d | | Delete: ID not found |
| 10 | x | | Invalid operation: No action taken |
+----------------------------------------------------------------------+
I am experienced with Java and JDBC, but unfortunately I just don't know, how to start here.
Do I need an additional table? Do I have to use triggers?
I've seen two techniques for an upsert. With the first technique, within a transaction, you test first to see if the row exists, and use the results to determine whether to perform an insert or an update. With the second technique, you try performing an update and verify the number of records updated (JDBC gives you this). If it's zero, then you do an insert, if one, then you're done.
I want to store millions of time series entries (long time, double value) with Java. (Our monitoring system is currently storing every entry in a large mySQL table but performance is very bad.)
Are there time series databases implemented in java out there?
checkout http://opentsdb.net/ as used by StumbleUpon?
checkout http://square.github.com/cube/ as used by square
I hope to see additional suggestions in this thread.
The performance was bad because of wrong database design. I am using mysql and the table had this layout:
+-------------+--------------------------------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+--------------------------------------+------+-----+-------------------+-----------------------------+
| fk_category | smallint(6) | NO | PRI | NULL | |
| method | enum('min','max','avg','sum','none') | NO | PRI | none | |
| time | timestamp | NO | PRI | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| value | float | NO | | NULL | |
| accuracy | tinyint(1) | NO | | 0 | |
+-------------+--------------------------------------+------+-----+-------------------+-----------------------------+
My fault was an inapproriate index. After adding a multi column primary key all my queries are lightning fast:
+-------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
| Table | Non_unique | Key_name | Seq_in_index | Column_name | Collation | Cardinality | Sub_part | Packed | Null | Index_type | Comment | Index_comment |
+-------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
| job | 0 | PRIMARY | 1 | fk_category | A | 18 | NULL | NULL | | BTREE | | |
| job | 0 | PRIMARY | 2 | method | A | 18 | NULL | NULL | | BTREE | | |
| job | 0 | PRIMARY | 3 | time | A | 452509710 | NULL | NULL | | BTREE | | |
+-------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
Thanks for all you answers!
You can take a look at KDB. It's primarily used by financial companies to fetch market time series data.
What do you need to do with the data and when?
If you are just saving the values for later, a plain text file might do nicely, and then later upload to a database.