database.com JPQL query exception when searching for NULL date fields - java

I'm trying to convert an old database system to Salesforce and have decided to try out the Database.com Java SDK.
I have recently hit problem that I can't seem to find a work around related to a JPQL Query when searching for NULL or empty dates.
E.g.
select t from table where t.expiryDate is NULL or t.expiryDate = :today
This causes the following exception:
Caused by: [InvalidFieldFault [ApiQueryFault [ApiFault exceptionCode='INVALID_FIELD' exceptionMessage='
from Table__c p where (( p.Expiry_Date__c = 'NULL' ) OR (
^
ERROR at Row:1:Column:158
value of filter criterion for field 'Expiry_Date__c' must be of type date and should not be enclosed in quotes'
]
row='1'
column='158'
]
]
I'm assuming this is a bug in a beta release of the SDK as I don't believe it should be converting the NULL to a string, but please let me know otherwise and/or does anyone know a work around?

Related

ERROR: column 'nameofcolumn' does not exist, unable to update field

I'm trying to update a field so i can test liquibase's functioning on my job. I'm using this
syntax
UPDATE "Country" SET "name" = 'Perúpe' WHERE "id" = 10;
But it will throw an error that says:
Caused by: org.postgresql.util.PSQLException: ERROR: column "Perúpe" does not exist
Which is not a column, it is a value that i'm trying to input so i know liquibase is working.
It is working with a db that was done outside liquibase, and it has only 3 liquibase inputs. Those are ok. When i try to enter a new one, it will crash and stop the app, until i erase that test.
I think my syntax could be wrong. It moved from the table (relation does not exist) to the value of my entry. What could i be doing wrong?
This has been asked and answered previously. Error: Column does not exist in postgresql for update
As Adrian wrote in their comment above, using single quotes instead of double quotes is the reported solution.

IncorrectResultSizeDataAccessException when no entity is found. First returned when multiple are found

I'm trying to find the reason for one strange behavior. Lets say I have Spring data repository which contains a few methods. One of them has a signature like this:
public Entity findByEntityKeyAndValidToDateAfterAndCorrectionDateAfter(BigInteger entityKey, Timestamp validToDate, Timestamp correctionDate);
I receive an issue with stacktrace with the following error:
org.springframework.dao.IncorrectResultSizeDataAccessException
I decided to add two records which have to be returned when I execute this query.
After calling this method instead of throwing the exception the query returns only one of the results. This was strange to my that is why I copy the query from the log and execute it on my SQL developer with the same parameters. The query return both results. I try to change the method signature to return list of entities:
public List<Entity> findByEntityKeyAndValidToDateAfterAndCorrectionDateAfter(BigInteger entityKey, Timestamp validToDate, Timestamp correctionDate);
In this case, the query returns two results.
Any idea why this may happen? And why this query does not throw this exception?
Spring data version is 1.10.1.RELEASE
When using a find method returning a singular type, the following semantics should apply:
0 elements found => return null or Optional.empty if applicable.
1 element found => return that element.
more than 1 element found => exception should be thrown.
Since this is not what you seem to see please upgrade to at least the current minor version (1.10.11), but preferably to the current GA release(1.11.7). If the problem still persists please create an issue at https://jira.spring.io/browse/DATAJPA/?selectedTab=com.atlassian.jira.jira-projects-plugin:summary-panel

Error inserting null or empty value in DATE column mapped to LocalDate type in java

I am using MS SQL as my DB and I have a DATE column called 'START_DATE' in one of my tables. This is a non mandatory column.
In my java layer I am have mapped this to the LocalDate . When I dont have any value set for START_DATE, then I set it to null or leave it empty.
In both cases I get the error that
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Implicit conversion from data type varbinary to date is not allowed. Use the CONVERT function to run this query.
How do I fix this, please advise?
Are you using 2012 version or newer? The problem is in the way SQL 2012 interprets null values for a datetime time. Insert query from Java should have an explicit convert function to forcefully insert it as datetime to avoid this unexpected error.
Something like:
convert(DATETIME,START_DATE,21)
More info
I had the same issue with SQL Server. I fixed it by changing the insert value from just myValue to
CONVERT(DATE, myValue)

DBUnit: dataset with MySQL YEAR type column

folks!
I'm having a problem using DBUnit:
In my test class, when I call DatabaseOperation.INSERT.execute(connection, dataSet), using a FlatXmlDataSet referencing a table that contains a column of type YEAR(4) - MySQL - I get the following:
(...)
Caused by: org.dbunit.dataset.datatype.TypeCastException: Error casting value for table 'Vehicle' and column 'LaunchYear'
at org.dbunit.operation.AbstractBatchOperation.execute(AbstractBatchOperation.java:210)
at net.carroramafleet.ws.utils.DbUnitHelper.execute(DbUnitHelper.java:57)
... 33 more
Caused by: org.dbunit.dataset.datatype.TypeCastException: Unable to typecast value <2010> of type <java.lang.String> to DATE
at org.dbunit.dataset.datatype.DateDataType.typeCast(DateDataType.java:110)
at org.dbunit.dataset.datatype.DateDataType.setSqlValue(DateDataType.java:141)
at org.dbunit.database.statement.SimplePreparedStatement.addValue(SimplePreparedStatement.java:73)
at org.dbunit.database.statement.AutomaticPreparedBatchStatement.addValue(AutomaticPreparedBatchStatement.java:63)
at org.dbunit.operation.AbstractBatchOperation.execute(AbstractBatchOperation.java:200)
... 34 more
Here's my dataset:
<dataset>
<Vehicle
ID="999"
LaunchYear="2010" />
</dataset>
As I have mentioned above, I have a YEAR(4) type column, LaunchYear, in the table Vehicle. And DBUnit can't insert this row because of this information can't be converted correctly.
I've already tried to replace this information using DBUnit's ReplacementDataSet, but I still have problem with TypeCastException. I really can't set a valid YEAR-formatted information.
Could somebody help me?
Thanks,
Jeff
This question is a bit old but I thought I would reply as I just ran into this same issue today.
I believe this is a bug in DbUnit. BTW, I'm using 2.4.9 but did check the release notes for later releases to see if this is mentioned as a bug fix.
The YEAR column is being converted into a java.sql.Date object. The initial bug is that there is no conversion from a simple string "2016" to a java.sql.Date. That leads to the TypeCastException. Changing this field to something like "2016-08-10" gets you past this initial error but leads to a SQLException when MySql attempts to truncate the Date into a integer or short.
The only way I have been able to work around this is to add specific code in the #SetUp or #Before methods to populate the table with initial data.

Hibernate expected NUMBER got BINARY on null field

I can't understand why Hibernate is trying to set this field as a VARBINARY when it is null.
The Java data type is BigDecimal
The Oracle data type is Float
It is being set like this:
entityManager.createNamedQuery("blah").setParameter("target_field", object.sourceValue)
again - sourceValue is a BigDecimal, and when i debug it the value is null.
when it tries to execute this update, I get oracle error:
ORA-00932: inconsistent datatypes: expected NUMBER got BINARY
this is what shows up for this property in the console log:
o.h.type.descriptor.sql.BasicBinder : binding parameter [8] as [VARBINARY] - [null]
IF I do this silly hack in Java before I execute the update:
if (object.sourceValue == null) object.sourceValue = new java.math.BigDecimal(0);
Then it runs fine. So the cause of this error is definitely not anything else than hibernate doing something wrong when the field is null.
How do I fix this so I can set the field to null without hibernate mishandling it?
In the DB, the field is nullable.
It looks the issue has not been resolved yet even with newer version of hibernate 5.2.17.Final with JPA.
The null parameter is being sent as VARBINARY which causes the error
ERROR [org.hibernate.engine.jdbc.spi.SqlExceptionHelper] - ERROR: cannot cast type bytea to bigint
One workaround to this is to use the org.hibernate.Query that is wrapped into the org.hibernate.ejb.QueryImpl this way:
QueryImpl q = (QueryImpl) this.entityManager.createNamedQuery("myNamed.query");
q.getHibernateQuery().setParameter("name", value, org.hibernate.Hibernate.BIG_DECIMAL);
q.getResultList();
When value is null, hibernate still can know which type the parameter must be, so it doesn't translate the null to a binary value.
I have solved this problem by adding 2 annotations
#DynamicUpdate(value=true)
#DynamicInsert(value=true)
Also and most important, check mapping on your entity class for getters like (onetomany, manytoone).
All the best.

Categories