I am using the .executeScalar() method from the sql2o .jar file. I am getting this exception when using an Employee POJO, when I use .executeScalar(Employee.class) I am getting:
org.sql2o.converters.ConverterException: No converter registered for class: com.mack.sales.employees.Employee
I cannot find anything to help resolve this issue, any help is appreciated.
Converters are what sql2o uses to convert from database values to Java values. For instance, if one of your properties in your pojo is an integer, sql2o uses its integer converter to convert from any compatible database datatype (int, number, etc) to integer.
The executeScalar method will fetch only one value (first column and first row) from the database and convert it to a Java value. It is meant to be used with single value queries. For instance a select count(*) from table.
To fetch multiple columns and map their values to a pojo, you can use the executeAndFetchFirst() method.
Related
Using Spring Boot and Spring Data JPA, I have a table which represents an enum value, and a corresponding entity that I want that one of its field will be 'all the possible values the enum can have', i.e. a field possibleValues that will be a select all on the other table. Preferably I don't want to have a relationship like #ManyToMany since:
It will always be a select all, I don't want to save to the database all the options and update them each time the enum values table changes.
I am going to have several enums, so for each enum create another many-to-many is less than ideal.
I've tried to find something like #Formula that will let me select all the values from another table, but it doesn't seem to work:
#Transient
#Formula("select e.name from EnumTable e")
private List<String> possibleValues;
results in possibleValues always being null, and if I remove the #Transient I have to define the relationship between the two entities.
For Enum value in database, I recommend using string (for object, you can use JSON to convert first before store and covert back after retrieve )
In my database i have some columns which contain null value, and i have generated classes from the tables so there are int attributes for the columns containing integter datatype. Now the problem is if i query the database to get data i get this error "Null value was assigned to a property of primitive type setter" for the columns which have null value.
I also have read many solution which are saying to convert int to Integer, but i cant do that. There are many int columns which have same problem and if i have to generate entity classes from the tables through JPA again, JPA will again generate int attributes.
I am asking for a solution while querying that checks that if attribute value is null For Example person.salary IS NULL, the i store 0 in person.salary.
My query right now is like this
"Select per From Person per where id=1"
Its a Java persistence query language not a native sql query.
Any help will be highly appreciated. I cant change the queries to native sql query too. I am stuck on it from two days
I'm using JPA 2.0. Is it possible to map an Entity's attribute (column) to the result of a function and then control the way it's persisted too?
Note: This is related to PostGIS. Since there is no native support for Geometry types in JPA/EclipseLink, I might do something like this:
#Entity
public class Foo {
#Column(name="geom", appendBeforeChange( ST_FromKml ))
public String kmlString;
}
In other words, I could store the geometry as a String in Java, but when EclipseLink writes it to the database, it should first call the database function ST_FromKml and provide the String, which will convert it to the Geometry type in the database...
I know it's a stretch, but figured I would ask...
I actually found a workaround with help from this post:
Are JPA (EclipseLink) custom types possible?
In Postgres, I create an implicit cast from String to Geometry:
CREATE OR REPLACE FUNCTION geom_in_text(varchar) RETURNS geometry AS $$
SELECT ST_GeomFromText($1::varchar);
$$ LANGUAGE SQL IMMUTABLE;
CREATE CAST (varchar AS geometry) WITH FUNCTION geom_in_text(varchar) AS IMPLICIT;
And now in Java, I just keep my Entity member variable as a String in WKT format. When Postgres see's the INSERT, it will recognize the avaiable implicit cast and not complain.
Note - I store WKT in my Entity class, but I could theoretically store KML and then update my function to call ST_GeomFromKml instead...
We are migrating from Oracle to PostgreSQL Enterprise DB and we have a stored procedure that accepts table-like structure. We used STRUCT with Oracle and it worked very well. Now this table is configured in PostgreSQL as UDT containing table of other UDT-s and it works hen being invoked from PostgreSQL layer.
I can't find a proper support or documentation on how to pass composite type as collection of composite types or array of composite types using PostgreSQL JDBC driver.
I was able to invoke the procedure by passing my implementation of PGobject.
My getValue() method converts the object data to the string of following format:
{row(key,1596156006),row(dataHeader2TDataHeader,2782),row(identifier01,20373833),
...}
But I get error "malformed array literal" from PostgreSQL.
The PostgreSQL version is 9.3.1, JDK 1.6, JBoss 3.2.8
You have basically a few options in PostgreSQL:
Create your own string representation and write a parser in C, pl/perl, pl/pgsql, or the like.
Use ROW() and ARRAY[] constructors, or
Use raw data formats.
JSON (however this has limits!)
If you are going to write your own string representation, you need to pay close attention to how it is parsed on input and output. You could go so far as to make your own types and not use composite types. This has some useful benefits in terms of ensuring that you always know your input and output formulas.
ROW() and ARRAY[] constructors can be used together. In this case you might:
ARRAY[ROW(1, 'foo')::bar, ROW(2, 'baz'), ROW(3, 'foobar,')]
Finally there is raw data formats. These use nested CSV (including double quoting) with {csv} representing an array and (csv) representing a row. The above would be this:
'{"(1,foo)","(2,baz)","(3,""foobar,"")"}'
My application is extracting the data from excel sheet. I am storing the value and type of the data from the sheet into the ArrayList. ie., If my excel sheet consists of employee data, i will retrive [ Employee name, String] [ Employee id, number] and so on.. So i have to create a table with these names and with their respective data types. So how could i dynamically specify the data types for the attributes in the table. I am using JDBC,MS Access..
Well, you read your data in a String, and for every value do String.matches(regex) to find out the datatype. For example do value.matches("\d"), if it mathces, then instantiate an Integer like, new Integer(value). Now, you should be able to add this new integer object into your List.
I hope you will be able to see how to go further. Check the instanceof or something while creating the table in the database.
For each different data type, use a different class. You can either use the default java classes like String or Integer, or make your own depending on your further requirements. You can store all these classes in your ArrayList.
When retrieving your data, check which class is used and handle it appropriately.