JDBI: Inserting Dates as longs (milliseconds) - java

In my application I store dates as milliseconds
public class Model {
private long date = System.currentTimeMillis();
public void setDate(long date) {
this.date = date;
}
public long getDate() {
return date;
}
}
I have a JDBI Data Access Object which looks like:
public interface ModelDAO {
#SqlBatch("REPLACE INTO model (date) VALUES (:date)")
#BatchChunkSize(1000)
void insertModels(#BindBean List<Model> models);
#SqlQuery("SELECT * FROM model ORDER BY date DESC")
List<Model> getModels();
}
However when I try to insert I get:
org.skife.jdbi.v2.exceptions.UnableToExecuteStatementException:
java.sql.BatchUpdateException: Data truncation: Incorrect datetime
value: '1430262000000' for column 'date'
Is there a way I can tell JDBI how to convert this without requiring something like the below for all my classes with dates in?
#BindingAnnotation(BindModel.ModelBindingFactor.class)
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.PARAMETER})
public #interface BindModel {
public static class ModelBindingFactor implements BinderFactory {
public Binder build(Annotation annotation) {
return new Binder<BindModel, Model>() {
public void bind(SQLStatement q, BindModel bind, Model model) {
q.bind("date", new Timestamp(model.getDate()));
}
};
}
}
}
I would be willing to switch my models to use a DateTime object if it makes things cleaner.

It will be easier and more cleaner if you can use DateTime to store date values. You should have a argumentFactory to convert DateTime to sql date. You can use the following one.
public class DateTimeAsSqlDateArgument implements ArgumentFactory<DateTime> {
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value != null && DateTime.class.isAssignableFrom(value.getClass());
}
#Override
public Argument build(Class<?> expectedType, final DateTime value, StatementContext ctx) {
return new Argument() {
#Override
public void apply(int position, PreparedStatement statement, StatementContext ctx) throws SQLException {
statement.setTimestamp(position, new Timestamp(value.getMillis()));
}
};
}
}
Register this argument factory to dbi. That's all you need. JDBI will use this factory whenever it see DateTime object.
dbi.registerArgumentFactory(new DateTimeAsSqlDateArgument());

Related

JDBCIO Calling Postgres Routine (Stored Proc) which takes a Custom Object Type as parameter

I'm trying to call a Postgres routine which takes a custom Object type as a parameter.
create type person_type as
(
first varchar,
second varchar,
is_real boolean
);
My routine (stored proc):
create function person_routine(person person_type)
returns void
language plpgsql
as $$
BEGIN
INSERT INTO person(first, second, is_real) VALUES
(person.first,person.second,person.is_real);
END;
$$;
Then I attempt creating a Java class to represent the custom type:
import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;
public class PersonType implements SQLData {
public String first;
public String second;
public boolean is_real;
private String sql_type;
#Override
public String getSQLTypeName() throws SQLException {
return sql_type;
}
#Override
public void readSQL(SQLInput stream, String typeName) throws SQLException {
sql_type = typeName;
second = stream.readString();
first = stream.readString();
is_real = stream.readBoolean();
}
#Override
public void writeSQL(SQLOutput stream) throws SQLException {
stream.writeString(first);
stream.writeBoolean(is_real);
stream.writeString(second);
}
}
Then i attempted to execute the code like this:
.apply(JdbcIO.<Person>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"org.postgresql.Driver", configuration.GetValue("postgres_host"))
.withUsername(configuration.GetValue("postgres_username"))
.withPassword(configuration.GetValue("postgres_password")))
.withStatement("SELECT person_routine(?)")
.withPreparedStatementSetter(new JdbcIO.PreparedStatementSetter<Person>() {
public void setParameters(Person element, PreparedStatement query)
throws SQLException {
PersonType dto = new PersonType();
dto.first = element.first;
dto.second = element.second;
dto.is_real = element.is_real;
query.setObject(1, dto);
}
})
);
Unfortunately that gives me an exception:
java.lang.RuntimeException: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of dto.PersonType. Use setObject() with an explicit Types value to specify the type to use.
Any help would be great.
So, it's about using PGobject(). This is how i achieved it and it seems to work really well.
PGobject person = new PGobject();
person.setType("person_type");
person.setValue(String.format("(%s,%s,%s)","something","something","FALSE"));
query.setObject(1, person);

#DefaultValue in JAX-RS for dates: now() and MAX

I have a query parameter as following:
#GET
public Response myFunction(#QueryParam("start") final LocalDate start, #QueryParam("end") final LocalDate end) { ... }
For this, I created a ParamConverter<LocalDate> which converts a string to a date, and vice versa.
Now I want to use #DefaultValue annotation to declare a default value. I have two special default values:
today (for start)
the maximum value/infinity (for end)
Is it possible to use the #DefaultValue annotation for this? How?
Yes, #DefaultValue value can be used in this situation:
#GET
public Response foo(#QueryParam("start") #DefaultValue("today") LocalDate start,
#QueryParam("end") #DefaultValue("max") LocalDate end) {
...
}
Your ParamConverterProvider and ParamConverter implementations can be like:
#Provider
public class LocalDateParamConverterProvider implements ParamConverterProvider {
#Override
public <T> ParamConverter<T> getConverter(Class<T> rawType, Type genericType,
Annotation[] annotations) {
if (rawType.getName().equals(LocalDate.class.getName())) {
return new ParamConverter<T>() {
#Override
public T fromString(String value) {
return parseString(value, rawType);
}
#Override
public String toString(T value) {
return ((LocalDateTime) value)
.format(DateTimeFormatter.ISO_LOCAL_DATE);
}
};
}
return null;
}
private <T> T parseString(String value, Class<T> rawType) {
if (value == null) {
return null;
}
if ("today".equalsIgnoreCase(value)) {
return rawType.cast(LocalDate.now());
}
if ("max".equalsIgnoreCase(value)) {
return rawType.cast(LocalDate.of(9999, 12, 31));
}
try {
return rawType.cast(LocalDate.parse(value,
DateTimeFormatter.ISO_LOCAL_DATE));
} catch (Exception e) {
throw new BadRequestException(e);
}
}
}
If, for some reason, you need the parameter name, you can get it from the Annotation array:
Optional<Annotation> optional = Arrays.stream(annotations)
.filter(annotation -> annotation.annotationType().equals(QueryParam.class))
.findFirst();
if (optional.isPresent()) {
QueryParam queryParam = (QueryParam) optional.get();
String parameterName = queryParam.value();
}

jOOQ converter implicitly set a value when writing a record

In a table of my postgresql db, I have 2 columns with default value.
I define a converter for one of them in jOOQ generator configuration.
When i execute the function record.store(), the default value is well used for the field without converter but not for the other one that becomes null.
I never explicitly set those fields but I guess that record.changed(MY_OBJECT.FIELD) == true after the execution of the converter.
I'm not sure this is the expected behavior. Is it a bug ? Is there a workaround for that ?
Edit : Here is the code used
TimestampBinding.java
public class TimestampBinding implements Binding<Timestamp, Instant> {
private static final Converter<Timestamp, Instant> converter = new TimestampConverter();
private final DefaultBinding<Timestamp, Instant> delegate = new DefaultBinding<> (converter());
#Override
public Converter<Timestamp, Instant> converter() { return converter; }
#Override
public void sql(BindingSQLContext<Instant> ctx) throws SQLException {
delegate.sql(ctx);
}
#Override
public void register(BindingRegisterContext<Instant> ctx) throws SQLException {
delegate.register(ctx);
}
#Override
public void set(BindingSetStatementContext<Instant> ctx) throws SQLException {
delegate.set(ctx);
}
#Override
public void set(BindingSetSQLOutputContext<Instant> ctx) throws SQLException {
delegate.set(ctx);
}
#Override
public void get(BindingGetResultSetContext<Instant> ctx) throws SQLException {
delegate.get(ctx);
}
#Override
public void get(BindingGetStatementContext<Instant> ctx) throws SQLException {
delegate.get(ctx);
}
#Override
public void get(BindingGetSQLInputContext<Instant> ctx) throws SQLException {
delegate.get(ctx);
}
}
TimestampConverter.java
public class TimestampConverter implements Converter<Timestamp, Instant> {
#Override
public Instant from(Timestamp ts) {
return ts == null ? null : ts.toInstant();
}
#Override
public Timestamp to(Instant instant) {
return instant == null ? null : Timestamp.from(instant);
}
#Override
public Class<Timestamp> fromType() { return Timestamp.class; }
#Override
public Class<Instant> toType() { return Instant.class; }
}
sql
CREATE TABLE user (
id uuid PRIMARY KEY,
active boolean NOT NULL DEFAULT false,
created_at timestamptz DEFAULT now()
);
store record
user.setId(UUID.randomUUID());
UserRecord userRecord = DSL.using(conn, SQLDialect.POSTGRES_9_3)
.newRecord(userTable.USER, user);
userRecord.store();
This behaviour is "expected" and has a long history in jOOQ. The short explanation is here:
Your column active is NOT NULL, thus the Java value null is interpreted as being the SQL value DEFAULT, regardless of the Record.changed() flag.
Your column created_at is nullable, thus the Java value null is interpreted
as being the SQL value DEFAULT if Record.changed(CREATED_AT) == false
as being the SQL value NULL if Record.changed(CREATED_AT) == true
With your call to DSLContext.newRecord(), you are copying all values from your user POJO into your userRecord, including all the null values. jOOQ cannot distinguish between null values that are uninitalised and null values that are explicitly... null.
In your particular case, the best solution would be to declare all columns NOT NULL (which is better on a conceptual level anyway):
CREATE TABLE user (
id uuid PRIMARY KEY,
active boolean NOT NULL DEFAULT false,
created_at timestamptz NOT NULL DEFAULT now()
);

Need help storing nested json in postgresql using jdbi

I am attempting to store JSON in a postgresql 9.4 database using the JSONB datatype with dropwizard and jdbi. I am able to store the data, but if my json goes any deeper than a single level, the json gets turned into a string instead of nested json.
For instance, the following json
{
"type":"unit",
"nested": {
"key":"embedded"
}
}
actually gets stored as
{
"type":"unit",
"nested":"{key=embedded}"
}
The method signature in my DAO is
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
protected abstract void createJson(#Bind("id") String id, #Bind("content") Map content);
I obviously have something wrong, but I can't seem to figure out the correct way to store this nested data.
You can use PGObject to build a JSONB data type in Java. This way you can avoid any special handling as part of the SQL:
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
A full example including converting an object to a tree, and using an ArgumentFactory to convert it to a PGobject could look like this:
public class JsonbTest {
#Test
public void tryoutjson() throws Exception {
final DBI dbi = new DBI("jdbc:postgresql://localhost:5432/sighting", "postgres", "admin");
dbi.registerArgumentFactory(new ObjectNodeArgumentFactor());
Sample sample = dbi.onDemand(Sample.class);
ObjectMapper mapper = new ObjectMapper();
int id = 2;
User user = new User();
user.emailaddress = "me#home.com";
user.posts = 123;
user.username = "test";
sample.insert(id, mapper.valueToTree(user));
}
public static class User {
public String username, emailaddress;
public long posts;
}
public interface Sample {
#SqlUpdate("INSERT INTO sample (id, data) VALUES (:id, :data)")
int insert(#Bind("id") long id, #Bind("data") TreeNode data);
}
public static class ObjectNodeArgumentFactor implements ArgumentFactory<TreeNode> {
private static class ObjectNodeArgument implements Argument {
private final PGobject value;
private ObjectNodeArgument(PGobject value) {
this.value = value;
}
#Override
public void apply(int position,
PreparedStatement statement,
StatementContext ctx) throws SQLException {
statement.setObject(position, value);
}
}
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof TreeNode;
}
#Override
public Argument build(Class<?> expectedType, TreeNode value, StatementContext ctx) {
try {
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
return new ObjectNodeArgument(dataObject);
} catch (SQLException e) {
throw new RuntimeException(e);
}
}
}
}
I was able to solve this by passing in a string obtained by calling writeValueAsString(Map) on a Jackson ObjectMapper. My createJson method turned into:
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
public abstract void createJson(#Bind("id")String id, #Bind("content")String content);
and I obtained the string to pass in by creating a mapper:
private ObjectMapper mapper = Jackson.newObjectMapper();
and then calling:
mapper.writeValueAsString(map);
This gave me the nested json I was looking for.

Jackson JsonDeserialize not being called for #QueryParam

I have mapped a custom deserializer to convert Strings on dd/MM/yyyy pattern to LocalDate so I can call my services with a more readable signature..
This is my dto class that is used as a Jersey #BeanParam to transport data between layers:
public class ProdutoFilterDto implements FilterDto {
private static final long serialVersionUID = -4998167328470565406L;
#QueryParam("dataInicial")
#JsonDeserialize(using = CustomLocalDateDeserializer.class)
private LocalDate dataInicial;
#QueryParam("dataInicial")
#JsonDeserialize(using = CustomLocalDateDeserializer.class)
private LocalDate dataFinal;
public LocalDate getDataInicial() {
return dataInicial;
}
public void setDataInicial(LocalDate dataInicial) {
this.dataInicial = dataInicial;
}
public LocalDate getDataFinal() {
return dataFinal;
}
public void setDataFinal(LocalDate dataFinal) {
this.dataFinal = dataFinal;
}
}
and this is my custom deserializer:
public class CustomLocalDateDeserializer extends JsonDeserializer<LocalDate> {
#Override
public LocalDate deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern("dd/MM/yyyy");
final String data = p.getValueAsString();
return (LocalDate) formatter.parse(data);
}
}
Its being used on this jersey service:
#Path("produto")
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
public class ProdutoService {
...
#GET
#Path("query")
#Override
public Response query(
#QueryParam("offset") #DefaultValue(value = "0") Integer offSet,
#QueryParam("limit") #DefaultValue(value = "10") Integer limit,
#BeanParam ProdutoFilterDto filter) { ... }
...
}
I am calling like this:
${host goes here}/produto/query?dataInicial=11/09/1992
The problem is that the deserializer method is never called and the bean param variable remains null..
MessageBodyReaders aren't used for #QueryParam. You seem to be expecting the Jackson MessageBodyReader to handle this deserialization, but it doesn't work like that.
Instead you will want to use a ParamConverter, which will need to be registered through a ParamConverterProvider. For example:
#Provider
public class LocalDateParamConverterProvider implements ParamConverterProvider {
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern("dd/MM/yyyy");
#Override
public <T> ParamConverter<T> getConverter(
Class<T> rawType, Type genericType, Annotation[] antns) {
if (LocalDate.class == rawType) {
return new ParamConverter<T>() {
#Override
public T fromString(String string) {
try {
LocalDate localDate = LocalDate.parse(string, formatter);
return rawType.cast(localDate);
} catch (Exception ex) {
throw new BadRequestException(ex);
}
}
#Override
public String toString(T t) {
LocalDate localDate = (LocalDate) t;
return formatter.format(localDate);
}
};
}
return null;
}
}
Now LocalDate will work with #QueryParam and other #XxxParams also.
Some things to note:
If your goal is to parse both your #XxxParams and your JSON body into a bean this will not work. I'm not sure how that would work, but I'm sure it would involve a lot of hacking, and I wouldn't recommend it.
Your cast to (LocalDate) won't work. It's an illegal cast to java.time.format.Parsed. See correct way in code example.
Related to the above point. I was pulling out my hair for a good hour trying to figure out why I was getting a 404, using your parse code. With a 404, the last place I thought to look was in the ParamConverter. But it seems any uncaught exceptions that are thrown in the ParamConverter, will cause a 404. Doesn't make much sense right? The head pounding led me to this, which led me to this, which seems to be a poor specification
"if the field or property is annotated with
#MatrixParam, #QueryParam or #PathParam then an implementation MUST generate an instance of
NotFoundException (404 status) that wraps the thrown exception and no entity
"
Moral of the story: make sure to catch any possible exceptions in the ParamConverter!
See Also:
Good article on ParamConverters

Categories