Need help storing nested json in postgresql using jdbi - java

I am attempting to store JSON in a postgresql 9.4 database using the JSONB datatype with dropwizard and jdbi. I am able to store the data, but if my json goes any deeper than a single level, the json gets turned into a string instead of nested json.
For instance, the following json
{
"type":"unit",
"nested": {
"key":"embedded"
}
}
actually gets stored as
{
"type":"unit",
"nested":"{key=embedded}"
}
The method signature in my DAO is
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
protected abstract void createJson(#Bind("id") String id, #Bind("content") Map content);
I obviously have something wrong, but I can't seem to figure out the correct way to store this nested data.

You can use PGObject to build a JSONB data type in Java. This way you can avoid any special handling as part of the SQL:
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
A full example including converting an object to a tree, and using an ArgumentFactory to convert it to a PGobject could look like this:
public class JsonbTest {
#Test
public void tryoutjson() throws Exception {
final DBI dbi = new DBI("jdbc:postgresql://localhost:5432/sighting", "postgres", "admin");
dbi.registerArgumentFactory(new ObjectNodeArgumentFactor());
Sample sample = dbi.onDemand(Sample.class);
ObjectMapper mapper = new ObjectMapper();
int id = 2;
User user = new User();
user.emailaddress = "me#home.com";
user.posts = 123;
user.username = "test";
sample.insert(id, mapper.valueToTree(user));
}
public static class User {
public String username, emailaddress;
public long posts;
}
public interface Sample {
#SqlUpdate("INSERT INTO sample (id, data) VALUES (:id, :data)")
int insert(#Bind("id") long id, #Bind("data") TreeNode data);
}
public static class ObjectNodeArgumentFactor implements ArgumentFactory<TreeNode> {
private static class ObjectNodeArgument implements Argument {
private final PGobject value;
private ObjectNodeArgument(PGobject value) {
this.value = value;
}
#Override
public void apply(int position,
PreparedStatement statement,
StatementContext ctx) throws SQLException {
statement.setObject(position, value);
}
}
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof TreeNode;
}
#Override
public Argument build(Class<?> expectedType, TreeNode value, StatementContext ctx) {
try {
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
return new ObjectNodeArgument(dataObject);
} catch (SQLException e) {
throw new RuntimeException(e);
}
}
}
}

I was able to solve this by passing in a string obtained by calling writeValueAsString(Map) on a Jackson ObjectMapper. My createJson method turned into:
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
public abstract void createJson(#Bind("id")String id, #Bind("content")String content);
and I obtained the string to pass in by creating a mapper:
private ObjectMapper mapper = Jackson.newObjectMapper();
and then calling:
mapper.writeValueAsString(map);
This gave me the nested json I was looking for.

Related

JDBCIO Calling Postgres Routine (Stored Proc) which takes a Custom Object Type as parameter

I'm trying to call a Postgres routine which takes a custom Object type as a parameter.
create type person_type as
(
first varchar,
second varchar,
is_real boolean
);
My routine (stored proc):
create function person_routine(person person_type)
returns void
language plpgsql
as $$
BEGIN
INSERT INTO person(first, second, is_real) VALUES
(person.first,person.second,person.is_real);
END;
$$;
Then I attempt creating a Java class to represent the custom type:
import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;
public class PersonType implements SQLData {
public String first;
public String second;
public boolean is_real;
private String sql_type;
#Override
public String getSQLTypeName() throws SQLException {
return sql_type;
}
#Override
public void readSQL(SQLInput stream, String typeName) throws SQLException {
sql_type = typeName;
second = stream.readString();
first = stream.readString();
is_real = stream.readBoolean();
}
#Override
public void writeSQL(SQLOutput stream) throws SQLException {
stream.writeString(first);
stream.writeBoolean(is_real);
stream.writeString(second);
}
}
Then i attempted to execute the code like this:
.apply(JdbcIO.<Person>write()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(
"org.postgresql.Driver", configuration.GetValue("postgres_host"))
.withUsername(configuration.GetValue("postgres_username"))
.withPassword(configuration.GetValue("postgres_password")))
.withStatement("SELECT person_routine(?)")
.withPreparedStatementSetter(new JdbcIO.PreparedStatementSetter<Person>() {
public void setParameters(Person element, PreparedStatement query)
throws SQLException {
PersonType dto = new PersonType();
dto.first = element.first;
dto.second = element.second;
dto.is_real = element.is_real;
query.setObject(1, dto);
}
})
);
Unfortunately that gives me an exception:
java.lang.RuntimeException: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of dto.PersonType. Use setObject() with an explicit Types value to specify the type to use.
Any help would be great.
So, it's about using PGobject(). This is how i achieved it and it seems to work really well.
PGobject person = new PGobject();
person.setType("person_type");
person.setValue(String.format("(%s,%s,%s)","something","something","FALSE"));
query.setObject(1, person);

Spring batch - use JdbcCursorItemReader to read from a DataSource and work with the ResultSet

I'm trying to use Spring Batch to create a Job that uses a DataSource (configure before) and runs a query.
I want to be able to iterate through the returned ResultSet to create a new table with the returned data.
I can create a reader like this, but i don´t know how to iterate the results.
#Bean
public JdbcCursorItemReader<ResultSet> reader(String query, DataSource dataSource) {
JdbcCursorItemReader<ResultSet> itemReader = new JdbcCursorItemReader<>();
itemReader.setDataSource(dataSource);
itemReader.setSql(query);
return itemReader;
}
What should my ItemProcessor receive? This?
public class ExtractionItemProcessor implements ItemProcessor<ResultSet, String>{
#Override
public String process(ResultSet item) throws Exception {
// transform the resultSet into a SQL INSERT
}
}
EDIT: the only way I know the results of the query is by the ResultSet Metadata, so I can´t create a POJO and set the properties.
Create a POJO class to represent a record, such as Foo:
public class Foo {
private final long id;
// more properties
public Foo(long id // ,...) {
this.id = id;
// set other properties
}
public long getId() {
return id;
}
}
so the reader will be a JdbcCursorItemReader<Foo>.
And create a RowMapper<Foo> such as:
public FooRowMapper implements implements RowMapper<Foo> {
#Override
public Foo mapRow(ResultSet rs, int rowNum) throws SQLException {
long id = rs.getLong("id");
// more properties
return new Foo(id // more properties);
}
}
and set it on the reader: itemReader.setRowMapper(new FooRowMapper()).
The ExtractionItemProcessor will receive a Foo, and would look something like:
public class ExtractionItemProcessor implements ItemProcessor<Foo, String>{
#Override
public String process(Foo item) throws Exception {
int someValue = transform(foo);
return "INSERT INTO blah blah blah..." + someValue + "..."; // potentially dangerous - see SQL injection attack
}
}
To take it further, perhaps ExtractionItemProcessor transformation of a Foo should create a Bar. Then it would look like:
public class ExtractionItemProcessor implements ItemProcessor<Foo, Bar>{
#Override
public Bar process(Foo item) throws Exception {
int someValue = transform(foo);
// more values....
return new Bar(someValue // more values);
}
}
So then the ItemWriter impelementation will take a List<Bar>, which knows how to safely insert Bars to another table.

A way to bind Java Map<String, Object> to sql varchar in JDBI INSERT statement

Is there a way to bind a java Map<String, Object> to a varchar in the the JDBI #BindBean annotation.
So for example I have a class Something.class and I create a
#SqlBatch("INSERT INTO Something (name, payload) VALUES(:name, :payload)").
Now in my java class the name is of type String and payload is of type Map<String, Object> and I want in the DB table the types are varchar(...). Now I want the Map object to be inserted in the column as a JSON object, is that somehow achievable wihtout creating my own complex Binder as defined in http://jdbi.org/sql_object_api_argument_binding/ ? and other than making my payload be of type String in java.
public class MapArgument implements Argument {
private Map<String, Object> payload;
private ObjectMapper objectMapper;
public MapArgument(ObjectMapper objectMapper, Map<String, Object> payload) {
this.objectMapper = objectMapper;
this.payload = payload;
}
#Override
public void apply(int i, PreparedStatement statement, StatementContext statementContext) throws SQLException {
try {
statement.setString(i, objectMapper.writeValueAsString(payload));
} catch (JsonProcessingException e) {
log.info("Failed to serialize payload ", e);
}
}
}
public class MapArgumentFactory implements ArgumentFactory<Map<String, Object>> {
private ObjectMapper mapper;
public MapArgumentFactory(ObjectMapper mapper) {
this.mapper = mapper;
}
#Override
public boolean accepts(Class<?> type, Object value, StatementContext statementContext) {
return value instanceof Map;
}
#Override
public Argument build(Class<?> type, Map<String, Object> map, StatementContext statementContext) {
return new MapArgument(mapper, map);
}
}
Fixed my problem with creating an ArgumentFactory binder suggested in this post.
So what I needed was to create a class that just contained one field of type Map<String, Object> implemented the Arugment interface from org.skife.jdbi.v2.tweak so I ended up with the following
public class NotificationPayloadArgument implements Argument {
private NotificationPayload payload;
NotificationPayloadArgument(NotificationPayload payload) {
this.payload = payload;
}
#Override
public void apply(int i, PreparedStatement preparedStatement, StatementContext statementContext)
throws SQLException {
preparedStatement.setString(i, toString());
}
#Override
public String toString() {
return new JSONObject(payload).toString();
}
}
To make this work I of course needed to implement a Factory class which implements the org.skife.jdbi.v2.tweak.ArgumentFactory<T> interface with my newly created type, so the factory ended up as such:
public class NotificationPayloadFactory implements ArgumentFactory<NotificationPayload> {
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof NotificationPayload;
}
#Override
public Argument build(Class<?> expectedType, NotificationPayload value, StatementContext ctx) {
return value;
}
}
and of course lastly also as mentioned in Does JDBI accept UUID parameters? I had to register my factory:
jdbi.registerArgumentFactory(new NotificationPayloadFactory());
I tried to do this with just a factory for a Map<String, Object> but could not make it work for that, and there was a risk for NPE.
EDIT
The reason I am Overriding the toString() in NotificationPayload is because I need the payload in json format some places where I need a String object. But else it can be removed and then just use the new JSONObject(payload).toString() in the preparedStatement.setString() where toString() is called.
In kotlin-way using JDBI to insert a data class as JSONB; just create one ArgumentFactory for that datatype and register that factory with JDBI. This is as simple as these 3 lines needed -
internal class CvMetadataArgumentFactory : AbstractArgumentFactory<CvMetadata?>(Types.OTHER) {
override fun build(value: CvMetadata?, config: ConfigRegistry): Argument {
return Argument { position, statement, _ -> statement.setString(position, value?.toString()?:"") }
}
}
Later register this factory -
jdbi.registerArgument(CvMetadataArgumentFactory())

Jackson - Transform field value on serialization

In a Spring Boot applicaion with AngularJS frontend, a "Pin" field value has to be blackened on serialization, i.e., if the Pin field value is null in the POJO, the according JSON field has to remain blank; if the field value contains data, it has to be replaced with a "***" string.
Does Jackson provide a feature to get this done?
You can do it easily like following without any Custom Serializer
public class Pojo {
#JsonIgnore
private String pin;
#JsonProperty("pin")
public String getPin() {
if(pin == null) {
return "";
} else {
return "***";
}
}
#JsonProperty("pin")
public void setPin(String pin) {
this.pin = pin;
}
#JsonIgnore
public String getPinValue() {
return pin;
}
}
You can use Pojo.getPinValue() to get the exact value.
Try the following example.
public class Card {
public int id;
public String pin;
}
public class CardSerializer extends StdSerializer<Card> {
public CardSerializer() {
this(null);
}
public CardSerializer(Class<Card> t) {
super(t);
}
#Override
public void serialize(Card value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeNumberField("id", value.id);
jgen.writeStringField("pin", "****");
jgen.writeEndObject();
}
}
Then you need to register your customer serializer with the ObjectMapper
Card card = new Card(1, "12345");
ObjectMapper mapper = new ObjectMapper();
 
SimpleModule module = new SimpleModule();
module.addSerializer(Card.class, new CardSerializer());
mapper.registerModule(module);
String serialized = mapper.writeValueAsString(card);
There are some improvements you can do here like registering the serializer directly on the class, but you can read more about it here Section 4 - http://www.baeldung.com/jackson-custom-serialization

How to specify object custom serialization in ORMLite?

I would like to store some field of type ParentClass as json string into my database. I don't want to use Serializable interface and DataType.SERIALIZABLE cause it ties with full class name of serialized class.
So I'm using the following code:
class ParentClass {
#DatabaseField(persisterClass = MyFieldClassPersister.class)
private MyFieldClass myField;
}
where persister class a kind of:
public class MyFieldClassPersister extends StringType {
private static final MyFieldClassPersister singleTon = new MyFieldClassPersister();
public static MyFieldClassPersister getSingleton() {
return singleTon;
}
protected MyFieldClassPersister() {
super(SqlType.STRING, new Class<?>[0]);
}
#Override
public Object parseDefaultString(FieldType fieldType, String defaultStr) {
return jsonStringToObject(defaultStr);
}
#Override
public Object resultToSqlArg(FieldType fieldType, DatabaseResults results, int columnPos) throws SQLException {
String string = results.getString(columnPos);
return jsonStringToObject(string);
}
private static MyFieldClass jsonStringToObject(String string) {
// json to object conversion logic
}
}
Here are two issues I've met:
I didn't get how to specify custom convertion from object to string. Seems that ORMLite calls Object.toString() in order to get string representation of the object. It would be great to have some method in Persister in which I could specify how to convert Object to string (json in my case). Yes, I can override toString() method in MyFieldClass, but it is more convenient to perform conversion in Persister. Is there any method I could override in order to specify convertion from model object to db-object?
If I mark my custom field type as String type:
class ParentClass {
#DatabaseField(dataType = DataType.STRING, persisterClass = MyFieldClassPersister.class)
private MyFieldClass myField;
}
then ormlite crashes when saving object with the following message:
java.lang.IllegalArgumentException: Field class com.myapp.venue.MyFieldClass for
field FieldType:name=myField,class=ParentClass is not valid for type
com.j256.ormlite.field.types.StringType#272ed83b, maybe should be
class java.lang.String
It doesn't crash if I omit dataType specification. Can I avoid this crash in some way? It seems to me that it's better to specify types explicitly.
So basically your persister should be implemented in the next way:
public class MyFieldClassPersister extends StringType {
private static final MyFieldClassPersister INSTANCE = new MyFieldClassPersister();
private MyFieldClassPersister() {
super(SqlType.STRING, new Class<?>[] { MyFieldClass.class });
}
public static MyFieldClassPersister getSingleton() {
return INSTANCE;
}
#Override
public Object javaToSqlArg(FieldType fieldType, Object javaObject) {
MyFieldClass myFieldClass = (MyFieldClass) javaObject;
return myFieldClass != null ? getJsonFromMyFieldClass(myFieldClass) : null;
}
#Override
public Object sqlArgToJava(FieldType fieldType, Object sqlArg, int columnPos) {
return sqlArg != null ? getMyFieldClassFromJson((String) sqlArg) : null;
}
private String getJsonFromMyFieldClass(MyFieldClass myFieldClass) {
// logic here
}
private MyFieldClass getMyFieldClassFromJson(String json) {
// logic here
}
}
You should register it in onCreate method of your OrmLiteSqliteOpenHelper class
#Override
public void onCreate(SQLiteDatabaseHolder holder, ConnectionSource connectionSource) {
try {
//...
DataPersisterManager
.registerDataPersisters(MyFieldClassPersister.getSingleton());
} catch (SQLException e) {
// log exception
}
}
And then you can use it in your model like this:
#DatabaseField(persisterClass = MyFieldClassPersister.class, columnName = "column_name")
protected MyFieldClass myFieldClass;
Don't register the persister adapter in the onCreate() method. This method only gets called when your database is first created. You should add this somewhere else, like your constructor or onOpen() method.

Categories