Store 3rd party library object in database - java

Dear Stackoverflow community,
I'm developing a spring boot application which is using 3rd party library (https://github.com/goldmansachs/jdmn)
There is an object called TDefinitions which holds information about .dmn diagram.
Somehow I need to persist this object or all the information that this object contains in my database, using Spring Data JPA repository. So that I can reuse it whenever I run my application again.
Could you please assist me what would be the ways to achieve that? Store it as a Blob or maybe serialize it somehow as a Json?
The problem is it's not a simple object. Here's part of it's definition:
#JsonPropertyOrder({...})
public class TDefinitions extends TNamedElement implements Visitable {
#JsonProperty("import")
private List<TImport> _import;
private List<TItemDefinition> itemDefinition;
private List<TDRGElement> drgElement;
private List<TArtifact> artifact;
private List<TElementCollection> elementCollection;
private List<TBusinessContextElement> businessContextElement;
private DMNDI dmndi;
private String expressionLanguage;
private String typeLanguage;
[...]
How should I approach this problem from architectural perspective? How should I design my app? Provide interfaces/classes that extend elements from this library?

I'd prefer NoSQL such as MongoDB for storing these types of objects. If you have to use SQL for some reason, then go with JSON.

Related

How to use multiple Java annotations with the same value?

I'm using Retrofit along with GSON to retrieve data from an API and deserialize it to Java objects using GSON's #SerializedName annotation like below:
public class MyApiObject {
#SerializedName("apiJsonKey")
private String myValue;
...
}
It works fine, but I need to send objects of MyApiObject to a Firebase database and for that the object needs to be serialized back to JSON. Firebase's Java API does this automatically, but it generates the keys based on the instance variable's names (myValue) and not the serialized name ("apiJsonKey").
I know I can use Firebase's #PropertyName annotation, but that would require me to use two annotations with the same values, which is redundant and error-prone.
Is there a better way to do this?
The usual aproach in this cases is to set a constant and use it in both annotations.
public class MyApiObject {
private static final String MY_VALUE_NAME = "apiJsonKey";
#SerializedName(MY_VALUE_NAME)
#ParameterName(MY_VALUE_NAME)
private String myValue;
...
}
This is fairly usual in sequence annotations for JPA.

DataTableRepository in Spring data Elasticsearch

Currently, we are using spring data JPA with MySql database with DataTabaleRepository which works well with JPA. Now we are moving our data to Spring data elasticserch but DataTabaleRepository is not working with that. Is there any alternative for that or how can I implement a custom repository for that?
spring-data-jpa-datatables does not implement support for ElasticsearchRepository, as you say and use the Specification API which is not implemented by Spring Data for Elasticsearch, so extending it would take some work.
What you need to do is create your own ElasticsearchRepositoryFactoryBean (ie. ElasticsearchDataTablesRepositoryFactoryBean) and your own implementation of AbstractElasticsearchRepository that implements the specifics of spring-data-jpa-datatables just like DataTablesRepositoryImpl. You should also define your own DataTablesRepository (ElasticsearchDataTablesRepository that extends ElasticsearchRepository) with the same methods.
The org.springframework.data.jpa.datatables.mapping classes can be reused, but you'll have to recreate the logic found in SpecificationFactory for elasticsearch using QueryBuilders, which will be the most time consuming part I imagine.
When you're done, you can use the #EnableElasticsearchRepositories just like described by spring-data-jpa-datatables ie.:
#EnableElasticsearchRepositories(repositoryFactoryBeanClass = ElasticsearchDataTablesRepositoryFactoryBean.class))
And extend your repositories with your ElasticsearchDataTablesRepository interface and you're good to go.
For reference you should look at SpecificationFactory and AbstractElasticsearchRepository (the search method) and get familiar with Elasticsearch QueryBuilders.

Take advantage of JSON flexibility in Java components

I am just trying to find a way to be more flexible.
I have a java web-app that connects to a noSql db (couchbase). In order to map the stored jsons I have created a jar which contains all the java classes for all those jsons.
Rex json document
{
"age":15
}
Mapping the json structure to java class:
public class Dog{
private int age;
// getters+setters
}
The problem I am facing is:
Whenever I update the db json structure (because json is flexible) I have to update also the java classes -> recompile a new jar version of the classes (and update the web-app dependency to the new jar version).
A newly needed update for Rex json:
{
"dob":"1999/01/25",
"name":"Rex"
}
I need an update to the Dog class looking like this:
public class Dog{
private String dob;
private String name;
// getters+setters
}
How can I create the java classes to be flexibleand to not need a new recompilation of the classes jar?
My main objective is to not update/redeploy the web-apps connecting to noSql in case of an update of json structure.
Hoping this is not a dumb question, I thank you,
Georgian
You can use a Map<String, String> for storing these values
How can I create the java classes to be flexibleand to not need a new recompilation of the classes jar?
I assume that you need to read and write JSON using these classes ...
If so, you need to choose between:
a JSON <-> Java binding (like you are currently using) where you will need to generate new Java classes when your JSON structures change, OR
a JSON binding that binds JSON "objects" to a Map type, OR
something like the old "org.json" library where you load JSON into a JSONObject.
There is no way to get a statically typed API (like your Dog class) that automagically deals with changing JSON structures.

Google App Engine class on client side

I am developing an Android app using GAE on Eclipse.
On one of the EndPoint classes I have a method which returns a "Bla"-type object:
public Bla foo()
{
return new Bla();
}
This "Bla" object holds a "Bla2"-type object:
public class Bla {
private Bla2 bla = new Bla2();
public Bla2 getBla() {
return bla;
}
public void setBla(Bla2 bla) {
this.bla = bla;
}
}
Now, my problem is I cant access the "Bla2" class from the client side. (Even the method "getBla()" doesn't exist)
I managed to trick it by creating a second method on the EndPoint class which return a "Bla2" object:
public Bla2 foo2()
{
return new Bla2();
}
Now I can use the "Bla2" class on the client side, but the "Bla.getBla()" method still doesn't exit. Is there a right way to do it?
This isn't the 'right' way, but keep in mind that just because you are using endpoints, you don't have to stick to the endpoints way of doing things for all of your entities.
Like you, I'm using GAE/J and cloud endpoints and have an ANdroid client. It's great running Java on both the client and the server because I can share code between all my projects.
Some of my entities are communicated and shared the normal 'endpoints way', as you are doing. But for other entities I still use JSON, but just stick them in a string, send them through a generic endpoint, and deserialize them on the other side, which is easy because the entity class is in the shared code.
This allows me to send 50 different entity types through a single endpoint, and it makes it easy for me to customize the JSON serializing/deserializing for those entities.
Of course, this solution gets you in trouble if decide to add an iOS or Web (unless you use GWT) client, but maybe that isn't important to you.
(edit - added some impl. detail)
Serializing your java objects (or entities) to/from JSON is very easy, but the details depend on the JSON library you use. Endpoints can use either Jackson or GSON on the client. But for my own JSON'ing I used json.org which is built-into Android and was easy to download and add to my GAE project.
Here's a tutorial that someone just published:
http://www.survivingwithandroid.com/2013/10/android-json-tutorial-create-and-parse.html
Then I added an endpoint like this:
#ApiMethod(name = "sendData")
public void sendData( #Named("clientId") String clientId, String jsonObject )
(or something with a class that includes a List of String's so you can send multiple entities in one request.)
And put an element into your JSON which tells the server which entity the JSON should be de serialized into.
Try using #ApiResourceProperty on the field.

Restful architecture issue with (too many) complex objects

Alright, I've been put in charge of both the server and client (used internally) part of this RESTful architecture. (using restlet).
We've got a resource that exposes the Post operation. Here's a simplified version:
public class UserResource {
#Post
public Representation create(UserRegistration registration) {
SomeService.getInstance().createUser(registration);
return new XstreamRepresentation(new RegistrationResponse(registration.getUniqueCode()););
}
For a few months, we've been the only ones using these services, so domain objects were shared across client and server sides... and it's been working just fine.
Now that we have to document these resources and let other clients use them some "issues" have been arising that made me think this API might be a little too complicated.
This Post service, for example.
The internal method accepts complex type UserRegistration
public class UserRegistration implements Serializable {
private Profile profile;
private Boolean someBooleanProperty;
public UserRegistration(Profile profile) {
this(profile, true);
}
public Profile getProfile() {
return profile;
}
public boolean isSomeBooleanProperty() {
return someBooleanProperty;
}
}
which, in turn, uses another complex object (Profile)
public class Profile {
private String nickname;
private String email;
private String password;
private String firstname;
private String lastname;
private Date birthDate;
private String phone;
private Address address;
private GenderType gender;
private String subscriptionSite;
private Date privacyAcceptanceDate;
private Date subscriptionDate;
private String activationCode;
private String socialSecurityNumber;
...
which is using a lot of complex types and so on.
This use of complex types is what really bugs me.
I either don't know how to document this (apart from making a long long list of these complex objects inner properties) or I'm just lost.
My questions are:
Do I have to simplify?
Is this architecture very bad-designed?
Would a few builder methods do the trick?
By sharing domain entity types between the client and the server, you (not saying you specifically) have completely defeated the point of REST. RESTful systems are supposed to share only media types and link relations. Sharing types like you are doing is much easier with SOAP because WSDL allows toolkits to take care of the details of keeping the client and server types in sync.
REST is all about reducing the coupling between client and server to allow them to evolve independently. Obviously, if you have a large set of shared types, that is going to be difficult, which why you currently have this bad feeling.
The solution I have taken to this problem is to define two media types. One is sort of a generic entity data container. Let's call it BusinessDocument, and the other is called BusinessLayout. The client uses the BusinessDocument to retrieve all the data from the server and the BusinessLayout provides "data binding" information so the client knows where in my UI to display the different pieces of business data.
By doing this I am able to build a client that really doesn't understand the specifics of the data it is dealing with, it just knows how to display it on the UI for the user to interact with. By doing this, I am able to use a single media type, to describe hundreds of different business entities.
There's no need to give the java client to external consumers. Your API should be able to answer to any Http client. The fact that there is a java client that shares the object can depend on different factors but should not influence how you expose your REST API to third party consumer.
So I'd suggest to start writing a pure HTTP client, using apache commons HTTP, to see how your REST API behaves.
The fact that the server objects are complex also should not be of any interest of the API. If the old system was designed modeling object around data, which I consider a bad idea, that's something you have to deal with.
From the REST API you always receive just text, XML or JSON, and you have eventually to parse it into your Java Object, if you have for example and ORM + RDBMS backed system. If you could store Json, like on a a document DB, you do not have this problem but, again, this is of no concern of the REST API per se, but you need a layer that transform JSON to Java Object.
Restlet helps you with this, of course such complicated object is not an easy one to be automagically converted.

Categories