Schema-less in All layers - java

I have use case where schema of my entities keep on changing again and again.
Based on that i have to change/add new business rules which is not scalable for me.
I am trying to go schema-less by using JSON documents in my data , service and ui layer.
I want to avoid creating DAO(Data Access objects), Data Transfer object(for sending objects) , View Model objects that are major problem for me in changing schema and deploy again.
Are there any good frameworks from where i can take help in this ..

Have you tried the Java API for JSON Processing?
http://www.oracle.com/technetwork/articles/java/json-1973242.html
https://jcp.org/en/jsr/detail?id=353

Related

Proper way to handle schema changes in MongoDB with java driver

I'm having an application which stores data in a cloud instance of mongoDB. So If I explain further on requirement, I'm currently having data organized at collection level like below.
collection_1 : [{doc_1}, {doc_2}, ... , {doc_n}]
collection_2 : [{doc_1}, {doc_2}, ... , {doc_n}]
...
...
collection_n : [{doc_1}, {doc_2}, ... , {doc_n}]
Note: my collection name is a unique ID to represent collection and in this explanation I'm using collection_1, collection_2 ... to represent that ids.
So I want to change this data model to a single collection model as below. The collection ID will be embedded into document to uniquely identify the data.
global_collection: [{doc_x, collection_id : collection_1}, {doc_y, collection_id : collection_1}, ...]
I'm having the data access layer(data insert, delete, update and create operations) for this application written using Java backend.
Additionally, the entire application is deployed on k8s cluster.
My requirement is to do this migration (data access layer change and existing data migration) with a zero downtime and without impacting any operation in the application. Assume that my application is a heavily used application which has a high concurrent traffic.
What is the proper way to handle this, experts please provide me the guidance..??
For example, if I consider the backend (data access layer) change, I may use a temporary code in java to support both the models and do the migration using an external client. If so, what is the proper way to do the implementation change, is there any specific design patterns for this??
Likewise a complete explanation for this is highly appreciated...
I think you have honestly already hinted at the simplest answer.
First, update your data access layer to handle both the new and old schema: Inserts and updates should update both the new and old in order to keep things in sync. Queries should only look at the old schema as it's the source of record at this point.
Then copy all data from the old to the new schema.
Then update the data access to now query the new data. This will keep the old data updated, but will allow full testing of the new data before making any changes that will result in the two sets of data being out of sync. It will also help facilitate rolling updates (ie. applications with both new and old data access code will still function at the same time.
Finally, update the data access layer to only access the new schema and then delete the old data.
Except for this final stage, you can always roll back to the previous version should you encounter problems.

Store session(data) into database

I am working with Java EE and I try to create a web application. I want to store the session data into the database.
The ways that I have considered are:
Create tables to be able to store the data (I did not like this approach, because for every different web app we need to create different tables on database, and I think if you have complex session data this will be painful, to have all the relations etc.).
Create Java class for holding the data, and store the JSON representation to the database. So when you retrieve the session data, you convert it back to Java object, with Jackson for instance.
Store the serialized Java class object, and after deserialize it and use it.
I think that approach 2 and 3 is somehow more generic and don't need too much effort.
Are these good approaches? Is there some other approaches better that that?
What do you suggest me?
You should use Hibernate framework. It will automatically reduce lot of your work.
Refer https://www.javatpoint.com/hibernate-tutorial.

Best way to share a data structure between different instances of the same java application?

Currently, I'm developing an application based on spring boot. One of the requirements is that the application should be real-time and I need some kind of unique data structure based on InvertedRadixTree(not exactly this but data strucutre is using the tree to answer the queries). I developed an admin UI for crud operations. The number of cruds are not so much and basically will be done by OPs employees. the data structure that I developed is thread safe and is synchronized by database(which is mongodb) and since this is the only app using this database, I'm not worried about the other apps messing up with the data. The only problem that I have is that if we have multiple instances of this app, and one of them do some crud operations on mongodb; although the data structure of this instance will get updated, the other instance will not be updated. I created an scheduler to update the data structure from database every 12 hours, but I'm looking for another solution like sharing data structure between all the instances. I really appreciate every suggestions.
EDIT: After searching around, I found that updating the whole data structure doesn't take to much. I wrote some test cases and put around a million record of my class inside mongodb and fetched the whole collection. Fetching and data structure creation took less than a second. So I ended up using this method instead of using some sophisticated method for synchronizing memory and database.
One of the suggestion can be that you can use a shared database. Every time there is an update by any of the APP,It should be updated in the database.And every time you have to use the data you will have to load the fresh data from the database.This is the easiest way as far as i think ..!!!
I would use something like redis http://redis.io/topics/pubsub , and listen to an event fired for the instance that make the change and use some local cache on every instance if the data is not frequently updated

POJO or DTO approach

I am developing a new web application with Struts2, Spring and Hibernate as its core building blocks.
We have created POJO classes with respect to hibernate mapping files.There will be some inputs from users which needs to be updated in to the underlying database
e.g registration or updation.
We have few option like creating new POJO/DTO for action classes which will be filled by the Struts2 and than we can transfer them to the service layer where we can convert those DTO to the respected hibernate POJO else we can expose same POJO to struts2 so that the framework can fill them with the user input and we need not to do the work for conversion and creating extra set of classes.
Application will be not big in size and will have a medium size application tag.
My question is what is the best way to transfer this user input to underlying hibernate layer to perform data base specific work.
Thanks in advance
I'd prefer the "DTO" approach in this case since you then can validate the input first and trigger updates only when wanted.
However, you could use detached entities as DTOs and reattach them when you want to create or update them. If you don't want the web part of your application to depend on Hibernate and/or JPA you might need to create another set of classes (unless you don't use a single annotation).
You'll get both answers on this.
With Struts 2 I tend to use normal S2 action properties to gather form values/etc. and use BeanUtils to copy them to the Hibernate objects. The problem with exposing the Hibernate objects to the form, like with ModelDriven etc. is that you need to define whitelists/blacklists if you have columns that should not be set directly by the user. (Or handle the problem in a different way.)
That said, I'm not fundamentally opposed to the idea like a lot of people are, and they're arguably correct.

Re-using results from webservice

I am using Java and SOAP.
I have two webservices. One (A), which generates some data, and one (B), which will update that data given specific parameters.
My question is: How can I save the data after it is generated from A for B to use?
I have read that using stateful webservices is not preferable. Instead, can I just write the XML response to a file and then get B to open and parse that file? This seems like a lot of work. What would be the 'normal' method to use here?
Thank you!
The usual enterprisey thing to do is to have a persistence layer (e.g. database) to save the data. You would map the XML to a relational model and store that, then regenerate the XML when B requires it.
Saving a file directly is pretty simple and might be the best solution - you'll need to manage locking etc. yourself. Or you could do a very simple DB with the XML in a column.

Categories