I want to make my web application able to work offline and as soon as it becomes online or gets connected again, then it should be able to transfer the modifications made by user in offline mode.
I have seen Google Gears as an ideal solution for my problem, which is not recommended to be used as it is now deprecated.
What is a good way to make my application work offline, both in terms of technology to use and application design?
Gears is deprecated because the HTML5 standard allows for equivalent features to be present in compliant browsers.
With respect to your current problem at hand of handling offline web application access, you can look into the support offered by HTML5 for offline web applications via support for client-side SQL database access, and the client-side application HTTP cache.
The features will have to be used in conjunction, as the client-side database access will allow for storage of data (generated when the application is offline) in a structured format, while the offline application cache will allow for caching of HTTP responses from the server; you should not be caching responses that are dynamic in nature which depend on any user-provided inputs.
The details of the proposed APIs can be found in the W3C HTML5 specification, which is in draft at the moment, although it appears that certain user-agents have already implemented this feature.
Firstly, you will need some form of offline storage. HTML5's capabilities are the successor to Google Gears, as stated on the google gears developer blog; essentially, the purpose of Google Gears was just to push the development & subsequent adoption of HTML 5 features.
Specifically you should be looking at the HTML5 offline (here's a tutorial) APIs, and the Storage APIs may also come in handy (relevant tutorial).
With regards to design, you will essentially need to maintain your complete web application state client side, and then send over the differences (i.e. update the server-side state) as soon as the connection to the server is available again.
Off the top of my head, there's 2 simple ways to design this:
Explicitly maintain separate application states for the client and server. Essentially, when the user takes an action, it's applied to the client application state first, and then at specified intervals (and/or triggers, e.g. the user clicks the save button), the client sends over the differences between the last known state of the server and the current state of the client. This is probably best suited to highly interactive web applications, and I suspect Google Docs works on this kind of design. Depending on your application (if "conflicting changes" can occur), you'll need to also account for merging application state: do you override with the last received client state, or do you intelligently try to merge? (you'll have to decide which makes more sense for your particular application.)
Record user actions while offline, and replay them once the connection becomes available again. You essentially implement the Command design pattern, and have both your client-side code and server-side code able to handle each command. The client-side code always handles each command, and while the connection to the server is available, your client side code also sends off the commands to the server. You'll probably want to implement some batching, to avoid continual requests to the server, and also some roll-back functionality when requests to the server fail (e.g. conflicting changes). This ends up looking more or less like GMail's main email managment user interface, where you can undo operations.
This has not much to do with J2EE, but rather how you code your web-client. One possible solution would be to use a javascript client that does save the data in the local storage introduced with html5 (see http://diveintohtml5.ep.io/storage.html ). That is also basically the reason why google gears was stopped ...
Related
We have 13 years old monolithic java application using
Struts 2 for handling UI calls
JDBC/Spring JDBC Template for db calls
Spring DI
Tiles/JSP/Jquery for UI
Two deployables are created out of this single source code.
WAR for online application
JAR for running back-end jobs
The current UI is pretty old. Our goal is to redesign the application using microservices. We have identified modules which can run as separate microservice.
We have following questions in our mind
Which UI framework should we go for (Angular/React or a home grown one). Angular seems to be very slow and we need better performance as far as page loading is concerned.
Should UI/Javascript make call to backend web services directly or should there be a spring controller proxy in deployed WAR which kind of forwards UI calls to APIs. This will also help if a single UI calls requires getting/updating data from different microservice.
How should we cover microservice security aspect
Which load balancer should we go for if we want to have multiple instance of same microservice.
Since its a banking application, our organization does not allow using Elastic Search/Lucene for searching. So need suggestion for reporting using Oracle alone.
How should we run backend jobs?
There will also be a main payment microservice which will create payments. Since payments volume is huge hence it will require multiple instances. How will we manage user logged-in session. Should we go for in-memory distributed session store (may be memcache)
This is a very broad question. You need to get a consultant architect to understand your application in depth, because it is unlikely you will get meaningful in-depth answers here.
However as a rough guideline here are some brief answers:
Which UI framework should we go for (Angular/React or a home grown one). Angular seems to be very slow and we need better performance as far as page loading is concerned.
That depends on what the application actually needs to do. Angular is one of the leading frameworks, and is usually not slow at all. You might be doing something wrong (are you doing too many granular calls? is your backend slow?). React is also a strong contender, but seems to be losing popularity, although that is just a subjective opinion and could be wrong. Angular is a more feature complete framework, while React is more of a combination of tools. You would be just crazy if you think you can do a home grown one and bring it to the same maturity of these ready made tools.
Should UI/Javascript make call to backend web services directly or
should there be a spring controller proxy in deployed WAR which kind
of forwards UI calls to APIs. This will also help if a single UI calls
requires getting/updating data from different microservice.
A lot of larger microservice architectures often involve an API gateway. Then again it depends on your use case. You might also have an issue with CORS, so centralising calls through a proxy / API gateway, even if it is a simple reverse proxy (you don't need to develop it) might be a good idea.
How should we cover microservice security aspect.
Again no idea what your setup looks like. JWT is a common approach. I presume the authentication process itself uses some centralised LDAP / Exchange or similar process. Once you authenticate you can sign a token which you give to the client, which is then passed to the respective micro services in the HTTP authorization headers.
Which load balancer should we go for if we want to have multiple
instance of same microservice.
Depends on what you want. Are you deploying on a cloud based solution like AWS (in which case load balancing is provided by the infrastructure)? Are you going to deploy on a Kubernetes setup where load balancing and scaling is handled as part of its deployment fabric? Do you want client-side load balancing (comes part of Spring Cloud)?
Since its a banking application, our organization does not allow using
Elastic Search/Lucene for searching. So need suggestion for reporting
using Oracle alone.
Without knowledge of how the data on Oracle looks like and what the reporting requirements are, all solutions are possible.
How should we run backend jobs?
Depends on the infrastructure you choose. Everything is possible, from simple cron jobs, to cloud scheduling services, or integrated Java scheduling mechanisms like Quartz.
There will also be a main payment microservice which will create
payments. Since payments volume is huge hence it will require
multiple instances. How will we manage user logged-in session. Should
we go for in-memory distributed session store (may be memcache)
Not really. It will defeat the whole purpose of microservices. JWT tokens will be managed by the client's browser and expire automatically. You don't need to manage user logged-in session in such architectures.
As you have mentioned it's a banking site so security will be first priory. Here I have few suggestions for FE and BE.
FE : You better go with preactjs it's a react like library but much lighter and fast as compare to react. For ui you can go with styled components instead of using some heavy third party lib. This will also enhance performance and obviously CDNs for images and big files.
BE : As per your need you better go with hybrid solution node could be a good option.e.g. for sessions.
Setup an auth server and get you services validate user from there and it will be used in future for any kinda service .e.g. you will expose some kinda client API's.
User case for Auth : you can use redis for session info get user validated from auth server and add info to redis later check if user is logged in from redis this will reduce load from auth server. (I have used same strategy for a crypto exchange and went pretty well)
Load balancer : Don't have good familiarity with java but for node JS PM2 will do that for you not a big deal just one command and it will start multiple instances and will balance on it's own.
In case you have enormous traffic then you better go with some messaging service like rabbitmq this will reduce cost of servers by preventing you from scaling your servers.
BE Jobs : I have done that with node for extensive tasks and went quite well there you can use forking or spanning this will start a new instance for particular job and will be killed after completing it and you can easily generate logs along with that.
For further clarification I'm here :)
Frequently, I write small apps that rely on a web service that provides JSON at various points (login, configuration, request info, etc). For UI development/testing, I usually just keep a few dummy JSON files in the app bundle that can be read locally. Is this the most common practice, and are there any better ways to do it?
Well,
If by "offline" you mean you absoluteky can't have any internet access, I guess you don't have any choice. Although, this will mean that you will not be able to test your web services calls.
Otherwise, if you are using API calls, you can use free API servers such as http://myjson.com/api. These are simple JSON store for your web or mobile app.
I often use it during development phase.
Hope it may help.
I have a requirement where a Groovy Application is supposed to send event notifications to another Java Web Application Which will than display that data on
web interface.
I don't want to use Queues like ActiveMQ or RabbitMQ because this will introduce an extra layer and will be used if no other solution exists.
An idea have been shared with me that I should expose a web-service from my Java application, which will be consumed by the first application, and the data
sent to the web-service will be then received in second application and somehow displayed on it's web interface.
I am not sure how this will work i.e how the data which is received in web-service of second application will be displayed on its web interface.
Kindly help me to figure out the right solution for this task.
Your problem actually is "how to send notifications from server to browser/mobile client issued by another application".
If you have very strict requirements for latency, then I would suggest to use https://github.com/OpenHFT/Chronicle-Queue
It was created by HFT guys to process 6 millions of messages per second in a single thread.
To display events on user's screen please consider using mechanisms like WebSockets, Server Sent Events, Push Notifications, Long polling, whatsoever depending on your requirements ( like browser support ).
Actually in most cases it doesn't matter what transport are you using. Unless you have super strict non-functional requirements like sub microsecond latency you're free to choose any mechanism, e.g. HTTP, JMS.
Try not to over engineer and design your software based on your actual requirements - not on stackoverflow answers.
Cheers!
I would suggest you create an XML representation of the data you wish to transfer to the java web app. On the java web app if using simple servlets, create a new servlet to which you can post this xml. The servlet could then persist this to a database. This can then be retrieved when a user logs in to the web app at some point.
Let me know if you need any more help. I could only answer only so much based on the question. Some more light on the framework the java web app is using and the data you wish to transfer, might make it easier to add more info.
I need to implement a proof of concept application Swing application where there's a server having a list of users and several clients which connect to the server and do CRUD operations on the database and hence on the list of users.
I have an obvious synchronization dilemma of keeping all clients lists updated so that if one client removed a user another one who still has it in his list cannot change its name.
Now I know a protocol in which before updating a user the client asks the server whether it still exists would work.
However this is just a simple example but in the real application I might have junction tables and complex references between objects which need always to be kept consistent and I don't want to reinvent the wheel.
What I'm wondering if there's some ready made solutions or some library which does this job which doesn't require me to change database or load extremely complex dependencies.
I did some research on Google but nothing seems to fit and the most similar example of client server synchronization I found was "chat programs", however chat programs are inherently simple because a message is never modified or deleted and all you have to keep consistent is the chronological order. I would need something more involved than that or some useful hints on the subject.
What you need is some sort of messaging between server and clients. Clients will either long-poll the server, asking for the updates, or subscribe to some streaming endpoint on your server.
You can take a look on Comet model, long-polling itself, websockets, etc.
Alternatively - there are couple of data management servies - BlazeDS, GraniteDS or any similar purpose solution. You can integrate one of those in your application and use for complete data management cycle.
Here's a scenario:
I have a java front end (RCP/SWT) app which currently has no authentication support.
I have to however, add security to this application so that it gets deployed in different enterprise envinronments. I have a few approaches which I thought I would share with you all here and take your inputs. Please note that there are no strict requirements yet, so.. I would like you to consider typical and non-typical enterprise network security models.
Approach 1
Create a 'Security' webservice that
the thick client would invoke, on startup.
The client queries the security for the current authentication mode and receives the implementation class of the authentication as a soap attachment. The class received, will not have the logic for authencation, rather it would just describe the UI and the events on the UI. (The client could make use of a GUI toolkit such as Thinlet?)
Once the class is loaded, a UI relating to the currently set authentication method is displayed to the end user.
Advantages:
This approach lets me handle different authentication schemes. For instance, if the app has to authenticate against user names and passwords stored in a database, a screen with UserName and password fields would suffice. However, say the user were to do a network logon that would involved typing in the network name, the UI would contain three fields. If the security model at the client network dictates ntlm/SSO based authentications, the user won't see a UI. This will also leave scope for future authentication methods - for instance, supporting a captcha specific logon screen/ biometric stuff / whatever.
Approach 2
KISS (Keepin in yea.., Simple)
User name and password are usually the only two credentials required by all of the known authenticating mechanisms?
Have the thick client query the webservice and let the webservice handle the entire authentication process.
I am not sure how realistic/feasible/commonly used the above mentioned approaches are. Appreciate your help.
I'd certainly not recommend transmitting class definitions as SOAP attachments. A network classloader would make more sense, but is still not needed in your situation.
Put in the client what belongs there - the UI. Have the multiple screen types ready (i.e. defined as classes) on the client and activate each of them depending on a single value passed by the server. For example if AuthenticationType.CREDENTIALS is passed, go for username/password. If Authentication.SMART_CARD is - go for smart card.
If you want to distribute the application and later implement different auth screens, then use Java Web Start. Thus all clients will be guaranteed to be running the latest version.
After knowing that your requirements impose some limitations, take a look at this network classloaders article.