What options do you have to communicate between the WARs in an EAR?
We have several WARs providing different webservices deployed within one EAR. For their tasks they need to communicate with the other WARs. Of course they could communicate using webservices. What other, perhaps more efficient, options are there?
EDIT: The reason for the communication is that the modules use some shared functionality, and we want to locate this functionality in only one place, since it requires a significant amount of resources. Also, this requires synchronous communication.
First, you should be clear about what is that you are sharing. You should differentiate between the service and a library.
Library lets you share the common functionality, this is what you achieve when you use log4j library for example. In that case, you setup log4j in each project that is using it.
On the other hand, you could have the centralized logging service that has its own logging configuration and lets you manage this in a single place. In this case, you need to share the service.
You can share the library by placing the jar inside each war or inside the ear.
You can share the service by being the service client. So, your web services can use another service. In that case, one web service is a client of another, achieving service composition (a common pattern in enterprise development)
If both service client and service itself reside inside the same ear, than you might avoid some overhead by calling the service “directly”, for example using the Spring’s parent context feature:
http://springtips.blogspot.com/2007/06/using-shared-parent-application-context.html
but I would advise against flattening the service because you will loose different benefits that having service in the first place provides like governance, manageability etc.
Since your edit seems to imply that the communications are not actually required between WARS, but both need to access the same shared resources. The simplest solution would be to put the jars for this resource in the EAR and add the dependency for those jars to both web projects so they are using the shared resource.
If there is stateful code in both web projects that need to be updated, then your only option is to make a call to the servlet for the web project (assuming the stateful code is contained within the web project).
Just remember that the shared resource must be threadsafe.
Similar question here.
Why not put the common classes into a JAR and work with them directly? Or slightly more heavy weight make the common classes session beans?
Two things come to mind
There's JMS for sending signals.
EJB could record shared information.
A lib jar in the EAR's lib directory.
If you just need shared methods, 3 is what you want. But your edit points tells me you've got shared functionality that operates on shared data. For example, you've got user records that both WARs access and update. If so, you want an EJB.
Related
I have a requirement, where a front-end application (written in spring MVC) needs to communicate with another backend application. Both the applications are going to be WAR running within the same tomcat instance. For understanding purpose, lets name it frontend.war and backend.war.
I have gone through many posts across various forum, and found many different strategies, some of them are as below:
1) Using EJB - Ruled out, EJB's are maintenance overhead and we have no plan to create a dedicated EAR to accomplish this; because we have plan to add more different forntend wars (application modules) which will communicate to same backend.war.
2) Using JNDI : Looks promising, but it needs to have one war to know about the 'interface' being exposed by 2nd war, its signature. So, it is making it tightly coupled with each other. Future change in the service contract can become nightmare.
3) Using REST API : This looks an ideal approach, with only one caveat that the communication is over HTTP call, hence it could be slow.
Other approaches like common parentContext (in Spring). ContextSwitching within application does have their own issues.
I am getting inclined to use REST API approach for this solution; as it is cleaner and easy to maintain. Further the http protocol is mature and has lots of know-how available for future development.
My query:
A) Is it possible to make a tomcat aware that a particular webservice call is indeed a call on the application running same JVM/Server (kind of 'internal'); rather than an 'external' webservice call?
B) If I use url like 'http://localhost:8080/rest/...' (note that backend.war is not intended for external world, so a domain name is not needed) ; will it do the trick?
I am looking for an approach, which gives me performance of JNDI (communication within same JVM) and flexibility of REST (You can change anything, anytime as long as public URLs are intact).
If you have thousand of war, maybe try the Enterprise service bus approach. WSO2 would be a good candidate. You could always change your entry point definition while keeping the backend intact.
Added benefit: your war can be deployed on multiple server and / or moved, but you keep only an entry point; only one address to change.
Create a jar file of the common functions, package them up as a dependcy to both projects - a service layer !
Alternatively, use rest and stick on different tomcat instances/servers - microservices!
I would use any "remote invocation" approach like Java RMI or CORBA. The latter applies also outside the Java world. Those have some benefits over others: they use TCP but not HTTP, therefore are lighter, serialize objects instead of creating new objects (like json or others). Additionally, I think RMI is simple to understand and use quickly.
I'm working on a web application which is in the form of some war files for the JSF2 front end, some jar files for services, and some shared jars to call the services from the front end, eg
webLayer.war -> serviceUtilities.jar -> service.jar
My problem is that when I try to call a service from the front end using my service utilities (which use reflection to call a service class in a separate jar) I get a class not found exception.
I have worked around the problem by changing the scope of the dependencies in my web project pom.xml from provided to compile, but this is not ideal because I have to build all the relevant projects and then build them into my web layer as libraries.
What I want is to have my web layer wars and service layer jars completely separate so that if I change my service I only need to compile that project. But obviously I still want to be able to access the jars from my war.
Can this be done, and if so how?
Thanks in advance
I believe you are trying to achieve total decoupling between your projets but in an unorthodox way.
First of all, using provided, means that while the library is present at compile time, it will not be added to your deployable (the war in this case) because it is assumed to be present in the platform where you deploy. Think of a similar case when using the servlet api but you do not need it as an explicit dependency at runtime because your servlet container provides it. It does not give you decoupling but a way to prevent dependency duplication. So in your case, the web application and serviceUtilities.jar do not know anything about your service.jar at runtime, while they do at compile time.
It is very difficult to be able to access the jars directly as you say (even using reflection - it is still a binary dependency), while keeping the projects' lifecycle independent.
I think the best way to get total decoupling between layers is to use Rest + Json for communication. You may even be able to deploy your layers in separate nodes. As long as the services contract does not change (the rest urls in this case) you are safe to even switch between different implementations or expose multiple front ends
Your serviceUtilities in the web side will then be replaced by code that uses some rest client library and your service layer will also use some rest service provider to listen to your rest urls. Which technology you use depends on your preference and technology stack. Jersey is the reference implementation for JAX-RS, while you can also easily use spring controllers in the provider side.
As shown in the above pic, I have a EJB-3 Enterprise application (EAR file), which acts as a portal and holds 3 web applications (WAR files) that communicate and transact with the same datastore. These 3 webapps are not portlet implementations, but normal webapps which interact with the datastore through the Enterprise App's Persistence Layer. These webapps are developed independently and so, some of 'em use Webservices from the Enterprise App and some of 'em use EJB-Clients.
Also, there is an other option of replacing these webapps (Web App1, Web App2 and Web App3) and using independent Enterprise Apps to communicate and transact with the database, as shown below:
Now, my questions are:
1) What is the best Option among the listed 2 options (above)?
2) How does it affect when we replace those webapps acting as clients to the Enterprise App, as independent Enterprise Apps (EAR files)?
3) What is a better model for Transaction handling, SSO functionality, Scalability and other factors?
4) Are there are any other better models?
EDIT:
1) In the first model, which method is a preferred way to interact with the EAR file - webservices or ejb-client jar file/library (interfaces and utility classes)?
2) How do both models differ in memory usage (server RAM) and performance. Is there any considerable difference?
Since you are being so abstract I will do it as well. If we remove all buzzy words as "Portal", "Enterprise Apps" and so on... What we have at the end is three web apps and a common library or framework (The enterprise App).
Seeing its app as simple as posible. You have three developers that need develop three web apps. You will provide some common code useful to build their apps. The model you will use will depends of what kind of code you will provide them.
1.- You will only provide some utils, and common business code. May be the clasical library fit your needs. (In Java EE environments you must take in account how can you take the advantages of persistence cache level 2 sharing a Session Factory for a single datastore)
2.- You will provide shared services as persistence, cache, security, audit, and so on... You will need a service layer as the first option. You will have a shared state so you need only one instance.
3.- The more common case is both you provide some business API and a service layer to common services.
You aren't indicating any requirement that force you to use a more complex solution for your scenario.
EDIT:
About if it is prefered rmi (the ejb-client) or webservices. I always use rmi to communicate applications geographically close. It use is simple and the protocol is much more faster that webservices (you can read a lot of comparison over this topic searching for rmi webservices performance on google).
On the other hand rmi is more sensible to network latence, require special firewall configurations and it is more coupled that webservices. So if I pretend to offer services to a third party or connect geographically sparse servers I will prefer webservices or even REST.
About the last question initially there is no any difference about deploy one or ten applications in the same server. The deploy fee will be insignificant over the overhead for the use of the application. Of course, you must take this as a generical assumption. Obviously the size and how you deploy your applications will have an impact about the memory consumption and others.
You must take in account that this decisions can be easily changed as you will needed. So as I said you could start with the simple solution and if you encounter a problem deploying your applications your could restructure your ears easily.
I'm inclined to agree with Fedox. If there is no reason for choosing one solution over the other ( business reason, technical reason, etc) then you might as wel choose the path of least resistance. To my mind that would be the first solution.
In general terms start simple and add complexity as you need to. Your solutions have no meaning without context. A banking app needs different considerations to a blog.
Hope this helps
There is a new platform called Vitria's BusinessWare, it's a very successful project which is worth millions.
Now let's see how does it work and what it does so that we can do the same in theory:
It interconnects projects with their databases, web-services with their EJBs..etc.
From their concept we can learn the following:
Create main EJB stateless bean (API), whose job is to pass messages
from:
web-services to other web-services
web-services to webapps
webapps to other web-services
The purpose of this EJB is first do validations in the main database
and then pass the calls to the other modules.
Only this EJB has access to the DB to more secure the connections
This EJB will queue the messages until the modules to sent are free
to accept
This EJB will control all the processes in the DB
This EJB will decide where to send the messages
If I have a hosted web application, is it good practise to split the web and api web service into 2 different projects/hosted applications in tomcat?
I can see that if people try and abuse the API it will effect the performance of the web application.
If I was to go with creating 2 separate projects (or if not initially but build for the potential to split things off), can I somehow share my hibernate data layer between 2 projects?
I'm using IntelliJ, how can I do this? Would it be to create a seperate module for hibernate (domain entities, Dao, and Service classes).
I wouldn't say is a good practice in general, but maybe a good idea for some scenarios.
In a service oriented architecture, a service layer is consumed by not only the web layer, but potentially other clients. In this case is probably a good idea to build the web and service layers in separate servers.
Another case would be when you want to perform separate deployments, because e.g. work in both layers is done by different teams or in separate workstreams - I would question whether this is a good practice though as opposed to teams working in vertical features rather than in layers.
You can create your service layer in many different ways:
As web services. When you need interoperability.
As remote EJBs (this is possible in TomEE). When interoperability is not necessary.
You can also create a combination of the both above, they are not mutually exclusive.
In terms of splitting the projects, you could create:
A set of domain objects in a jar module that is to be shared between your web and service layers.
A war module for your web layer.
A jar module for your service layer interfaces that is a dependency for your web layer.
A jar/war module for your service layer containing services and DAOs.
What's the difference between what you call "web" and "api web service" from the client perspective? A programmatic client can "abuse" either of those, so not sure if it makes sense to split them for that reason. You can use a load balancer to scale out.
You could make an internal API that the web interface consumes, and a web api that consumes the internal API.
I need to perform pre- and post-processing of all incomming requests to a web server. The functionality is both url-level access restriction and language translation but also other special cases that need to be handled globaly.
Typically this can be achieved with servlet filters but when the number of web applications grow it becomes desirable not to bundle the filters with every application since all applications need to be rebuilt and re-deployed when making a change to a filter.
Instead I would like to install the filters globally on the server and I have found two possible solutions of which I'm not satisfied with any of them.
On Tomcat it is possible to deploy server-wide filters in the "lib" directory and configure the server web.xml to map them to incoming requests. The problem I see is that any filter dependencies also need to be deployed globally in the lib directory. From what I understand this can cause hard to solve dependency conflicts with installed applications. (Does Tomcat load the same library file into memory twice if they are in two web apps?)
Deploying the filters in a simple web application that mainly acts as a proxy would at least bundle the filters with their corresponding dependencies. This application can then be deployed on the server and take all incoming requests before forwarding them to the target application using the crossContext config parameter. (RequestDispatcher forward between Tomcat instances) However, this requires fiddling with the urls such that all links point to the "proxy".
Neither of these solutions seem to be satisfactory. They are both platform dependent since they rely on Tomcat. They also both seem to have possible problems and require special handling of dependencies.
What is the best practise when using server wide functionality?
This is my untested thought (so not a best practice) - which is a variation of option 2 in your list.
You can use Sitemesh (which is actually meant for decorating your multiple web apps with a common header/footer - but in this case dont use the header/footer).
Host the Sitemesh as a separate web app with crossContext = true.
Sitemesh will be invoked as a Filter for each web app, so the URLs that the end user sees will not change at all. But you will have to define the decortaor.xml for each web app.
You can write your actual Filter processor and chain it after the Sitemesh Filter. All requests will go to the Sitemesh app first - then to your Filter - then to the individual servlet within the web app.