I have a large scale project I am working on at the moment using Eclipse. Normally, as a one man team, these problems would not be an issue, but as our team is not one person we need to be able to break up pieces of the project to be worked on by certain team members.
In simplicity, let's say I have two layers to be separated apart:
1. Each DAO is a separate Java project, to be worked upon individually
2. The web-tier service layer contains all of our service endpoints and must be able to reference all of the DAOs. This layer runs on Tomcat as a dynamic web project, and utilizes Adobe LiveCycle Data Services as the piece that handles creation and management of endpoints.
Now, the issue we are running into is that when we create a DAO and unit test it individually it runs great. But when we reference it into our service project and try to run it we begin to get all kinds of issues related to the fact that we have two different versions of certain jars referenced in and as such we begin to have errors when running the server.
As a result, we know we can solve the issue by pulling the problem jars and ensuring that this is not an issue again in the future, but as I said before this is a large scale project with multiple people working on it and we don't want to be spending our time weeding out dependency issues when under the gun.
We are looking for recommendations on where to proceed for alternative solutions? Our team is new to JavaEE and as such we don't have much of a bearing on what we can use to tie everything together in it, or if it is a viable solution. Should we be looking at turning our DAOs into EJBs and deploying them in an EAR library? If so, where would our service layer lie, and would the service layer be able to reference the DAO classes since the EJB maintains it's own classpath (from what we have read?) Are we looking down the wrong path, or are we completely wrong in our current understanding of JavaEE?
Any assistance is greatly appreciated. We are still in the framework stage of this project and we want to be sure that we will be able to maintain it in the long run.
I second the Maven recommendation. That can add all sorts of sanity to your project structure.
Maven can even generate Eclipse workspaces via mvn eclipse:eclipse
An important clarification on the EJBs note. As of ava EE 6 is you no longer need to separate EJBs from Servlets and can use them together in the very same jar in the war file.
So understand from that that using EJBs or not no longer has any impact on packaging or classloaders as it once did. These are now separate decisions. EARs and classloader separation should now be viewed as a feature you might want to use should you want classloader separation and the complexity it brings. Most applications simply do not need that and are more than fine with just a war file containing servlets, ejbs, jpa entities, cdi beans, jaxrs services and whatever else you need. You are free to decide how you want to separate them or if you want to bother separating them at all.
EJBs do make great DAOs due to transaction management, something you don't get from plain Tomcat but can be made available in Tomcat via TomEE and works fine in Eclipse. You should consider EJBs for that reason, not for dependency reasons.
Side note, as you're new to Java EE, you might find this helpful:
http://openejb.apache.org/examples-trunk/index.html
In order to have things organized when working with Java EE in teams of 1+ people I could suggest:
Use Maven to manage your build process and library dependencies.
Maven has a small learning curve, but once you grasp it you will be grateful. By using Maven you no longer depends on Eclipse to manage your classpath.
A thing about it that I think is really helpful when working in teams is the install feature. Suppose you are woking on the version 1.0 of an EJB module, say core-ejb-module-1.0, and you've got it to a stable state and want everyone working in the project to refer to it from now on.
You then run a maven command like this on it: mvn clean package install
Maven will clean this module, compile it, run tests, create the jar and then install it to a repository that you define. Could be any computer in your company.
Now you may tell the guys working on other projects to update this dependency version on their .pom file and in the next build they run, before compiling, maven will download this library and then use it. Really neat. No more classpath hell.
(There are other ways to always automatically refer to the latest library as stated in this post, but there are some caveats. Anyway it's just an example.)
Use JPA/EJB instead of DAO Pattern.
Some people say DAO meaning any sort of data access, others really mean that they use the DAO Pattern to access objects. If that is your case, you no longer need to use it when using JPA. (At least for most common scenarios).
In my case, I have a generic EntityService which is capable of doing CRUD operations on any Entity and has a centralized query management. Then every EJB's that should perform database related operations may inject this guy and do its job.
As a suggestion, with Maven, you project could be organized as such:
core project structure
core (The pom root)
core-ejb-module (Includes all generic EJB's, like the EntityService for instance.)
core-jpa-module (Includes all JPA generic definitions, like Interfaces, MappedSuperclasses and such.)
core-jsf-module (Includes all JSF generic definitions, like abstract controllers, generic converters and wrappers for FacesContext, etc..)
Now that you have a core generic module setup, you could create:
an application structure
app (The pom root)
app-ear-module (Includes all other modules in this application. Shared jars goes in the ear /lib folder, so all other modules could reference to them.)
app-ejb-module-a (Includes EJB's for the business layer of your application. It uses the core-ejb-module)
app-ejb-module-b (You may have lots of ejb modules. You may even have a project that contains only ejb modules. Other apps will declare their dependency on them via Maven.)
app-jpa-module (Contains definitions for JPA Entities that represents you database tables. Depends on the core-jpa-module)
app-web-module (Holds the pages, Controllers and Converters for this application.)
I think you got the idea. Things tend to be loosely coupled and you may organize your projects as you like.
This is just a simple example to illustrate. I didn't explained a lot about Maven but if you're interested I think it will help you indeed.
I hope it gives you some ideas and may help you in any way.
[]'s
If you can run all the sub-components using the same set of dependencies, you may find it helpful to migrate to a Maven build.
With Maven, you can define a top-level project that manages all the 3rd party dependency versions in one place, so all modules are built, tested and deployed against the same library versions. You are also likely to find Maven a good fit for the multi-module approach you have adopted, as it ensures that a project is rebuilt correctly if one of its dependencies changes.
You would still be able to use dynamic web projects as before; Eclipse will automatically deploy the DAOs as part of the service project (IIRC you need to characterise the DAOs as utility modules).
If you do go down the EJB root, you are correct that each EAR will get its own class-loader, and can therefore use a varying set of dependencies. However, in your position I would tend to look at improving your dependency management first - it'll probably be cheaper and easier.
Related
I have two Java projects with same domain objects.First project is the administration of a webapp. And second project is the webapp.
I've chosen this approach in order to allow deployment of administration without downtime for my webapp.
So both projects use same database. I'm using spring-data and marking entities with #Entity.
My question is: is there any way to replicate domain objects in each project?
For example creating another maven module with the domain objects and mark as a dependency. (But in this case #Entity will still work?).
the way is just as you said it - create a maven module (usually called datamodel, infomodel or something along those lines) that contains all of your JPA classes (#Entity classes).
this model can either be a completely separate 3rd project (more work) or, more likely, pick one of the 2 projects as the "owner" of the module and the other project will simply list it as a dependency. in both cases you'll need to think about things like version compatibility (what happens when you update administration but not the webapp and the entities changed? who updates the database, how do you make sure the older code can still read/write it?)
as for working, JPA classes work just fine in their own jar.
Like you have raised is a good approximation, separating two different maven projects.
What you're trying to do is very similar to the structure of Broadleaf Commerce. It is a multi-module project using Maven and Spring, is open source, so you can look at how it is structured to see if it helps.
Here you have another guide of how to implement it step by step. Hope it helps.
It seams that you will need at least three modules.
1st - the domain module with the enitity annotated domain classes;
2nd - the application itself witch depends on the domain module;
3rd - the adm module witch also depends on the domain.
Now that you have a multi-module maven project you should have a 4th project formally listing the other three as its child modules.
P.s.: Resist to the temptation of creating separate git repositories and evolving the versions of the modules separately.(just an advice)
This may be a very rudimentary question, but please help me out if this is well-known and has been solved elsewhere.
I have a multi-war setup (all maven modules) say kilo-webapp1 and kilo-webapp2 as two WARs that I need to deploy on a Tomcat instance. These two webapps both use services from a common service jar, say kilo-common-services.jar. The kilo-common-services.jar has its own spring context that is loaded by the users of the jar viz. kilo-webapp1 and kilo-webapp2 in this case. It so happens that the initialization of the services in kilo-common-services takes a long time and hence I want it to happen only once (to ensure that the time it takes to bring up the instance is not very high) which also helps me to use it as a second level cache that it kept current in the JVM instance. To do this, we resorted to the following steps:
Modify the catalina.properties of CATALINA_BASE in tomcat to have shared.loader as ${catalina.base}/shared/lib
Copied the kilo-common-services.jar and all of its dependent jars to the CATALINA_BASE/shared/lib. [Manual step]
Copy spring related jars to the CATALINA_BASE/shared/lib location [Manual step]
Created a beanRefContext.xml file in kilo-common-services.jar. Define a new ClassPathXmlApplicationContext here, where the constructor was provided with the location to the spring context file for the common services.
Noted the dependency scope of kilo-common-services.jar and every other dependency (like Spring related jars) as provided in the kilo-webapp1 and kilo-webapp2 pom files. For Spring this is needed to ensure that the classpath scanning actions are not triggered twice. Also this causes different ClassCastExceptions (for log4j lets's say) if not excluded via the provided scope.
web.xml for kilo-webapp1 and kilo-webapp2 indicated that the parentContext for them is the servicesContext defined in kilo-common-services.jar.
I was able to verify that only one instance of the services of kilo-common-services exist, but the setup as you might have imagined is painful. If someone has best practices about such a setup in an IDE like Eclipse, would really appreciate it. My problems are as below:
#2 is becoming a challenge. I am currently running mvn dependency:copy-dependencies on kilo-common-services to copy dependent jars from target/dependency to the shared/lib which is a woefully manual step. Time and again, I forget to regenerate dependencies and have to do a redeploy again.
#3 is also not straight-forward as time and again there are newer common dependencies and we always have to remember to copy it to shared lib to avoid ClassCastExceptions
#5 is again a maintenance nightmare.
Also as time progresses, there will more such disparate common jars that need to be shared and it would involve pain for each of those jars. Feel free to critique the setup and propose a better one in its place that may be easy to use (from an IDE as well). Would be happy to provide any other details.
Thanks in advance!
The problem is that your architecture is broken (and that's why you're struggling with the solution). You have two solutions:
1) If you want to share a service that takes a long time (to initialise) between two war applications, make that a separate service completely and access it via rest or any kind of remoting.
2) Merge both webapps into one.
Having the common library is the shared lib folder is going to bring you lots of headaches, and you'll end up rolling it back.
My (personal) approach would be to merge both applications, but keep the packages separate enough and have separate spring configurations. In this way, at least you still keep the logic separation of both webapps.
Also since both run on the same container, there's little gain from having 2 separate wars (unless you're planning to move them to different containers very soon).
About the IDE, you can use the maven-cargo-plugin to start up a tomcat with several web applications with (almost) any configuration you want.
We are developing restful soa, with spring and tomcat and utilizing Domain Driven Design (well thats the plan anyway). There is migrationProject and a initial basic search service. Two separate WAR files, with two separate POMs. Both utilize the same Domain objects.
So I will have separate project that will be just the DomainObjects I will wrap them up into a jar, and then using maven and/or jenkins it will deploy automatically (whenever I configure (for example when pushed to a specific repository).
Having two copies of the same jar, sounds like a much worse idea to me. Its not your architecture that is broken, its your deployment and development process thats needs improvement, imho.
(my kind of related question).
Our long term plan is to have one project as the restful interface, with multiple Controllers that have service classes and repositories injected into them from their dependencies.
We are building a small application using different architectural layers such as domain, interface, infrastructure and application. This follows the Onion DDD model. Now I am wondering if there is any benefit in splitting the application into a multimodule maven project. As far as I can see now it seems to make things more difficult than needed. The entire application will be deployed as a single WAR file into a Tomcat container.
Splitting your application makes sense for the following:
When a certain part of the project needs to have new functionality or bug fixes, you can simply focus on that module and run just the tests for it. Compiling a fraction of all the code and running just the related tests speeds up your work.
You can re-use the code from the modules across different projects. Let's assume your project contains some well-written generic-enough code for mail sending. If you later have another project that need mail sending functionality, you can simply re-use your existing module or build upon it (in another module by adding it as a dependency).
Easier maintainability on the long run. Maybe now it seems like a small project. In a few months things might look different and then you'll need to do more refactoring to split things into logical units (modules).
Conceptual clarity (as added by Adriaan Koster).
Concerning the WAR: You can have an assembly module which puts things together and produces a final WAR file from all the related modules.
Initially, this may seem as more work, but in the long-run, modularized projects are easier to work with and to maintain. Most sane developers would prefer this approach.
Using multiple modules forces you to have a hierarchy of dependencies. You have one module which is standalone and doesn't depend on any other of your modules. You have another which only depends on that. It might appear harder than allowing anything to depend on anything else but this approach results in a mess of dependencies which is hard to fix later.
If you are trying to follow a layered model I suggest you place each layer in a different module. This will ensure you are not tempted to break the model.
Short answer: today it is small, tomorrow it will bigger and more complicated to maintain, reuse, extend, integrate with other system and so on
As far as I know, Maven do little help for WAR dependencies. As you are talking about single WAR, this should never be a problem.
You can separate java classes into several "jar" submodules, but if you split the WAR project into several smaller WARs, using some kind of "overlapped" packaging things get complicated.
Just information, one of our projects, it contains too many web pages, so we decided to split it into several WAR submodules, however, the session is not shared between different WARs deployed, and we are not going to use Kerberos stuff. At last, we modified a lot sources of Glassfish, Jetty, MyFaces, etc. To make them resolve web.xml stuff inside JARs. And converted the whole project to Facelets 2.0 (to avoid the dependency of JDK tools.jar and custom resource handler), the only reason is to change the WAR submodules to JAR submodules, and move all webapp/pages into class resources. So the conclusion, Maven does great job for JAR dependencies, but no WAR or single WAR.
EDIT You can put applicationContext.xml in one of the base submodule, and import it by classpath:com/example/applicationContext.xml. Also Spring 3.0 do have annotation supports, you can make spring auto scan them instead of declaring them all in the xml.
Spliting your project into multiple maven projects is useful if you want to reuse your classes in another project or if your projects are deployed in different configurations.
Maybe think of a webservice - if you are hosting the server, you could build a project for your domain classes (models) and your endpoint interfaces that could be used by server and client. The server would be another project that is build to a WAR.
To develop further clients the first project could be used, too.
Use a parent project for dependency management on common projects (like logging) and different profiles and build configurations.
We are developing webapps with Eclipse + Tomcat plugin. We recently started a new app which will run on Facebook and StudiVZ (FB competitor in Germany). Since the functionality of the app will be 95% the same we split the code into separate Eclipse projects (app-core, app-facebook, app-vz). The -core project is source-linked into the -facebook and -vz projects in Eclipse. We are also using Hudson for CI and made ant scripts that import the code from the -core project before building. So basically we tried to inherit on a project level.
The described method has some flaws:
Versioning is complicated
The -core project does not run standalone, which makes automatic testing partly impossible
We need to modify some models where the -core projects classes depend on
Other problems that make me think this is not the best solution
Does anyone have suggestions for a better solution?
There are a wealth of build tools available for Java that address dependency management and versioning specifically. Many of these integrate with Hudson and Eclipse.
I'd suggest looking at Maven and how it does dependency management as a good starting point. Even if you don't use Maven itself, many of the solutions out there build on Maven's dependency management mechanism. Something like Apache Ivy allows you to use maven dependency management, but still use your own custom Ant scripts; whereas something like Gradle is wholesale replacement.
You should be able to split your project into 3 or more parts and then establish dependencies via Java Build Path. You need to clean up the dependencies between the projects. If you need to configure your core components depending on whether it is a -facebook or a -vz project, you might need to separate configuration, maybe even use Spring or similar dependency injection framework.
When trying to introduce reuse into web-based Java projects, usually the problems arise in the UI code. Not many frameworks were built with this approach in mind.
I don't use/hate Eclipse[1], but can point to how we deal with a similar problem.
We use Maven with IntelliJ. In particular, both of these support modules which have defined internal dependencies. In your case it could be -fb and -vz modules depending on core, or you can split core into smaller parts (such as DAO, business logic, etc.).
When compiling, deliverables of "upper" modules would be used to build "lower" modules.
Let's go over points/flaws you have raised:
versioning is no longer a problem as everything sits under the same root of Subversion/GIT/VCS of your choice
Why is that a problem? Certainly this shouldn't be an issue for unit tests as how I understand TDD, these should not require complex environments. For automated tests, you would have to test the core API (as this is the interface between core and everything else, right?) hence this shouldn't require any fronted stuff?
you need to explain your other points to tell why you don't like it
It is against Geneva convention to ask a developer to use anything other than IDE of his/her choice.
I have a rather large (several MLOC) application at hand that I'd like to split up into more maintainable separate parts. Currently the product is comprised of about 40 Eclipse projects, many of them having inter-dependencies. This alone makes a continuous build system unfeasible, because it would have to rebuild very much with each checkin.
Is there a "best practice" way of how to
identify parts that can immediately be separated
document inter-dependencies visually
untangle the existing code
handle "patches" we need to apply to libraries (currently handled by putting them in the classpath before the actual library)
If there are (free/open) tools to support this, I'd appreciate pointers.
Even though I do not have any experience with Maven it seems like it forces a very modular design. I wonder now whether this is something that can be retrofitted iteratively or if a project that was to use it would have to be layouted with modularity in mind right from the start.
Edit 2009-07-10
We are in the process of splitting out some core modules using Apache Ant/Ivy. Really helpful and well designed tool, not imposing as much on you as maven does.
I wrote down some more general details and personal opinion about why we are doing that on my blog - too long to post here and maybe not interesting to everyone, so follow at your own discretion: www.danielschneller.com
Using OSGi could be a good fit for you. It would allow to create modules out of the application. You can also organize dependencies in a better way. If you define your interfaces between the different modules correctly, then you can use continuous integration as you only have to rebuild the module that you affected on check-in.
The mechanisms provided by OSGi will help you untangle the existing code. Because of the way the classloading works, it also helps you handle the patches in an easier way.
Some concepts of OSGi that seem to be a good match for you, as shown from wikipedia:
The framework is conceptually divided into the following areas:
Bundles - Bundles are normal jar components with extra manifest headers.
Services - The services layer connects bundles in a dynamic way by offering a publish-find-bind model for plain old Java objects(POJO).
Services Registry - The API for management services (ServiceRegistration, ServiceTracker and ServiceReference).
Life-Cycle - The API for life cycle management (install, start, stop, update, and uninstall bundles).
Modules - The layer that defines encapsulation and declaration of dependencies (how a bundle can import and export code).
Security - The layer that handles the security aspects by limiting bundle functionality to pre-defined capabilities.
First: good luck & good coffee. You'll need both.
I once had a similiar problem. Legacy code with awful circular dependencies, even between classes from different packages like org.example.pkg1.A depends on org.example.pk2.B and vice versa.
I started with maven2 and fresh eclipse projects. First I tried to identify the most common functionalities (logging layer, common interfaces, common services) and created maven projects. Each time I was happy with a part, I deployed the library to the central nexus repository so that it was almost immediately available for other projects.
So I slowly worked up through the layers. maven2 handled the dependencies and the m2eclipse plugin provided a helpful dependency view. BTW - it's usually not too difficult to convert an eclipse project into a maven project. m2eclipse can do it for you and you just have to create a few new folders (like src/main/java) and adjust the build path for source folders. Takes just a minute or two. But expect more difficulties, if your project is an eclipse plugin or rcp application and you want maven not only to manage artifacts but also to build and deploy the application.
To opinion, eclipse, maven and nexus (or any other maven repository manager) are a good basis to start. You're lucky, if you have a good documentation of the system architecture and this architecture is really implemented ;)
I had a similar experience in a small code base (40 kloc). There are no °rules":
compiled with and without a "module" in order to see it's usage
I started from "leaf modules", modules without other dependencies
I handled cyclic dependencies (this is a very error-prone task)
with maven there is a great deal with documentation (reports) that can be deployed
in your CI process
with maven you can always see what uses what both in the site both in netbeans (with a
very nice directed graph)
with maven you can import library code in your codebase, apply source patches and
compile with your products (sometimes this is very easy sometimes it is very
difficult)
Check also Dependency Analyzer:
(source: javalobby.org)
Netbeans:
(source: zimmer428.net)
Maven is painful to migrate to for an existing system. However it can cope with 100+ module projects without much difficulty.
The first thing you need to decide is what infra-structure you will move to. Should it be a lot of independently maintained modules (which translates to individual Eclipse projects) or will you consider it a single chunk of code which is versioned and deployed as a whole. The first is well suited for migrating to a Maven like build environment - the latter for having all the source code in at once.
In any case you WILL need a continuous integration system running. Your first task is to make the code base build automatically, so you can let your CI system watch over your source repository and rebuild it whenyou change things. I decided for a non-Maven approach here, and we focus on having an easy Eclipse environment so I created a build enviornment using ant4eclipse and Team ProjectSet files (which we use anyway).
The next step would be getting rid of the circular dependencies - this will make your build simpler, get rid of Eclipse warnings, and eventually allow you to get to the "checkout, compile once, run" stage. This might take a while :-( When you migrate methods and classes, do not MOVE them, but extract or delegate them and leave their old name lying around and mark them deprecated. This will separate your untangeling with your refactoring, and allow code "outside" your project to still work with the code inside your project.
You WILL benefit from a source repository which allows for moving files, and keeping history. CVS is very weak in this regard.
I wouldn't recommend Maven for a legacy source code base. It could give you many headaches just trying to adapt everything to work with it.
I suppose what you need is to do an architectural layout of your project. A tool might help, but the most important part is to organize a logical view of the modules.
It's not free but Structure101 will give you as good as you will get in terms of tool support for hitting all your bullet points. But for the record I'm biased, so you might want to check out SonarJ and Lattix too. ;-)