I'm trying to use JSFUnit framework, but can't understand how to allocate projects/files. How many projects/sub-projects should I have and whether it is possible to have just ONE project, as it normally happens with JUnit and similar frameworks. Would be nice to see some example.
I still can't understand whether I need to create a separate maven project for testing purposes (as this page says) or I can work in my main project...
I'm trying to use JSFUnit framework, but can't understand how to allocate projects/files. How many projects/sub-projects should I have and whether it is possible to have just ONE project, as it normally happens with JUnit and similar frameworks. Would be nice to see some example.
The JSFUnit project itself has many examples (as we can see in the JBoss repository) and I suggest to get them from the subversion repository (checkout the whole jboss-jsfunit-examples).
I still can't understand whether I need to create a separate maven project for testing purposes (as this page says) or I can work in my main project...
JSFUnit tests are in-container tests and need to be packaged and deployed as a war. But you obviously don't want them to end up in the "production" WAR and putting them in a separate project is the obvious way to separate things (they will be then merged with the war under test using overlays). So, yes, create a separate project.
See also
Test, Test, JSF (the JSFUnit blog)
Intro to Exadel RichFaces and JBoss JSFUnit [PDF]
Testing RichFaces Applications with JBoss JSFUnit [PDF]
Related
I'm very much new to JUnit. We want to integrate JUnit into our old and big Enterprise Java application(which has many projects associated with it) developed long back.We want to do it without touching java files and on framework level. Is it possible? If yes, please share me the links or information on how to do it?
I can't comment (yet) so here is my recommendation as an answer:
"Working Effectively with Legacy Code" by Michael Feathers covers all scenarios of testing / maintaining etc. of old, huge applications in a very readable way.
Why would you touch existing Java files for writing unit tests ( if you are ready to leave non-testable classes in your source as they are) ?
Isn't integrate JUnit means writing unit tests for existing classes using JUnit ( for which unit tests have not been written so far) OR is there any other meaning you intend too?.
We keep test classes written using JUnit in a separate source-folder so those don't mess with your existing code.
I don't see any concern here. Just add JUnit jar into your project's build path ( by including jar OR by adding maven dependency OR by adding Gradle dependency ) and start writing unit tests for your testable classes and you are done with your integration.
You might choose to not include source-folder-for-tests into your deployment build.
Sometimes in your old code, some classes might not be testable so you will have to tweak those a little if wish to cover those too.
Hope it helps !!
I have a large scale project I am working on at the moment using Eclipse. Normally, as a one man team, these problems would not be an issue, but as our team is not one person we need to be able to break up pieces of the project to be worked on by certain team members.
In simplicity, let's say I have two layers to be separated apart:
1. Each DAO is a separate Java project, to be worked upon individually
2. The web-tier service layer contains all of our service endpoints and must be able to reference all of the DAOs. This layer runs on Tomcat as a dynamic web project, and utilizes Adobe LiveCycle Data Services as the piece that handles creation and management of endpoints.
Now, the issue we are running into is that when we create a DAO and unit test it individually it runs great. But when we reference it into our service project and try to run it we begin to get all kinds of issues related to the fact that we have two different versions of certain jars referenced in and as such we begin to have errors when running the server.
As a result, we know we can solve the issue by pulling the problem jars and ensuring that this is not an issue again in the future, but as I said before this is a large scale project with multiple people working on it and we don't want to be spending our time weeding out dependency issues when under the gun.
We are looking for recommendations on where to proceed for alternative solutions? Our team is new to JavaEE and as such we don't have much of a bearing on what we can use to tie everything together in it, or if it is a viable solution. Should we be looking at turning our DAOs into EJBs and deploying them in an EAR library? If so, where would our service layer lie, and would the service layer be able to reference the DAO classes since the EJB maintains it's own classpath (from what we have read?) Are we looking down the wrong path, or are we completely wrong in our current understanding of JavaEE?
Any assistance is greatly appreciated. We are still in the framework stage of this project and we want to be sure that we will be able to maintain it in the long run.
I second the Maven recommendation. That can add all sorts of sanity to your project structure.
Maven can even generate Eclipse workspaces via mvn eclipse:eclipse
An important clarification on the EJBs note. As of ava EE 6 is you no longer need to separate EJBs from Servlets and can use them together in the very same jar in the war file.
So understand from that that using EJBs or not no longer has any impact on packaging or classloaders as it once did. These are now separate decisions. EARs and classloader separation should now be viewed as a feature you might want to use should you want classloader separation and the complexity it brings. Most applications simply do not need that and are more than fine with just a war file containing servlets, ejbs, jpa entities, cdi beans, jaxrs services and whatever else you need. You are free to decide how you want to separate them or if you want to bother separating them at all.
EJBs do make great DAOs due to transaction management, something you don't get from plain Tomcat but can be made available in Tomcat via TomEE and works fine in Eclipse. You should consider EJBs for that reason, not for dependency reasons.
Side note, as you're new to Java EE, you might find this helpful:
http://openejb.apache.org/examples-trunk/index.html
In order to have things organized when working with Java EE in teams of 1+ people I could suggest:
Use Maven to manage your build process and library dependencies.
Maven has a small learning curve, but once you grasp it you will be grateful. By using Maven you no longer depends on Eclipse to manage your classpath.
A thing about it that I think is really helpful when working in teams is the install feature. Suppose you are woking on the version 1.0 of an EJB module, say core-ejb-module-1.0, and you've got it to a stable state and want everyone working in the project to refer to it from now on.
You then run a maven command like this on it: mvn clean package install
Maven will clean this module, compile it, run tests, create the jar and then install it to a repository that you define. Could be any computer in your company.
Now you may tell the guys working on other projects to update this dependency version on their .pom file and in the next build they run, before compiling, maven will download this library and then use it. Really neat. No more classpath hell.
(There are other ways to always automatically refer to the latest library as stated in this post, but there are some caveats. Anyway it's just an example.)
Use JPA/EJB instead of DAO Pattern.
Some people say DAO meaning any sort of data access, others really mean that they use the DAO Pattern to access objects. If that is your case, you no longer need to use it when using JPA. (At least for most common scenarios).
In my case, I have a generic EntityService which is capable of doing CRUD operations on any Entity and has a centralized query management. Then every EJB's that should perform database related operations may inject this guy and do its job.
As a suggestion, with Maven, you project could be organized as such:
core project structure
core (The pom root)
core-ejb-module (Includes all generic EJB's, like the EntityService for instance.)
core-jpa-module (Includes all JPA generic definitions, like Interfaces, MappedSuperclasses and such.)
core-jsf-module (Includes all JSF generic definitions, like abstract controllers, generic converters and wrappers for FacesContext, etc..)
Now that you have a core generic module setup, you could create:
an application structure
app (The pom root)
app-ear-module (Includes all other modules in this application. Shared jars goes in the ear /lib folder, so all other modules could reference to them.)
app-ejb-module-a (Includes EJB's for the business layer of your application. It uses the core-ejb-module)
app-ejb-module-b (You may have lots of ejb modules. You may even have a project that contains only ejb modules. Other apps will declare their dependency on them via Maven.)
app-jpa-module (Contains definitions for JPA Entities that represents you database tables. Depends on the core-jpa-module)
app-web-module (Holds the pages, Controllers and Converters for this application.)
I think you got the idea. Things tend to be loosely coupled and you may organize your projects as you like.
This is just a simple example to illustrate. I didn't explained a lot about Maven but if you're interested I think it will help you indeed.
I hope it gives you some ideas and may help you in any way.
[]'s
If you can run all the sub-components using the same set of dependencies, you may find it helpful to migrate to a Maven build.
With Maven, you can define a top-level project that manages all the 3rd party dependency versions in one place, so all modules are built, tested and deployed against the same library versions. You are also likely to find Maven a good fit for the multi-module approach you have adopted, as it ensures that a project is rebuilt correctly if one of its dependencies changes.
You would still be able to use dynamic web projects as before; Eclipse will automatically deploy the DAOs as part of the service project (IIRC you need to characterise the DAOs as utility modules).
If you do go down the EJB root, you are correct that each EAR will get its own class-loader, and can therefore use a varying set of dependencies. However, in your position I would tend to look at improving your dependency management first - it'll probably be cheaper and easier.
I am a little confused about integration testing of a simple EJB. If I want to test the EJB's local interface/no-interface do I need to use Arquillian? I stumbled upon Arquillian but I have never used it. I have a Maven directory structure/Glassfish and Eclipse Indigo
If I want to test the EJB's local interface/no-interface do I need to use Arquillian?
It is not necessary to use Arquillian, but there are certain things made easier when you do so.
Ordinarily, you would merely use the EJBContainer API available in EJB 3.1 for testing of EJBs in an embedded container (that runs in the same JVM as the tests). In the case of embedded Glassfish, this typically results in deployment of EJBs that are found in the classpath of the application.
Arquillian allows you to do a lot more than execute tests in a container. It manages the lifecycle of the container, thus not requiring any writing of code beyond setting the properties in the arquillian.xml file. It allows you to manage deployments to a container in a far more easier manner; using the ShrinkWrap API, one can programmatically perform different context-sensitive deployments to a container. Furthermore, injection of dependencies (test enrichment) can also be performed, so long as they're supported by Arquillian.
It would suffice to know that the embedded Glassfish container support for Arquillian, uses the same APIs that are exposed by the embedded Glassfish API; usually you might end up duplicating the work of Arquillian, except in certain unique scenarios.
If you're interested in taking a look at examples using Arquillian, this GitHub project would help.
If you use j2ee 6, you can use EJBContainer to create full ejb instanse.
http://download.oracle.com/javaee/6/api/javax/ejb/embeddable/EJBContainer.html
http://download.oracle.com/javaee/6/tutorial/doc/gkcrr.html
When you are not a fan of mocking (just like me), then you could either have a look at ejb3unit (http://ejb3unit.sourceforge.net/), or try Arquillian.
I must say I had very good experiences with "ejb3unit".
But it seems that "EJB3unit" peoject was not maintenance since 2-3 years. But supprisingly, weeks ago, there are again some activities on the ejb3unit site.
Arquillian is not so easy start with. I would say this mainly lies in the documentation, missing running examples, and good turorials.
But so long as you have made your firt Arquillian test run, Arquillian begins to shine!
Under the following link, you could find a tutorial serial on step by step setting up Arquillian:
http://milestonenext.blogspot.de/2012/12/ejb3-integration-test-with-arquillian.html
We have a Java web app, and a number of developers working with it. Every developer is working with his/her own feature, in its own branch. When the feature is ready - we want to review it and visually test (after all unit and integration tests are passed, of course). We want to automate this process of deployment. Ideally, we would like to let our developers click just one button somewhere to make the application deployed to http://example.com/staging/branches/foo (where branches/foo is developer's path in SVN repository).
Then, the deployment is reviewed (by project sponsors mostly), merged into /trunk, and removed from the staging server.
I think that I'm not the first one who needs to implement such a scenario. What are the tools and technologies that may help me?
Typically, I would use a stage environment to test the "trunk" (ie all the individual branches for a release merged together). Several reasons for this:
Stakeholders and sponsors usually don't have time to test individual branches. They want to test the entire release. It also tend to get very confusing for people not inside the immediate team to keep track of different, changing URLs and understanding why feature X works on one URL and not the other. Always keep it simple for your sponsors.
It tends to become very messy and costly to maintain more than one instance of third-party dependencies (databases, service providers etc) for proper stage testing. Bear in mind that you want to maintain realistic test-data at all times.
Until you merge all individual branches together for a release, there will be collisions and integration bugs that will be missed. Assume that automated integration tests won't be perfect.
All that being said, there are lots of good tools for automatic build/deploy out there. Not knowing anything about your build setup and deployment environment, a standard setup could consist of a build-server, maven and tomcat. The build-server would execute the build and deploy the resulting appplication to the test-server. If you are using maven and tomcat, there is a plugin available for this task (http://mojo.codehaus.org/tomcat-maven-plugin/introduction.html). There are a number of good build-servers out there as well with good support for maven. Teamcity is popular, as is Hudson CI.
Basically you can use Hudson/Jenkins.
There are ways to manage have multiple deployments on one machine with some plugins, as stated on the following post on Jenkins Users, you'll just have to manage those multiple deployments to be the branches the developers are working on.
As #pap said, Hudson and other CI software build, test (if you have any tests in it) and deploy webapps, you'll just have to configure this procedure. Hope the link is helpful.
We are developing webapps with Eclipse + Tomcat plugin. We recently started a new app which will run on Facebook and StudiVZ (FB competitor in Germany). Since the functionality of the app will be 95% the same we split the code into separate Eclipse projects (app-core, app-facebook, app-vz). The -core project is source-linked into the -facebook and -vz projects in Eclipse. We are also using Hudson for CI and made ant scripts that import the code from the -core project before building. So basically we tried to inherit on a project level.
The described method has some flaws:
Versioning is complicated
The -core project does not run standalone, which makes automatic testing partly impossible
We need to modify some models where the -core projects classes depend on
Other problems that make me think this is not the best solution
Does anyone have suggestions for a better solution?
There are a wealth of build tools available for Java that address dependency management and versioning specifically. Many of these integrate with Hudson and Eclipse.
I'd suggest looking at Maven and how it does dependency management as a good starting point. Even if you don't use Maven itself, many of the solutions out there build on Maven's dependency management mechanism. Something like Apache Ivy allows you to use maven dependency management, but still use your own custom Ant scripts; whereas something like Gradle is wholesale replacement.
You should be able to split your project into 3 or more parts and then establish dependencies via Java Build Path. You need to clean up the dependencies between the projects. If you need to configure your core components depending on whether it is a -facebook or a -vz project, you might need to separate configuration, maybe even use Spring or similar dependency injection framework.
When trying to introduce reuse into web-based Java projects, usually the problems arise in the UI code. Not many frameworks were built with this approach in mind.
I don't use/hate Eclipse[1], but can point to how we deal with a similar problem.
We use Maven with IntelliJ. In particular, both of these support modules which have defined internal dependencies. In your case it could be -fb and -vz modules depending on core, or you can split core into smaller parts (such as DAO, business logic, etc.).
When compiling, deliverables of "upper" modules would be used to build "lower" modules.
Let's go over points/flaws you have raised:
versioning is no longer a problem as everything sits under the same root of Subversion/GIT/VCS of your choice
Why is that a problem? Certainly this shouldn't be an issue for unit tests as how I understand TDD, these should not require complex environments. For automated tests, you would have to test the core API (as this is the interface between core and everything else, right?) hence this shouldn't require any fronted stuff?
you need to explain your other points to tell why you don't like it
It is against Geneva convention to ask a developer to use anything other than IDE of his/her choice.