Application client using #EJB annotation and Maven on Glassfish - java

There is an example in NetBeans site how to create Application Client using simple projects (without Maven). There are 4 projects needed (EJB, EAR, Lib, Program). This tutorial is simple and works perfectly.
I want to ask how to do the same with Maven? I can't manage to get all the dependencies correctly so when I try to call EJB method, it gives me NullPointerException. Can anyone tell me, the key steps (preffered using NetBeans) that needs to be done? Because I am confused, about how many projects needs to be created? I know, that I need Application Project, EAR and EJB projects and thats it? What special configs needs to be written in these projects pom.xml files?
EDIT1:
I don't want to explicit JNDI I want to be able to use #EJB annotations.

Here are the steps:
Create the Java Class library for holding the interface class using Maven's folder of New Project's menu. Choose Java Application under Maven folder.
Create the Enterprise Application following the NB's tutorial. The only difference is that you have to use Maven's folder of New Project's menu
Build the class library
Ensure that the class library is a dependancy in the Enterprise Application.
Run the Enterpise Application. NB will deploy it to GF server
Create the Application Client by use of Maven's folder. Don't use the insert code NB's feature for injecting the Stateless EJB here, because it crashes (at least in my version: NB 7.2). Instead simply copy and paste the code shown in the tutorial. You don't need any deployment / ejb descriptor.
Modify the application client's POM in order to use maven-assembly-plugin for obtaining a jar with dependencies. If you don't to this step, the deploy will fail because GF is not able to load the interface class. Add the following lines to the plugins tab (change the main class as appropriate):
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.entapp.entappclient.Main</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Build the application client project with NB
Run the application client using the GF's application client command: appclient -jar EntAppClient-1.0-SNAPSHOT-jar-with-dependencies.jar
Useful link: Java EE's Buried Treasure: the Application Client Container by Jason Lee
Important Note
In order to deploy the client to oher JVMs you have to install the appclient on each client machine and set the target-server property. The appclient seems to have a very complicated structure, which you cannot produce simply by adding these lines (plus the EclipseLink persistence artifacts):
<dependency>
<groupId>org.glassfish.appclient</groupId>
<artifactId>gf-client</artifactId>
<version>3.1.1</version>
<type>pom</type>
<scope>compile</scope>
</dependency>
Adding these artifacts to the client compiles perfectly but the jar doesn't work. And this is understandable, since the file sun-acc.xml is missing (this file is necessary because contains the target-server property). Therefore I think that the only way is using the package-appclient script as per the linked documentation.

There is a useful EJB FAQ which mention about how to use the #EJB to access the remote EJB by using the ejb-ref together with sun-web.xml or now it is a glassfish-web.xml as the following link: -
What if I have multiple instances of the Appserver running and I want to access a Remote EJB component between them?
If you would like to compare between ejb-ref and ejb-local-ref, you may see further information at What is the relationship between #EJB and ejb-ref/ejb-local-ref?
I hope this may help.

Related

Using Kotlin classes from jar built with kotlin-maven-plugin

I've built a Java/Kotlin hybrid lib using kotlin-maven-plugin (v 1.2.21). The lib is actually a Spring Boot app (v 1.5.9)
Now, when I declare it as a dependency in another project I face strange issues :
- My IDE (Intellij) is able to find the classes I need : auto-completion proposes me the classes, and I have no error
- But when I try to build my application, I get compilation errors, saying the packages (and therefore the class) I am trying to import don't exist.
When I expand the jar in the IDE, I can see that the classes exist in the expected package, under a BOOT-INF/classes directory.
So I am a bit confused... is there a specific config to put in kotlin-maven-plugin , so that the generated jar can be used by others ?
Thanks
OK, so actually, it had nothing to do with Kotlin, but everything with Spring Boot...
Because of the change of structure in Spring Boot after 1.4.0 version as explained here, it's now required to add this config if you want to be able to import the jar
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<classifier>exec</classifier>
</configuration>
</plugin>

Maven for static web project

I'm trying to add a wst.web facet for a multimodule maven project on eclipse. This wst.web facet is "Static web project"
You would say "why is he using maven for this?"
The point is that the whole project is a tree. I have a parent pom in the root of the project, and it has some modules, and they have their modules too. 3 levels most, for config centralising purposes and plugins and dependencies heritage.
So, one of these modules, contains 2 Static web projects, that are the UI part of other modules that are dynamic. These 2 are meant to be published in an Apache (in a Http Server within eclipse). All this already works with assembly plugin that moves stuff to target and then packs it in a tar.gz... but without the "Static web project" Eclipse does not allow me to publish it to the Http Server. I can do it by hand but that's not the point
My current config to do this is:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.9</version>
<configuration>
<additionalProjectFacets>
<wst.web>1.0</wst.web>
<wst.jsdt.web>1.0</wst.jsdt.web>
</additionalProjectFacets>
</configuration>
</plugin>
This is adding JavaScript 1.0 facet. But not the Static Web Project that I need
Thanks a lot for all your help.
PS: As an another reason to use maven for this, is that this tar.gz produced, is packed in a deb package, that configures the virtual host for apache, with all the AJP proxy needed to work with its backend.

Eclipse + Maven + Tomcat: testing web apps when the WAR is built with custom options

I am using Eclipse (Helios), with the "m2eclipse" plugin. I am working on a Maven-based web application project, which I test on a local Tomcat server that is setup inside Eclipse.
Generally, this works more or less great. "m2eclipse" can sometimes be flaky... but for the most part it keeps my POM and my Eclipse project settings in sync, and likewise keeps the deployed code current in Tomcat.
However, recently I've added a wrinkle. I have one JavaScript include file that needs to be different when going from the test environment to the real production environment. The differences are too significant to be cleanly handled by Maven filtering and token substitution. What I needed was to keep two separate files in my project, and only deploy the one appropriate for the build profile.
I accomplished this with some "targetPath" trickery in my Maven POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<webResources>
<resource>
<!-- take the stuff from a "prod" or "non-prod" subdirectory and copy it one level up -->
<directory>src/main/webapp/js/${profile.name}</directory>
<targetPath>js</targetPath>
</resource>
<resource>
<!-- now don't deploy those profile-specific subdirectories -->
<directory>src/main/webapp/js</directory>
<excludes>
<exclude>prod</exclude>
<exclude>non-prod</exclude>
</excludes>
</resource>
</webResources>
</configuration>
</plugin>
This builds a perfect WAR file, which works fine when I deploy it some external Tomcat server. However, the problem is that Eclipse does NOT use that WAR file when running Tomcat inside of Eclipse. Instead, Eclipse works with an exploded version of the application, deployed to a cryptic directory like this:
<workspace>/.metadata/.plugins/org.eclipse.wst.server.core/tmp1/wtpwebapps/MyApp
... and apparently the app's files are copied to this location PRIOR TO Maven doing the little trickery shown above. Therefore, when testing locally inside of Eclipse, no JavaScript include is present at all in the expected location.
Is there a way around this problem? Maybe an Eclipse change, or a Maven change, to pull the file from Maven's "target" directory to Eclipse's "wtpwebapps" directory? Maybe there's another approach to solving the profile-specific-include-file issue altogether?
Starting from eclipse (Helios) 3.6 , the option “Java EE Module Dependencies” is replaced by “Deployment Assembly” option . You can configure which files to be deployed to the tomcat running inside of Eclipse in this “Deployment Assembly” option (Project Properties ---> Deployment Assembly )
You could deploy to the local server using the maven-cargo-plugin instead of running from Eclipse directly. This would then use your maven generated war.
If the tests you are doing can be automated, this could then be incorporated as part of an integration test suite that would be completely run by maven.

Maven project for both WAR and standalone server/client

I have a POM-based Java project. It contains a number of servlets for deployment in a WAR. However, in addition to this, I also have classes that launch the application as a standalone using embedded servlet and database environments (for a turnkey development environment). Additionally, there is also a command-line client for the application.
I would like to have the ability to build the project into both the WAR and two separate executable JARS (one server, one client). I'm not concerned about the JARs/WAR containing some unnecessary code or deps- I just want all 3 to work.
What's the "correct" way to do this with Maven?
Multiple projects is the way to do this. Put the common code in the first project along with the standalone support. Then make a second with war packaging that depends on the first.
You could use assembly plugin to do this. Assembly plugin can package zip or tar.gz archive for you. It's a perfect distribution format for standalone applications. When you configure assembly plugin you could link it to package phase, so application will be packaged in two formats: war and zip.
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2.1</version>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>assembly</goal>
</goals>
</execution>
</executions>
</plugin>

Release different configurations with Maven

I'm currently migrating our build process to Maven from Ant. Our application is deployed to many different customers, each with a unique set of dependencies and and configuration. I can implement different profiles to model these and build the required wars from them. However this is a process that happens at compile time.
Each release is tagged under SVN as well as uploaded to our internal Nexus repository. I want to be able to take a defined release and reconstruct it based a profile. Is there a way to do something like this? Is there something other than profiles I should be using?
"declare several execution for the war plugin to produce several artifacts (and install/deploy them)" This sounds like this might be the way forward. How would I go about doing this?
This goes a bit against a Maven golden rule (the one main artifact per module rule) but can be done. The One artifact with multiple configurations in Maven blog post describes one way to implement this approach:
I decided to put all the environment
specific configuration in a special
source tree, with the following
structure:
+-src/
+-env/
+-dev/
+-test/
+-prod/
Then I configured the maven-war-plugin
to have three different executions
(the default plus two extra), one for
each environment, producing three
different war files: beer-1.0-dev.war,
beer-1.0-test.war and
beer-1.0-prod.war. Each of these
configurations used the standard
output files from the project and then
copied the content from the
corresponding src/env/ directory on
to the output files, enabling an
override file to be placed in the
corresponding src/env/ directory. It
also supported copying a full tree
structure into the output directory.
Thus if you for instance wanted to
replace the web.xml in test you
simply created the following
directory:
src/env/test/WEB-INF/
and placed your test specific
web.xml in this directory and if you
wanted to override a db.property
file placed in the classpath root
directory for the test environment you
created the following directory:
src/env/test/WEB-INF/classes
and placed your test specific
db.property file in this directory.
I kept the src/main directory
configured for development
environment. The reason for this was
to be able to use the
maven-jetty-plugin without any extra
configuration. Configuration
Below you find the maven-war-plugin
configuration that I used to
accomplish this:
<plugin>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<classifier>prod</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-prod</webappDirectory>
<webResources>
<resource>
<directory>src/env/prod</directory>
</resource>
</webResources>
</configuration>
<executions>
<execution>
<id>package-test</id>
<phase>package</phase>
<configuration>
<classifier>test</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-test</webappDirectory>
<webResources>
<resource>
<directory>src/env/test</directory>
</resource>
</webResources>
</configuration>
<goals>
<goal>war</goal>
</goals>
</execution>
<execution>
<id>package-dev</id>
<phase>package</phase>
<configuration>
<classifier>dev</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-dev</webappDirectory>
<webResources>
<resource>
<directory>src/env/dev</directory>
</resource>
</webResources>
</configuration>
<goals>
<goal>war</goal>
</goals>
</execution>
</executions>
</plugin>
(...) I can define each customer project with profiles but I don't know if there's a way to release them to a repository.
You have several options:
use profiles and run the build several times (create artifacts with a classifier and install/deploy them)
declare several execution for the war plugin to produce several artifacts (and install/deploy them)
use different modules (and maybe war overlays to merge a common part with a specific one)
Or at least a way in Maven to automatically build an artifact with a specified profile from say an SVN tag.
Well, this is doable. But without more details about a particular problem, it's hard to be more precise.
I would take a look at your architecture and see if there is a way to split up your project into multiple projects. One would be the main code base. The other projects would depend on the JAR file produced by the main project and add in their own configuration, dependencies, etc to produce your final artifact.
This would let you version customer specific code independently of each other as well as keeping common code in one place and separate from customer specific stuff.
Have you taken a look at the Maven Assembly plugin?
This plugin allows you to customize how your distribution is assembled - i.e. what format (.tar.gz, .zip, etc), directory structure, etc. I think you should be able to bind several instances of the plugin to the package phase to assemble multiple variations of your output (i.e. the packaging for customer 1, customer2, etc, separately).
The deploy plugin should then automatically handle deploying each of your assembled packages in the target directory to the repository.
I ended up doing something slightly different. We're not storing the releases in our internal repository. Instead we're building using Hudson and a multi-configuration project (one configuration/profile for each customer). This way when a release is made the Hudson job is run to build different wars for all customers. They are then stored on the Hudson server instead of Nexus. Builds for specific versions and customers can also be built at any time from the releases in Nexus. – samblake Mar 16 '11 at 12:32

Categories