How m2eclipse create a new domain by maven-glassfish-plugin? - java

version is glassfish v3
I want to trying maven-glassfish-plugin but I don't know how to create a new domain.

You can use the create-domain command. Either pass parameters on the command line or follow the interactive steps:
$ asadmin create-domain
But the Maven GlassFish Plugin (once properly configured) can also create a domain with the following goal:
glassfish:create-domain Create a new Glassfish domain. (Creating an existing domain will cause it to be deleted and recreated.)
Here is a configuration sample (inspired by the Fairly Complete Configuration Example):
<plugin>
<groupId>org.glassfish.maven.plugin</groupId>
<artifactId>maven-glassfish-plugin</artifactId>
<version>2.2-SNAPSHOT</version>
<configuration>
<glassfishDirectory>${glassfish.home}</glassfishDirectory>
<user>${domain.username}</user>
<adminPassword>${domain.password}</adminPassword>
<!-- <passwordFile>path/to/asadmin/passfile</passwordFile> -->
<autoCreate>true</autoCreate>
<debug>true</debug>
<echo>true</echo>
<skip>${test.int.skip}</skip>
<domain>
<name>${project.artifactId}</name>
<httpPort>8080</httpPort>
<adminPort>4848</adminPort>
</domain>
<components>
<component>
<name>${project.artifactId}</name>
<artifact>${project.build.directory}/${project.build.finalName}.war</artifact>
</component>
</components>
</configuration>
</plugin>

Related

Wildfly Maven Plugin - commands seem to have no effect

I am using the Wildfly Maven Plugin and it is working, in that it turns on, runs web application, however I am having trouble with my custom configurations, namely:
Setting the Root logger and Console logger to debug mode
Allowing connections from 0.0.0.0:8080 or something other than localhost.
Here is my set up:
<plugin>
<groupId>org.wildfly.plugins</groupId>
<artifactId>wildfly-maven-plugin</artifactId>
<version>2.0.2.Final</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<configuration>
<java-opts>
<java-opt>-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=1044</java-opt>
</java-opts>
<commands>
<!-- **These are the commands that aren't going through** -->
<command>/subsystem=logging/root-logger=ROOT:write-attribute(name="level", value="DEBUG") </command>
<command>/subsystem=logging/console-handler=CONSOLE:write-attribute(name="level", value="DEBUG")</command>
<command>/subsystem=logging/file-handler=debug:add(level=DEBUG,autoflush=true,file={"relative-to"=>"jboss.server.log.dir", "path"=>"debug.log"})</command>
<command>/subsystem=logging/logger=org.jboss.as:add(level=DEBUG,handlers=[debug])</command>
<command>/subsystem-----Enable Remote access here?</command>
</commands>
<add-user>
<users>
<user>
<username>admin</username>
<password>admin.1234</password>
</user>
<user>
<username>admin-user</username>
<password>user.1234</password>
<groups>
<group>admin</group>
<group>user</group>
</groups>
<application-user>true</application-user>
</user>
<user>
<username>default-user</username>
<password>user.1234</password>
<groups>
<group>user</group>
</groups>
<application-user>true</application-user>
</user>
</users>
</add-user>
</configuration>
</plugin>
I know when starting from terminal, one would use this: ./standalone.sh -b 0.0.0.0 -bmanagement 0.0.0.0 However I am running demos straight from Maven and need to access my webapp from a separate machine.
Note - from within the Wildfly Management page, I can manually set the Root Logger and Console Logger to debug mode and then the proper debug logs will flow out.
For example, manually I could go here: http://127.0.0.1:9990/console/index.html#logging-configuration and then manually change the logging from the default INFO level to DEBUG:
So my question is, along with allowing remote access, is how to change the logging level as a command into the maven wildfly plugin.
You'd need to upgrade the plugin version to 2.1.0.Beta1 to get that to work. The 2.0.x versions do no have the ability to execute CLI commands from the run or deploy goals.
If you need to stick with the version you're using you'd need to define the execute-commands goal. Then you could use the embedded server to configure the server.
<commands>
<!-- **These are the commands that aren't going through** -->
<command>embed-server</command>
<command>/subsystem=logging/root-logger=ROOT:write-attribute(name="level", value="DEBUG") </command>
<command>/subsystem=logging/console-handler=CONSOLE:write-attribute(name="level", value="DEBUG")</command>
<command>/subsystem=logging/file-handler=debug:add(level=DEBUG,autoflush=true,file={"relative-to"=>"jboss.server.log.dir", "path"=>"debug.log"})</command>
<command>/subsystem=logging/logger=org.jboss.as:add(level=DEBUG,handlers=[debug])</command>
<command>/subsystem-----Enable Remote access here?</command>
<command>stop-embedded-server</command>
</commands>

How to use custom delimiter when creating filtered fileSets in a maven artefact?

I'm trying to create a custom maven artefact that creates a basic Java Handler for AWS Lambda. One of the files in my archetype-resources is a serverless.yml file as we are looking to deploy this handler using the ServerLess Framework. I want this file to be part of a filtered=true fileSet as I want to pre-populate certain fields based on the project groupId, projectId etc. Here's a sample:
service: cmpy-prefix-${groupId}-${artifactId}-service
# exclude the code coverage files and circle ci files
package:
exclude:
- coverage/**
- .circleci/**
...
profider:
...
environment:
S3_BUCKET_NAME: ${self:provider.stage}-cmpy-bkt
And I add this file to src/main/resources/META-INF/maven/archetype-metadata.xml as follows:
<fileSet encoding="UTF-8" filtered="true" packaged="false">
<directory></directory>
<includes>
<include>serverless.yml</include>
</includes>
</fileSet>
Now my problem is that serverless.yml file contains ${self:provider.stage} which interfere's when I run maven:generate for this archetype and the error I get is:
org.apache.velocity.exception.ParseErrorException: Encountered ":provider.stage}-cmpy-bkt\...
I tried to set the <delimiter> for the maven-resource-plugin in the pom.xml for my main archetype to no avail. Essentially, I added the following to the pom of the archetype project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>${org.apache.maven.plugins.maven-resources-plugin.version}</version>
<configuration>
<addDefaultExcludes>false</addDefaultExcludes>
<delimiters>$[*]</delimiters>
</configuration>
</plugin>
</plugins>
</build>
But I still face the same problem when I try to generate a new project using this archetype. The maven archetype plugin seems to be ignoring the delimiter.
Any advice/help on how I can fix this will be immensely appreciated.
Found the solution. I had not realised I could add Velocity directives in my archetype files.
See this other Stackoverflow post for hints Maven archetype strips comments

Cannot load UIMA PEAR package through the package installer GUI

I have a problem related to installing a UIMA PEAR package containing an Annotator component. I am using PearPackageMavenPlugin for the job with the following setup:
<plugin>
<groupId>org.apache.uima</groupId>
<artifactId>PearPackagingMavenPlugin</artifactId>
<version>2.6.0</version>
<extensions>true</extensions>
<executions>
<execution>
<phase>package</phase>
<configuration>
<!-- PEAR file component classpath settings -->
<classpath>$main_root/bin</classpath>
<!-- PEAR file main component descriptor -->
<mainComponentDesc>desc/S4DocumentUimaAnnotator.xml</mainComponentDesc>
<!-- PEAR file component ID -->
<componentId>S4DocumentAnnotator</componentId>
<!-- PEAR file UIMA datapath settings -->
<datapath>$main_root/resources</datapath>
</configuration>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
`
I have constructed a special maven profile building the project in a bin directory instead of target so all my compiled classes are there that is why I have pointed the classpath setting of the plugin at $main_root/bin.
Finally when I load the built pear package I get the following error:
Verification of S4DocumentAnnotator failed =>
org.apache.uima.resource.ResourceInitializationException: The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component. You must specify an Annotator, CAS Consumer, Collection Reader, or CAS Multiplier. If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed. (Descriptor: file:/home/ceco/s4_stuff/my_pear/S4DocumentAnnotator/desc/S4DocumentUimaAnnotator.xml)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:228)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:170)
at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:279)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:331)
at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:448)
at org.apache.uima.pear.tools.InstallationTester.testAnalysisEngine(InstallationTester.java:218)
at org.apache.uima.pear.tools.InstallationTester.doTest(InstallationTester.java:113)
at org.apache.uima.pear.tools.InstallationController.verifyComponentInstallation(InstallationController.java:1110)
at org.apache.uima.pear.tools.InstallationController.verifyComponent(InstallationController.java:1993)
at org.apache.uima.tools.pear.install.InstallPear.installPear(InstallPear.java:389)
at org.apache.uima.tools.pear.install.InstallPear.access$000(InstallPear.java:80)
at org.apache.uima.tools.pear.install.InstallPear$RunInstallation.run(InstallPear.java:109)
at java.lang.Thread.run(Thread.java:744)
I do not understand why the UIMA jars are not supposed to be packaged when the idea of the PEAR package is to be self-contained and not depend on the system it is ran on?
This is what I would try:
The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component
Check that S4DocumentUimaAnnotator is valid. Unzip the PEAR and check the xml.
If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed.
Did you try to print the extension classpath?
Else you could try to use a plain java version of the PEAR, meaning: manually unzip it and create a normal java project with it.
One common issue I ran into several times and which creates exactly this error message is that my PEAR contains uimaj-common.jar in the lib folder. Have you checked this?
I know this one is old but what I think you should do is set the scope of the uima dependencies to provided. The PEAR needs only it's dependencies to run, the enviroment it'll be used should have the uima dependencies set to use all the uima features and pears
Provided dependencies won't be copied to the lib folder of the PEAR
There are a couple of things that could be going on, but going by your error message, the first thing I would determine is whether or not the uimaj-core jar file is being excluded from the build of the PEAR file. It should be. (Take a look at the error message above.) I just ran into this problem myself, and I got around it by adding this to my POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<!-- Copy the dependencies to the lib folder for the PEAR to copy. -->
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<stripVersion>true</stripVersion>
<outputDirectory>${basedir}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<includeScope>runtime</includeScope>
<!-- An exception happens when using a PEAR if the archive includes this jar. -->
<excludeArtifactIds>uimaj-core</excludeArtifactIds>
</configuration>
</execution>
</executions>
</plugin>
I recently posted a parent POM file and a project-specific POM file to my Gist. You're welcome to take a look at what I'm doing. (One of the things I'm doing is copying over my desc directory, which contains my AE's descriptor XML file, to the root of my project directory, so that the maven-pear plugin can copy it into the PEAR archive properly.)
Parent POM:
https://gist.github.com/software-mariodiana/d46e10fca53dc6e6c0f16e20563476b8
Project-specific POM:
https://gist.github.com/software-mariodiana/e9a0f0f03a49d33dcc32655170fd4841
Good luck!

How can I make Tycho load platform specific fragment into the test runtime for any OS?

I'm using Tycho to build and test some eclipse plugins. I have one bundle that has many platform specific fragments. I also have one test bundle that is using tycho-surefire-plugin to test the original bundle that has the platform specific fragments. However, Tycho is not including the current platform's fragment into the test runtime.
All of the platform specific fragments look like the win64 fragment manifest listed below. (There are actually six fragments in total, one for each platform combination I need to support.)
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Liferay AUI Upgrade Tool Win64
Bundle-SymbolicName: com.liferay.laut.win32.win32.x86_64;singleton:=true
Bundle-Version: 1.0.2.qualifier
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
Fragment-Host: com.liferay.ide.alloy.core
Eclipse-BundleShape: dir
Eclipse-PlatformFilter: (& (osgi.ws=win32)(osgi.os=win32)(osgi.arch=x86_64))
Bundle-Vendor: Liferay, Inc.
Example win64 fragment pom.xml's <build> section
<build>
<plugins>
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<configuration>
<environments>
<environment>
<os>win32</os>
<ws>win32</ws>
<arch>x86_64</arch>
</environment>
</environments>
</configuration>
</plugin>
</plugins>
</build>
When I try to execute my Tycho build and it runs the surefire test plugin (no matter which OS I try), the correct platform fragment is not added into the runtime.
I've seen various posts on stackoverflow about similar questions but in those cases the fragments loaded into the test runtime were not platform-specific fragments with OS filters.
This is a good question - but if you know the right trick, the solution is fortunately not complicated: Simply configure Tycho to include the feature which contains all fragments into the test runtime.
Create a feature in an eclipse-feature module which includes all
the native fragments. Make sure that the platform filters for each
plug-in is correct: On the Plug-Ins tab of the feature.xml editor,
you need to select the correct os/ws/arch that each fragment applies
to. This is some manual effort, but typically can re-use this
feature to include your fragments into a p2 repository/update site.
Include this feature into the test runtime with the following POM configuration:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<dependency-resolution>
<extraRequirements>
<requirement>
<type>eclipse-feature</type>
<id>fragment-containing-feature</id>
<versionRange>0.0.0</versionRange>
</requirement>
</extraRequirements>
</dependency-resolution>
</configuration>
</plugin>
A potential pitfall is the <environments> configuration of the eclipse-feature module: You don't need anything special for that module; just have the module inherit the <environments> configuration from the parent POM. Note that the parent POM shall configure all the environments your build supports - and only the fragment modules need to override the global configuration.

JAXB XJC Possible to suppress comment creation in generated classes?

Our project uses XJC to generate Java classes from an XSD. I'm using JAVA EE 6.
When all the XSDs we have are re-generated, the generated classes include this comment at the top of the file:
// Generated on: 2011.02.23 at 02:17:06 PM GMT
Is it possible to suppress this comment? The reason is that we use SVN for version control, and every time we regenerate our classes, every single file shows as being changed in SVN, even though the only thing that differs is this comment. So I'd like to remove the comment altogether if possible.
There is a -no-header directive, but I don't want to remove the entire header, so that future generations know that it's a file generated from a tool, and that modifications will be overwritten. I only want to remove the timestamp. (Or alternatively, I'd remove the inbuilt header and then insert my own header somehow.)
I am using this Maven plugin which replaces the // Generated on: 2011.02.23 at 02:17:06 PM GMT line:
<plugin>
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>maven-replacer-plugin</artifactId>
<version>1.3.8</version>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>replace</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>src/main/java/jaxb/*.java</include>
</includes>
<token>^// Generated on.*$</token>
<value>// Generated on: [TEXT REMOVED by maven-replacer-plugin]</value>
<regexFlags>
<regexFlag>MULTILINE</regexFlag>
</regexFlags>
</configuration>
</plugin>
I'm late to the party, but since version 2.0 of the jaxb2-maven-plugin, there's a noGeneratedHeaderComments configuration option. (see the JAXB-2 Maven Plugin Docs)
You can use it like this:
...
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>2.3.1</version>
<executions>
<execution>
<id>xjc</id>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<target>2.1</target>
<sources>
<source>FirstXSD.xsd</source>
<source>SecondXSD.xsd</source>
</sources>
<xjbSources>
<xjbSource>OptionalBindings.xjb</xjbSource>
</xjbSources>
<noGeneratedHeaderComments>true</noGeneratedHeaderComments>
</configuration>
<dependencies>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-xjc</artifactId>
<version>${jaxb.version}</version>
</dependency>
</dependencies>
</plugin>
</plugins>
...
So no need for another plugin or script to run.
If you want to keep a disclaimer, you can use one of the techniques already mentioned to inject it where wanted.
If you use ant, the following snippet may be useful for replacing the comments:
<replaceregexp
match="^// Generated on:.*$"
replace="// Generated on: [date removed]"
byline="true">
<fileset dir="src">
<include name="**/*.java"/>
</fileset>
</replaceregexp>
I know this is 2 years after the fact, but because the classes are generated they aren't necessarily needed in SVN. What needs to be in SVN is the schema or whatever file you use for source to generate the classes. As long as you have the source and the tools to generate the classes, the classes in SVN are redundant and as you saw, problematic in SVN or any SCCS. So put the schema file in SVN and avoid the issue altogether.
If it's not possible using an option you can post-process the generated files yourself.
For a very specific use-case we had to do it that way on our project...
We use Maven and we execute a specific script after the Java classes have been generated and before we compile and package them to a distriuable JAR.
To build on cata's answer (upvoted) the maven-replacer-plugin is the way to go. I've come up with the following that strips out the entire comment (not just the timestamp) which you can replace with your file comment (license etc.).
<plugin>
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>maven-replacer-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>replace</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- assumes your xjc is putting source code here -->
<includes>
<include>src/main/java/**/*.java</include>
</includes>
<regex>true</regex>
<regexFlags>
<regexFlag>MULTILINE</regexFlag>
</regexFlags>
<replacements>
<replacement>
<token>(^//.*\u000a|^\u000a)*^package</token>
<value>// your new comment
package</value>
</replacement>
</replacements>
</configuration>
</plugin>
The one gotcha to watch out for is that the <value> element treats the text literally. So if you want a line break in your replacement text you need to put a line break in your pom.xml file (as I've demonstrated above).
What you should you :
Generate your classes in target :
${project.build.directory}/generated-sources
If you add target to ignore list (svn), that's all.
I also want to have text header with warning about classes was auto-generated and should not be modified manually, but because I place such files into git I do not want there always changed date of generation.
That header generated in com.sun.tools.xjc.Options#getPrologComment method. So essentially it call:
return Messages.format(
Messages.FILE_PROLOG_COMMENT,
dateFormat.format(new Date()));
Messages.FILE_PROLOG_COMMENT defined as Driver.FilePrologComment. With futher debugging I found it use standard Java localization bundles.
So, to change header format we just may provide our properties override for their values from MessageBundle.properties.
We can do it in two way:
Just copy that file (from repo by link, or just from jar of appropriate version what you are using) into src/main/resources/com/sun/tools/xjc/MessageBundle.properties in your project and change key Driver.FilePrologComment as you wish.
But first case have some drawbacks - first you copy-paste many code which you do not change, second you should update it when you update XJC dependency. So better I recommend place it as src/main/resources/com/sun/tools/xjc/MessageBundle_en.properties (note _en suffix in filename) file and place there only properties you really want to change. Something like:
# We want header, but do NOT willing there `Generated on: {0}` part because want commit them into git!
Driver.FilePrologComment = \
This file was generated by the JavaTM Architecture for XML Binding(JAXB) Reference Implementation, v2.4.0-b180830.0438 \n\
See https://javaee.github.io/jaxb-v2/ \n\
Any modifications to this file will be lost upon recompilation of the source schema. \n
Ensure that file in compiler classpath, especially if you call it from some plugins.
That is common mechanism for translation. See related answer: JAXB english comments in generated file
If you are using maven-jaxb2-plugin there is an tag noFileHeader set it to true. It will prevent jaxb to generate the header that includes that date line on it.
<noFileHeader>true</noFileHeader>

Categories