I'm migrating a maven 1 project to maven 3. The job is almost done actually with a missing task, What I need is to get all dependency names from pom file and write them to a configuration file as one string, the job is done like below in the maven.xml, check the last 5 lines where it writes the names to a file called wrapper.conf.
How can I achive this with Maven3? Is there a maven plugin I can use for this or I need to use ant script within my pom.xml?
<goal name="service">
<mkdir dir="${maven.build.dir}/grid" />
<ant:copy todir="${maven.build.dir}/grid">
<fileset dir="resources/javaservicewrapper" />
</ant:copy>
<j:forEach var="lib" items="${pom.artifacts}">
<j:set var="dep" value="${lib.dependency}"/>
<j:if test="${dep.getProperty('service.bundle')=='true'}">
<ant:copy failonerror="true" todir="${maven.build.dir}/grid/lib">
<fileset dir="${maven.repo.local}/${dep.groupId}/jars">
<include name="${dep.artifactId}-${dep.version}.${dep.type}"/>
<j:set var="SERVCP" value="../lib/${dep.artifactId}-${dep.version}.${dep.type}:${SERVCP}" />
</fileset>
</ant:copy>
</j:if>
</j:forEach>
<attainGoal name="jar" />
<ant:copy file="target/${maven.final.name}.jar" tofile="${maven.build.dir}/grid/lib/grid.jar" />
<j:set var="SERVCP" value="${SERVCP}../lib/gridcache.jar" />
<ant:copy todir="${maven.build.dir}/gridcache/conf" file="resources/javaservicewrapper/conf/wrapper.conf" overwrite="true">
<filterset begintoken="#" endtoken="#">
<filter token="service.classpath" value="${SERVCP}"/>
</filterset>
</ant:copy>
</goal>
EDIT : The solution using build-classpath worked well but I had other problems specific to using Javaservicewrapper. So best solution I found was creating whole the script/config file by appassembler-maven-plugin and let maven-assembly plugin to copy it to the conf folder
If you have the need to create a JSW (wrapper.conf) the simplest solution would be to use the appassembler-maven-plugin which can create such files.
Have a look at the build-classpath goal of the Maven Depency Plugin. You can fast check the result on the command line:
mvn dependency:build-classpath
You can change the path to the dependeny files using the 'prefix' (mdep.prefix) property
mvn -Dmdep.prefix=myLibFolder dependency:build-classpath
You will find much more configuration parameters in the documentation, e.g. the outputFile parameter ;-)
Related
I've built a swing gui in eclipse that is supposed to run a bunch of code previously developed, part of which involves running ant to build the code. When I run the any script outside of the GUI (in the original project) the ant executes correctly and builds the project. However when I try and run ant programmatically it throws errors that look like the project doesn't have the necessary .jars. The code, top of the build.xml, and errors are listed below.
Code Block
File buildFile = new File("lib\\ePlay\\build.xml");
Project p = new Project();
p.setUserProperty("ant.file", buildFile.getAbsolutePath());
DefaultLogger consoleLogger = new DefaultLogger();
consoleLogger.setErrorPrintStream(System.err);
consoleLogger.setOutputPrintStream(System.out);
consoleLogger.setMessageOutputLevel(Project.MSG_INFO);
p.addBuildListener(consoleLogger);
try {
p.fireBuildStarted();
p.init();
ProjectHelper helper = ProjectHelper.getProjectHelper();
p.addReference("ant.projectHelper", helper);
helper.parse(p, buildFile);
p.executeTarget(p.getDefaultTarget());
p.fireBuildFinished(null);
} catch (BuildException e) {
p.fireBuildFinished(e);
}
Build.xml
<?xml version="1.0" encoding="UTF-8"?>
<project name="EPlay" xmlns:ivy="antlib:org.apache.ivy.ant" default="resolve">
<dirname file="${ant.file}" property="ant.dir" />
<property name="scm.user" value="scmrel"/>
<property name="scm.user.key" value="/.ssh/scmrel/id_rsa"/>
<property name="ivy.jar" value="ivy-2.0.0.jar" />
<property name="ivy.settings.dir" value="${ant.dir}/ivysettings" />
<property name="VERSION" value="LATEST" />
<property name="tasks.dir" value="${ant.dir}/.tasks" />
<property name="deploy.dir" value="${ant.dir}/deploy" />
...
<!-- retrieve the dependencies using Ivy -->
<target name="resolve" depends="_loadantcontrib,_getivy" description=" retrieve the dependencies with Ivy">
<ivy:settings file="${ivy.settings.dir}/ivysettings.xml" />
<ivy:resolve file="${ant.dir}/ivy.xml" transitive="false" />
<ivy:retrieve pattern="${deploy.dir}/[conf]/[artifact].[ext]"/>
</target>
And the error
resolve:
BUILD FAILED
H:\eclipse\CLDeploySwing\lib\ePlay\build.xml:66: Problem: failed to create task or type antlib:org.apache.ivy.ant:settings
Cause: The name is undefined.
Action: Check the spelling.
Action: Check that any custom tasks/types have been declared.
Action: Check that any <presetdef>/<macrodef> declarations have taken place.
No types or tasks have been defined in this namespace yet
This appears to be an antlib declaration.
Action: Check that the implementing library exists in one of:
-ANT_HOME\lib
-the IDE Ant configuration dialogs
Total time: 0 seconds
I've looked through my ant installation and it appears everything is there. Like I said, the original project builds successfully if build.xml is run outside of this application.
I would suggest that this is causde because your java program does not have the same classpath, where it is running, as does the normal ant build - and thus the ANT_HOME isn't the right one.
You can make sure that this is correct by passing the right enviornmental variables into the JVM, or simply a call to System.getProperty("ANT_HOME"), to see where your ANT_HOME actually is residing.
I think ${basedir} for your project is not properly calculated.
Add this line to your build.xml
<echo message="basedir='${basedir}'/>
Looking at this line
File buildFile = new File("lib\\ePlay\\build.xml");
It may be that it's 2 levels up ( I know the documentation says that it should be a parent directory of build.xml, but you are not running from the command line ).
rather than load the ivy library using the new method of using the namespace in the project declaration go old school on it.
<path id="ivy.lib.path">
<fileset dir="path/to/dir/with/ivy/jar" includes="*.jar"/>
</path>
<taskdef resource="org/apache/ivy/ant/antlib.xml"
uri="antlib:org.apache.ivy.ant" classpathref="ivy.lib.path"/>
I am doing something in Gradle that requires this
ant.taskdef(name: 'ivy-retrieve', classname: 'org.apache.ivy.ant.IvyRetrieve', classpath: '...path to ivy jar.../ivy-2.2.0.jar')
which in ant would be something like
<taskdef name="ivy-retrieve" classname="org.apache.ivy.ant.IvyRetrieve"/>
I know it's more klunky and simply not as nice as including the namespace declaration but it does remove some of the confusion regarding which libraries on which classpath are being loaded.
I have this in my build.xml:
<target depends="build-jar" name="proguard">
<taskdef resource="proguard/ant/task.properties" classpath="tools/proguard4.6/lib/proguard.jar" />
<proguard configuration="ant/proguard.conf" />
</target>
It works fine.
Inside the configuration file (i.e "ant/proguard.conf") I'm trying to access properties defined in this build.xml file but I'm always getting this kind of error:
Value of system property 'jar.final_name' is undefined in '<jar.final_name>' in line 1 of file '.......\ant\proguard.conf'
The error is clear. Question is how I do what I'm trying to?
If I'd do it the "Embedded ProGuard configuration options" way I could use these properties like any other property in build.xml, but I'm trying to keep the files separate.
How do I do that then?
By default, Ant doesn't provide a way to set java system properties for its tasks. You can only specify -D options in the ANT_OPTS system variable when starting Ant itself.
I'll consider supporting the use of Ant properties in referenced ProGuard configurations (being the developer of ProGuard).
For the time being, an acceptable solution might be to specify input and output jars in Ant's XML-style:
<proguard configuration="ant/proguard.conf">
<injar name="${injar}" />
<outjar name="${outjar}" />
<libraryjar name="${java.home}/lib/rt.jar" />
</proguard>
This part of the configuration is more closely tied to the Ant script anyway.
I need to filter java files before compilation, leaving the original sources unchanged and compiling from filtered ones (basically, I need to set build date and such).
I'm using NetBeans with its great Ant build-files.
So, one day I discovered the need to pre-process my source files before compilation, and ran into a big problem. No, I did not run to SO at once, I did some research, but failed. So, here comes my sad story...
I found the "filter" option of "copy" task, overrided macrodef "j2seproject3:javac" in build-impl.xml file and added filter in the middle of it. I got the desired result, yes, but now my tests are not working, since they use that macrodef too.
Next, I tired to overriding "-do-compile" target, copying&filtering files to directory build/temp-src, and passing an argument of new source directory to "j2seproject3:javac":
<target depends="init,deps-jar,-pre-pre-compile,-pre-compile, -copy-persistence-xml,
-compile-depend,-prepare-sources"
if="have.sources" name="-do-compile">
<j2seproject3:javac gensrcdir="${build.generated.sources.dir}" srcdir="build/temp-src"/>
<copy todir="${build.classes.dir}">
<fileset dir="${src.dir}" excludes="${build.classes.excludes},${excludes}" includes="${includes}"/>
</copy>
</target>
And now Ant says to me, that macrodef in question does not exist!
The prefix "j2seproject3" for element "j2seproject3:javac" is not bound.
That's strange, since build-impl.xml contains that macrodef, and build-impl.xml is imported into main build file.
And, by the way, I cannot edit build-impl.xml directly, since NetBeans rewrites it on every other build.
So, my question is: how can I automatically filter sources before compiling in NetBeans, and do not break the build process?
Looking at the default build.xml, it contains a comment that reads (in part):
There exist several targets which are by default empty and which can be
used for execution of your tasks. These targets are usually executed
before and after some main targets. They are:
-pre-init: called before initialization of project properties
-post-init: called after initialization of project properties
-pre-compile: called before javac compilation
-post-compile: called after javac compilation
-pre-compile-single: called before javac compilation of single file
-post-compile-single: called after javac compilation of single file
-pre-compile-test: called before javac compilation of JUnit tests
-post-compile-test: called after javac compilation of JUnit tests
-pre-compile-test-single: called before javac compilation of single JUnit test
-post-compile-test-single: called after javac compilation of single JUunit test
-pre-jar: called before JAR building
-post-jar: called after JAR building
-post-clean: called after cleaning build products
So, to inject some pre-compile processing, you would provide a definition for -pre-compile.
FWIW, the error you got is because the j2seprojectX prefix is defined on the project tag of build-impl.xml, and the code in build.xml is outside of that tag.
Since I found an answer, and because it seems that nobody knows the answer, I'll post my solution.
build.xml:
<?xml version="1.0" encoding="UTF-8"?>
<!-- Needed to add xmlns:j2seproject3 attribute, to be able to reference build-impl.xml macrodefs -->
<project name="Parrot" default="default" basedir="." xmlns:j2seproject3="http://www.netbeans.org/ns/j2se-project/3">
<import file="nbproject/build-impl.xml"/>
<target depends="init,deps-jar,-pre-pre-compile,-pre-compile, -copy-persistence-xml, -compile-depend,-prepare-sources" if="have.sources" name="-do-compile">
<j2seproject3:javac gensrcdir="${build.generated.sources.dir}" srcdir="build/temp-src"/>
<copy todir="${build.classes.dir}">
<fileset dir="${src.dir}" excludes="${build.classes.excludes},${excludes}" includes="${includes}"/>
</copy>
</target>
<!-- target to alter sources before compilation, you can add any preprocessing actions here -->
<target name="-filter-sources" description="Filters sources to temp-src, setting the build date">
<delete dir="build/temp-src" />
<mkdir dir="build/temp-src" />
<tstamp>
<format property="build.time" pattern="yyyy-MM-dd HH:mm:ss"/>
</tstamp>
<filter token="build-time" value="${build.time}" />
<copy todir="build/temp-src" filtering="true">
<fileset dir="src">
<filename name="**/*.java" />
</fileset>
</copy>
</target>
</project>
I have a legacy product, spread across dozens of repositories. I'm currently trying to refactor (and understand...) the given build process. First step was moving from the old version control system to mercurial, which was encouraging and easy.
The build process mainly uses ant build scripts (good news) but has to run 'on' the repository file structure (bad news...) because the ant scripts take files from all repositories and leave artifacts on .. lets say, several places... But that's not a blocker.
Before I trigger the build (now with hudson CI), I have to make sure, that all repositories are updated to the tip or a selected tag. I could write a script/batch or write a custom ant task (again) but, I don't want to ignore existing features (again):
is it possible to update a set of mercurial repositories by using existing ant tasks or mercurial features? All repositories are located in one folder (like /repo) and have a common prefix (like SYSTEMNAME_module) so it's easy to locate them/create a fileset.
The most simple solution today is to start the hg executable from Ant (using the exec task. Note: MacroDef is your friend). Just create a "main" build file in one of the projects, change to the parent directory (where you can access all the local copies) and then update each one.
I found a different ant based solution, based on the 'apply' task:
<apply executable="hg">
<arg value="update" />
<arg value="--verbose" />
<arg value="--revision" />
<dirset dir="/repos" include="REPO_SUFFIX*" />
</apply>
Thanks to Aaron, his very helpful answer showed me the right direction.
I think this would be a more flexible way to update a mercurial depo. From a set of directories it chooses which ones are depots and updates those. Note that you can also do this with the foreach tag of ant contrib but the 'for' seems to consume a lot less memory.
<!-- See if the directory looks like a mercurial depo, if so update it -->
<target name="-mercurial-update">
<dirset dir="${dir.deps}" id="lib.dirs">
<exclude name="build"/>
<present targetdir="${dir.deps}">
<mapper type="glob" from="*" to="*/.hg" />
</present>
</dirset>
<for param="file">
<path>
<dirset refid="lib.dirs"/>
</path>
<sequential>
<echo>Trying to update mercuriali depo #{file}</echo>
<exec executable="hg" dir="#{file}">
<arg line="pull" />
</exec>
<exec executable="hg" dir="#{file}">
<arg line="up" />
</exec>
</sequential>
</for>
</target>
We have an Apache ANT script to build our application, then check in the resulting JAR file into version control (VSS in this case). However, now we have a change that requires us to build 2 JAR files for this project, then check both into VSS.
The current target that checks the original JAR file into VSS discovers the name of the JAR file through some property. Is there an easy way to "generalize" this target so that I can reuse it to check in a JAR file with any name? In a normal language this would obviously call for a function parameter but, to my knowledge, there really isn't an equivalent concept in ANT.
I would suggest to work with macros over subant/antcall because the main advantage I found with macros is that you're in complete control over the properties that are passed to the macro (especially if you want to add new properties).
You simply refactor your Ant script starting with your target:
<target name="vss.check">
<vssadd localpath="D:\build\build.00012.zip"
comment="Added by automatic build"/>
</target>
creating a macro (notice the copy/paste and replacement with the #{file}):
<macrodef name="private-vssadd">
<attribute name="file"/>
<sequential>
<vssadd localpath="#{file}"
comment="Added by automatic build"/>
</sequential>
</macrodef>
and invoke the macros with your files:
<target name="vss.check">
<private-vssadd file="D:\build\File1.zip"/>
<private-vssadd file="D:\build\File2.zip"/>
</target>
Refactoring, "the Ant way"
It is generally considered a bad idea to version control your binaries and I do not recommend doing so. But if you absolutely have to, you can use antcall combined with param to pass parameters and call a target.
<antcall target="reusable">
<param name="some.variable" value="var1"/>
</antcall>
<target name="reusable">
<!-- Do something with ${some.variable} -->
</target>
You can find more information about the antcall task here.
Take a look at Ant macros. They allow you to define reusable "routines" for Ant builds. You can find an example here (item 15).
Also check out the subant task, which lets you call the same target on multiple build files:
<project name="subant" default="subant1">
<property name="build.dir" value="subant.build"/>
<target name="subant1">
<subant target="">
<property name="build.dir" value="subant1.build"/>
<property name="not.overloaded" value="not.overloaded"/>
<fileset dir="." includes="*/build.xml"/>
</subant>
</target>
</project>
You can use Gant to script your build with groovy to do what you want or have a look at the groovy ant task.