Build multiple maven pom files and log in to a single file - java

I am working on a Java project. In my project multiple dependency projects are these. So I created a bat file to build all projects one by one. Please see the logic I used to achieve this.
set x=proj1-prx
set y=proj2-prx
set z=proj3-prx
set LIST=(%x% %y% %z% )
echo Checkout and deploy started
for %%G in %LIST% do (
set _module=%%G
set _value=!%%G!
echo Checkout module - %%G
svn checkout %SVNHOST%/%%G/%REPO% %WORKSPACE%\%%G\%REPO% --username %USER% --password %PASSWORD%
echo Install module to AEM - %%G
mvn clean install -Dskiptests -f %WORKSPACE%\%%G\%REPO%\pom.xml -l output.log
#ECHO OFF
)
echo Checkout and deploy finished
This file is executing well , Log file also creating but each time for loop building a project the build result overrides in to log file. I want build result of all project. Please help me friends

I want build result of all project. Please help me friends
You cannot depend on mavens logger implementation. Use the OS stdout/stderr redirection:
at the beginning of your batch file delete the old log content:
echo > output.log
then change the maven call:
mvn clean install -Dskiptests -f %WORKSPACE%\%%G\%REPO%\pom.xml >> output.log 2>&1

Related

Maven jar:jar not gathering dependencies

I am trying to use semantic-release within a GitLab CI pipeline. I have the prepare stage working fine, but the publish stage always fails when I use anything other than mvn jar:jar deploy:deploy, but when i use those commands it deploys a jar that is 3kb big instead of a jar that is 10mb. So i can only assume that it is not gathering dependencies. There was a WARNING message about no files being marked for inclusion and the jar being empty. So I tried to package the project before calling deploy. It did not work.
The pipeline fails with no reason as to why. It just show that line as the culprit.
commands I have tried:
mvn clean install
mvn clean package deploy
mvn jar:jar deploy:deploy
mvn clean deploy:deploy
.. you get the idea.
Here is the prepare section that works:
verifyConditions:
- "#semantic-release/changelog"
- "#semantic-release/gitlab"
- "#semantic-release/git"
verifyRelease:
- path: "#semantic-release/exec"
cmd: echo -e "VERSION=${nextRelease.version}\nNEW_RELEASE=true" > RELEASE.env
prepare:
- path: "#semantic-release/exec"
cmd: if [ ! -d ".m2" ]; then mkdir .m2; cd .m2; touch settings.xml; echo $MVN_SETTINGS | base64 -d > 'settings.xml'; cd ..; fi; mvn versions:set -DnewVersion=${nextRelease.version} -B -gs .m2/settings.xml;
- "#semantic-release/changelog"
And here is the publish section that only works with jar:jar deploy:deploy but does not create the correct jar.
publish:
- "#semantic-release/gitlab"
- path: "#semantic-release/exec"
cmd: if [ ! -d ".m2" ]; then mkdir .m2; cd .m2; touch settings.xml; echo $MVN_SETTINGS | base64 -d > 'settings.xml'; cd ..; fi; mvn versions:set -DnewVersion=${nextRelease.version} -DremoveSnapshot=true clean deploy -B -gs .m2/settings.xml;
I'm extremely new to this, and I cannot see why:
1) trying clean deploy is causing this to fail and jar:jar deploy:deploy doesn't
2) how I can get semantic-release to create a jar with all dependencies for upload to our repository.
I should note that both Maven Shade plugin and Maven Deploy plugin are present in my pom.
This is an older run, but they all are formatted like this and tell you nothing about WHY it failed. Just that it did:
stderr: '/bin/sh: line 1: 425 Killed mvn clean deploy -B -gs .m2/settings.xml\n',
failed: true,
signal: null,
cmd: '/bin/sh -c mvn $MAVEN_CLI_OPTS versions:set -DremoveSnapshot; mvn clean deploy -B -gs .m2/settings.xml',
timedOut: false,
killed: false,
pluginName: '#semantic-release/exec' }ERROR: Job failed: command terminated with exit code 1
First of all, for deployment use mvn clean deploy. The other combinations you presented do not produce sensible output.
If you want to package the dependencies into your jar, you need to properly configure the Maven shade plugin (configuring the deploy plugin is usually not necessary). Without your pom.xml, I can only guess, but I would say that the error is probably in that configuration.
BTW: Only put the dependencies into the jar if the jar is meant to run standalone. If, on the other hand, you write a Java library to be used as dependency, don't do it.

UnsupportedOperationException: No format processor for org.jboss...MavenResolvedArtifact was found

I'm creating an application to automatically generate resources for a launcher, this requires automatically resolving maven dependencies, but I'm getting an UnsupportedOperationException while running JBoss Shrinkwrap Resolver
I'm running this inside a docker container, to avoid the local repository caching, it works outside of the container on the host, but I'm unsure what is missing inside the container.
My resolver config is a simple example, converting to a MavenResolvedArtifact
MavenResolvedArtifact[] artifacts = Maven.configureResolver()
.withMavenCentralRepo(true)
.withRemoteRepo("internal-nexus", MAVEN_URL, "default")
.resolve("com.company:application:" + PROJECT_VERSION)
.withTransitivity()
.asResolvedArtifact();
My Dockerfile is also relatively simple, using openjdk9, including the bootstrapper program, a shell script and some environment variables.
FROM openjdk:9
COPY bootstrapper-shaded.jar /bootstrapper.jar
COPY docker-run.sh /run.sh
ENV CLONE_URL https://github.com/company/repository.git
ENV NEXUS_BASE https://nexus.company.com/
ENV NEXUS_REPO repository
RUN chmod +x /run.sh
RUN apt-get update && apt-get install -y git software-properties-common maven
ENTRYPOINT ["/run.sh"]
And the run.sh script copies some files (removed for brevity), runs the build, then the bootstrapper
#!/bin/sh
rm -rf /boostrap/*
cd /bootstrap/
git clone $CLONE_URL work
cd work
chmod +x ./gradlew
./gradlew clean build
NEXUS_URL=`printf $NEXUS_BASE; printf "/repository/"; printf $NEXUS_REPO` ./gradlew upload
java -jar /bootstrapper.jar 0
I expect the output to be the same as on the host machine, an array of MavenResolvedArtifacts, however the following exception is thrown on the final line of the code snippet, .asResolvedArtifact()
Exception in thread "main" java.lang.UnsupportedOperationException: No format processor for org.jboss.shrinkwrap.resolver.api.maven.MavenResolvedArtifact was found. Supported processors are: class org.jboss.shrinkwrap.resolver.impl.maven.archive.ArchiveFormatProcessor
at org.jboss.shrinkwrap.resolver.spi.format.FormatProcessors.find(FormatProcessors.java:53)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenFormatStageImpl.as(MavenFormatStageImpl.java:84)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenFormatStageImpl.asResolvedArtifact(MavenFormatStageImpl.java:71)
at org.jboss.shrinkwrap.resolver.impl.maven.MavenFormatStageImpl.asResolvedArtifact(MavenFormatStageImpl.java:40)
at com.company.ResolveTask.run(ResolveTask.java:39)
at com.company.Bootstrapper.main(Bootstrapper.java:102)
Apologies for any typos in the stacktrace, VM wouldn't let me copy-paste out of it so I had to type it out myself.
Update: Haven't found anything else on google yet, have tried clean builds to no avail.
You need to change the type of artifacts to
org.jboss.shrinkwrap.resolver.impl.maven.archive.ArchiveFormatProcessor
or at least cast it to that when you declare.

CAN'T Build Apache spark with MAVEN

I have downloaded Apache spark and trying to build it with MAVEN as suggested here. http://spark.apache.org/docs/1.0.0/building-with-maven.html
But I am not able to resolve the error after running the command -
build/mvn -DskipTests clean package run , the error is- build/mvn: No such file or directory .
I checked by running mvn -v and also JAVA_HOME is set to the JDK.(screen shot attached ) .
Please help to resolve the problem.command promt output
mvn -DskipTests clean package run
You don't need to use build/mvn. I am assuming you have mvn installed somewhere within the system.

Installing and using Gradle in a docker image/container

I am getting this strange error at the end of the process of creating a docker image from a Dockerfile:
/bin/sh: 1: gradle: not found
INFO[0003] The command [/bin/sh -c gradle test jar] returned a non-zero code: 127
The relevant part of the Dockerfile:
FROM debian:jessie
[...]
RUN curl -L https://services.gradle.org/distributions/gradle-2.4-bin.zip -o gradle-2.4-bin.zip
RUN apt-get install -y unzip
RUN unzip gradle-2.4-bin.zip
RUN echo 'export GRADLE_HOME=/app/gradle-2.4' >> $HOME/.bashrc
RUN echo 'export PATH=$PATH:$GRADLE_HOME/bin' >> $HOME/.bashrc
RUN /bin/bash -c "source $HOME/.bashrc"
RUN gradle test jar
[...]
The command I am using is: docker build -t java_i .
The strange thing is that if:
I run a container from the previous image commenting out RUN gradle test jar (command: docker run -d -p 9093:8080 -p 9094:8081 --name java_c -i -t java_i),
then I log into that container (command: docker exec -it java_c bash),
then I manually check the gradle environment variables finding them,
then I manually run that commented out command from within the running container (gradle test jar):
I eventually get the expected output (the compiled java code in the build folder).
I am using Docker version 1.6.2
I solved the problem using the ENV docker instructions (link to the documentation).
ENV GRADLE_HOME=/app/gradle-2.4
ENV PATH=$PATH:$GRADLE_HOME/bin
This command /bin/bash -c "source $HOME/.bashrc" means that you create a new non-interactive process and run a command in it to set environment variables there. Which does not affect the parent process. As soon as variables are set, process exits. You can check this by running something like this:
RUN /bin/bash -c "source $HOME/.bashrc; env"
RUN env
What should be working is this option:
RUN source ~/.bashrc
And the reason why it works when you log in, is because the new process reads already updated ~/.bashrc.
I was trying to install same version with JDK 11.0.7 but gradle-2.4 does not work. and got below error
FAILURE: Build failed with an exception.
* What went wrong:
Could not determine java version from '11.0.7'.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
I install later version to fix the above issue after installation.
Posting as an answer might help someone else.
FROM openjdk:11.0.7-jdk
RUN apt-get update && apt-get install -y unzip
WORKDIR /gradle
RUN curl -L https://services.gradle.org/distributions/gradle-6.5.1-bin.zip -o gradle-6.5.1-bin.zip
RUN unzip gradle-6.5.1-bin.zip
ENV GRADLE_HOME=/gradle/gradle-6.5.1
ENV PATH=$PATH:$GRADLE_HOME/bin
RUN gradle --version
You can use multi-stage builds and the Gradle Docker image (no need to install Gradle...) to build the application then use the result in the runtime container:
# Build
FROM gradle AS build
WORKDIR /appbuild
COPY . /appbuild
RUN gradle --version
# here goes your build code
Once the Gradle build is done, switch to the runtime container:
# Runtime
FROM openjdk:8-jre-alpine
# more stuff here...
COPY --from=0 appbuild/<somepath>/some.jar application.jar
# more stuff here...
The COPY command copies the build artifacts from the build phase to the runtime container (in this case a jar file).

Maven error while assembling hadoop in stand alone mode

I am new to hadoop and maven. I will like to compile the hadoop 2.0.3 from the source and install it. I am following instructions from
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
So far, i have managed to download hadoop source code and from the source directory issued "mvn clean install -Pnative"
Next i tried to execute mvn assembly:assembly, but i get following error:
Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.3:assembly (default-cli) on project hadoop-main: Error reading assemblies: No assembly descriptors found. -> [Help 1]
Please help so that i can move forward.
Also, the above mentioned install link, does not mention what should be the value of "$HADOOP_COMMON_HOME/$HADOOP_HDFS_HOME"
I compiled 1.0.4 just as an academic exercise. Not sure if it will be valid for 2.0.3
This should be done (on Ubuntu) before you start compilation to make sure that all needed stuff is there:
sudo apt-get -y install maven build-essential protobuf-compiler autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
I did not had subversion so I did this too:
sudo apt-get install subversion
After that I checked out the code:
svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-1.0.4/ hadoop-common-1.0.4
Then went to newly created folder “hadoop-common-1.0.4″ and gave command:
ant clean package
You can refer to my blog for the whole story:
http://hadoopmagic.wordpress.com/

Categories