Jenkins ERROR: Failed to create /usr/share/tomcat7/.m2 on Maven project - java

I am running Jenkins ver. 2.60.2 and it doesn't seem possible, within a Maven Job, to define a local repository not in /usr/share/tomcat7/.m2.
Here are my attempts:
I created a Global Maven settings.xml and a Settings file with the Config File Management Plugin, that contains:
<settings>
<localRepository>/srv/maven/.m2/repository</localRepository>
...
</settings>
I Created a new Maven Project. Tried to make the Job see that file by attempting all of the following:
a) Defining either Settings file or Global settings file (I created two identical files) within the build step:
b) Adding a Pre-step Provide Configuration files, and then using the variable MY_SETTINGS either in the Goals and options or MAVEN_OPTS:
c) Use the Provide Configuration files within the build environment (and using the MY_SETINGS in the same way as in the previous step.:
However, none of these seems to work. The job always fails, trying to use the default maven repository location (/usr/share(tomcat7/.m2) - which I have no idea how to re-define:
provisioning config files...
copy managed file [MYFILE settings] to file:/srv/webapps/jenkins/jobs/testJob/workspace#tmp/config3408982272576109420tmp
provisioning config files...
copy managed file [MYFILE settings] to file:/srv/webapps/jenkins/jobs/testJob/workspace#tmp/config2203063037747373567tmp
Parsing POMs
using global settings config with name MYFILE settings
Replacing all maven server entries not found in credentials list is true
Deleting 1 temporary files
ERROR: Failed to create /usr/share/tomcat7/.m2
Finished: FAILURE
Do you know how to make this work within a Maven Job type in Jenkins?

Related

PropertiesLauncher command line arguments not working with Spring Boot executable Jar

So I have basic multi-module Spring Boot project. The goal, that I had was to build executable jar and pass additional properties with the help of -Dloader.path=....
For some reason (if I understand purpose of this argument) loader.path is being ignored completely.
My project structure is following:
\-
|--conf
|---default
|--pets-api
|--pets-app (this module contains the Main-Class)
|--pets-domain
|--pets-infrastructure
Since no custom active profile is being passed it uses "default". Jar contains application-default.propeties file, that has single configuration server.servlet.context-path=/v1.
/conf/default location has 2 properties files:
application.properties
random.properties - this is bind to #ConfigurationProperties(prefix = "...") inside application
When I run it normally everything is fine java -jar pets-app-0.0.1-SNAPSHOT.jar. It just uses application-default.properties file and that is it.
Now when I am trying to utilize -Dloader.path argument as in java -Dloader.path=PATH/TO/conf/default -jar pets-app-0.0.1-SNAPSHOT.jar it starts application same as before, as if I am not adding 2 more file to classpath.
What is used:
Java 17
Spring Boot 2.6.12
Gradle
Did anyone come across this as well? Any suggestion on how to resolve it?
PS. If there is need to see the code, I can upload it to GitHub.

Ivy Install task fails with JSCH SFTP error 4 first time, but is successful on subsequent attempts

I am trying to use the ANT Ivy install task to copy a library from one repository to the other.
Some example code within my ANT target:
<ivy:install organisation="testOrg" module="testModuleName" revision="1.2.3" from="fromRepo" to="toRepo"/>
The fromRepo and toRepo are defined in a local ivysettings.xml file.
The resolve (from fromRepo) of the library is successful but the install to toRepo fails, with an SFTP Code 4 error.
impossible to install testOrg#testModuleName;1.2.3: java.io.IOException: Failure
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.put(SFTPRepository.java:164)
at org.apache.ivy.plugins.repository.AbstractRepository.put(AbstractRepository.java:130)
at org.apache.ivy.plugins.resolver.RepositoryResolver.put(RepositoryResolver.java:234)
at org.apache.ivy.plugins.resolver.RepositoryResolver.publish(RepositoryResolver.java:215)
at org.apache.ivy.core.install.InstallEngine.install(InstallEngine.java:150)
at org.apache.ivy.Ivy.install(Ivy.java:537)
at org.apache.ivy.ant.IvyInstall.doExecute(IvyInstall.java:102)
at org.apache.ivy.ant.IvyTask.execute(IvyTask.java:271)
...
Caused by: 4: Failure
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2833)
at com.jcraft.jsch.ChannelSftp.mkdir(ChannelSftp.java:2142)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.mkdirs(SFTPRepository.java:186)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.mkdirs(SFTPRepository.java:184)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.put(SFTPRepository.java:160)
... 37 more
However if I simply run the same target again, the install completes successfully!
It seems to be some issue with creating a directory, from com.jcraft.jsch.ChannelSftp.mkdir(ChannelSftp.java:2142) in the stacktrace.
After running the 1st time, the testOrg/testModuleName directory exists (only testOrg having previously existed).
The 2nd time running the testOrg/testModuleName/1.2.3 directory is created (along with the library artifacts).
If after running the 1st time I delete the testOrg/testModuleName directory it created, it will continue to return the code 4 error.
My ANT library directory contains: jsch-0.1.50.jar which I assume it is using to upload to the destination Ivy Server.
In addition I am using:
Ant 1.8.4
Ivy 2.4.0
Java 1.7.0_80
By debugging the Ivy SFTP source code that creates the new directories on the destination toRepo repository, I was able to see why this was happening.
The code is in the method: SFTPRepository.mkdirs() this recursively calls itself to make each directory in the path if they do not exist.
For my example the directory being uploaded was:
/toRepo/testOrg/testModuleName//1.2.3/
You can see the double slash: // in the middle of the path.
The meant that the mkdirs() method tried to create the testModuleName directory twice. The 2nd time failed which caused the code 4 error.
The reason there is a double slash in the path is because there is no branch for this artifact.
Within my ivy settings file the sftp resolver (for my toRepo repository) artifact patterns were configured to:
<ivy pattern="/toRepo/[organisation]/[module]/[branch]/[revision]/ivy-[revision].xml"/>
<artifact pattern="/toRepo/[organisation]/[module]/[branch]/[revision]/[artifact]-[revision].[ext]"/>
The /[branch]/ part of the pattern is what was generating the // in the path.
There are 2 configurations, one for the ivy.xml file itself and the other for all other artifacts.
Ivy patterns allow the use of parenthesis for optional parts of the pattern.
So changing my configuration to:
<ivy pattern="/toRepo/[organisation]/[module](/[branch])/[revision]/ivy-[revision].xml"/>
<artifact pattern="/toRepo/[organisation]/[module](/[branch])/[revision]/[artifact]-[revision].[ext]"/>
Fixed the issue and the ivy install functioned as expected.
This means that for antifacts where there is no branch defined (like 3rd party artifacts), then the branch directory will not be included in the path.

Kafka-Connect add SQL JAR file to classpath

I am trying to deploy a connect-standalone job to stream from an mssql server however am facing an issue (Kafka-Connect is part of my Ambari deployment, not docker). This is the properties file I am using:
name=JdbcSourceConnector
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
connection.user=ue
connection.password=pw
tasks.max=1
connection.url=jdbc:sqlserver://servername
topic.prefix=iblog
query=SELECT * FROM IB_WEBLOG_DUMMY_small
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.json.JsonConverter
poll.interval.ms=5000
table.poll.interval.ms=120000
mode=incrementing
incrementing.column.name=ID
I have added the jar file sqljdbc42.jar to /usr/share/java
and have run export CLASSPATH=/usr/share/java/*
however I still run into the error Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSourceConnector
Am I doing anything wrong or can I check something else?
Kafka-Connect is part of my Ambari deployment
That would imply you are using Hortonworks installation
You need to
git clone https://github.com/confluentinc/kafka-connect-jdbc/
Checkout a release branch that ideally matches your Kafka version. For example branch v3.1.2 is Kafka 0.10.1.1
mvn clean package will generate some folders in target/ of that project
SCP those files to all Kafka Connect workers in your cluster into /usr/hdp/current/kafka/.../share/java/kafka-connect-jdbc (create this, if not exists)
Restart Kafka processes to pick up the new CLASSPATH settings
You may need some extra Confluent packages that JDBC connect depends on
You need to include the kafka-connect-jdbc jar file, which contains the io.confluent.connect.jdbc.JdbcSourceConnector class.
If you are using maven, you can add it as a dependency:
[Add the following repo to your project if you haven't done so yet.]
<repository>
<id>confluent</id>
<url>http://packages.confluent.io/maven/</url>
</repository>
After this, add the following dependency:
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-connect-jdbc</artifactId>
<version>3.3.0 (or whatever version you want)</version>
</dependency>
https://github.com/confluentinc/kafka-connect-jdbc/issues/356
I too had the same problem. with Couchbase connector not found
ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113) java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches com.couchbase.connect.kafka.CouchbaseSourceConnector
Setting classpath was losing the existing classpath and I couldn't append as the classpath
I moved the required jar file from kafka-connect-couchase/*.jar files to /path/kafka_verison/libs/
libs is A folder where all the jar file stored.
I met the same issue, I resolved it by running connect-standalone in the root folder of confluent, in my case this was: /opt/confluent-5.0.1

JAVA SPRING - Using Maven, steps to create 2 war file (quality & production)

Can anyone please help me to create 2 war files using maven, java spring?
Requirement: Need 4 war file
For that 1st create 2 war file
(make another 2 copy from this with different name for oauth)
database name only diff between staging & production war
staging (http://10.19:3006/imdesk_imapi_staging)-sql datasource- for staging
1)war - api
2)oauth staging war - copy
production (http://10.19:3006/imdesk_imapi_production)-sql datasource for ****production****
1)api - war
2)oauth war - copy
work with maven profiles
http://maven.apache.org/guides/introduction/introduction-to-profiles.html
so you can create different artifacts for different stages
I see two paths you could take to solve the issue:
1. Use maven profiles.
http://maven.apache.org/guides/introduction/introduction-to-profiles.html
How can a profile be triggered? How does this vary according to the type of profile being used?
A profile can be triggered/activated in several ways:
Explicitly
Through Maven settings
Based on environment variables
OS settings
Present or missing files
Details on profile activation
Profiles can be explicitly specified using the -P CLI option.
This option takes an argument that is a comma-delimited list of profile-ids to use. When this option is specified, the profile(s) specified in the option argument will be activated in addition to any profiles which are activated by their activation configuration or the section in settings.xml.
mvn groupId:artifactId:goal -P profile-1,profile-2
2. Use Spring profiles.
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-profiles.html
Spring Profiles provide a way to segregate parts of your application configuration and make it only available in certain environments. Any #Component or #Configuration can be marked with #Profile to limit when it is loaded:
`#Configuration
#Profile("production")
public class ProductionConfiguration {
// ...
}
In the normal Spring way, you can use a spring.profiles.active Environment property to specify which profiles are active. You can specify the property in any of the usual ways, for example you could include it in your application.properties:
spring.profiles.active=dev,hsqldb
or specify on the command line using the switch --spring.profiles.active=dev,hsqldb

How to publish CruiseControl LATEST build artifacts to a static URL

I have a Java multi-module Maven project that I want to build an MVN site and javadocs and have CruiseControl publish the latest daily builds to a configured static location.
The trouble is the CruiseControl artifactPublisher allows you to specify a dest directory but it is timestamped with the latest time of the last build. I want to be able to publish to a location that gets overridden on each build, such as:
http://cc-buildserver/cruisecontrol/artifacts/gameplatform-documentation/
artifactPublisher documentation:
dir - will copy all files from this
directory
dest - parent directory of actual
destination directory; actual
destination directory name will be the
build timestamp.
subdirectory -
subdirectory under the unique
(timestamp) directory to contain
artifacts
For example if I have a CruiseControl project called gameplatform-documentation and I configure my artifactPublisher as such:
<project name="gameplatform-documentation" forceOnly="true" requireModification="false" forceBuildNewProject="false" buildafterfailed="false">
...
<schedule>
<composite time="2300">
<maven2
mvnhome="${mvn.home}"
pomfile="${dev.root}/gameplatform-parent/pom.xml"
goal="site" />
</composite>
</schedule>
<publishers>
<artifactspublisher
dir="${dev.root}/gameplatform-parent/target/site"
dest="artifacts/gameplatform-documentation" />
</publishers>
</project>
I end up with my Maven generated site and javadocs in a different directory each build:
http://cc-buildserver/cruisecontrol/cruisecontrol/artifacts/gameplatform-documentation/20091110130202/
Maybe I need to use a custom AntPublisher or FTPPublisher and create another webserver to host the published docs. I could also use CC source control tools and checkin the documentation into our SVN server and use that to serve the documentation.
How can this be accomplished?
We ended up using Maven's site deploy plugin to publish the documentation artifacts through SCP (using cygwin SSHD server setup on Windows server) to our CruiseControl server's "artifact" folder:
<distributionManagement>
<site>
<id>dev.website</id>
<url>scp://user#buildserver/cygdrive/c/Users/user/servers/cruisecontrol-project-2.8.3/artifacts/documentation/project/gameplatform</url>
</site>
</distributionManagement>
Then we're able to access the nightly built documentation them by visiting:
http://buildserver:8081/cruisecontrol/artifacts/documentation/project/gameplatform

Categories