I have a Maven OSGi multi-module project. The project runs perfectly well when the OSGi picks the module jars from the individual project modules. (view 1.1.B below).
However, using a second approach, bundle.getRegisteredServices() (view 1.1.A below) returns null whenever I try using bundles deposited into a central folder (D:/parent/provider/target/modules) using the maven-assembly-plugin version : 2.6:
framework.getBundleContext().installBundle("file:D:/parent/provider/target/modules/OSGiDmHelloWorldProvider-1.0.jar");
framework.getBundleContext().installBundle("file:D:/parent/provider/target/modules/OSGiDmHelloWorldConsumer-1.0.jar");
View 1.1.C below for console output using the second approach.
1.1.A
if (bundle.getRegisteredServices() != null) {
for (ServiceReference<?> serviceReference : bundle.getRegisteredServices())
System.out.println("\tRegistered service: " + serviceReference);
}
Why can't I access the bundles with the second approach?
GitHub
I have a SSCCE on GitHub HERE. Running the main class will show my predicament.
Thank you all in advance.
1.1.B
package main;
import java.net.URISyntaxException;
import java.net.URL;
import java.util.HashMap;
import java.util.Map;
import java.util.ServiceLoader;
import org.osgi.framework.Bundle;
import org.osgi.framework.BundleException;
import org.osgi.framework.Constants;
import org.osgi.framework.ServiceReference;
import org.osgi.framework.launch.Framework;
import org.osgi.framework.launch.FrameworkFactory;
public class App {
public static void main(String[] args) throws BundleException, URISyntaxException {
App app = new App();
app.initialize();
}
private void initialize() throws BundleException, URISyntaxException {
Map<String, String> map = new HashMap<String, String>();
// make sure the cache is cleaned
map.put(Constants.FRAMEWORK_STORAGE_CLEAN, Constants.FRAMEWORK_STORAGE_CLEAN_ONFIRSTINIT);
map.put("ds.showtrace", "true");
map.put("ds.showerrors", "true");
FrameworkFactory frameworkFactory = ServiceLoader.load(FrameworkFactory.class).iterator().next();
Framework framework = frameworkFactory.newFramework(map);
System.out.println("Starting OSGi Framework");
framework.init();
loadScrBundle(framework);
framework.getBundleContext().installBundle("file:D:/parent/provider/target/OSGiDmHelloWorldProvider-1.0.jar");
framework.getBundleContext().installBundle("file:D:/parent/consumer/target/OSGiDmHelloWorldConsumer-1.0.jar");
for (Bundle bundle : framework.getBundleContext().getBundles()) {
bundle.start();
System.out.println("Bundle: " + bundle.getSymbolicName());
if (bundle.getRegisteredServices() != null) {
for (ServiceReference<?> serviceReference : bundle.getRegisteredServices())
System.out.println("\tRegistered service: " + serviceReference);
}
}
}
private void loadScrBundle(Framework framework) throws URISyntaxException, BundleException {
URL url = getClass().getClassLoader().getResource("org/apache/felix/scr/ScrService.class");
if (url == null)
throw new RuntimeException("Could not find the class org.apache.felix.scr.ScrService");
String jarPath = url.toURI().getSchemeSpecificPart().replaceAll("!.*", "");
System.out.println("Found declarative services implementation: " + jarPath);
framework.getBundleContext().installBundle(jarPath).start();
}
}
1.1.C
Starting OSGi Framework
Found declarative services implementation: file:/C:/Users/Revilo/.m2/repository/org/apache/felix/org.apache.felix.scr/1.6.2/org.apache.felix.scr-1.6.2.jar
INFO : org.apache.felix.scr (1): Version = 1.6.2
DEBUG: Starting ComponentActorThread
Bundle: org.apache.felix.framework
Registered service: [org.osgi.service.resolver.Resolver]
Registered service: [org.osgi.service.packageadmin.PackageAdmin]
Registered service: [org.osgi.service.startlevel.StartLevel]
Bundle: org.apache.felix.scr
Registered service: [org.apache.felix.scr.ScrService]
Registered service: [org.osgi.service.cm.ManagedService]
Registered service: [org.apache.felix.scr.impl.ScrGogoCommand]
Bundle: null
Bundle: null
I had to do a lot to get your sample to duplicate the question.
First off your reactor order is wrong in the parent. That is why you have to do mvn install all the time.
<modules>
<module>OSGiDmHelloWorldProvider</module>
<module>OSGiDmHelloWorldConsumer</module>
<module>main</module>
<module>dist</module>
</modules>
Next, if you define a dependency (e.g. JUnit) in the parent you don't need to redfine it in the children.
Next, it is conventional to put the parent tag at the top of the pom.
I don't see a reason to have your child modules have a different version to the parent so I removed the tag so they all have 1.0-SNAPSHOT from the parent.
Next, you have the wrong group id in the OSGiDmHelloWorldProvider dependency (it should be rev).
<dependency>
<groupId>rev</groupId>
<artifactId>OSGiDmHelloWorldProvider</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
In the main module you have a dependency that isn't in the reactor. I am guessing this is just an oversight of the sample.
<dependency>
<groupId>rev</groupId>
<artifactId>core</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
After all that, mvn clean package -DskipTests=true works.
You have a hard-coded string in your Main class that obviously doesn't work for me. (You also might want to look at the free IDEA Community instead of Eclipse!)
String baseDir = "D:/standAloneDev/java/workingDir/Sample Projects/Eclipse/Gen/OSGiDmHelloWorld/dist/target/dist-1.0-SNAPSHOT-bin/plugins/";
You should make this relative. e.g.
File baseDir = new File("dist/target/dist-1.0-SNAPSHOT-bin/plugins/");
String baseDirPath = baseDir.getAbsolutePath();
loadScrBundle(framework);
File provider = new File(baseDirPath, "OSGiDmHelloWorldProvider-1.0-SNAPSHOT.jar");
File consumer = new File(baseDirPath, "OSGiDmHelloWorldConsumer-1.0-SNAPSHOT.jar");
framework.getBundleContext().installBundle(provider.toURI().toString());
framework.getBundleContext().installBundle(consumer.toURI().toString());
Anyway, after getting it going I noticed the following javadoc on bundle.getSymbolicName().
Returns the symbolic name of this bundle as specified by its Bundle-SymbolicName manifest header. The bundle symbolic name should be based on the reverse domain name naming convention like that used for java packages.
So in the MANIFEST.MF of org.apache.felix.scr-1.6.2.jar you have
Bundle-Name: Apache Felix Declarative Services
Bundle-SymbolicName: org.apache.felix.scr
You don't have this in yours as you are not creating a manifest and adding it to a jar.
You need to add an execution phase and tell the jar plugin to use the manifest:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifestFile>${project.build.outputDirectory}/META-INF/MANIFEST.MF</manifestFile>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<executions>
<execution>
<id>bundle-manifest</id>
<phase>process-classes</phase>
<goals>
<goal>manifest</goal>
</goals>
</execution>
</executions>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>OSGiDmHelloWorldProvider</Bundle-SymbolicName>
<Export-Package>com.bw.osgi.provider.able</Export-Package>
<Bundle-Activator>com.bw.osgi.provider.ProviderActivator</Bundle-Activator>
<Bundle-Vendor>Baptiste Wicht</Bundle-Vendor>
</instructions>
</configuration>
</plugin>
Related
I'm developing custom Maven plugin. I want my plugin to add a new dependency to a project.
I have the following code:
#Mojo(name = "generate-model", defaultPhase = LifecyclePhase.GENERATE_SOURCES)
public class ModelGeneratorMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Override
public void execute() throws MojoExecutionException {
Dependency connectivity = new Dependency();
connectivity.setGroupId("groupid");
connectivity.setArtifactId("artifactid");
connectivity.setVersion("1.0");
//noinspection unchecked
project.getDependencies().add(connectivity);
}
}
It seems to have no effect, because when I compile a project containing this plugin, I get unresolved symbol error.
I'm sure that plugin is executed because I see code generated by it (code generation is omitted in my example) in target folder.
I think you should bind the goal in your plugin to the Initialize phase of Maven build process to include the dependency very early in the build process.
Something along these lines:
#Mojo(name = "generate-model", defaultPhase = LifecyclePhase.INITIALIZE)
public class ModelGeneratorMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Parameter(defaultValue = "${session}", required = true)
MavenSession session;
#Override
public void execute() throws MojoExecutionException {
Dependency connectivity = new Dependency();
connectivity.setGroupId("groupid");
connectivity.setArtifactId("artifactid");
connectivity.setVersion("1.0");
project.getDependencies().add(connectivity);
session.setCurrentProject(project);
}
}
<plugin>
<groupId>com.maventest</groupId>
<artifactId>maven-generate-model</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>generate-model</goal>
</goals>
</execution>
</executions>
</plugin>
I tried multiple paths to accomplish this, but it seems not possible to do using maven plugins, probably maven extension is a better choice here.
The reason is that Depedency Resolution is the first step into Maven lifecycle and you need that extra dependency to compile the application.
With maven extensions, you can extend the maven lifecycle.
https://maven.apache.org/examples/maven-3-lifecycle-extensions.html
I'm trying to build an app that reads info from SFSF. For this, I'm using the Virtual Data model generator tool (the maven plugin) with SFSF OData metadata to be able to access the system. I'm following these steps:
Get a project via archetype (with powershell):
mvn archetype:generate "-DarchetypeGroupId=com.sap.cloud.sdk.archetypes" "-DarchetypeArtifactId=scp-cf-tomee" "-DarchetypeVersion=RELEASE"
Add the following to the application\pom.xml
In dependencies:
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<scope>provided</scope>
</dependency>
In plugins:
<plugin>
<groupId>com.sap.cloud.sdk.datamodel</groupId>
<artifactId>odata-generator-maven-plugin</artifactId>
<version>3.13.0</version>
<executions>
<execution>
<id>generate-consumption</id>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<inputDirectory>${project.basedir}/edmx</inputDirectory>
<outputDirectory>${project.build.directory}/vdm</outputDirectory>
<defaultBasePath>/odata/v2</defaultBasePath>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${project.basedir}/vdm</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
Get the OData metadata file from https://apisalesdemo2.successfactors.eu/odata/v2/JobRequisition/$metadata and place it in ./application/edmx
Create a destination service (my-destination) and add a destination there pointing to my SFSF instance with basic auth (with user#companyId, the connection is 200:OK)
Add the destination service in the manifest.yml
Create a java class to call the destination and get the data:
package com.sap.sdk;
import com.google.gson.Gson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.List;
import com.sap.cloud.sdk.cloudplatform.connectivity.DestinationAccessor;
import com.sap.cloud.sdk.odatav2.connectivity.ODataException;
import com.sap.cloud.sdk.s4hana.connectivity.DefaultErpHttpDestination;
import com.sap.cloud.sdk.s4hana.connectivity.ErpHttpDestination;
import com.sap.cloud.sdk.s4hana.datamodel.odata.namespaces.rcmjobrequisition.JobRequisition;
import com.sap.cloud.sdk.s4hana.datamodel.odata.services.DefaultRCMJobRequisitionService;
#WebServlet("/req")
public class JobReqServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private static final Logger logger = LoggerFactory.getLogger(JobReqServlet.class);
private final ErpHttpDestination destination = DestinationAccessor.getDestination("sfsf-sdk-dest").asHttp()
.decorate(DefaultErpHttpDestination::new);
#Override
protected void doGet(final HttpServletRequest request, final HttpServletResponse response)
throws ServletException, IOException {
try {
final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
.getAllJobRequisition()
.execute(destination);
response.setContentType("application/json");
response.getWriter().write(new Gson().toJson(jobReqs));
} catch (final ODataException e) {
logger.error(e.getMessage(), e);
response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
response.getWriter().write(e.getMessage());
}
}
}
With all this (I think I'm not missing anything), I do:
mvn clean install
and:
cf push
Everything works well, the hello world servlet works, but when I try to access /req, I get a:
Unable to execute metadata request.
However, I can see that the app is hitting SFSF because if I play with the base path of the service (in the pom.xml) I get 404's coming from SFSF.
Checking everything, I see this when the VDM generator is running:
1. This is the base path I'm giving in the pom:
<defaultBasePath>/odata/v2</defaultBasePath>
I can see the generator picking that path correctly:
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.DataModelGenerator - Default base path: /odata/v2/
But this is what the generator processes:
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Title: RCMJobRequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Raw URL: /odata/v2/SFODataSet
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Java Package Name: rcmjobrequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Java Class Name: RCMJobRequisition
Clearly, that SFODataSet in the URL is not correct. When the app runs, it's tring to get the metadata from .../odata/v2/SFODataSet/$metadata, and that's why it's not finding it.
That SFODataSet is coming from the SFSF metadata:
<Schema Namespace="SFODataSet" xmlns="http://schemas.microsoft.com/ado/2008/09/edm" xmlns:sf="http://www.successfactors.com/edm/sf" xmlns:sap="http://www.sap.com/Protocols/SAPData">
<EntityContainer Name="EntityContainer" m:IsDefaultEntityContainer="true">
<EntitySet Name="JobOfferTemplate_Standard_Offer_Details" EntityType="SFOData.JobOfferTemplate_Standard_Offer_Details" sap:label="JobOfferTemplate_Standard_Offer_Details" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
<Documentation>
<Summary>Job Requisition Template</Summary>
<LongDescription>These entities represent the job requisition template as defined in provisioning.</LongDescription>
<sap:tagcollection>
<sap:tag>Recruiting (RCM)</sap:tag>
<sap:tag>RCM - Job Requisition</sap:tag>
</sap:tagcollection>
</Documentation>
</EntitySet>
<EntitySet Name="JobRequisitionLocale" EntityType="SFOData.JobRequisitionLocale" sap:label="JobRequisitionLocale" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
<Documentation>
...
I can't find the way for this to work. Can you help me find the issue here?
I'm using:
Apache Maven 3.6.2
SAP Cloud SDK 3.13.0
Edit:
SFSF metadata files are available in https://api.sap.com/
The one I'm using for this app is for SFSF - Job Requisition, available here:
https://api.sap.com/api/RCMJobRequisition/overview
From there, you can download the EDMX specification. These are "mock" API's, not connected to a real SFSF instance, but the problem is the same.
To do this I'm following two blogs mainly:
https://blogs.sap.com/2018/04/30/deep-dive-10-with-sap-s4hana-cloud-sdk-generating-java-vdm-for-s4hana-custom-odata-service/
https://blogs.sap.com/2019/05/06/create-an-application-with-sap-cloud-sdk-to-integrate-with-sap-successfactors/
Also, I removed tha last part as I will open a separate question:
SFSF OData call: Failed to convert response into ODataFeed: An 'EdmSimpleTypeException' occurred
Thanks,
kepair
I will start of with a partial answer and edit in more information later if needed.
Regarding the URL:
The behaviour you observe is intentional. The full URL of a request will be assembled as follows: Destination URL + service path + service name + entity + '?' + query parameters. So in your case that might be:
https://my.host.domain/odata/v2/JobRequisitions/MyEntity
Destination: https://my.host.domain
Service Path: /odata/v2
Service name: JobRequisitions
Entity: MyEntity
The generator assembles the default base path from service path + service name. The service name will actually be pulled from the namespace of the EDMX. That is why the URL of your service is being generated the way it is.
The reason for this is simple: One might want to generate a VDM for multiple services at the same time. All of these services are exposed under the same endpoint except for the service name itself. In order to generate all the VDMs with one configuration we can specify the "service path" in the generator and the generator pulls the service name from the EDXM itself.
So that means that your approach of overwriting the generated base path should work:
final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
.withServicePath("odata/v2/JobRequisition")
.getAllJobRequisition()
.execute(destination);
The error message at the very end of your question looks a bit like a problem with parsing to me. But in order to tackle that one further we would need the full stack trace and the HTTP log output. Also, we can only reproduce the problem if we have access to the metadata. The link you provided requires authorization through username/password.
Since your question above is already quite comprehensive I would recommend that you separate these two problems and create a new question, if this really turns out to be an independent problem. This will also make both questions more relevant for others.
Update for the impatient: it's simple, use package.- for sub-package scanning instead of package.*, as-per martoe's answer below!
I cannot seem to get onlyAnalyze working for my multi-module project: regardless of what package (or pattern) I set, maven-findbugs-plugin doesn't evaluate sub-packages as I'd expect from passing it packagename.*.
To prove either myself or the plugin at fault (though I always assume it's the former!), I setup a small Maven project with the following structure:
pom.xml
src/
main/java/acme/App.java
main/java/acme/moo/App.java
main/java/no_detect/App.java
which is very simple!
The POM has the following findbugs configuration:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>findbugs-maven-plugin</artifactId>
<version>2.4.0</version>
<executions>
<execution>
<phase>verify</phase>
<goals><goal>findbugs</goal><goal>check</goal></goals>
</execution>
</executions>
<configuration>
<debug>true</debug>
<effort>Max</effort>
<threshold>Low</threshold>
<onlyAnalyze>acme.*</onlyAnalyze>
</configuration>
</plugin>
</plugins>
</build>
and every App.java has the following code with two obvious violations:
package acme;
import java.io.Serializable;
public class App implements Serializable
{
private static final class NotSer {
private String meh = "meh";
}
private static final NotSer ns = new NotSer();// Violation: not serializable field
public static void main( String[] args )
{
ns.meh = "hehehe";// Vilation: unused
System.out.println( "Hello World!" );
}
}
Note that no_detect.App has the same content as above, but my expectation is that it wouldn't be evaluated by findbugs because I have the "onlyAnalyze" option set to acme.* which I assume would evaluate acme.App and acme.moo.App and nothing else.
I now execute a mvn clean install to clean, build, test, run findbugs, package, install, which produces the following findbugs report (snipped for brevity) and results in a build failure which is expected because acme.App and acme.moo.App:
<BugInstance category='BAD_PRACTICE' type='SE_NO_SERIALVERSIONID' instanceOccurrenceMax='0'>
<ShortMessage>Class is Serializable, but doesn't define serialVersionUID</ShortMessage>
<LongMessage>acme.App is Serializable; consider declaring a serialVersionUID</LongMessage>
<Details>
<p> This field is never read. Consider removing it from the class.</p>
</Details>
<BugPattern category='BAD_PRACTICE' abbrev='SnVI' type='SE_NO_SERIALVERSIONID'><ShortDescription>Class is Serializable, but doesn't define serialVersionUID</ShortDescription><Details>
<BugCode abbrev='UrF'><Description>Unread field</Description></BugCode><BugCode abbrev='SnVI'><Description>Serializable class with no Version ID</Description></BugCode>
To summarise: only acme.App is analysed, acme.moo.App isn't (bad) and neither is no_detect.App (good).
I tried with two wildcards in the onlyAnalyze option but that produces a successful build but with a findbugs error (Dangling meta character '*' etc).
I tried with onlyAnalyze set to acme.*,acme.moo.* which analyzes all the expected classes (acme.App and acme.moo.App) which means it "works" but not as I expect; i.e. I have to explicitly declare all parent-packages for the classes I want to analyze: that could get large and difficult to maintain on a multi-module project!
Do I have to define every package I want analyzed, or can I declare a wildcard/regex pattern that will do what I want?
I'd rather not use the inclusion/exclusion XML because that requires far more setup and reasoning that I don't currently have time for...
To cite the Findbugs manual: "Replace .* with .- to also analyze all subpackages"
I have a java project with tests written in groovy.
I use TestNG as unit testing framework.
I also have several tests written in java.
After maven test-compile phase all tests (both groovy and java) are compiled and placed in the similar folder inside target/test-classes/.
When I want to run tests with maven, only java tests are run.
When I tried to run groovy test from the IDE (IntelliJ IDEA), it runs perfectly.
I decompiled groovy test and here is what I have:
package mypackage.core;
import groovy.lang.GroovyObject;
import groovy.lang.MetaClass;
import org.codehaus.groovy.runtime.ScriptBytecodeAdapter;
import org.codehaus.groovy.runtime.callsite.CallSite;
import org.testng.annotations.Test;
#Test
public class Sample
implements GroovyObject
{
public Sample()
{
Sample this;
CallSite[] arrayOfCallSite = $getCallSiteArray();
this.metaClass = $getStaticMetaClass();
MetaClass tmp20_17 = this.metaClass;
this.metaClass = ((MetaClass)ScriptBytecodeAdapter.castToType(tmp20_17, $get$$class$groovy$lang$MetaClass()));
tmp20_17;
while (true)
return;
}
#Test
public void testSomething()
{
CallSite[] arrayOfCallSite = $getCallSiteArray(); Registry registry = arrayOfCallSite[0].callConstructor($get$$class$mypackage$core$internal$Registry());
arrayOfCallSite[1].call(registry, null); for (return; ; return);
}
static
{
tmp10_7 = new Long(0L);
__timeStamp__239_neverHappen1314379332415 = (Long)tmp10_7;
tmp10_7;
tmp28_25 = new Long(1314379332415L);
__timeStamp = (Long)tmp28_25;
tmp28_25;
Class tmp48_45 = ((Class)ScriptBytecodeAdapter.castToType($get$$class$mypackage$core$Sample(), $get$$class$java$lang$Class()));
$ownClass = (Class)tmp48_45;
tmp48_45;
return;
while (true)
return;
}
}
Has anyone met similar issue? What can be wrong here?
Can it be connected with the fact that class Sample implements GroovyObject? Can it be connected with bad surefire version?
Thanks!
UPD:
Groovy-related settings in pom.xml:
<dependency>
<groupId>org.codehaus.groovy.maven.runtime</groupId>
<artifactId>gmaven-runtime-1.6</artifactId>
<version>1.0</version>
<scope>test</scope>
</dependency>
...
<plugin>
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<goals>
<goal>generateTestStubs</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
Groovy tests are placed in mymodule/src/test/groovy/.., java tests are placed in mymodule/src/test/java/...
After test-compile phase they both are in mymodule/target/test-classes/...
I don't have special section for surefire in my pom.xml, but from looking at local repository .m2 I can say that surefire plugin of version 2.4.3 is being used.
Test classes must end with "Test" in order to be selected by maven test phase. Just rename the class to SampleTest.
We have a internal artifactory repository. At the moment all snapshots will be deployed there. We also want to have a different server with a web interface, and want to copy the to it the created artifacts.
For our builds we use Hudson, but the post-build action "Deploy artifacts to Maven repository" together with scp doesn't work. So there is the question of doing it in some other elegant way. Why isn't maven able to have several distribution repositories? Any ideas?
The nicest thing would be if artifactory would support an (automatic!) incremental export to a standard maven repository after each new deployment.
I don't think maven supports deploying to multiple repositories for a single profile, but perhaps profiles could change the id and urls of the repository.
<distributionManagement>
<repository>
<id>${repo-id}</id>
<name>${repo-name}</name>
<url>${repo-url}</url>
</repository>
</distributionManagement>
Maven Deployment
Then use profiles to pick which repo to deploy to:
<profiles>
<profile>
<id>repo1</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<repo-id>repo1</repo-id>
<repo-name>Repo1 Name </repo-name>
<repo-url>http://url.com/maven2</repo-url>
</properties>
</profile>
<profile>
<id>repo2</id>
<properties>
<repo-id>repo2</repo-id>
<repo-name>Repo2 Name </repo-name>
<repo-url>http://url2.com/maven2</repo-url>
</properties>
</profile>
</profiles>
Maven profiles
If you are willing to use a custom plugin, you can configure Maven to deploy to a list of "mirror" locations at the same time as the standard deployment. I'd recommend defining this in a profile so you can control what deployments are mirrored (it might not be appropriate to do this on every build).
To define a new plugin you need to create a new Maven project and specify the POM has packaging maven-plugin:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>maven-mirror-plugin</artifactId>
<packaging>maven-plugin</packaging>
<version>0.0.1</version>
<dependencies>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-plugin-api</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.4</version>
</dependency>
</dependencies>
</project>
In src/main/java define a Mojo. The code below declares a "mirror" goal, it takes a list of mirrorRepository items (containing a repositoryId and url) to mirror the artifact deployment to. The plugin uses the same approach to deployment as the maven-deploy-plugin, and takes most of the same parameters.
Note that you still need to define a server in your settings.xml for each repository with appropriate permissions to do the deployment or the build will fail!
package name.seller.rich;
import java.io.File;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.apache.maven.artifact.Artifact;
import org.apache.maven.artifact.deployer.ArtifactDeployer;
import org.apache.maven.artifact.deployer.ArtifactDeploymentException;
import org.apache.maven.artifact.metadata.ArtifactMetadata;
import org.apache.maven.artifact.repository.ArtifactRepository;
import org.apache.maven.artifact.repository.ArtifactRepositoryFactory;
import org.apache.maven.artifact.repository.layout.ArtifactRepositoryLayout;
import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugin.MojoFailureException;
import org.apache.maven.project.MavenProject;
import org.apache.maven.project.artifact.ProjectArtifactMetadata;
/**
* #goal mirror
* #phase deploy
*/
public class MirrorMojo extends AbstractMojo {
/**
* #parameter expression=
* "${component.org.apache.maven.artifact.deployer.ArtifactDeployer}"
* #required
* #readonly
*/
private ArtifactDeployer deployer;
/**
* Map that contains the layouts
*
* #component role=
* "org.apache.maven.artifact.repository.layout.ArtifactRepositoryLayout"
*/
private Map repositoryLayouts;
/**
* Component used to create a repository
*
* #component
*/
private ArtifactRepositoryFactory repositoryFactory;
/**
* The type of remote repository layout to deploy to. Try <i>legacy</i> for
* a Maven 1.x-style repository layout.
*
* #parameter expression="${repositoryLayout}" default-value="default"
* #required
*/
private String repositoryLayout;
/**
* Parameter used to update the metadata to make the artifact as release.
*
* #parameter expression="${updateReleaseInfo}" default-value="false"
*/
private boolean updateReleaseInfo;
/**
* Whether to deploy snapshots with a unique version or not.
*
* #parameter expression="${uniqueVersion}" default-value="true"
*/
private boolean uniqueVersion;
/**
* #parameter expression="${mirrorRepositories}"
* #required
*/
private MirrorRepository[] mirrorRepositories;
/**
* #parameter expression="${localRepository}"
* #required
* #readonly
*/
private ArtifactRepository localRepository;
/**
* #parameter expression="${project}"
* #required
* #readonly
*/
private MavenProject project;
/**
* Deploy all artifacts for the project to each mirror repository.
*/
public void execute() throws MojoExecutionException, MojoFailureException {
ArtifactRepositoryLayout layout;
layout = (ArtifactRepositoryLayout) repositoryLayouts
.get(repositoryLayout);
for (int i = 0; i < mirrorRepositories.length; i++) {
MirrorRepository mirrorRepository = mirrorRepositories[i];
ArtifactRepository deploymentRepository = repositoryFactory
.createDeploymentArtifactRepository(mirrorRepository
.getRepositoryId(), mirrorRepository.getUrl(),
layout, uniqueVersion);
String protocol = deploymentRepository.getProtocol();
if ("".equals(protocol) || protocol == null) {
throw new MojoExecutionException("No transfer protocol found.");
}
deployToRepository(deploymentRepository);
}
}
/**
* Deploy all artifacts to the passed repository.
*/
private void deployToRepository(ArtifactRepository repo)
throws MojoExecutionException {
String protocol = repo.getProtocol();
if (protocol.equalsIgnoreCase("scp")) {
File sshFile = new File(System.getProperty("user.home"), ".ssh");
if (!sshFile.exists()) {
sshFile.mkdirs();
}
}
File pomFile = project.getFile();
Artifact artifact = project.getArtifact();
// Deploy the POM
boolean isPomArtifact = "pom".equals(project.getPackaging());
if (!isPomArtifact) {
ArtifactMetadata metadata = new ProjectArtifactMetadata(artifact,
pomFile);
artifact.addMetadata(metadata);
}
if (updateReleaseInfo) {
artifact.setRelease(true);
}
try {
List attachedArtifacts = project.getAttachedArtifacts();
if (isPomArtifact) {
deployer.deploy(pomFile, artifact, repo, localRepository);
} else {
File file = artifact.getFile();
if (file != null && !file.isDirectory()) {
deployer.deploy(file, artifact, repo, localRepository);
} else if (!attachedArtifacts.isEmpty()) {
getLog()
.info(
"No primary artifact to deploy, deploy attached artifacts instead.");
} else {
String message = "The packaging for this project did not assign a file to the build artifact";
throw new MojoExecutionException(message);
}
}
for (Iterator i = attachedArtifacts.iterator(); i.hasNext();) {
Artifact attached = (Artifact) i.next();
deployer.deploy(attached.getFile(), attached, repo,
localRepository);
}
} catch (ArtifactDeploymentException e) {
throw new MojoExecutionException(e.getMessage(), e);
}
}
}
The mojo references a MirrorRepository type to encapsulate the repositoryId and url, it is a simple bean:
package name.seller.rich;
public class MirrorRepository {
private String repositoryId;
private String url;
public String getRepositoryId() {
return repositoryId;
}
public void setRepositoryId(String repositoryId) {
this.repositoryId = repositoryId;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
}
Here's an example configuration using the plugin. Note all the deploy formats are supported (http, scp, ftp):
<plugin>
<groupId>name.seller.rich</groupId>
<artifactId>maven-mirror-plugin</artifactId>
<executions>
<execution>
<id>mirror</id>
<phase>deploy</phase>
<goals>
<goal>mirror</goal>
</goals>
</execution>
</executions>
<configuration>
<mirrorRepositories>
<mirrorRepository>
<repositoryId>mirror</repositoryId>
<url>http://path/to/mirror</url>
</mirrorRepository>
</mirrorRepositories>
<!--any other deploy configuration needed-->
</configuration>
</plugin>
Artifactory does have an automatic export feature. From the documentation:
You can automatically and periodically back up the whole Artifactory system.
The backup process creates a timestamped directory (or zip file) in the target backup dir, and is basically identical to running full system export with metadata. [...] Each backup can have its own schedule and excluded certain repositories [...]
The content of the backup (when extracted) is in standard Maven format and can be loaded into any external Maven repository
[...]
Artifactory supports backing up incrementally to the same target directory (named "current") in the target backup dir. This kind of backup is only writing deltas to the output dir, resulting in extremely fast backups.
Isn't that exactly what you need?
To transfer the files, you can either mount a shared directory to the remote server and do the backup there, or do the backup locally and then rsync it.
I think in Artifactory, by default, it maintains different logical repositories for uploading snapshots and non-snapshots. Using permissions, you can make the snapshot repository visible only to some.
If that is not sufficient, another solution that works with Artifactory 2.0 is to have Artifactory use a MySQL database that does asynchronous replication to another MySQL database, which in turn is being read by a separate installation of Artifactory. If that's too real time, you can simply have two different installations that do updates based on business rules.