Introduction
I am writing a custom plugin for Gradle to allow users to publish artifacts generated by their builds to a corporate Maven repository. I want users to be able to use the standard maven-publish plugin to upload their artifacts, but without them knowing the repository's details, i.e. authentication.
Users may run local builds but they cannot publish anything to the repository. The same user-provided build scripts, when running on a build server, will be able to publish. Users may download the plugin and use it locally, but it won't work. It will either fail or do a mock no-op upload to local/dummy repo/whatever, this is not important right now. When running on the build server, the plugin will just "know it" and work properly.
As I said, I want users to use the standard maven-publish plugin. I would like something like this for the users' build.gradle:
repositories {
mavenCentral()
Organization.mavenCorporate() /* this would be it */
}
(The Organization part is not required, it's just how I have seen it works with Gradle's project extensions, which seem to be currently favored over conventions).
Current status
So far I have been able to get the above piece of the script working to some extent. When I run a build which uses my plugin, it does:
not fail at syntax check/compile time, so I know Gradle understands this piece of code.
print some info when the plugin is applied, so I know Gradle is really using it.
print my (currently hard-coded) repository's name when I do println Organization.mavenCorporate().name inside the build script, so I know the extension is working.
print my repository's name when I do println project.repositories[0].name inside the build script, so I know the project knows the repository. The name is the same as the previous print and contains a System.nanoTime() to be sure it is not being generated twice. Also, if I print the default object's toString() instead of its name, both identifiers are equal.
However, when the build tries to resolve actual dependencies, it fails with the following message:
Execution failed for task ':myproject:compileJava'.
> Could not resolve all dependencies for configuration ':myproject:compileClasspath'.
> Cannot resolve external dependency org.apache.logging.log4j:log4j-api:2.3 because no repositories are defined.
Required by:
project :myproject
> Cannot resolve external dependency org.apache.logging.log4j:log4j-web:2.3 because no repositories are defined.
Required by:
project :myproject
[...]
What bothers me is the 'because no repositories are defined' part. I would expect it to fail for a lot of different reasons, but not this one. When I add another repository which does not contain the needed dependencies, it fails with another kind of error.
The code
My plugin is written in Java. I have seen a plugin from the Gradle Plugin Portal which has a very similar purpose, but it's written in Groovy. It simply adds some methods to project.repositories.metaClass. I don't know Groovy, but that looks like a quick-and-dirty solution to avoid dealing with Gradle's API in a proper way.
As of now, I have the following:
The main plugin class:
public class MyPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
_printVersionInfo(project);
// this allows the use of Organization methods as Gradle DSL
project.getExtensions().create("Organization", Organization.class, project);
// this should add the repo to project's RepositoryManager
project.getRepositories().add(CustomRepository.getInstance(project));
}
[unrelated methods omitted]
}
The custom repository class:
public class CustomRepository implements MavenArtifactRepository {
private static MavenArtifactRepository repo;
public static MavenArtifactRepository getInstance(Project project) {
if (repo == null) {
DefaultRepositoryHandler drh = (DefaultRepositoryHandler) project.getRepositories();
repo = drh.maven(new CustomRepoAction());
}
return repo;
}
[implemented methods from MavenArtifactRepository omitted for brevity; for debugging purposes, every method except getName() and getUrl() currently throws an UnsupportedOperationException]
}
The extension Organization class. Contains only the mavenCorporate() DSL method, which returns a MavenArtifactRepository obtained with the same call to CustomRepository.getInstance(project) as in the main plugin class.
The CustomRepoActionclass which implements Action<MavenArtifactRepository>. Contains only the implementation of execute method, which sets the name and URL of the MavenArtifactRepository passed as the parameter.
I am using Java 8 and Gradle 3.4. I am relatively new to Gradle as the user, and not really familiar with the new features from Java 7 and 8.
The question
Am I missing something? Why doesn't Gradle recognize the MavenRepositoryArtifact returned by Organization.mavenCorporate() as a repository? Do I need to bind the repository with the project, the repository handler, or any kind of manager object that I am not aware of?
EDIT: Thanks to #lukegv's comment, now I realize I cannot force my custom DSL into DefaultRepositoryHandler. I tried something like this:
repositories {
mavenCentral()
maven Organization.mavenCorporateAction() /* instance of CustomRepoAction */
}
and it works, adds a MavenArtifactRepository to project's repositories. But that doesn't keep users from accessing to the repository's internals, i.e. password. However, RepositoryHandler.maven(Action<? super MavenArtifactRepository> action) doesn't allow me to add a subclass of MavenArtifactRepository in which I could control this behaviour.
Do I need to implement a RepositoryHandler and make it the default repository handler for projects? Is that even possible without implementing my own custom Project class and messing with Gradle's internals? Is there a clean way of achieving this:
I want users to be able to use the standard maven-publish plugin to
upload their artifacts, but without them knowing the repository's
details, i.e. authentication.
with decoration, or anything implying minimal or no changes to the default behavior of everything else?
Related
I have two versions of the same Java class (same name / methods). Since it's Java, both .java files have the same name. I want to configure gradle in such a way that I can build a "debug" version of my application that pulls in one of these files, and a "production" version of my application that pulls in the other one. How would I go about doing this?
This class has only static methods. I don't ever want to make an instance of this class. I additionally don't want to add the overhead of an if statement in each of the methods on this class to check which version I'm in.
Following #JFabianMeier's answer you could use 4 projects:
with the production version class
with the debug version class
with code that uses either of the two, parameterized according to Migrating Maven profiles ... → Example 6. Mimicking the behavior of Maven profiles in Gradle. (I'm also a Maven guy and therefore can't tell you exactly how to do it in Gradle.)
a multi-project with 1./2./3. as sub[-]projects for building all of them in one go, maybe parameterized to build just 1.+ 3. or 2.+ 3.
Have you tried creating production and debug source directories/sets? According to the docs you can use multiple directories when specifying source directories in your source set. Try dropping the different versions of your class in their respective production/debug source directories.
I haven't tested myself (not sure about the syntax), but this is based on the Gradle's Java compilation documentation.
sourceSets {
// Could also name this production
main {
java {
srcDirs ['src/main/java', 'src/prod/java']
}
}
debug {
java {
srcDirs ['src/main/java', 'src/debug/java']
}
}
}
You could do the following:
Put the class into a separate project (so generate a separate jar from it)
Then you can have two different jars, for production and debugging
Then you can pull either one or the other jar in gradle depending on a parameter.
Alternatively, you could look into template engines like Velocity which allow you to generate source code during the build depending on variables and then compile it.
Android has a neat feature called Product Flavors. It lets you swap classes at compile time effortlessly and keep your project clean.
This post is very good to get a taste of it: https://android-developers.googleblog.com/2015/12/leveraging-product-flavors-in-android.html
And here is the full documentation: https://developer.android.com/studio/build/build-variants#product-flavors
Problem
In java, I have a a "Util project" using another "Mock project" when doing unit test.
My problem is that the "Mock Project" is as well using the "Util project" to build some of the Mock object.
When i use maven to build my projects, i can't build it cause one project miss the jar from the second and reverse case for the other project.
Example
As you can see in the example below, it make sense that both project needs each other and each piece of code is located in the right project, what is "Mock" is in "Mock" project, what is "Util" is in "Util" project.
public class TestProjectUtil
{
#Test myMethod()
{
//some code
GeneratedEntity obj = ProjectMockUtil.generateEntity();
}
}
public class ProjectMockUtil
{
public static EntityObj generateEntity()
{
//Some code
EntityObj obj = new EntityObj();
MethodList names = ProjectUtil.Reflection.getMethodList(obj);
//Some code
}
}
Question
How should you deal with this type of situation. I tried to force maven to still build my project and ignore failure but as soon as one class fail to compile then the generated jar does not include any class at all, so the jar is empty.
As well i do not believe that a refactoring is ultimately needed in my case, the different classes are in the right projects and i do not want to duplicate my code for the sake of having the same class in both project to satisfy maven and make it work.
What might be the best approach ?
option 1
Another way to do it is to build the first project as a JAR WITHOUT MAVEN, in this case your jar is usable by the second project when you run maven for the first time. (the JAR will need to be added as dependency in the POM).
After that you can build the first project normally with maven, then rebuild the second project again but this time change the JAR reference in the POM to use dependency from local repository (the one you just build with first project).
And for future build always use MAVEN as before, it will work properly.
Option 2
Another way to do so is to merge the 2 projects together, its not always logic to do so but in my case, it could be logic to merge 2 class util together and create a separation via package name, for instance first project under dev.helper.helperjse, then the second project dev.helper.helpermock
In this case we don't have the issue with circular reference since within a project circular reference are accepted and normal.
Option 3
Another way is to change the argument of the maven compilation plugin and pass argument to force compilation error .class to be added to the jar file and to not fail on error. (this one i did not find what are the arguments yet, Happy if someone knows).
First of all sorry if I've missed this in the documentation somewhere.
I'm trying to write a Gradle plugin in Java.
More specifically I need to do this:
public class MyPlugin implements Plugin<Project>{
#Override
public void apply(Project project) {
project.getSourceDir();
}
}
In other words, how can I, if possible, get the directory where the sources of the project my plugin is run over reside?
Am I correct in my assumption that when the apply method gets called Gradle will already know where the sources are and this location can not change further on?
Thanks.
Yes because the plugin is applied to a Project per your parameter. If you "apply" it to two projects then it creates two instances of the plugin and pass your project objects to the apply method.
Look through the Gradle user guide.
A project has a list of "sourceSet" objects, which you get to with "project.sourceSets". Each sourceSet has a set of properties.
It's good to browse through The DSL reference to see what properties are available. Look for "SourceSet".
I am writing a Maven plugin and I would like to automatically resolve specific dependencies and add them as dependencies to the project based on the parameters given to the plugin.
I have been able to successfully resolve dependencies through aether, but there seems to be a disconnect between aether and the MavenProject.
There is a method on MavenProject#addAttachedArtifact which I'm guessing is what I want to use. However, it takes a org.apache.maven.artifact.Artifact while the one retrieved from aether is org.sonatype.aether.artifact.Artifact. I found a plugin that has a conversion method between the two, but I figure there ought to be a more standard approach.
I have also tried using the DefaultArtifactFactory to create a org.apache.maven.artifact.Artifact but get a NullPointerException when trying to get an ArtifactHandler.
code:
DefaultArtifactFactory factory = new DefaultArtifactFactory();
Artifact mavenArtifact = factory.createBuildArtifact("com.beust", "jcommander", "1.27", "jar");
result:
Caused by: java.lang.NullPointerException
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:155)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:117)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:111)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createBuildArtifact(DefaultArtifactFactory.java:75)
at com.foo.bar.module.IncludeModuleFrontEndMojo.execute(IncludeModuleFrontEndMojo.java:165)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
... 20 more
So really, these are the things I've tried, a resolution to these issues would be great, but I'm really after the right way to do this. Any ideas?
UPDATE
I wrote my own conversion method between the two classes:
private static org.apache.maven.artifact.Artifact aetherToMavenArtifactBasic(Artifact artifact, String scope, ArtifactHandler artifactHandler) {
DefaultArtifact mavenArtifact = new DefaultArtifact(artifact.getGroupId(), artifact.getArtifactId(), artifact.getVersion(), scope, artifact.getExtension(), artifact.getClassifier(), artifactHandler);
mavenArtifact.setFile(artifact.getFile());
mavenArtifact.setResolved(true);
return mavenArtifact;
}
and found that the MavenProject#addAttachedArtifact method is to attach an artifact to an existing artifact (i.e. attach sources/javadocs jars to an artifact), which is not my goal. Instead I got the artifacts from the maven project and add my artifact:
project.getArtifacts().add(mavenArtifact);
which adds my artifact to the project (my artifact is then shown when I call the project's getArtifactMap() and getCompileClasspathElements(). However, this change does not persist. This is the problem I was really worried about. So the question has evolved into:
Can I make changes to the MavenProject and have it persist?
I don't think this is possible and for my purposes I decided instead to require the user to add the dependency in the project's pom file (and error out if they don't have it).
It seems to be by design that you don't allow the user to muck with the project configuration through a plugin to a point where you could break the build. I found a good post on advanced MOJO development here. A quote from it:
If this parameter could be specified separately from the main
dependencies section, users could easily break their builds –
particularly if the mojo in question compiled project source code. In
this case, direct configuration could result in a dependency being
present for compilation, but being unavailable for testing. Therefore,
the #readonly annotation functions to force users to configure the
POM, rather than configuring a specific plugin only.
I am using Maven with Eclipse (using M2E) to build a project that relies on java.util.ServiceLoader to dynamically load some factory classes. It works fine when I run it in Maven, but when I run a test using the inbuilt Eclipse JUnit4 Runner, it fails to pick up the services and the tests do not succeed.
Is there something I need to do to manually add the META-INF/services to the JUnit build path? I couldn't make it work in either src/main/resources/META-INF/services or src/main/java/META-INF/services. Is it related to the way M2E sets up the build path? I made up the test in a completely new project and it still failed.
The basic code that is failing is:
public class TestServiceLoader<S>
{
public TestServiceLoader(final Class<S> serviceClass)
{
ServiceLoader<S> serviceLoader =
java.util.ServiceLoader.load(serviceClass, serviceClass.getClassLoader());
Iterator<S> services = serviceLoader.iterator();
if(!services.hasNext())
{
throw new RuntimeException("Failed to get any services for this class");
}
}
}
The test looks like:
#Test
public final void testTestServiceLoader()
{
TestServiceLoader<TestFactory> serviceLoader = new TestServiceLoader<TestFactory>(TestFactory.class);
}
TestFactory is defined as:
public interface TestFactory
{
}
TestFactoryImpl is defined as:
public class TestFactoryImpl implements TestFactory
{
}
I originally was using the MetaInfServices generator from http://weblogs.java.net/blog/kohsuke/archive/2009/03/my_project_of_t.html but when I removed that and manually created the files by hand, it was still failing in the same way for Eclipse while succeeding when run using the Maven Surefire plugin.
The M2E developers appear to believe that any resources will affect a maven build, even in the case where META-INF/services/ is a functional part, albeit a resource:
"Actually project resource folder doesn’t really need to be added to the buildpath (Maven Builder is going to work without it), but it been considered convenient and look better in the Package Explorer and other Eclipse views."
M2E FAQ
If you want to hack around it, you can apparently encode a special M2E profile into your pom.xml files to make the M2E incremental build filter and copy resources Sonatype M2Eclipse Wiki