Gradle Local Repository to share jars easily? - java

So we a re considering Gradle instead of Maven. What I cant figure out is how to EASILY share built dependencies between completely seperate projects at a local level that will work anywhere.
So imagine I have 2 projects DBService and Middleware. Middleware depends on DBService but they are completely seperate (not sub projects nor multi modules etc...).
I make changes to the DBService and then in Maven I could have changes go to local Maven Repo (not global as i need to test them first) using:
mvn clean install
I then start coding on Middleware and import the changes by running the same command as normal (Nice!). The maven POM requires nothing special at all (besides standard ). No path to local directory containing the jar in a lib folder (nasty IMHO) or anything else. it just works (out of the box).
How can I achieve this very simple and very common scenario in Gradle. I know I can use a local Maven repo but this is this a first class citizen in Gradle? Does it not need special/ad-hoc, or even worse, environment specific setup (UGHH) in Gradle?.

Related

Is there a way to list projects using a specific maven package?

Background:
I have multiple projects that utilize the same logic across their business so I decided to split the shared part and reference it in both projects.
Example:
Suppose there is an HR application and an Accounting application that require a shared business in this case let's say count hours which is part of calculations class (let's call it calc).
Both applications (HR & Accounting) have their own repository on gitlab. Also have independent deployment and utilize the calc package which is uploaded on nexus repository.
Question:
I would like to know if there is a way to find projects using calc package through the package name across the repositories.
If you have the projects that you want checked out locally, you could find the dependency-tree for each project:
mvn depedency:tree
This gives you all the (transitive) dependencies that are used the projects. If you do this for each project and then grep your package-name, this should find you all projects that use it, or at least have a dependency on it.
I am not sure how to calculate dependencies through package names. But:
If you have access to the source you can scan your projects for pom.xml and find the dependencies
If you have access to the runtime libraries you can scan these jars for /META-INF/maven/*/*/pom.xml and check the dependencies
Either way you should be able to find jar files making use of Calc.
There is no way to ask Calc where it got used from.

How to add 70 local jars on maven project?

why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.

Maven, m2e, not building local dependencies

We have a setup of Maven module like this:
Parent
common
webapp
Where webapp has declared a dependency to common.
In our Eclipse environment, using m2e, we made a change in common. Then ran maven package on webapp, deployed and tested the webservice. There was now a fault that indicated that the package we had done did not include the latest changes in common.
We are trying to figure out the best way to use Maven in our day-to-day development. So whats the best practice to handle situations like this?
The way to solve this issue would have been to; Make changes in common, save them, run maven install, make changes in webapp, run maven package, and use the war file for tests. But this contains a lot manual steps.
I guess another way would be to run maven package on the parent, but as the parent grows with more modules this will take longer.
How do you expect changes to common to be available unless you package and deploy it separately? Maven is not magic and it cannot make magic things happen. The solution in this case is to either manually deploy common or do as you suggest package and deploy the parent.
Parent 1.0-SNAPSHOT
+common 1.0-SNAPSHOT
+webapp 1.0-SNAPSHOT
You must use a snapshot version in the parent and reuse the same version in both module.
You must import every module in eclipse
You must run mvn clean install from the parent
If you do all this, Eclipse will recognize when there's a dependency 'webapp' project to 'common' project. Eclipse takes the 'common' project in the build-path of the 'webapp' project. Any change in 'common' is seen by 'webapp'.

Using cached artifacts in Maven to avoid redundant builds?

I have a Maven 3 multi-module project (~50 modules) which is stored in Git. Multiple developers are working on this code and building it, and we also have automated build machines that run cold builds on every push.
Most individual changelogs alter code in a fairly small number of modules, so it's a waste of time to rebuild the entire source tree with every change. However, I still want the final result of running the parent project build to be the same as if it had built the entire codebase. And I don't want to start manually versioning modules, as this would become a nightmare of criss-crossing version updates.
What I would like to do is add a plugin which intercepts some step in build or install, and takes a hash of the module contents (ideally pulled from Git), then looks in a shared binary repository for an artifact stored under that hash. If one is found, it uses that artifact and doesn't even execute the full build. If it finds nothing in the cache it performs the build as normal, then stores its artifact in the cache. It would also be good to rebuild any modules which have dependencies (direct or transient) which themselves had a cache miss.
Is there anything out there which does anything like this already? If not, what would be the cleanest way to go about adding it to Maven? It seems like plugins might be able to accomplish it, but for a couple pieces I'm having trouble finding the right way to attach to Maven. Specifically:
How can you intercept the "install" goal to check the cache, and only invoke the module's 'native' install goal on a cache miss?
How should a plugin pass state from one module to another regarding which cache misses have occurred in order to force rebuilds of dependencies with changes?
I'm also open to completely different ways to achieve the same end result (fewer redundant builds) although the more drastic the solution the less value it has for me in the near term.
I have previously implemented a more complicated solution with artifact version manipulation and deployment to private Maven repository. However, I think this will fit your needs better and is somewhat more simple:
Split your build into multiple builds (e.g., with a single build per module using maven -pl argument).
Setup parent-child relationships between these builds. (Bamboo even has additional support for figuring out Maven dependencies, but I'm not sure how it works.)
Configure Maven settings.xml to use a different local repository location - specify a new directory inside your build working directory. See docs: https://maven.apache.org/guides/mini/guide-configuring-maven.html
Use mvn install goal to ensure newly built artifacts are added to local repository
Use Bamboo artifact sharing to expose built artifacts from local repository - you should probably filter this to include only the package(s) you're interested in
Set dependent builds to download all artifacts from parent builds and put them into proper subdirectory of local repository (which is customized to be in working directory)
This should even work for feature branch builds thanks to the way Bamboo handles parent-child relations for branch builds.
Note that this implies that Maven will redownload all other dependencies, so you should use a proxy private Maven repository on local network, such as Artifactory or Nexus.
If you want, I can also describe the more complicated scenario I've already implemented that involves modifying artifact versions and deploying to private Maven repository.
The Jenkins plugin allows you to manage/minimize dependent builds
whenever a SNAPSHOT dependency is built (determined by Maven)
after other projects are built (manually via Jenkins jobs)
And if you do a 'mvn deploy' to save the build into your corporate Maven repo then you don't have to worry about dependencies when builds run on slave Jenkins machines. The result is that no module is ever built unless it or one of its dependencies has changed.
Hopefully you can apply these principles to a solution with Bamboo.

Maven compile dependency instead of taking it out of the local repo

I am sorry i don't know maven good enough for the complex environment i am currently working in (1k+ applications, most of them are Java EE). I still give it a try to describe what i want to archive:
0.) There is a company framework that abstracts the Java EE World a bit and is used in all the Java EE components
1.) I checked out the maven project of the Java EE component i am working with
during the build it downloads the dependencys of other components out of the companys repository and stores it inside my local repo for compilation. So i can see the jar-files of the companys framework inside my local repo.
2.) I now want to change some of the frameworks functionality for a local test so i checked out their sources from another SVN repository. I made the changes and build that framework component with maven "clean install".
3.) I rebuild the component i am working with as well.
Inside eclipse i can now click on one of a frameworks classes method and it opens the according source. But this only happens because the local repo is meant to contain source-jars for any dependency as well. So in my editor i can see this source is from the jar of the framework in my local repo and i cant change anything.
Could someone please give me a hint how i can archive the following:
I can make changes to the framework (and build the frameworks jars with "clean install")
I can build my component and it uses the above compiled framework jars rather than the "old" ones from the local repo.
I will right now start to read the entire maven documentation and each and every section (i try to understand that dependency management since 1 year and still don't get it) but i would really appreciate if you could help me out a bit here.
I don't know how Eclipse manages maven dependencies, but
In IntelliJ IDEA this is simple - if maven dependency is in project then IDEA uses it instead of dependency from local repo.
So if u want to edit framework source code and use this changes immidiately - i think that framework should be in your Eclipse Workspace. And your module in Eclipse should reference framework artifacts directly - not over Maven dependency mechanism.
I think that this is a Eclipse Maven plugin responsibility. Do u have any installed Maven plugin for Eclipse? (M2Eclipse for example)

Categories