I have a problem with Maven project in netbeans. I am new to maven so maybe this is a general question. The problem is, every time I run the main class file (Shift+F6), the console will display the downloading from the repo. I notice that this makes me slower because if I just change some line, I will need to wait the maven to re download every dependency again. Is it possible to run file without maven re downloading all dependency?
This is not the standard behaviour of Maven. First of all, check whether Maven is really downloading the jar files and not only metadata or poms. If so, it can have several reasons:
You activated Update-Snapshots in your build (-u). This downloads all SNAPSHOT dependencies in every build (if they changed).
Your update period for SNAPSHOTs is set to a very low value (the standard value, which is good for most purposes, is 1 day). Same effect as above.
Your local repository (user/.m2/repository) is broken. Delete it and try again.
Related
I am working on a large project which has lots of dependencies and many of them are SNAPSHOT dependencies. And because of snapshot dependencies, build takes a huge amount of time (around an hour) as for each dependency it needs to check if it is updated in the remote artifactory or not, even if the snapshot version is not changed. With release versions, there is no problem as only if the version is updated then only remote is checked else locally available jars from .m2 is used.
Compared to the above scenario, if I do offline build with -o flag when I have all the dependencies available in my local .m2, the build takes only about 5-10 minutes, which saves around 40-45 minutes of build time. But as the project is large and many people are working on it, whenever I pull changes, there may be some code-changes which requires the latest snapshot and offline build breaks as even a single mismatch can cause build break.
So to solve this, I am thinking about the following approach :
1. Always build offline using -o flag.
2. Create some external script (probably in nodejs), which will periodically keep scanning if dependecies given in pom is updated on the remote artifactory. If yes, pull into local .m2, else do nothing.
Is there any better alternative to this ?
Also once I figure out which artifact to update, what is the way using which I can force update only that particular artifact without transitive dependencies?
With above approach, there is still chance of build break if any transitive dependencies are changed, but that would not happen frequently, and I would have to run full online build in that case.
I thought about setting up local proxy artifactory server, but it won't help in this scenario as for snapshot resolution it would any way go to the remote repository. If I keep cache time longer there is again chances of loosing changes and build break.
You can control snapshots checking:
Download policy
updatePolicy String The frequency for downloading updates - can be
"always", "daily" (default), "interval:XXX" (in minutes) or "never"
(only if it doesn't exist locally).
https://maven.apache.org/ref/3.6.3/maven-settings/settings.html#class_snapshots
But probably you have some other issue, since 45 mins for build wasted on downloading artifacts is a lot.
Introduction
I have inherited a project that I am able to build using the maven command mvn clean install -DskipTests. However, I am not able to make using the inteliJ button. I am able to deploy the project using Remote debugging but I am not able to hotswap new code in/out due to make not working.
Errors during make
When I run make I get a series of errors such as:
Older Maven Version
I have been told to use an older version of maven, specifically 3.0.3 .
I have gone to the settings for the current project and manually set maven 3.0.3 as the default.
Question 1) Is there a chance this does not apply to the sub directories? Should I change my system path variable and set the old maven as the system default?
Red Highlighting in POM.XML
I am seeing that InteliJ is highlighting a pom.xml in one of the sub-modules for errors. This code has been committed by colleagues so it is strange that there would be errors.
and
and
Question 2) Could Maven be the issue here? Or could there legitimately be an error in the POM.xml?
Maven > Reimport does not solve the issue
Additionally, running Maven > re-import does not solve the issue.
Updating Indices
I tried selecting the proposed option to Update Maven Indices. This has brought up the following dialogs and is downloading in the background from both the maven servers but also a private artifactory.
The indices were taking too long to update so I invalidated the cache/restarted and will try again as proposed # Intelli J IDEA takes forever to update indices .
Summary of Questions
Question 1) Is there a chance this does not apply to the sub directories? Should I change my system path variable and set the old maven as the system default?
Question 2) Could Maven be the issue here? Or could there legitimately be an error in the POM.xml?
Update
Indices finished downloading after some time.
I removed some of the problematic entries in the pom.xml and the project now is not red-underlying the various packages that they do not exist.
I am starting to believe the pom.xmlwas problematic. However, if someone downloads the dependencies/indices, then the problem no longer appears.
Update - Remove Module
I talked with a colleague and he said the specific modules are no longer used (even if they do include faulty pom.xml files). I was told to right click the module and select "remove module". This pretty much stopped the problem.
I talked with a colleague and he said the specific modules are no longer used (even if they do include faulty pom.xml files). I was told to right click the module and select "remove module". This pretty much stopped the problem.
I have a Maven 3 multi-module project (~50 modules) which is stored in Git. Multiple developers are working on this code and building it, and we also have automated build machines that run cold builds on every push.
Most individual changelogs alter code in a fairly small number of modules, so it's a waste of time to rebuild the entire source tree with every change. However, I still want the final result of running the parent project build to be the same as if it had built the entire codebase. And I don't want to start manually versioning modules, as this would become a nightmare of criss-crossing version updates.
What I would like to do is add a plugin which intercepts some step in build or install, and takes a hash of the module contents (ideally pulled from Git), then looks in a shared binary repository for an artifact stored under that hash. If one is found, it uses that artifact and doesn't even execute the full build. If it finds nothing in the cache it performs the build as normal, then stores its artifact in the cache. It would also be good to rebuild any modules which have dependencies (direct or transient) which themselves had a cache miss.
Is there anything out there which does anything like this already? If not, what would be the cleanest way to go about adding it to Maven? It seems like plugins might be able to accomplish it, but for a couple pieces I'm having trouble finding the right way to attach to Maven. Specifically:
How can you intercept the "install" goal to check the cache, and only invoke the module's 'native' install goal on a cache miss?
How should a plugin pass state from one module to another regarding which cache misses have occurred in order to force rebuilds of dependencies with changes?
I'm also open to completely different ways to achieve the same end result (fewer redundant builds) although the more drastic the solution the less value it has for me in the near term.
I have previously implemented a more complicated solution with artifact version manipulation and deployment to private Maven repository. However, I think this will fit your needs better and is somewhat more simple:
Split your build into multiple builds (e.g., with a single build per module using maven -pl argument).
Setup parent-child relationships between these builds. (Bamboo even has additional support for figuring out Maven dependencies, but I'm not sure how it works.)
Configure Maven settings.xml to use a different local repository location - specify a new directory inside your build working directory. See docs: https://maven.apache.org/guides/mini/guide-configuring-maven.html
Use mvn install goal to ensure newly built artifacts are added to local repository
Use Bamboo artifact sharing to expose built artifacts from local repository - you should probably filter this to include only the package(s) you're interested in
Set dependent builds to download all artifacts from parent builds and put them into proper subdirectory of local repository (which is customized to be in working directory)
This should even work for feature branch builds thanks to the way Bamboo handles parent-child relations for branch builds.
Note that this implies that Maven will redownload all other dependencies, so you should use a proxy private Maven repository on local network, such as Artifactory or Nexus.
If you want, I can also describe the more complicated scenario I've already implemented that involves modifying artifact versions and deploying to private Maven repository.
The Jenkins plugin allows you to manage/minimize dependent builds
whenever a SNAPSHOT dependency is built (determined by Maven)
after other projects are built (manually via Jenkins jobs)
And if you do a 'mvn deploy' to save the build into your corporate Maven repo then you don't have to worry about dependencies when builds run on slave Jenkins machines. The result is that no module is ever built unless it or one of its dependencies has changed.
Hopefully you can apply these principles to a solution with Bamboo.
I'm building a Java project that has a dependency on a library. mvn.bat clean install produced the target subdirectories as expected, and the project built fine with mvn.bat clean install as well.
What's not expected is that when I deleted the entire directory of the library, the outer project still built fine, although the library it depends on was gone.
How does this work?
UPDATE: Turns out Maven makes some sort of cache in %USERPROFILE\.m2.
You are most likely thinking of your local repository where everything you install locally (and maven downloads for you from the central repository) is put for later usage.
The behavior you describe is intentional, and allows for building A once and then let B reference it whenever needed, without having to recompile A every time. This is usually very desirable, especially in teams or with large code bases.
Note, that for changing code you should be using -SNAPSHOT artifacts. They are treated slightly differently.
Your dependencies are always downloaded into .m2/repository.
If you want to have some predictability on downloaded libraries in your team, you can put in place a repository manager like Nexus : https://repository.apache.org/index.html#welcome
Instead of downloading dependencies from Maven central, your developers will download their dependencies from this repository manager.
My problem is that I have written a maven plugin to deploy the artifact to a user specified location. I'm now trying to write another maven plugin to use this deployed artifact, change some things and zip it again.
I want to write the second plugin such that i use the first plugin to get the information for where it was deployed.
I don't know how to access this information from the first plugin.
I would agree with #Barend that if you can afford to make changes before deploy, that could be best strategy.
If you cannot do that, you can follow strategy of a plugin like Maven Release plugin. Maven release plugin runs in two phases where second run needs output of the first run. They manage it by keeping temporary properties file in the project directory which carry the information like tag name, SNAPSHOT version name etc.
You could use the same approach with plugin. Just remember that your plugin will be sort of transactional, where it expects the other goal to have run before it can do its work.
It seems to me that the easiest workaround is to reverse the order in which the plugins run.
Have Plugin B run first, using the known location under target/ to modify the artifact and then run Plugin A, deploying the modified artifact to the configured location.
If that's no option, I suggest you simply duplicate the configuration value (so that both plugins are told about the new location in their <configuration> element). This keeps both plugins independent, which is what Maven assumes them to be.
A last option is to make make Plugin B parse the entire POM and extract the information from Plugin A's <configuration> element, but I really can't recommend this. If you go this way the two plugins are so closely intertwined that they're really just one plugin. This is poor design, violates the principle of least surprise and might cause nasty configuration problems down the line.