Using more than one build for a single instance - java

I always like to use build tools for my project like Maven. It makes my job much easier. However, I wanted to know if it's possible to use more than one build tool for a single instance of work flow?
Like, can I configure the jars in my pom.xml file using my gradle tool?
I tried doing so, but I arrive at error saying :-
"can't define your project".
I tried searching for the same but couldn't find an answer. Is it just me or am I missing something important?

Regarding Maven, this tool only supports the pom.xml, it's not capable in reading other buildtool files. And I think you should just pick one tool which best fits the project or the team. Otherwise you need to keep the files in sync for the different tools and that's a recipe for disaster.

Related

Who is using my maven artifact?

I have a system consisting of multiple web applications (war) and libraries (jar). All of them are using maven and are under my control (source code, built artifacts in Nexus,...). Let say that application A is using library L1 directly and L2 indirectly (it is used from L1). I can easily check the dependency tree top-down from the application, using maven's dependency:tree or graph:project plugins. But how can I check, who's using my library? From my example, I want to know, whether A is the only application (or library) using L1 and that L2 is used from L1 and from some other application, let say B. Is there any plugin for maven or nexus or should I try to write some script for that? What are your suggestions?
If you wish to achieve this on a repository level, Apache Archiva has a "used by" feature listed under project information
.
This is similar to what mvnrepository.com lists under its "used by" section of an artifact description.
Unfortunately, Nexus does not seem to provide an equivalent feature.
Now I suppose it would be a hassle to maintain yet another repository just for that, but then it would probably easier than what some other answers suggestions, such as writing a plugin to Nexus. I believe Archiva can be configured to proxy other repositories.
Update
In fact, there's also a plugin for Nexus to achieve the "used by" feature.
As far as I know nothing along these lines exists as an open source tool. You could write a Nexus plugin that traverses a repo and checks for usages of your component in all other components by iterating through all the pom's and analyzing them. This would be a rather heavy task to run though since it would have to look at all components and parse all the poms.
In a similar fashion you could do it on a local repository with some other tool. However it probably makes more sense to parse the contents of a repo manager rather than a local repository.
I don't think there's a Maven way to do this. That being said, there are ways of doing this or similar things. Here's a handful examples:
Open up your projects in your favorite IDE. For instance Eclipse will help you with impact analysis on a class level, which most of the time might be good enough
Use a simple "grep" on your source directory. This sounds a bit brusk (as well as stating the obvious), perhaps, but we've used this a lot
Use dependency analysis tools such as Sonargraph or Lattix
I am not aware of any public libraries for this job, so I wrote a customized app which does it for me.
I work with a distribution which involves more than 70 artifacts bundled together. Many times after modifying an artifact, I want to ensure changes are backward compatible (i.e. no compilation errors are introduced in dependent artifacts). To achieve this, it was crucial to know all dependents of modified artifact.
Hence, I wrote an app which scans through all artifacts under a directory(/subdirectories), extracts their pom.xml and searches (in dependency section of pom) for occurrence of modified artifact.
(I did this in java although shell/windows script can do this even more compactly.)
I'll be happy to share code on github, if that could be of any help.
One way that might suit your needs are to create a master-pom with all your maven projects. Then you run the following command on the master-pom:
mvn dependency:tree -DoutputType=graphml -DoutputFile=dependency.graphml
Open the generated file in yEd.
Used the instructions found here:
http://www.summa-tech.com/blog/2011/04/12/a-visual-maven-dependency-tree-view/
More interesting is probably: what would you do with this information? Inform the developers of A not to use library L1 or L2 anymore, because it has a critical bug?
In my opinion you should be able to create a blacklist of dependencies/parents/plugins on your repository manager. Once a project tries to deploy/upload itself with a blacklisted artifact, it should fail. I'm saying uploading and not downloading, because that might break a lot of projects. As far as I know, this is not yet available for any repository-manager.
One of the ways to approach this problem is outside Java itself : write an OS-level monitoring script that tracks each case of fopen() on the jar file under question! Assuming this is in a corporate environemnt, you might have to wait for a few weeks (!) to allow all using processes to access the library at least once!
On Windows, you might use Sysinternals Process Monitor to do this:
http://technet.microsoft.com/en-us/sysinternals/bb896645
On Unix variants, you would use DTrace or strace.
IMHO and also from my experience, looking for a technical solution for such a problem is often an overkill. If the reason why you want to know who is using your artifact(library) is because you want to ensure backward compatibility when you change an artifact or something similar, I think it is best done by communicating your changes using traditional channels and also encourage other teams who might be using your library to talk about it (project blogs, wiki, email, a well known location where documentations are put, Jour fixe etc.).
In theory, you could write a script that crawls though each project in your repository and then parses the maven build.xml (assuming they all use maven) and see whether they have defined a dependency to your artifact. If all the projects in your organization follows the standard maven structure, it should be easy to write one such script (though if any of those projects have a dependency to your artifact via a transitive dependency, things can get a bit more tricky).

Best way to manage multi-module projects?

I have a medium size project split into 3 modules: Core, plugins (in short its an interpretation layer), and implementation. There are a few global dependencies, and module specific dependencies. There is a custom ant target for generating javadoc excluding the implementation (for obvious reasons). This is stored in an public online SVN repository and therefor needs to be independent of any machine sans the JRE
Right now I'm using the built in NetBeans project management, and it sucks, probably mainly do the fact that the project management system was not designed for modules. Lack of a global library set (you can import a library specific to your nb installation, but then it doesn't get updated), lack of auto resolving of library dependencies (dependency on a project means the project and its dependencies), lack of an independent multi-project formatting style (either tied to profile specific "Global options" or individually setup and synced module-specific options), and other things make managing my project a pain.
When I was experimenting with IDEA, one of the things I loved was its project management. It was close to what I wanted, but like most things in IDEA could of been simpler. However the IDE itself was bad (not up for debate), so I switched back to NetBeans. And Maven looks bad, both from having to traverse its file structure manually and general opinion.
Are their better options out there that can be stored in a standard SVN repository with limited tools to use, are pretty easy to use for 1-3 developers, and for 2-5 modules? It must be able to handle java, and (in the perfect world) integration with NetBeans.
Honestly maven is your best bet. I wouldn't knock it you haven't actually tried it yet. It tends to be a very divisive technology, but those who love it love it for a damn good reason. If you are someone who prefers to keep your hands off the build script/files after you initially set it up, and it looks like you are given you were using Netbeans' built in projects which generate an ant build.xml behind the scenes, then you should just try maven and see what happens.
I'm not sure why you think you need to "traverse the directory structure" with maven if you are in netbeans. See this screenshot for an example of what it looks like. You don't ever see src/main/java or target/ or anything on the file system (unless you need to).
(source: netbeans.org)
If you use a maven multi-module project, you'll get the modularity you are looking for within Netbeans as well. If you want a sample, go checkout an open source project that has tons of modules and load it in Netbeans and play around with it: http://camel.apache.org/source.html

Build system that allows sharing modules amongst different binaries

I'm trying to choose the most appropriate build system to work in enterprise with a common source repository, emphasizing sharing of common code. I'd like the source hierarchy to look something like this:
- src
- java
- common
- net
- database
- team1
- team2
- team3
- lib
- tests
- java
- common
- net
- database
- team1
- team2
- team3
- lib
The goal is to have a build system where team[1-3] can have independent builds that explicitly specify their dependencies. Dependencies might look like:
- team1
- common/net
- team3/lib
- team2
- common/database
- team3
So, for example, the build for team1 would include everything within the team1, common/net, and team3/lib; but nothing else. Ideally, tests would be integrated in the same fashion (testing team1 would run tests for team1, common/net, and team3/lib).
I'm currently using Ant, but haven't found a sane way to manage a hierarchy like this. I started to look at Maven 2 for its ability to manage a dependency hierarchy, but it seems to want full-fledged projects for each module. That wouldn't be a problem, but it seems to force me into a directory structure that does not map well to the traditional java package hierarchy. It seems like I might be able to do what I want with buildr using an alternative layout, but I'm worried that might prove to be brittle.
Can someone recommend something that might work for me?
I think you actually have three issues here.
How to layout your project so that the artifacts make sense.
How to best handle the sharing of these artifacts for each project.
How to handle the loss in productivity while converting the development team to use the new project structure.
For the first issue, try to use Maven conventions wherever possible and organize the project into multiple artifacts. If the artifacts should be nested under a parent, do so. Start off with the simplest artifact which has no dependencies and work your way through the code.
I'm not sure why you believe the layout won't support the traditional Java hierarchy? It should work, especially if you use parent poms.
Obviously the second issue can become quite a handful depending upon how you handle the first one. I would err on the side of creating more artifacts instead of fewer and using a repository manager like Nexus or Artifactory to manage them. At least that way, your team's builds can rely on pre-built and tested jars by hitting your repository to pull down the latest SNAPSHOT or RELEASE of the jar they are working with.
For the third, make sure you're using IDEs that have Maven support. If you're stuck using something like Rational Application Developer 7.0.x or an IDE based on something less than Eclipse 3.4, then you won't be able to use the M2Eclipse plugin. Without M2Eclipse, the developers will have to jump through some manual hoops which are not ideal. Netbeans 6.7 and 6.8 have very good Maven support.
As you say, Maven 2 is the preferred option for your case.
Maven folder structure is not madnatory - it is configurable, if you consider it unsiutable. However, I think it is a good structure that you can follow without remorse.
You can use a repository manager so that people who use some dependencies don't necessarily need to checkout the projects they depend on.
I started to look at Maven 2 for its ability to manage a dependency hierarchy, but it seems to want full-fledged projects for each module.
That's one way to do it. Alternatively, a multi-module Maven project can be organized like this:
project
module-1
src
main
....
test
....
pom.xml
module-2
src
main
....
test
....
pom.xml
...
pom.xml
where each pom.xml could also refer to modules defined by other trees. BTW, the Eclipse maven plugin supports this approach as well as the more common one-module-per-project approach.
I'm currently using Ant, but haven't found a sane way to manage a hierarchy like this.
This is surprising as Ant (+Ivy?) gives you all the flexibility you want.
I started to look at Maven 2 for its ability to manage a dependency hierarchy, but it seems to want full-fledged projects for each module.
If by this you mean one pom.xml per module, then that's correct.
That wouldn't be a problem, but it seems to force me into a directory structure that does not map well to the traditional java package hierarchy.
Yes, Maven comes with some conventions, the project directory structure being one of them. This is (a bit) configurable though but I don't think you'll be able to match the wanted layout (with tests and sources into separated hierarchies). And actually, I would strongly advice to use defaults if you go for Maven, you should adopt its philosophy, it will save you a lot, really a lot, of pain (not even mentioning that some plugins might use these default in an hard coded way).
To be honest, I don't really understand what you mean by a directory structure that does not map well to the traditional java package hierarchy. First, Maven is perfect for Java, so this doesn't make any sense to me. Second, and this might be more subjective, your layout (with separated tests and sources trees) doesn't look traditional at all to me. Maybe you should clarify what you mean exactly by traditional...
It seems like I might be able to do what I want with buildr using an alternative layout, but I'm worried that might prove to be brittle
I don't know buildr really well so I can't say much about it but I know it is indeed more flexible. That said, if Ant doesn't give you satisfaction in terms of flexibility, then I don't see why buildr would be better.
And don't forget that buildr and Ant+Ivy have much smaller communities compared to Maven. Don't underestimate this, this might become a real concern.
Personally, I would go for Maven and reconsider your layout. But let's say I'm biased.
What you are going for is going to give you lots of trouble in the long-term... each standalone component should really be made into its own project with its own repository, otherwise, you can get into lots of issues with changes in one component breaking the other components and updating taking excessively long. I strongly recommend that you make each component into its own project and using Maven2 to build.
You can do it with Buildr. You could live for some time with it.
Of course, like most people on the thread, I would rather not recommend this approach.
You can also use base_dir to change the base directory of the projects.

Best ways to manage generated artifacts for web service/xml bindings in a java webapp/client?

I'm working on a couple of web services that use JAXB bindings for the messages (in JAX-WS or spring-ws). When using these bindings there's always some code that is automatically generated from the WSDL to bind the message objects. I'm struggling to figure out the best way I can make this work so that it's easy to work with, hard to break and integrates nicely with IDEs (mostly using eclipse).
I think there are a couple of ways to go about this. The three main options I see right now are:
Generate code, keep the source artifacts and check them into the repository. Pros: integrates easily with IDEs (source highlighting etc), works within the build system. Cons: generated code changes each time you regenerate it, possibly creating noisy commits. It's also redundant since the WSDL file is already checked in, usually.
Generate code as part of the build process. Don't keep source artifacts or only keep them in output directories. Pros: fixes all the cons from the previous one. Cons: harder to integrate with IDE, though maybe this build step can be run automatically? I currently use this on one of my projects but the first time I checkout the project it appears broken, which is a minor nuisance.
Keep generated bindings in separate libraries (jars) included with maven or manually updated jars, depending on your build process. I got the idea from a thread on java.net. This seems more stable and uses explicit versioning but seems a bit heavyweight.
Which one of these options would you implement and how? We're currently using maven and eclipse, so any ideas in that regard would be great. I think this problem generalises to most other build systems and IDE combinations though, even other languages perhaps.
I went for option 3. If you already host your own repository (and optionally CI), it's not that heavyweight. All it takes is a simple POM. It's even possible to include some utility/wrapper/builder classes (that often make life easier with generated classes) and use them in several projects.
I'd go for option 2 and generate code in the "standard" ${project.build.directory}/generated-sources/<toolname> location as part of the build process. Using generated sources is well supported by m2eclipse (use Maven > Update Project Configuration once sources have been generated) and, if I remember well, by the maven eclipse plugin as well (i.e. the folder will be added to the Java Build Path). Actually, I think NetBeans also handle this fine. Not sure for Idea.
For the generation itself, you may need the maven-jaxb2-plugin if I understood correctly.

Multiple Java projects and refactoring

I have recently joined a project that is using multiple different projects. A lot of these projects are depending on each other, using JAR files of the other project included in a library, so anytime you change one project, you have to then know which other projest use it and update them too. I would like to make this much easier, and was thinking about merging all this java code into one project in seperate packages. Is it possible to do this and then deploy only some of the packages in a jar. I would like to not deploy only part of it but have been sassked if this is possible.
Is there a better way to handle this?
Approach 1: Using Hudson
If you use a continuous integration server like Hudson, then you can configure upstream/downstream projects (see Terminology).
A project can have one or several downstream projcets. The downstream projects are added to the build queue if the current project is built successfully. It is possible to setup that it should add the downstream project to the call queue even if the current project is unstable (default is off).
What this means is, if someone checks in some code into one project, at least you would get early warning if it broke other builds.
Approach 2: Using Maven
If the projects are not too complex, then perhaps you could create a main project, and make these sub-projects child modules of this project. However, mangling a project into a form that Maven likes can be quite tricky.
If you use Eclipse (or any decent IDE) you can just make one project depend on another, and supply that configuration aspect in your SVN, and assume checkouts in your build scripts.
Note that if one project depends on a certain version of another project, the Jar file is a far simpler way to manage this. A major refactoring could immediately means lots of work in all the other projects to fix things, whereas you could just drop the new jar in to each project as required and do the migration work then.
I guess it probably all depends on the specific project, but I think I would keep all the projects separate. This help keep the whole system loosely coupled. You can use a tool such as maven to help manage all the dependencies between the projects. Managing dependencies like this is one of maven's main strengths.
Using Ant as your build tool, you can package your project any way that you want. However, leaving parts of your code out of the distribution seems like it would be error prone; you might accidentally leave out necessary classes (presumably, all of your classes are necessary).
In relation to keeping your code in different projects, I have a loose guideline. Keep the code that changes together in the same project and package it in its own jar file. This works best when some of your code can be broken out into utility libraries that change less frequently than your main application.
For example, you might have an application where you've generated web service client classes from a web service WSDL (using something like the Axis library). The web service interface will likely change infrequently, so you don't want to have the regeneration step reoccurring all the time in your main application build. Create a separate project for this piece so that you only have to recreate the web service client classes when the WSDL changes. Create a separate jar and use it in your main application. This style also allows other projects to reuse these utility modules.
When following this style, you should place a version number in the jar manifest so that you can keep track of which applications are using which versions of your module. Depending on how far you want to take this, you could also keep a text file in the jar that details the changes that have occurred for each revision (much like an open source library).
It's all possible (we had the same situation some years ago). How hard or easy it'll be depends on your IDE (refactoring, merging, organizing new project) and you build tool (deploying). We used IDEA as IDE and Ant as build tool and it wasn't too hard. One sunday (nobody working+committing), 2 people on one computer.
I'm not sure what you mean by
"deploy only some of the packages in a jar"
I think you will need all of them at runtime, won't you? As I understood they depend on each other.

Categories