Resolve jar dependencies automatically - java

A project was using various libraries. E.g. a.jar, b.jar,c.jar,d.jar etc
Some of the jars have been refactored and now is ab.jar and cd.jar etc.
What I need is an automatic way to find which jars in my installation are now obsolete and I can delete them.
Is this possible?

So with LooseJar you can detect unused Jar files by adding:
-javaagent:loosejar.jar
to your java command when you invoke form the command line (or as a VM option in Eclipse). I guess this isn't technically automatic because lines of code that dynamically load classes at runtime will need to be invoked in order for LooseJar to know that the class and therefor the jar is needed. A good method might be to invoke your unit tests with this java agent (assuming your unit tests have good code coverage)

The best way is to use maven. If dependencies are defined in maven you can just run mvn dependency:tree to retrieve needed information. Please refer to this article for details.
If you do not use maven you probably have to use tools like jdepend. But be careful: such tools cannot really retrieve all dependencies. It is impossible to retrieve dependency on dynamically loaded class or API being called by reflection using static analysis only. Full solution may be achieved only if you are running your application, test it with all possible scenarios and check what classes are loaded by class loader. If you have 100% test coverage you can run your application using option -verbose:class and then run all unit tests against your application. You will get a list of all loaded classes. Now put this list to file and write shell script that analyses the classes list and transforms it to jars list.

Related

How to run job-dsl-plugin locally with additional plugins

We are using the Jenkins Job-DSL plugin for creating a number of jobs and having the actual job-configuration as part of the version-controlled source-code.
In order to test the resulting XML files locally, I currently use something like the following:
java -jar /opt/job-dsl-plugin/job-dsl-core/build/libs/job-dsl-core-1.78-SNAPSHOT-standalone.jar create_jobs.groovy
This allows to look at the resulting XML while making changes.
However some DSL elements are failing in the local build, but still work on the actual Jenkins installation.
E.g. "batchFile", "pullRequestBuildTrigger" and a few others.
As far as I understand these are separate Jenkins plugins which contribute some additional elements to the DSL, so the core job-dsl-plugin does not know about them.
I tried various ways of adding the code from these plugins to the job-dsl-plugin so that I can run the local transformation, but I could not find a way that actually works. Adding the plugins to job-dsl-plugin, classpath, ... nohting fixed it.
I also looked at How to import and run 3rd party Jenkins Plugin's extension DSL (githubPullRequest) with Gradle tool locally?, but the suggestions there did not work for me as I do not want to run a local Jenkins instance here.
So how can I run the job-dsl-plugin manually with DSL from additional plugins being available?

How to test compiled JAR file using Gradle

I had a code that was working correctly when was executed during standard unit testing, but didn't work when it was compiled into the jar and was added as dependency for some other project.
It wasn't an issue to find the root cause and fix it, but I started to think how can I test freshly made jar artifact before deploying it anywhere, to make sure that it will work for end users and other projects. I have googled this topic for several hours, but didn't even find something close to it.
Maybe I'm totally wrong and trying to achieve something weird, but I cannot figure out another way how to verify compiled packages and be confident that it will work for others.
Some details about the project - simple Java library with few classes, using Gradle 5.5 as a build system and travis-ci as CI/CD tool, for testing I'm using TestNG, but I can easily switch to JUnit if it will be required.
If you curious about the code, which was not working when was compiled into the package, here is simplified version:
public String readResourceByURI() throws IOException, URISyntaxException
{
new String(Files.readAllBytes(Paths.get(ClassLoader.getSystemClassLoader().getResource("resource.txt").toURI())));
}
This function will throw java.nio.file.FileSystemNotFoundException if packaged into the jar file. But as I said the problem is not with the code...
Ideally, I want to create a build pipeline, that will produce jar artifacts, which then will be tested and if tests are successful those jars will be automatically deployed to repository (maven and/or bintray).
At the moment all tests are executed before jar creation and as a result there is chance, that compiled code inside jar package will not work due to packaging.
So, to simplify my question I'm looking for a Gradle configuration that can execute unit tests on a freshly made jar file.
That's what I came up with:
test {
// Add dependency on jar task, since it will be main target for testing
dependsOn jar
// Rearrange test classpath, add compiled JAR instead of main classes
classpath = project.sourceSets.test.output + configurations.testRuntimeClasspath + files(jar.archiveFile)
useTestNG()
}
Here I'm changing default classpath for test task by combining folder with test classes, runtime dependencies and compiled JAR file. Not sure if it's correct way to do it...
I don't think there is a good way to detect this kind of problem in a unit test. It is the kind of problem that is normally found in an integration test.
If your artifact / deliverable is a library, integration tests don't normally make a lot of sense. However, you could spend some time to create a sample or test application that uses your library, which you can then write an integration test for.
You would need to ask yourself whether there are enough potential errors of this kind to warrant doing that:
I don't image that you will make this particular mistake again soon.
Other problems of this nature might include assumptions in your library about the OS platform or (occasionally) Java versions ... which can only really be tested by running an application on the different platforms.
Maybe the pragmatic answer is to recognize that you cannot (or cannot afford to) test everything.
Having said that, one possible approach might be to chose a free-standing test runner (packaged as a non-GUI Java application). Then get Gradle to run the test runner as a scripted task with JARs for your library and the unit tests on the classpath.
In Gradle you can try to execute some scripting task that run code from your jar. Or complex but on simple POC it works.
In main gradle project create subproject 'child'.
Add inforation about it in settings.gradle:
include 'child'
In build.gradle add this:
task externalTest {
copy {
from 'src/test'
into './child/src/test'
}
}
externalTest.dependsOn(':child:build')
jar.doLast {
externalTest
}
And in child/settings.gradle in dependency add parent jar:
compile files('../build/libs/parent.jar')
Now in main project on build, the child project will be build after jar creation.

Is there a way to prevent developers to use a certain import?

I have an application that uses Jasper to generate reports. In order to encapsulate the complexity and provide a uniform interface with the Jasper API, I have created a "intermediate" interface that wraps the Jasper classes and delegates client calls to them. This will also make it easier to change the report machine in the future - to Crystal Reports, for instance.
The thing is, since the Jasper classes are in the classpath, developers (including myself) can accidentally use some of its classes directly in the business code, and that may pass unnoticed for a long time. I would like to avoid that, or at least be notified when that happens.
The environment is basically eclipse, maven, git, sonar, bamboo ci.
I'm sure this is not an uncommon scenario, so, what is the best way to deal? Design patterns, eclipse/maven plugins, sonar alerts? Or maybe something dead simple that I'm just not seeing?
In maven you can specify a library is for runtime only. This allows you to not compile against that library at all. If you don't use Jasper from maven, you could avoid including it at all. You can force this by adding an <exclusion> if it is a transient dependency.
You should have two separate eclipse projects: One for the reporting library, one for the rest.
The reporting library project contains your interfaces, the Jasper jar files and the Jasper-specific implementation of the interfaces.
The other project depends on the reporting library project (you can set project dependencies in the projects properties dialog under "Java Build Path" -> "Projects").
As the reporting project only exports the source folder to the other project, the jasper classes are not visible to it at development time.
I haven't used it much myself, but if you ever need more control over your dependencies you could try DCL Suite, an Eclipse plugin. It lets you define constraints between modules and you can declare the modules to be a class, a set of classes, packages, etc
That would only be possible if you handled classloading of Jasper and included it as a resource (a jar file) inside your own jar. Then no one would know it was available directly. Here's an example of how you can include jars inside your own jar file -> An embedded jar classloader in under 100 lines.

Jar configurations and their contents

While downloading Google Guice I noticed two main "types" of artifacts available on their downloads page:
guice-3.0.zip; and
guice-3.0-src.zip
Upon downloading them both and inspecting their contents, they seem to be two totally different "perspectives" of the Guice 3.0 release.
The guice-3.0.zip just contains the Guice jar and its dependencies. The guice-3.0-src.zip, however, did not contain the actual Guice jar, but it did contain all sorts of other goodness: javadocs, examples, etc.
So it got me thinking: there must be different "configurations" of jars that get released inside Java projects. Crossing this idea with what little I know from build tools like Ivy (which has the concept of artifact configurations) and Maven (which has the concept of artifact scopes), I am wondering what the relation is between artifact configuration/scope and the end deliverable (the jar).
Let's say I was making a utility jar called my-utils.jar. In its Ivy descriptor, I could cite log4j as a compile-time dependency, and junit as a test dependency. I could then specify which of these two "configurations" to resolve against at buildtime.
What I want to know is: what is the "mapping" between these configurations and the content of the jars that are produced in the end result?
For instance, I might package all of my compile configuration dependencies wind up in the main my-utils.jar, but would there ever be a reason to package my test dependencies into a my-utils-test.jar? And what kind of dependencies would go in the my-utils-src.jar?
I know these are a lot of tiny questions, so I guess you can sum everything up as follows:
For a major project, what are the typical varieties of jars that get released (such as guice-3.0.zip vs guice-3.0-src.zip, etc.), what are the typical contents of each, and how do they map back to the concept of Ivy configurations or Maven scopes?
The one you need to run is guice-3.0.zip. It has the .class files in the correct package structure.
The other JAR, guice-3.0-src.zip, has the .java source files and other things that you might find useful. A smart IDE, like IntelliJ, can use the source JAR to allow you to step into the Guice code with a debugger and see what's going on.
You can also learn a lot by reading the Guice source code. It helps to see how developers who are smarter than you and me write code.
I'd say that the best example I've found is the Efficient Java Matrix Library at Google Code. That has an extensive JUnit test suite that's available along with the source, the docs, and everything else that you need. I think it's most impressive. I'd like to emulate it myself.

testing Java code generated during another test

I want to build a "toJavaCode()" on my model that would generated the required Java source code to generate that model (never mind the reasons or if it should or shouldn't be done, nor the compatibility issues that may occur).
I'm at a loss at how to test this. I'm using maven, but generate-sources won't really work for me since my server needs to be up for proper, bulk testing. I do get the server up during the "test" goal, but generate-sources is just too early.
On the other hand, while I can use the built in compiler (from tools.jar in the JDK) to do this, I don't know how I can pack it into the jar for testing (or load that jar).
Any ideas?
You can use the JavaCompiler API to compile your source files and setup a classloader to load the compiled classes in your test (sample code). tools.jar has to be on the classpath. This is the case if the JDK is used.
If your generated code is stable for any given class you could use annotation processor to generate the source code and compile it in the same javac run as the annotated class.
You can add ant tasks to your maven. That's a way to something 'out-of-classical-order' during a maven build. Like adding a javac ant task to mavens test goal or so.
(sorry, I'm as confused as your commentor matt b - but the embedded ant tasks are your swiss army knife here.)

Categories