Task: what I have is the large non-Gradle (make:-)) project, which contains many subprojects, each one in it's own subdirectory. I have to write functional test for some of these subprojects. These subprojects are producing independent results, but with the same structure, so there is many common code for testing these subprojects, so I want to share it in some special location.
Restrictions:
as developers requested, the tests for subprojects should be in the directory of this subproject (to be precise, in the subdirectory, for example, func_tests).
I have some shared dependencies for my test projects, that I usually use, for example, Google Guava, TestNG and so on, and also have some settings for test run (excludeGroups 'slow'...) and I prefer this settings to be common, still, that doesn't matter too much.
symbolic links are accepted way, if that's good design:)
If it's possible, I want to have IntelliJ IDEA correctly handle this dependency.
My ideas:
symlink src/main of every test subproject to some common directory (src/test is "individual"). This will greatly support IDE , but it would lead to copying all the dependencies and preferences. Also, I'm very unsure, if that's preferred way in Gradle.
create common project, which will be imported by every subproject, this will save dependencies (will it?), but I'm not sure IDEA will correctly handle this way.
What is the idiomatic way to do this with Gradle?
Look at samples/java/withIntegrationTests in your Gradle installation. This will give you some idea how to add your tests (there are other ways too). You want to tweak that setup to make sure that IDEA handles your tests. This is done by customization of idea.module.scopes.
Shared code and shared libraries: you can create a map like https://github.com/gradle/gradle/blob/master/gradle/dependencies.gradle and use it in different subprojects. BTW: Gradle codebase has a lot of integration tests and you can check how their build is configured to see if you want to apply some ideas.
Related
I have multiple app projects of of roughly this layout:
example app (Java)
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
2nd example app (flutter)
flutter wrapper
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
3rd example app
flutter wrapper
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
All apps share the same main dependency (java Wrapper with additional functionality) and its dependency tree. Now I am developing on each app all the way down to C++ code. They are managed as git submodules in their respective parent project.
As there is a high change rate along the whole process, I want the final example to be built for testing from all sources.
I tried several approaches for tying this together into one gradle build:
1. Preferred (but failing) solution: settings.gradle in each project, each project only includes direct dependencies
Now I want this full tree to be managed in one flutter build. So I add the direct dependencies in each projects settings.gradle, just to learn that gradle only supports one toplevel settings.gradle. So this does not work. The presented solutions in aforementioned question mostly try to emulate support for multiple settings.gradle files.
2. Functioning but Ugly: Add all dependency projects are included in the toplevel settings.gradle
Do I really have to include all subprojects manually in the toplevel settings.gradle, when each of the subprojects knows its dependencies perfectly fine? Furthermore, since there are multiple projects depending on this, do I have to do this manually for each of them?
(And don't even get me startet about gradle not telling me, I have a wrong projectDir because I got a typo in the 100rth level of recursive descend!)
3. Probably Working Solution: Use composite builds
This will trigger the builds but now I have to resolve the build artifacts instead of the projects. So same problem with other artifacts.
4. Probably Working solution: Publish dependency projects to a maven (or other) repository and pull that into the app
I did not try this because I find the idea abhorent: I want to test one small change in the C++ code and now have to push that to a repository and potentially do the same on every project above?
This works for a stable project but not for flexible exploratory development. Sure, I want to publish something at the end but I don't want to publish every little step in between.
This left me wondering: Am I doing something unusual? I mean: is there nobody who has the same requirements that gradle does not seem able to fit:
live updates from all the way down to quick test local changes
no repeating of transitive dependencies on the toplevel
What is the common practice in this case?
After Lukas Körfer's comment I took a closer look at composite builds again and noticed that I had a misconception about them. I did not understand that their dependency resolution will solve the finding of the build artifacts for me.
Now I use the composite builds to tie together the whole build while using
implementation 'my.group:project'
to import the code of the subprojects and
includeBuild '../path/to/subproject/'
to pull them in.
Never found a really satisfactory solution to this. How do you do it? I am looking for inspiration for new approaches.
For context, assume I write a generator that takes a project resource and generates a code file. But it could be any other project support tool - validator, converter, deployer etc. Often manually triggered actions that are not running as part of normal build.
Such tools typically require a few dependencies that are not required by the project itself at runtime.
Strategies that I have applied or considered in the past:
add tool dependency to project anyway, and either mark it "provided" or filter it out during the packaging process (this is what I usually do, but now I am in danger of adding normal project code that uses the tool dependency, potentially resulting in an error that only manifests during runtime)
use a script (trying hard to avoid scripts and their hidden dependencies and complexities)
create separate support projects (trying hard to avoid project explosion, especially for seemingly small tasks that are handled by a few lines of code)
subprojects / modules (only vaguely aware of this option, never really tried it)
maven plugin that is run with a profile with separate dependencies (trying to avoid the separate project required to maintain the custom plugin)
Inspiration from answers and comments
separate tools project shared by multiple projects
I just realized that maven and eclipse already solved exactly this problem for a very specific "tool": test code.
Test code often needs additional dependencies not used by the application itself.
People obviously invested quite a bit to keep the "test / tool" infrastructure within the same project, as opposed to creating a separate test-project:
separate source locations (src/main/java, src/test/java)
separate resource locations (src/main/resources, src/test/resources)
a full-blown separate maven dependency scope "test", complete with transitive resolution
separate compilation phases (compile / test) with separate dependency trees
eclipse supports special junit launch configurations that are able to correctly resolve the test dependencies
probably more stuff that I am not aware of currently
So, I am strongly considering to program all my supporting tools as "junit test cases".
I am planning to create and commit shared junit launch configs for the team that execute just one specific "test case", which will run the tool logic instead of testing.
The problem I have to solve is to avoid running these dummy tests during the normal maven test phase.
Also, writing this, I realize that there is even another such system already in place: the maven plugin infrastructure, that also has a separate dependency resolution mechanism. Although, so far it seems necessary or normal to create separate projects to create plugins. I will look into ways of writing and building project specific maven plugins without needing to create separate projects. I am thinking about generating the pom.xml needed for plugin compilation on the fly, and including all the test dependencies.
I am new to using github and have been trying to figure out this question by looking at other people's repositories, but I cannot figure it out. When people fork/clone repositories in github to their local computers to develop on the project, is it expected that the cloned project is complete (ie. it has all of the files that it needs to run properly). For example, if I were to use a third-party library in the form of a .jar file, should I include that .jar file in the repository so that my code is ready to run when someone clones it, or is it better to just make a note that you are using such-and-such third-party libraries and the user will need to download those libraries elsewhere before they begin work. I am just trying to figure at the best practices for my code commits.
Thanks!
Basically it is as Chris said.
You should use a build system that has a package manager. This way you specify which dependencies you need and it downloads them automatically. Personally I have worked with maven and ant. So, here is my experience:
Apache Maven:
First word about maven, it is not a package manager. It is a build system. It just includes a package manager, because for java folks downloading the dependencies is part of the build process.
Maven comes with a nice set of defaults. This means you just use the archtype plugin to create a project ("mvn archetype:create" on the cli). Think of an archetype as a template for your project. You can choose what ever archetype suits your needs best. In case you use some framework, there is probably an archetype for it. Otherwise the simple-project archetype will be your choice. Afterwards your code goes to src/main/java, your test cases go to src/test/java and "mvn install" will build everything. Dependencies can be added to the pom in maven's dependency format. http://search.maven.org/ is the place to look for dependencies. If you find it there, you can simply copy the xml snippet to your pom.xml (which has been created by maven's archetype system for you).
In my experience, maven is the fastest way to get a project with dependencies and test execution set up. Also I never experienced that a maven build which worked on my machine failed somewhere else (except for computers which had year-old java versions). The charm is that maven's default lifecycle (or build cycle) covers all your needs. Also there are a lot of plugins for almost everything. However, you have a big problem if you want to do something that is not covered by maven's lifecycle. However, I only ever encountered that in mixed-language projects. As soon as you need anything but java, you're screwed.
Apache Ivy:
I've only ever used it together with Apache Ant. However, Ivy is a package manager, ant provides a build system. Ivy is integrated into ant as a plugin. While maven usually works out of the box, Ant requires you to write your build file manually. This allows for greater flexibility than maven, but comes with the prize of yet another file to write and maintain. Basically Ant files are as complicated as any source code, which means you should comment and document them. Otherwise you will not be able to maintain your build process later on.
Ivy itself is as easy as maven's dependency system. You have an xml file which defines your dependencies. As for maven, you can find the appropriate xml snippets on maven central http://search.maven.org/.
As a summary, I recommend Maven in case you have a simple Java Project. Ant is for cases where you need to do something special in your build.
While downloading Google Guice I noticed two main "types" of artifacts available on their downloads page:
guice-3.0.zip; and
guice-3.0-src.zip
Upon downloading them both and inspecting their contents, they seem to be two totally different "perspectives" of the Guice 3.0 release.
The guice-3.0.zip just contains the Guice jar and its dependencies. The guice-3.0-src.zip, however, did not contain the actual Guice jar, but it did contain all sorts of other goodness: javadocs, examples, etc.
So it got me thinking: there must be different "configurations" of jars that get released inside Java projects. Crossing this idea with what little I know from build tools like Ivy (which has the concept of artifact configurations) and Maven (which has the concept of artifact scopes), I am wondering what the relation is between artifact configuration/scope and the end deliverable (the jar).
Let's say I was making a utility jar called my-utils.jar. In its Ivy descriptor, I could cite log4j as a compile-time dependency, and junit as a test dependency. I could then specify which of these two "configurations" to resolve against at buildtime.
What I want to know is: what is the "mapping" between these configurations and the content of the jars that are produced in the end result?
For instance, I might package all of my compile configuration dependencies wind up in the main my-utils.jar, but would there ever be a reason to package my test dependencies into a my-utils-test.jar? And what kind of dependencies would go in the my-utils-src.jar?
I know these are a lot of tiny questions, so I guess you can sum everything up as follows:
For a major project, what are the typical varieties of jars that get released (such as guice-3.0.zip vs guice-3.0-src.zip, etc.), what are the typical contents of each, and how do they map back to the concept of Ivy configurations or Maven scopes?
The one you need to run is guice-3.0.zip. It has the .class files in the correct package structure.
The other JAR, guice-3.0-src.zip, has the .java source files and other things that you might find useful. A smart IDE, like IntelliJ, can use the source JAR to allow you to step into the Guice code with a debugger and see what's going on.
You can also learn a lot by reading the Guice source code. It helps to see how developers who are smarter than you and me write code.
I'd say that the best example I've found is the Efficient Java Matrix Library at Google Code. That has an extensive JUnit test suite that's available along with the source, the docs, and everything else that you need. I think it's most impressive. I'd like to emulate it myself.
I develop a little java utility library using maven. Now I'd like to add some demo / sample code to show how to use the library.
Where is the best place to put it?
In a sub-package with the other code. I don't like this since it means the demos will be included in the library jar file.
In a new maven artifact. That works, but I'd prever to have the demos closer connected to the library source.
As a sub-artifact. Haven't tried this yet. Seems to make everything a bit complex for something that should be simple.
Is there any common pattern to do this?
If it's some sample code snippets that run by themselves and just demonstrate how to use the library, then write them as unit tests, in the same module.
If it's more like a separate demo application (that a user might even interact with), then create a separate artifact. It's the standard way of doing it. If you really want to you could put it in the same module, but in a different source directory, but that's just making it harder on yourself.
Your library and you demo should probably share a parent module (of type "pom", not "jar" like the others), giving you a multi-module project. Then you can build both by launching maven from this parent module.
If you want to release your library and demo together (you can, but you don't have to), you can do that from the parent too.
In other words, it's not because they are separate modules, packaged in different artifacts, that they cannot be closely connected anymore. The different modules of a multi-module project still form one whole project.
You hadn't written what kind of utility library it is, but if it's something like apache commons, then most of the demos can be written as JUnit tests, which are placed in the same artifact. Good designed JUnit tests both tests your code and provide example how to use your utilities.
I prefer a new maven artifact, it can make your own artifact clean
I would recommend create a maven multi-module project where one module is the core and one module is the demo code. That way the user can choose if he wants to create both modules (they become separate artifacts) or if he just wants the core.