Remote Java compiler - java

I'm looking for a way to boost my team's productivity, and one way to do that would be to shorten the time it takes to compile & unit test & package & deploy our Java EE application which is getting bigger and bigger.
The trivial solution that I know of is to set up a powerful computer with N processors (N ~= num of developers) and a blazingly fast disk system and a lot of memory, and run everything on this computer and connect to it via X remotely. It would certainly be much faster than compiling on our laptops, but still cheaper and easier to maintain than to buy each developer his/her own supercomputer.
Is there another way to solve this problem? For example, could we run our IDEs locally and then tell it to remote compile java source? Can Netbeans / Eclipse / IntelliJ / etc. do this? Or is there a special tool that enables remote java compilation, also that makes use of multiple processors? It need not be free/open source.
Unfortunately our laptops MUST run a (company managed) Windows Vista, so another reason to go for the separate server computer is to let us use linux on it and finally get rid of the annoying managed environment.
EDIT: to sum up the answers so far, one way to shorten build times is to leave compilation for the developers individually (because compiling is supposed to be fast), skip running unit tests and hot-deploy (without packaging) to the container.
Then, when the developer decides to check his/her code in, a continuous integration server (such as Hudson) is triggered to clean & build & run tests & package & deploy.
SOLUTION: I've accepted Thorbjørn's answer since I think that's going to be the closest to which way I'm planning to proceed. Although out of curiosity I'm still interested in solving the original problem (=remote Java compiling)...

You essentially need two workflows.
The OFFICIAL build, which checks out the sources, builds the whole thing from scratch, runs all the unit tests, and then builds the bits which will eventually ship to the customer after testing.
Developer hot-deploying after each source code change into the container the IDE knows about.
These two can actually be vastly different!
For the official build, get Jenkins up and running and tell it to watch your source repository and build whenever there is a change (and tell those who break the build). If you can get the big computer for building, use it for this purpose.
For the developers, look into a suitable container with very good IDE deployment options, and set that up for usage for each and every developer. This will VERY rapidly pay off! JBoss was previously very good for exactly this purpose.
And, no, I don't know of an efficient remote java compilation options, and I don't think this is what you should pursue for the developers.
See what Joel thinks about Build Servers: http://www.joelonsoftware.com/articles/fog0000000023.html
If you don't like Jenkins, plenty others exist.
(2016 edit: Hudson changed to Jenkins. See https://stackoverflow.com/a/4974032/53897 for the history behind the name change)

It's common to set up a build server , e.g. running hudson to do the compiling/packaging/unit-testing/deploying.
Though you'd likely still need the clients to at least perform a compile. Shifting to using a build server, you might need to change the work process too if you arn't using a build server now - e.g. if the goal is to take load off the client machines, your developers will check code in , automatic unit tests gets run, instead of running unit tests first, then checking in.

You could mount each developer dir with ntfs on the powerful machine and then create External Tool Configuration in Eclipse (GUI access), that would be triggering build on external server.

JavaRebel can increase productivity also. It eliminates the need for redeployments..
You can recompile a single file and see the changes being applied directly on the server.

When things start getting too big for efficient builds, it may be time to investigate breaking up your code into modules/JARs (how it breaks apart would depend on many project specifics and how your team tends to work). If you find a good setup, you can get away with less compiling (dont always need to rebuild the whole project) and more/quicker copying/jaring to get to the point where you can test new code.

What your project need is a build system to do the building, testing and packaging for you. Hudson is a good example of such a continuous integration build system.

Related

Easy deployment of a jvm based web server on a remote machine

I wanted to know what is the easiest way to deploy a web server made using java or kotlin. With nodejs, I just keep all the server code on remote machine and edit it using the sshfs plugin for vscode. For jvm based servers, this doesn't appear as easy since intellij doesn't provide remote editing support. Is there a method for jvm based servers which allows quick iterative development cycle?
Do you have to keep your server code on remote machine? How about developing and testing it locally, and only when you want to test it on the actual deployment site, then deploy it?
I once tried to use SSH-FS with IntelliJ, and because of the way IntelliJ builds its cache, the performance was terrible. The caching was in progress, but after 15 minutes I gave up. And IntelliJ without its caching and smart hints would be close to a regular editor.
In my professional environment, I also use Unison from time to time: https://www.cis.upenn.edu/~bcpierce/unison/. I have it configured in a way to copy only code, not the generated sources. Most of the times it works pretty well, but it tends to have its quirks which can make you waste half a day on debugging it.
To sum up, I see such options:
Developing and testing locally, and avoiding frequent deployments to the remote machine.
VSCode with sshfs plugin, because why not, if it's enough for you for nodejs?
A synchronization tool like Unison.
Related answers regarding SSHFS from IntelliJ Support (several years old, but, I believe, still hold true):
https://intellij-support.jetbrains.com/hc/en-us/community/posts/206592225-Indexing-on-a-project-hosted-via-SSHFS-makes-pycharm-unusable-disable-indexing-
https://intellij-support.jetbrains.com/hc/en-us/community/posts/206599275-Working-directly-on-remote-project-via-ssh-
A professional deployment won't keep source code on the remote server, for several reasons:
It's less secure. If you can change your running application by editing source code and recompiling (or even if edits are deployed automatically), it's that much easier for an attacker to do the same.
It's less stable. What happens to users who try to access your application while you are editing source files or recompiling? At best, they get an error page; at worst, they could get a garbage response, or even a leak of customer data.
It's less testable. If you edit your source code and deploy immediately, how do you test to ensure that your application works? Throwing untested buggy code directly at your users is highly unprofessional.
It's less scalable. If you can keep your source code on the server, then by definition you only have one server. (Or, slightly better, a small number of servers that share a common filesystem.) But that's not very scalable: you're clearly hosted in only one geographic location and thus vulnerable to all kinds of single points of failure. A professional web-scale deployment will need to be geographically distributed and redundant at every level of the application.
If you want a "quick iterative development cycle" then the best way to do that is with a local development environment, which may involve a local VM (managed with something like Vagrant) or a local container (managed with something like Docker). VMs and containers both provide mechanisms to map a local directory containing your source code into the running application server.

technique to detect use of features that are "dangerous" to use across different Java implementation or OS?

I am responsible for a number of java application servers, which host apps from different developers from different teams.
A common problem is that some app is sent that does not work as expected, and it turns out that the app was developed on some other platform, i.e. openJDK on winXp or 7, and then deployed to Linux running Oracle JDK, or vice versa.
It would be nice to be able to enforce something up front, but this is practically not possible.
Hence are there any techniques to detect problems upon deployment I mean, without the source code, by scanning the class files?
If that is not possible, what is the tool I can use to send to the developers so they can identify from their source code what incompatible they have relied upon?
It's not possible to detect all such problems in a totally automated way. For example, it is extremely difficult to detect hard coded pathnames which are probably the single biggest issue with cross-platform deployments.
My suggestion would be to compensate for this with more automated testing at deployment time.
For example, in a similar environment I used to insist on deployment test suites: typically implemented as a secure web page that I could navigate to as the administrator which would run a suite of tests on the deployed application and display all the results.
If anything failed, it was an immediate rollback.
Some of the tests were the same as the tests used in development, but some were different ones (e.g. checking the expected configuration of the environment being deployed into)
Obviously, this means you have to bundle at least some of your test code into your production application, but I think it is well worth it. As an administrator, you are going to be much more confident pushing a release to production if you've just seen a big screen full of green ticks in your staging environment.

Suggestion need for Code sharing Onsite Offsite

I am new to a project where developers still share code by sending files by mail.
We are using eclipse and cvs.
Developers from offsite send there code for reveiw to onsite where other developers take files one by one from there mail and replace in eclipse. It is ok for 2 or 3 files. But as the files keep on increasing this task really becomes a pain.
We cannot put the source files into the cvs as untested code from offsite can crash our build server.
Here my question begins:-
What can be the better ways to share code?
We dont want to create branches for each change because in this case we will end up with 10-12 branches everyday.
Code should be tested via continuous integration, especially in your situation where your programmers are scattered literally across the world. Your offshore people should be using unit/integration testing to insure that they don't break the build. You should institute process where before they finish for the day, they verify the integrity of the build.
If they are not, they are not worth the money you are paying them.
I suggest you give the offsite developers the ability to perform the same test as your build server. There is no reason they should be sending you code which they cannot test (or test that it at least runs without crashing).
Is there any reason they cannot access your systems via VPN. That way they can test the code via your's or a second build server and merge the code themselves.

Distributed Java Compiler

Is there a distributed compiler for Java, analogous to distcc for C/C++?
The direct answer to your question is "no". However, it probably would not help you anyway… compiling Java is very fast.
On a small project, the compilation is fast enough that you shouldn't really care. On a large project you would need to deal with throwing the file to compile over a network, and having to deal with potentially also throw across many megabytes of dependencies as well.
One thing that you can do to improve your compilation speed is to use the eclipse compiler instead of the Sun javac. The Eclipse compiler is multi-threaded, and so will, with luck, use all the cores of your machine.
It is probably also worth mentioning that Apple also recently reduced distcc support, since in general, on newer hardware, it cost more time to get the code somewhere else to compile and back, than it did to just do the compilation locally. To quote Apple:
The single-computer build performance of Xcode has been improved to the point that distributed building with Distributed Network Builds is slower than local builds in most circumstances.
Maybe Jikes would work for you. You can achieve very similar effects with a clever ant script and nfs like filesystem...
If you're annoyed with waiting a long time for your java compiles, then you might consider one of the following:
break your project up into several different jar files (in a hierarchic dependency). With any luck, your changes will only affect source in one of those jars, while the others can continue to serve as dependencies.
break your project into groups of sources, perhaps by package, and use Apache ant to coordinate your compiling. I was always too lazy to use it, but you can set up explicit dependency management to avoid re-compiling stuff for which .class files already exist and are newer than the source. The effort that goes into setting this up once can reap dividends within a few days if the project is large and compiles are chewing up a lot of your time.
As opposed to multi-coring, reducing the amount of code that you need to recompile will also reduce your PC's energy consumption and carbon footprint ;)
I did write the start of one for java6
http://www.pointdefence.net/jarc/index.html
It's distributed at the java compiler task. So it would work well with parallel compilation of independent Maven modules.
I think the parallel compilation of independent Maven modules should be quite easy using some simple scripts - just pull from version control, change dir and run mvn clean compile. Add mvn deploy to get the artifact to your artifact repository.
This should work even with dependent modules, will need some work on synchronization though.

Tips for speeding up build time on Linux using ANT, Javacc, JUnit and compiling Java classes

We have a large codebase that takes approx 12 minutes on the developer machines to auto-generate some Java 5 classes using JavaCC and then compiles all the classes as well as running the units test.
The project consists of multiple projects which can be built in groups, but we are aiming for a full a build in under 10 minutes
What tips are there for reducing this build time?
Thanks
One quick fix that might shave some time off is to ensure that you are running Ant using the server JVM (by default it uses the client VM). Set ANT_OPTS to include "-server".
Profile the build process and see where the bottlenecks are. This can give you some ideas as to how to improve the process.
Try building independent projects in parallel on multi-core/CPU machines. As an extension of this idea, you may want to look around for a Java equivalent of distcc (don't know whether it exists) to distribute your build over a number of machines.
Get better machines.
some tips for reducing build time:
do less work.
e.g. remove unnecessary logging/echoing to files and
console
make your build 'incremental'.
Compile only changes classes.
eliminate duplicated effort.
Easier said than done, but if you run
the build in debug mode ("ant -debug") you can sometimes see
redundant tasks or targets.
avoid expensive operations.
copying of files, and packaging of jars into wars are necessary for release.Signing jars is expensive and should only be done, if possible' for milestone releases rather than every build
Try be inspired by pragmatic programmer. Compile only what is necessary, have two or more test suites. One for quick tests, other for full tests. Consider if there is real need to use each build-step every time. It necessary try to use jikes compiler instead of javac. After project spans several hundreds of classes I switch to jikes to improve speed. But be aware of potential incompatibility issues. Don't forget to include one all in one target to perform every step with full rebuild and full test of project.
Now that you've explained the process in more detail, here are two more options:
A dedicated machine/cluster where the build is performed much quicker than on a normal workstation. The developers would then, before a commit, run a script that builds their code on the dedicated machine/cluster.
Change the partitioning into sub-projects so that it's harder to break one project by modifying another. This should then make it less important to do a full build before every commit. Only commits that are touching sensitive sub-projects, or those spanning multiple projects would then need to be "checked" by means of a full build.
This probably wouldn't help in the very near term, but figured I should throw it out there anyway.
If your project is breakable into smaller projects (a database subsystem, logging, as examples), you may be interested in using something like maven to handle the build. You can run each smaller bite as a separate project or module, and maven will be able to maintain what needs to be built if changes exist. In this the build can focus onthe main portion of your project and it won't take nearly as long.
What is the breakdown in time spent:
generating the classes
compiling the classes
running the tests
Depending on your project, you may see significant increases in build time by allocating a larger heap size to javac(memoryMaximumSize) and junit(maxmemory).
Is it very important that the entire build lasts less than 10 minutes? If you make the sub-projects independent from one another, you could work on one sub-project while having already compiled the other ones (think Maven or Ivy to manage the dependencies).
Another solution (and if your modules are reasonably stable) is to treat your sub-projects as standalone projects. Each project would then follow their own release cycle and be available from a local Maven/Ivy repository. This of course works well if at least parts of the project are reasonably stable.

Categories