Are there any benefits to developing Java in a virtual machine? - java

I'm coming from the .Net camp where virtualization was much more prevalent do to the need to run on server software and system-wide entities such as the GAC.
Now that I'm doing Java development, does it make sense to continue to employ virtualization? We were using VirtualPC which, IMO, wasn't the greatest offering. If we were to move forward, we would, hopefully, be using VMWare.
We are doing web development and wouldn't use virtualization to test different flavors of server deployment.
Pros:
Allows development environments to be
identical across the team
Allows for isolation from the host
Cross-platform/browser testing
Cons:
Multiple monitor support is lacking (not in VMWare?)
Performance degradation - mostly I/O
Huge virtual disks

One possible advantage is that you could technically test out the same program on a variety of operating systems and on a vartiety of JVMs.
Contrary to popular opinion, Java is not 100% portable, and it is very possible to write nonportable code. In addition, there are subtle versions between libraries. If you work with different JVMs, there could also be differences.
However, since Java IDEs are heavyweight, running an IDE within the VM may not be fun.
Java does support some forms of remote deployment, might be beneficial to explore those while still doing the IDE work locally.

I don't like developing in a VM. The good thing is, in contrast to what you're writing as cons, that multiple monitors are supported by VMWare and the huge disk thing isn't really a problem since VMWare runs surprisingly smoothly from USB hard disks.
Running the heavyweight IDEs for Java, as Uri said, won't be much fun in a VM. But then, running Visual Studio in a VM isn't really fun as well. So if you were happy with VS in a VM, then give it a try for Java, because the cons aren't as strong as you might think :)

You said your doing java web development so it makes sense to test your application using different web browsers on different operating systems. VMware will be useful for this.
The Netbeans IDE is operating system independent, so you can have developers working on different operating systems with out any trouble.

I run eclipse inside a VirtualBox instance and it works fine. I've used VMWare in the past and that's fine too.
I like having my development environment segmented away from whatever the rest of my PC is doing (playing games, surfing the web, reading email, etc...)
I work from home so virtualization provides necessary separation of work/play. It also allows me to upgrade each environment separately and have much more control over the environment.
Also I can safely try something new and revert if the install goes "wonky". Sorry for the highly technical term. ;-)
Edit: It also allows me to satisfy the corporate VPN access requirements without subjecting my home environment to excessive corporate influence.

If you need a VM to verify that the app will also run on a different OS, you can cover quite some ground by using a continuous integration server and start/run VM instances on that machine (ie. Linux / Windows / OSX). Then unpack the latest build, run unit-tests on the delivered classes.
Then run automated integration tests. You will then have to report the results back to the CI-environment.
If the integration tests are good, this can catch a lot of the common multi-platform mistakes right after they are committed to the SCM.

Related

technique to detect use of features that are "dangerous" to use across different Java implementation or OS?

I am responsible for a number of java application servers, which host apps from different developers from different teams.
A common problem is that some app is sent that does not work as expected, and it turns out that the app was developed on some other platform, i.e. openJDK on winXp or 7, and then deployed to Linux running Oracle JDK, or vice versa.
It would be nice to be able to enforce something up front, but this is practically not possible.
Hence are there any techniques to detect problems upon deployment I mean, without the source code, by scanning the class files?
If that is not possible, what is the tool I can use to send to the developers so they can identify from their source code what incompatible they have relied upon?
It's not possible to detect all such problems in a totally automated way. For example, it is extremely difficult to detect hard coded pathnames which are probably the single biggest issue with cross-platform deployments.
My suggestion would be to compensate for this with more automated testing at deployment time.
For example, in a similar environment I used to insist on deployment test suites: typically implemented as a secure web page that I could navigate to as the administrator which would run a suite of tests on the deployed application and display all the results.
If anything failed, it was an immediate rollback.
Some of the tests were the same as the tests used in development, but some were different ones (e.g. checking the expected configuration of the environment being deployed into)
Obviously, this means you have to bundle at least some of your test code into your production application, but I think it is well worth it. As an administrator, you are going to be much more confident pushing a release to production if you've just seen a big screen full of green ticks in your staging environment.

RPG (iSeries) Modernization using JTOpen - What is possible?

We would be in near future implementing a solution to modernize our iSeries applications
written as RPG programs with some stored procedures, and our preferred way is leveraging the latest and greatest of what Java has to offer in this space.
From googling and checking other questions here on STOVFlow, JTOpen seems to be the defacto
library/toolset which has worked for most and I was encouraged to see that Tomcat runs on an I-series box with out any issues.
With this as the background, I am thinking of the following as the high level sol arch
Install IBM JRE and use JTOpen's capabilities to invoke RPG Programs and in some cases directly call the stored procedures running on DB2
Have Tomcat host a modern web application built using Grails and other frameworks (Camel, Smooks) to provide an application logic layer which would fill any mediations, transformations required for the old functionality to be offered to the user from a browser
Questions-
If any one of you has been involved in such an exercise, please share the pitfalls with this approach
Is there a significant performance drop with respect to response times for the end user?
Would it be better to some how expose the JT400 code as web services and run the web app on a different machine altogether consuming these web services?
Be very careful with calling RPG from Java because RPG is not threadsafe without some changes.
When I was at COMMON, the best product I felt on the market was Profound UI. There are several others from a variety of vendors. Most of these products do not use Java. Java on the i tends to be slow. (There are things that can be done to make it faster, but native is always faster.) You'll pay the price for these products, but just imagine how much time it would take you to do this yourself. For the above, I was quoted in the $20+ thousand range. But like all i products prices vary greatly based on system.
To directly answer your questions:
I have been doing research on modernization as time allows, the products weren't quite there yet (at the time I looked) to use it for what we wanted to use it for (before COMMON 2011). Now it looks like it might work.
This really depends on your system. A newer system will have less problems than an older system. Web will always be slower than the green-screen. Hands-down entry people won't like it. Executives and younger people will love it.
Your slow point is running the business logic. It wouldn't matter which server the HTML is coming from.
I've found that for all practical purposes an AS/400 behaves like an AIX box seen from Java code, and you must use jt400 (jtOpen) to communicate with the AS/400 specific features like data queues, files etc. This works pretty well, but the slowness of invoking the JVM pressures Java based solutions to be long running.
Note also that QTEMP is generally unavailable as a mechanism to keep state due to the nature of prestarted jobs.
Under V6R1 Java 6 is available and runs pretty well in the "new technology" edition. You can then run almost all Java based solutions, including web servers like Jetty in it. Note that Java defaults to code page 819 when accessing IFS files directly. Windows clients using AS/400 as a network drive uses a compatible code page.

Is Java completely Platform Independent?

Is Java completely Platform Independent ?
if not then, what care needs to be taken to see that your code written in Java can run on Multi Platforms. Basically it should work on Targeted Platforms like Windows (Various versions), Linux (all flavors), Mac and Solaris.
While in practice, most compiled byte code is platform independent, my experience in my 12 years of developing on the Java platform has taught me that there are still idiosyncrasies from platform to platform.
For example, while developing a Java 1.4 Swing application for PC and MacOSX the behavior of dialogs was different if the parent frame is null.
Another example might be with working with the file system and files in general. The Java API has methods to help shield the developer from differences in path separators (/ vs \). When writing to a file, it important to use the FileWriter API as intended so that return characters and such are generated properly for the platform that it is being written on.
So while the motto is "write once, run anywhere" my experience has been for production envs it is write once, test, everywhere.
As a result, having strong unit and integration tests can help with this, as you can execute those tests on the various platforms you want to distribute your software.
Despite some minor issues here and there, it is cool to see your code running on Linux, Unix, Windows and MacOSX (BSD Unix) using the same JARs.
As djacobson pointed out, the answer is a qualified "yes". For the most part, Java developers don't have to worry about platform dependencies. However, you may run into problems when you're dealing with APIs that handle traditional OS and platform functions.
When dealing with File I/O, for example, it's easy to make your code platform dependent by ignoring the differences between file/path separators across platforms (i.e. using '\' rather than File.separator).
For the most part, yes. Because Java compiles to bytecode that's executed by its virtual machine, it can generally be expected to behave the same way regardless of the system sitting under the virtual machine.
However. Not even virtual machines are immune to bugs. A quick Google search turns up the following, for example:
http://www.ibm.com/developerworks/java/library/j-diag0521.html
Differences in behavior can vary from JVM to JVM. Hopefully you won't end up with code that depends on any of these cases... but careful research is worthwhile to know what the limitations of your infrastructure are.
You problem will not be executing your code, but more likely the assumptions you have to make about file paths, available external commands (if you need them), necessary file permissions and other external factors that don't really fall under the "Java" problem domain. Unless you're planning on using native code (via JNI) extensively, Java will not be your problem, your environment will.
Which brings us back to the old adage: "write once, test everywhere".
Threading priorities is one thing to consider. Other OS like Solaris for example has more thread priorities than windows. So if you are working heavily on multi-threading, OS is something that may affect the program's behavior.
The main thing to be concerned with is UI code, to make sure that it is represented properly on all the platforms you will be running on.
Another source of possible issues is deploying to different app servers. There might be incompatibility issues between them.
Java other than that is platform independent, This is also one of its weaknesses, since you are coding to a common denominator and many features of each individual OS are not available.
There are very few and they should be pretty obvious. like System.getProperty("os.name") is clearly OS dependant or it wouldn't work. The most common one is System.exec() as it calls another application which is on your system, again you should know if the application you are calling works the same on every system or not (unlikely).
Along with the above concerns, the main problem I had was actually building on different platforms, which may not be what your asking, but may be something to watch out for.
OS X is especially guilty of this when using the Apple Distribution of Java (why anyone would want to put out their own packaging of Java I don't know but that is a separate argument, and on OSX i dont think you have a choice but to use their java). The Libraries that you may or may not be relying on are in completely different directories, eg libraries instead of lib if my memory serves me correctly. And the IBM java I think packages Classes in different Jars in some cases. Ridiculous!!
Hope that helps.

Which one of the major operating systems is best suited for a quick boot and startup of a Java application?

I created a Java application which is the only application active on a workstation. (Similar to a kiosk system)
The problem is that the application has to be up and running as fast as possible after starting the computer.
I'm wondering which one of the major operating systems can be configured to provide the shortest startup time?
I'm using 3rd party audio and graphics libraries so my choices are limited to Windows XP/Vista, Linux and Solaris.
Currently on my dual-boot machine Fedora takes a little longer than Vista, but on the other hand I don't have much experience with tuning boot time of Linux. So if someone knows that Linux could have much better chances of a quick startup then I would put my time in there.
I'd also appreciate general hints on tuning boot times and Java startup times.
I would look at BootChart to optimise your Fedora boot time. If you're running one app, then you can certainly remove a lot of the services that Fedora would normally come configured with.
I would also point out that perhaps for the amount of time you're going to spend optimising this, you may be better off investing in the appropriate hardware (e.g. SSDS and similar, if boot time is governed by your disk). Optimising can be a major time sink.
If you're running your application inside of a kiosk like machine where you don't need any other applications running, and you know which drivers/modules you'll need to load ahead of time, I think your best boot time will come from Linux.
It will just take some time to fine tune your boot process to load all the proper software in the fastest time possible.
For such a task a fine tuned Linux is best suited. You can take a look at some more customizable distro, where you can control which drivers and applications get in.
Debian is highly modularized and customizable, so you can get really good boot speed.
Another option can be Gentoo - there you can strictly choose what to compile and include.
Linux with SSD drives.
I'd also suggest a linux distro. E.g. Gentoo with initng (initng.org). Initng parallelizes the startup process. There are other startup system with which your system will be up in a few seconds.
And of course, fast hdds and enough ram for java ;)
My guess would be Windows XP embedded. I've found that Java apps start up fairly quickly under Windows, particularly if you use a client VM.
It is extremely likely that your 3rd party vendors will support XP embedded (particularly if you are a big customer to them). It is very similar to normal XP, just cut down.
If you're making a kiosk type app, why do you care about boot time?
Fedora can be easily optimized if you want to only run a single java application. There are many services which are pre-configured during boot time and they can be omitted. You could also go for SSD drives to improve the boot-time of the system, and at the same time if you spend some time on optimizing the boot chart, it would solve your problem.

Java 1.6 JDK tool, VisualVM

Has anyone used the new Java 1.6 JDK tool, VisualVM, to profile a production application and how does the application perform while being profiled?
The documentation say that it is designed for both Production and Development use, but based on previous profiling experience, with other profiling tools, I am hesitant.
While i haven't personally used VisualVM, I saw this blog post just today that might have some useful information for you. He talks about profiling a production app using it.
I tried it on a dev box and found that when I turned off profiling it would shut Tomcat down unexpectedly. I'd be very cautious about rolling this out to production- can you simulate load in a staging environment instead? It's not as good as the real thing, but it probably won't get you fired if it goes wrong...
I've used VisualVM before to profile something running locally. A big win was that I just start it up, and it can connect to the running JVM. It's easier to use than other profiling tools I've used before and didn't seem to have as much overhead.
I think it does sampling. The overhead on a CPU intensive application didn't seem significant. I didn't measure anything (I was interested in how my app performed, not how the tool performed), but it definitely didn't have the factor of 10 slowdown I'm used to seeing from profiling.
For just monitoring your application, running VisualVM remotely should not slow it down much. If the system is not on the edge of collapsing, I still haven't seen any problems. It's basically just reading out information from the coarse grained built-in instrumentation of the JVM. If you start profiling, however, you'll have the same issues as with other profilers. Basically because they all work almost they same way, often using the support in the JVM.
Many people have problems with running VisualVM remotely, due to firewall issues, but you can even run Visual VM remotely over ssh, with some system properties set.
It is possible to remote connect to your server from a different computer using VisualVM. You just need to right click on the "Remote" node and say "Add Remote Host."
This would at least eliminate the VisualVM overhead (if there is any) from impacting performance while it is running.
This may not eliminate all performance concerns, especially in Production environments, but it will help a little.
I've used the Net Beans profiler which uses the same underpinnings as Visual VM.
I was working with an older version of Weblogic, which meant using the 1.5 JVM, so I couldn't do a dynamic attach. The application I was profiling had several thousand classes and my workstation was pretty much unusable while the profiler instrumented them all. Once instrumentation was complete, the system was sluggish but not completely unusable. The amount of slowdown really depends on what you need to capture. The basic CPU metrics are pretty light weight. Profiling memory allocation slows things down a lot.
I would not use it on a production system. Aside from the potential for slowdown, I eventually ran out of PermGen space because the profiler reinstruments and reloads classes when you change settings. (This may be fixed in the 1.6 agent, I don't know)
I've been using VisualVM a lot since before it was included in the JDK. It has a negligable impact on the performance of the system. I've never noticed it cause a problem with performance on the system, but then again, our Java server had enough headroom at the time to support a little extra load. If your server is running at a level that is completely tacked out and can't handle the VisualVM running, then I would say its more likely that you need to buy another server . Any production server should have some memory headroom , otherwise what you have is a disaster just waiting to happen.
I have used VVM(VavaVoom?) quite extensively, works like a charm in the light mode, i.e. no profiling, just getting the basic data from the VM. But once you start profiling and there are many classes, then there is considerable slowdown. I wouldn't profile in a production environment even if you have 128 core board with 2 tera of memory purely because the reloading and re-defining of the classes is tricky, the server classloaders are another thing, also vary from one server implementation to another, interfering with them in production is not a very good idea.

Categories