Should I use 32-bit or 64-bit JDK? - java

Years ago, I tried 64-bit JDK but it was really buggy.
How stable would you say it is now? Would you recommend it? Should I install 64-bit JDK + eclipse or stick to 32-bit? Also, are there any advantages of 64-bit over 32-bit other than bypassing the 4 GB memory limit?

Only begin to bother with that if you want to build an application that will use a lot of memory (namely, a heap larger than 2GB).
Allow me to quote Red Hat:
The real advantage of the 64-bit JVM is that heap sizes much larger than 2GB can be used. Large page memory with the 64-bit JVM give further optimizations. The following graph shows the results running from 4GB heap sizes, in two gigabyte increments, up to 20GB heap.
That, in a pretty graph:
See, no big difference.
See more (more graphs, yay!) in: Java Virtual Machine Tuning

I think the answer can be found pretty simple.
Answer the question: "Do I need more than 4GB of RAM?".
A 64 bit JVM is as stable as a 32 bit JVM. There are no differences. In fact a Java Application running in a 64 bit JVM will consume more RAM compared to a 32 bit JVM. All internal datastructures will need more RAM.
My eclipse is running in a 64bit JVM.

Are you going to deploy to a 32 or a 64 bit environment? If you're affinitized to an environment in production then your development environment should use the same environment type.
If you're platform agnostic, go with x64 and don't even think about it. The platform is mature and stable. It gives you tremendous room to scale up as you can just add memory and make your heaps bigger.
Nobody wants to tell a CEO, "Sorry, we chose x86 and can't scale up like we hoped. It's a month long project project to retest and replatform everything for x64."

The only differences between 32-bit and 64-bit builds of any program are the sizes of machine words, the amount of addressable memory, and the Operating System ABI in use. With Java, the language specification means that the differences in machine word size and OS ABI should not matter at all unless you're using native code as well. (Native code must be built to be the same as the word-size of the JVM that will load it; you can't mix 32-bit and 64-bit builds in the same process without very exotic coding indeed, and you shouldn't be doing that with Java about.)
The 64-bitter uses 64-bit pointers. If you have 4GB+ RAM, and are running Java programs that keep 4GB+ of data structures in memory, the 64-bitter will accommodate that. The big fat pointers can point to any byte in a 4GB+ memory space.
But if your programs use less memory, and you run the 64-bit JVM, pointers in will still occupy 64 bits (8 bytes) each. This will cause data structures to be bigger, which will eat up memory unnecessarily.

I just compiled a MQTT client in both the 32-bit JDK (jdk-8u281-windows-i586) and the 64-bit JDK (jdk-8u281-windows-x64). The class files produced had matching MD5 checksums.
FYI, it's perfectly safe to have multiple JDKs on your system. But if the version you use is important, you should be comfortable with setting your system path and JAVA_HOME to ensure the correct version is used.

Related

Java 64 bit uses more memory than a 32 bit version

I am testing my java application and I noticed that a Java 64 bit version uses much more than executing the application in a Java 32 bit version.
The servers I tested was a Windows 7-64 bit and a Solaris-64 bit, but the same behavior has happened in both cases. By the way, the application runs with default VM parameters and Java version used was 8u65s.
As my servers are 64 bit that right choice would take a Java 64 bit, but is there any reason for this? In what case 32 bit version is better than a 64 bit?
Memory allocated in both:
32-bit : 74mb
64-bit: 249mb
It is correct that a 64-bit memory model takes up more memory.
Besides that I just wanted to mention a Solaris gotcha, so this is not really a full answer to your question, but the answer below can fully explain the difference from 74mb to 249mb that you are seeing.
It is correct that there's no longer a 32-bit version Java for Solaris, as is also the case for Mac OS X. Beware that for Java 7 on Solaris you would always get the 32-bit Java (even if you had installed 64-bit Java) unless you explicitly requested the 64-bit with the -d64 flag. So be sure not to compare apples and oranges here. A lot of people on Solaris thought that they have been running the 64-bit Java because they had installed it, unaware that it had to explicitly requested.
For Java 8 on Solaris there's no point in specifying -dXX as there's only a 64-bit version.
Therefore - simply as a consequence of this - the default values for memory settings have changed. What I'm saying is that solely as a consequence of this (and not the discussion about memory pointers) it will seem as if your Java 8 on Solaris is taking up more memory from the OS. This is in fact a mirage.
Here's a recap from a 16 GB system (the values will change depending on your amount of installed RAM):
Java 7 on Solaris
With Java 7 on Solaris without further command line options you would get a 32-bit memory model (-d32 is implied even with 64-bit version installed) and default values as follows:
memory model: 32 bit
-Xms default : 64 MB
-Xmx default : 1 GB
If you explicitly used -d64 you would get:
memory model: 64 bit
-Xms default : 256 MB
-Xmx default : 4 GB
Java 8 on Solaris
With Java 8 on Solaris without further command line options you would get a 64-bit memory model (-d64 is implied, -d32 is now illegal) and default values as follows:
memory model: 64 bit
-Xms default : 256 MB
-Xmx default : 4 GB
As for your comment that you've read that: "SPARC is on the order of 10-20% degradation when you move to a 64-bit VM". I doubt it. I can see that you've read it here but that document applies to Java 1.4 and potentially Java 5. A lot has happened since then.
This is normal behavior for Java (as well as Microsoft .NET), mostly because of their pointer model, and also their garbage collection model.
An object variable is actually a pointer to an object on the heap. In 64-bit versions, this pointer requires twice as much space. So, the pointers stored in containers will require more memory, and so will the pointers that are held by the garbage collector to allow collection. Since objects are mostly made up of pointers to other objects, the difference between 32-bit and 64-bit adds up very fast.
Added to that, the garbage collector has to keep track of all of these objects efficiently, and in 64-bit versions, the collector tends to use a larger minimum allocation size so it doesn't have to keep track of as many slices of memory. This makes sense because objects are bigger anyway. I believe the minimum sizes are typically 16 bytes in 32-bit mode and 32 bytes in 64-bit mode, but those are entirely up to the specific virtual machine you are using, so they will vary.
For example, if you have an object that only requires 12 bytes of heap memory, and you are running on a virtual machine with a 32-byte minimum allocation size, it will use 32 bytes, with 20 of those bytes wasted. If you allocate the same object on a machine with a 16-byte minimum size, it will use 16 bytes, with 4 wasted. The alternative to this is to waste a lot more memory in keeping track of those blocks, so this is actually the best approach, and will keep your application's performance and resource utilization balanced.
Another thing to keep in mind is that the Java runtime allocates blocks of memory from the operating system for its heap, then the program can allocate memory out of those blocks. The runtime tries to stay ahead of your program's memory needs, so it will allocate more than is needed and let your program grow into it. With a higher minimum allocation size, the 64-bit runtime will allocate bigger blocks for its heap, so you will have more unused memory than with a 32-bit runtime.
If memory consumption is a serious constraint for your particular application, you might consider native-compiled C++ (using actual C++ standards, not legacy C with pointers to objects!). Native C++ typically requires 1/5 of the memory of Java to accomplish the same thing, which is the reason that native code tends to be more popular on mobile devices (C++ and Objective C). Of course, C++ has its own issues, so unless you have a desperate need to reduce memory consumption, it is probably best to accept this as normal behavior and keep using 64-bit Java.

Java JVM Heap Size

I have an application that on launch requests a specific amount of RAM using the following command.
java -Xms512m -Xmx985m -jar someJarfile.jar
This command fails to run on my computer with 8.0GB of RAM because it can not create an object heap of the specified size. If I lower the max range to something below 700MB it works fine.
What is even stranger is that even doing a simple java -Xmx768m -version fails when the -Xmx flag value exceeds 700m. I am trying to run it with Java 1.7Uu67 32-bit(that is what the jar was built with) and even newer versions of Java 1.7 and event Java 1.8. I would understand if the max heap was higher and I was using 32bit, but it is not above the ~1.4GB cap of 32-bit java
Is there a configuration parameter that I am missing somewhere that would be causing this, some sort of software that may be interfering? It does not make sense to me as to why I can not allocate 700MB of RAM on a machine with 8.0GB of RAM. I
I should also note that there are no other processes running that are taking up all of my RAM. It is a fresh install of Windows 7.
While 700 MB is pretty low, it is not surprising.
The 32-bit Windows XP emulator in Windows works the same way as Windows XP with all it's limitations. It means you lose 2 GB or a potential 4 GB to the OS. This means programs already running use up virtual memory space. Also if your program uses shared libraries or off heap storage like direct memory and memory mapped files this will means you lose virtual memory for the heap. Effectively you are limited to 1.4 GB of virtual memory for your applications no matter how much memory you actually have.
The simple way around this it to use the 64-bit JVM which runs in your 64-bit OS and is also limited but instead to 192 TB of virtual memory on Windows.
You should try using a 64 bit Java Runtime. It is probably the case that there is no 985 MB large one-piece memory chunk free within the 32-bit address space of your computer (the 32 bit address space 4GB). When you use a 64 bit Java Runtime, Java can allocate the memory within the 64 bit address space, in which the free memory is much more likely to be available.
It doesn't matter that your JAR file was built using a 32 bit version.
The answer to your question may lie in the fact that Windows tries and fails to find a contiguous block of memory that is large enough: see http://javarevisited.blogspot.nl/2013/04/what-is-maximum-heap-size-for-32-bit-64-JVM-Java-memory.html. (Though this suggests that other processes are hogging memory, which seems to be contradicted by your last remark.)

Why is the 64bit JVM faster than the 32bit one?

Recently I've been doing some benchmarking of the write performance of my company's database product, and I've found that simply switching to a 64bit JVM gives a consistent 20-30% performance increase.
I'm not allowed to go into much detail about our product, but basically it's a column-oriented DB, optimised for storing logs. The benchmark involves feeding it a few gigabytes of raw logs and timing how long it takes to analyse them and store them as structured data in the DB. The processing is very heavy on both CPU and I/O, although it's hard to say in what ratio.
A few notes about the setup:
Processor: Xeon E5640 2.66GHz (4 core) x 2
RAM: 24GB
Disk: 7200rpm, no RAID
OS: RHEL 6 64bit
Filesystem: Ext4
JVMs: 1.6.0_21 (32bit), 1.6.0_23 (64bit)
Max heap size (-Xmx): 512 MB (for both 32bit and 64bit JVMs)
Constants for both JVMs:
Same OS (64bit RHEL)
Same hardware (64bit CPU)
Max heap size fixed to 512 MB (so the speed increase is not due to the 64bit JVM using a larger heap)
For simplicity I've turned off all multithreading options in our product, so pretty much all processing is happening in a single-threaded manner. (When I turned on multi-threading, of course the system got faster, but the ratio between 32bit and 64bit performance stayed about the same.)
So, my question is... Why would I see a 20-30% speed improvement when using a 64bit JVM? Has anybody seen similar results before?
My intuition up until now has been as follows:
64bit pointers are bigger, so the L1 and L2 caches overflow more easily, so performance on the 64bit JVM is worse.
The JVM uses some fancy pointer compression tricks to alleviate the above problem as much as possible. Details on the Sun site here.
The JVM is allowed to use more registers when running in 64bit mode, which speeds things up slightly.
Given the above three points, I would expect 64bit performance to be slightly slower, or approximately equal to, the 32bit JVM.
Any ideas? Thanks in advance.
Edit: Clarified some points about the benchmark environment.
From: http://www.oracle.com/technetwork/java/hotspotfaq-138619.html#64bit_performance
"Generally, the benefits of being able to address larger amounts of memory come with a small performance loss in 64-bit VMs versus running the same application on a 32-bit VM. This is due to the fact that every native pointer in the system takes up 8 bytes instead of 4. The loading of this extra data has an impact on memory usage which translates to slightly slower execution depending on how many pointers get loaded during the execution of your Java program. The good news is that with AMD64 and EM64T platforms running in 64-bit mode, the Java VM gets some additional registers which it can use to generate more efficient native instruction sequences. These extra registers increase performance to the point where there is often no performance loss at all when comparing 32 to 64-bit execution speed.
The performance difference comparing an application running on a 64-bit platform versus a 32-bit platform on SPARC is on the order of 10-20% degradation when you move to a 64-bit VM. On AMD64 and EM64T platforms this difference ranges from 0-15% depending on the amount of pointer accessing your application performs."
Without knowing your hardware I'm just taking some wild stabs
Your specific CPU may be using microcode to 'emulate' some x86 instructions -- most notably the x87 ISA
x64 uses sse math instead of x87 math, I've noticed a %10-%20 speedup of some math-heavy C++ apps in this case. Math differences could be the real killer if you're using strictfp.
Memory. 64 bits gives you much more address space. Maybe the GC is a little less agressive on 64 bits mode because you have extra RAM.
Is your OS is in 64b mode and running a 32b jvm via some wrapper utility?
The 64-bit instruction set has 8 more registers, this should make the code faster overall.
But, since processsors nowaday mostly wait for memory or disk, i suppose that either the memory subsystem or the disk i/o might be more efficient in 64-bit mode.
My best guess, based on a quick google for 32- vs 64-bit performance charts,
is that 64 bit I/O is more efficient. I suppose you do a lot of I/O...
If memcpy is involved when moving the data, it's probably more efficient to copy longs than ints.
Realize that the 64-bit JVM is not magic pixie dust that makes Java apps
go faster. The 64-bit JVM allows heaps >> 4 GB and, as such, only makes sense
for applications which can take advantage of huge memory on systems which
have it.
Generally there is either a slight improvement (due to certain hardware
optimizations on certain platforms) or minor degradation (due to increased
pointer size). Generally speaking there will be a need for fewer GC's -- but
when they do occur they will likely be longer.
In memory databases or search engines that can use the increased memory
for caching objects and thus avoid IPC or disk accesses will see the biggest
application level improvements. In addition a 64-bit JVM will also
allow you to run many, many more threads than a 32-bit one, because
there's more address space for things like thread stacks, etc. The
maximum number of threads generally for a 32-bit JVM is ~1000but ~100000 threads with a 64-bit JVM.
Some drawbacks though:
Additional issues with the 64-bit JVM are that certain client
oriented features like Java Plug-in and Java Web Start
are not supported. Also any native code would also need
to be compatible (e.g. JNI for things like Type II JDBC drivers).
This is a bonus for pure-Java developers as pure apps should
just run out of the box.
More on this Thread at Java.net

how can JVM use more than 4gb of memory

I have a request to install a Linux server (preferably Ubuntu 64bit server),
and Java (64 bit) on the following machine:
Intel Core2Quad Q8200 - 2.33 GHz
8gb DDR2 ram
2x 320GB SATA HDD in soft RAID1 mirror (mirror)
The problem is how to configure system and Java because I need JVM to use more than 4gb of memory.
It cannot be distributed on many virtual machines. There is data more than 4GB large
and it has to be in memory because HDD is slow and performance is critical.
This is a configuration and performance question and I am interested in comments if anyone has experience?
thank you very much for helping me on this...
A 64 bit JVM should have no problem at all with giant heaps, certainly much larger than your available RAM. Just increase the heap size when you start the JVM, for example:
java -Xmx6g
You used to have to specify the 64bit flag (with -d64), but I don't think this is necessary any more.
32bit JVMs can manage something like 3GB of heap.
skaffman's answer which provides the required flag for allocating 6GB of memory is correct. If the 'g' does not do the trick you might want to try 6000m (check the link below for details on flags/platform)
For other options you can find useful information on all available options for the Java HotSpot VM here.
http://java.sun.com/javase/technologies/hotspot/vmoptions.jsp
(Behavioral and performance options are available. Platform specific links also available on this page)
JVM (especially 64bit) does not hesitate taking all the memory, and 4Gb is not a problem. Just install 64-bit Ubuntu and default-jre package will also be 64 bit.
Also take special care about how your data is stored in memory. Again, 64-bit JDK is very hungry for memory due to higher overhead for pointers etc. so if you distribute these 4Gb in small chunks in some data structure, 8Gb will not be enough.
If you install a 64-bit Ubuntu, I believe that
sudo apt-get install sun-java6-jdk
gives you a 64-bit Java.
EDIT: The 64 bit Java can give you as much memory as you need with the appropriate switches. The limit is with the 32-bit JVM's which cannot go over 2-4 Gb depending on operating system.

Does Java 64 bit perform better than the 32-bit version?

I noticed Sun is providing a 64-bit version of Java. Does it perform better than the 32-bit version?
Almost always 64 bits will be slower.
To quote Sun from the HotSpot FAQ:
The performance difference comparing
an application running on a 64-bit
platform versus a 32-bit platform on
SPARC is on the order of 10-20%
degradation when you move to a 64-bit
VM. On AMD64 and EM64T platforms this
difference ranges from 0-15% depending
on the amount of pointer accessing
your application performs.
There are more details at the link.
Define your workload and what "perform" means to you.
This is sort of a running annoyance to me, as a performance geek of long standing. Whether or not a particular change "performs better" or not is dependent, first and foremost, on the workload, ie, what you're asking the program to do.
64 bit Java will often perform better on things with heavy computation loads. Java programs, classically, have heavy I/O loads and heavy network loads; 64 bit vs 32 bit may not matter, but operating systems usually do.
64-bit perform better if you need much more than 1.2 GB. On some platforms you can get up to 3 GB but if you want 4 - 384 GB for example, 64-bit is your only option.
I believe Azul supports a 384 GB JVM, does anyone know if you can go higher?
I know that this question is quite old and the voted answers were probably correct at the time when they were written. But living in 2018 now, things have changed.
I just had an issue with a Java client application running on Win 10 64Bit on a Java 8 32Bit JVM. It was reading 174 MB of data from an HttpsURLConnection's InputStream in 26s which is awfully slow. The server and network were proven not to be the cause of this.
Thinking "Hey, there cannot be a huge difference between 32Bit and 64Bit JRE" it took some time until I tried having the very same code executed by a 64Bit JVM. Fortunately, in the end I did it: It was reading the very same 174MB in 5s!
I don't know if I could make it even faster, but the key take-away is this:
jre1.8.0_172 32Bit : 6.692MB/s
jre1.8.0_172 64Bit : 34.8MB/s
for the very same jar file being executed on Windows 10 64Bit.
I have no idea what could be the reason for this, but I can answer this question by "Yes, 64Bit Java is better than 32Bit Java". See also the numbers in the answer of my question regarding this issue.
On most CPU architecures 32-bits is faster than 64-bit, all else being equal. 64-bit pointers require twice as much bandwidth to transfer as 32-bits. However, the x64 instruction set architecture adds a bit of sanity over x86, so it ends up being faster. The amount of handling of long types is usually small.
Of course it also depends upon the implementation of Java. As well as the compiler, you might find differences in the implementation; for instance, NIO assumes 64-bit pointers. Also note that Sun previously only shipped the faster server HotSpot implementation for x64. This meant that if you specified -d64, you would also switch from client to server HotSpot, IIRC.
Some improvements: operations with doubles on 64 bits compute equally fast as floats on 32 bits, as well as operations on long at 64 bit compared to int.
So if you are running code with tons of longs you might see a real improvement.
My experience differs from the other answers.
Java 64bit may be faster than 32bit. At least with my tests it always was! The pointer argument is not valid when less than 4GB are used because then the 64bit-VM will also use short pointers internally. You get however the faster instruction set of the 64bit CPUs!
I tested this with Windows 7 and JDE1.8.0_144, but maybe the real reason are different internal JVM setting. When you use the 64-bit JVM it starts in "server" mode, while the 32-bit VM starts in "client" mode.
Yes, especially if your code is built to target a 64 bit platform.

Categories