High CPU usage in Eclipse when idle - java

On my multicore machine, Eclipse uses between 100 and 250 % CPU power, even when idling on a new plain install and an empty workspace. When actually doing things, it becomes slow and unresponsive.
I have tried setting the memory settings as suggested here: Eclipse uses 100 % CPU randomly . That did not help. I also tried different Java versions, namely OpenJDK and Oracle Java 7, and Eclipse versions Juno and Indigo. I am on Ubuntu 12.04 LTS.
As another maybe unrelated issue when I close Eclipse the Java process still stays open with over 200% cpu usage and needs to be killed manually.

I was having the same problem today, and it turned out to be an indexing thread that was occupying the CPU. I had recently added quite a bit of files to a project and had forgotten about it. I realize it's not likely that anyone else has this problem, but it might be useful to post how I investigated it.
I'm running Ubuntu 12.10 with STS based on eclipse Juno.
Start eclipse from the command line and redirect output to a file so we can get a thread dump
Allow it to settle for a bit, then get a listing of the cpu usage for each thread: ps -mo 'pid lwp stime time pcpu' -C java. Here's a sample of the output that identified my cpu-hungry thread:
PID LWP STIME TIME %CPU
6974 - 07:42 00:15:51 133
7067 07:42 00:09:49 **86.1**
Convert the thread id (in my case 7067) to hex 0x1b9b (e.g. in the command line using: printf "0x%x\n" 7067)
Do a thread dump of the java process using kill -3, as in: kill -3 6974. The output is saved in the file you redirected stdout when you started eclipse
Open the file and look for the hex id of the thread:
"Link Indexer Delayed Write-10" prio=10 tid=0x00007f66b801a800 nid=**0x1b9b** runnable [0x00007f66a9e46000]
java.lang.Thread.State: RUNNABLE
at com.ibm.etools.references.internal.bplustree.db.ExtentManager$WriteBack.r

I've seen such behaviour only when the garbage collector went crazy because the allocated memory really reached the configured maximum memory limits of the VM. If you have a large Eclipse installation, your first step should always be to increase the memory settings in the eclipse.ini.
Please also activate Window -> Preferences -> General -> Show heap status. It will show you how much memory Eclipse currently uses (in the status line). If that goes up to the allowed maximum and doesn't drop anymore (i.e. the garbage collector cannot clean up unused objects), then that is exactly the indication for what I described above.
Edit: It would also be good to know what Eclipse package you use, as those contain different plugins by default. Classic, Modeling, Java EE developers,...?

I've had this problem with plugins, but never with Eclipse itself.
You can try to debug it by going to Help > About Eclipse > Installation details and disabling the plugins one by one.

Uninstalling mylyn plugins fixed the issue for me and the performance boost was so drastic that I am posting it as answer to a 6 year old question.
Go to Help->About Eclipse->Installation Details->Installed Software
and uninstall all plugins that you know you are not using. I uninstalled only mylyn plugins and it did the wonder for me.
EDIT:
In the eclipse version : 2018-09 (4.9.0), the eclipse freeze/unresponsive issue can be solved by - closing the package & project explorer.
I know this may sound like a dumb solution, but I have tested this on about 5 peer machines, multiple times and believe me when I say this simple solution removed the freeze issue in each of them. As long as package/project explorer was not reopened, none of them complained about unresponsive eclipse.

Problem: Eclipse and the Eclipse indexer take up all my resources / CPU%
Tested in Eclipse IDE for C/C++ Developers Version: 2022-09 (4.25.0) on Linux Ubuntu 18.04.
Quick summary
Solution: decrease the max number of threads Eclipse can use, down to 1/2 as many as your computer has. So, if your computer has 8 physical "cores" (actually: hyperthreads), then decrease the max number of threads that Eclipse can use to 4, or <= half of your number of cores for your system, as follows:
In $HOME/eclipse/cpp-2022-09/eclipse/eclipse.ini on Linux Ubuntu, or equivalent for your OS, make this change (reducing from 10 threads max, to 4, in my case):
Change from:
-Declipse.p2.max.threads=10
to:
-Declipse.p2.max.threads=4
Restart Eclipse.
Now, Eclipse can only take up to 4 of my 8 hyperthreads, and my system runs much better!
If on Linux, you should also reduce your "swappiness" setting to improve system performance. See below.
Details and additional improvements to make
I noticed a huge improvement in my ability to use my computer while Eclipse was indexing projects once I made this change. Eclipse used to make my computer almost totally unusable for hours or days at a time, before, as it indexes my huge repos--many GiB.
You should also give Eclipse more RAM, if needed. In that same eclipse.ini file mentioned above, the -Xms setting sets the starting RAM given to Eclipse's Java runtime environment, and the -Xmx setting sets the max RAM given to it. For indexing large projects, ensure it has a large enough max RAM to successfully index the project. The defaults, if I remember correctly, are:
-Xms256m
-Xmx2048m
...which means: starting RAM given to the Eclipse Java runtime environment is 256 MiB, and max it is allowed to grow to if needed is 2048 MiB.
I have 32 GiB of RAM and 64 GiB of swap space, and my indexer was stalling if I gave Eclipse < 12 GiB of max RAM, so I set my settings as follows to start Eclipse with 1 GiB (1024 MiB) of RAM, and allow it up to 12 GiB (12288 MiB) of RAM:
-Xms1024m
-Xmx12288m
So, my total changes were from:
-Declipse.p2.max.threads=10
-Xms256m
-Xmx2048m
...to:
-Declipse.p2.max.threads=4
-Xms1024m
-Xmx12288m
Here is my final /home/gabriel/eclipse/cpp-2022-09/eclipse/eclipse.ini file, with those changes in-place:
-startup
plugins/org.eclipse.equinox.launcher_1.6.400.v20210924-0641.jar
--launcher.library
/home/gabriel/.p2/pool/plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.2.600.v20220720-1916
-product
org.eclipse.epp.package.cpp.product
-showsplash
/home/gabriel/.p2/pool/plugins/org.eclipse.epp.package.common_4.25.0.20220908-1200
--launcher.defaultAction
openFile
--launcher.appendVmargs
-vm
/home/gabriel/.p2/pool/plugins/org.eclipse.justj.openjdk.hotspot.jre.full.linux.x86_64_19.0.1.v20221102-1007/jre/bin
-vmargs
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/sun.security.ssl=ALL-UNNAMED
-Dosgi.requiredJavaVersion=17
-Dosgi.instance.area.default=#user.home/eclipse-workspace
-Dsun.java.command=Eclipse
-XX:+UseG1GC
-XX:+UseStringDeduplication
--add-modules=ALL-SYSTEM
-Dosgi.requiredJavaVersion=11
-Dosgi.dataAreaRequiresExplicitInit=true
-Dorg.eclipse.swt.graphics.Resource.reportNonDisposed=true
-Xms1024m
-Xmx12288m
--add-modules=ALL-SYSTEM
-Declipse.p2.max.threads=4
-Doomph.update.url=https://download.eclipse.org/oomph/updates/milestone/latest
-Doomph.redirection.index.redirection=index:/->http://git.eclipse.org/c/oomph/org.eclipse.oomph.git/plain/setups/
--add-opens=java.base/java.lang=ALL-UNNAMED
-Djava.security.manager=allow
How to see how many "cores" (again, actually: hyperthreads) you have on your hardware
On Linux Ubuntu, simply open the "System Monitor" app. Count the cores. You can see here I have 8:
How many threads should I give Eclipse?
A good starting point is to give Eclipse half of your total cores, to keep it from bogging down your system all the time while indexing and refreshing large projects. So, I have 8 cores (hyperthreads), so I should give Eclipse 4 of them by setting -Declipse.p2.max.threads=4 in the .ini file.
This may sound counter-intuitive, but the larger your project and the weaker your computer, the fewer threads you should give Eclipse! This is because the larger your project and the weaker your computer, the more your computer will get bogged down using things like your Chrome web browser. So, to keep Eclipse from sucking up all your resources and freezing your computer, limit the number of threads it can have even more. If I find Eclipse to be bogging down my computer again, I'll reduce its threads to 2 or 3 max instead of 4. I previously gave it 7 of my 8 threads, and it was horrible! My computer ran so stinking slow and I could never use things like Chrome or Slack properly!
How much max RAM (-Xmx) should I give Eclipse?
The starting setting of -Xmx2048m (2048 MiB, or 2 GiB) is fine for most users. It handles most normal projects you'll encounter.
Perhaps as few as -Xmx512m (512 MiB, or 0.5 GiB) or so can index the entire Arduino AVR (8-bit mcu) source code just fine
I need at least -Xmx12288m (12288 MiB, or 12 GiB) for my large mono-repo.
You might need a whopping 32 GiB ~ 64 giB (-Xmx32768m to -Xmx65536m) to index the entire C++ Boost library, which is totally nuts. So, in most cases, exclude the Boost library from your indexer. I mention that in my Google document linked-to below.
The rule-of-thumb is to increase your -Xmx setting a bit whenever you see your indexer struggling or stalled, and Eclipse's usage of the available RAM is continually maxed-out. Here is a screenshot at the bottom of my Eclipse window showing that Eclipse is currently using 8456 MiB of the available 12288 MiB which it has currently allocated on the heap:
Zoomed-in view:
If it was rapidly increasing to the max often and staying there frequently, I'd need to increase my -Xmx setting further, to let Eclipse further grow the heap.
To turn on showing the heap status at the bottom of the Eclipse window (if it isn’t already on by default):
Window → Preferences → General → check the box for "Show heap status" → click "Apply and Close".
NB: When Eclipse first starts, the memory usage indicator will show the right-number in the above heap usage as being equal to your starting heap allocation, which is defined by the -Xms number. As Eclipse needs more memory, it will allocate more, growing that right number up to the -Xmx value you've defined. Again, if your indexer stalls or freezes because it's out of RAM, increase that -Xmx number to allow Eclipse's indexer to use more heap memory (RAM).
What other options can I pass to Eclipse's underlying Java virtual machine (JVM)?
Eclipse's article, FAQ How do I increase the heap size available to Eclipse?, states (emphasis added):
Some JVMs put restrictions on the total amount of memory available on the heap. If you are getting OutOfMemoryErrors while running Eclipse, the VM can be told to let the heap grow to a larger amount by passing the -vmargs command to the Eclipse launcher. For example, the following command would run Eclipse with a heap size of 2048MB:
eclipse [normal arguments] -vmargs -Xmx2048m [more VM args]
The arguments after -vmargs are directly passed to the VM. Run java -X for the list of options your VM accepts. Options starting with -X are implementation-specific and may not be applicable to all VMs.
You can also put the extra options in eclipse.ini.
So, as it says, run this:
java -X
...for a list of all possible arguments you can pass to the underlying Java virtual machine (JVM). Here are the descriptions from that output for -Xms and -Xmx:
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
(For Linux users) reduce your system's "swappiness"
If on Linux, you should also reduce your "swappiness" setting from the default of 60 down to the range of 0 to 10 (I prefer 0) to increase your system's performance and reduce lagging and freezing when you get above about 75% RAM usage.
"Swappiness" describes how likely your system is to move the contents of RAM to your "swap space", or virtual memory, which is on your hard disk drive (HDD) or solid state drive (SSD). Swappiness setting values range from 0 to 200 (see my answer quoting the Linux kernel source code here), where 0 means it will try not to use your swap space until it has to, and 200 means it will favor using your swap space earlier.
The benefit of virtual memory, or swap space, is that it can expand your computer's "RAM-like" memory for free practically, allowing you to run a program or do a heavy task like compiling a large application. Such a heavy process might want 64 GiB of RAM even if you only have 8 GiB of RAM. Normally, your computer would crash and couldn't do it, but with swap space it can, as it treats your swap file or partition like extra RAM. That's pretty amazing. The downside of swap memory, however, is that it's much slower than RAM, even when it is running on a high-speed m.2 SSD.
So, to limit swapping and improve performance, just reduce your swappiness to 0. Follow my instructions here: How do I configure swappiness?.
I mentioned and described how decreasing my system's swappiness from 60 to 0 really improved my performance and decreased periodic freezing in these two places here:
https://github.com/ElectricRCAircraftGuy/bug_reports/issues/3#issuecomment-1347864603
Unix & Linux: what is the different between settings swappiness to 0 to swapoff
As an alternative, if you have >= 64 GB of RAM (since that's a large enough amount for me to reasonably consider doing this), you may consider disabling all swap space entirely, and just running on RAM. On my bigger machines with that much RAM, that's what I've done.
References:
My Google document: Eclipse setup instructions on a new Linux (or other OS) computer
"Troubleshooting" section of that doc
My answer: java.lang.OutOfMemoryError when running bazel build
My answer: Ask Ubuntu: How do I increase the size of swapfile without removing it in the terminal?
I cross-linked back to here from my short answer: Eclipse uses 100 % CPU randomly and on Super User here: High CPU usage and very slow performance with Eclipse
How to view memory usage in eclipse (beginner)
I also put this info. in my Google document linked-to above.
https://wiki.eclipse.org/FAQ_How_do_I_increase_the_heap_size_available_to_Eclipse%3F
My answer: How do I configure swappiness?

Java multi thread garbage collector is a garbage.
add -XX:-UseLoopPredicate option to java command line.
See e.g. the bug https://bugzilla.redhat.com/show_bug.cgi?id=882968

Was facing the same issue, Passed following VM Argument in eclipse and it worked fine for me.
-Xmx1300m
-XX:PermSize=256m
-XX:MaxPermSize=256m

Related

Java goes OutOfMemory even with enough RAM

I have an app that uses the following jvm options:
-Xmx512m -Xms256m -XX:+UseParNewGC -XX:+UseConcMarkSweepGC
-XX:MaxGCPauseMillis=2 -XX:MaxDirectMemorySize=1G
I run it on Windows 7 x64 with 8gb RAM. And when the task manager says that there's 60% of RAM is in use, it becomes impossible to run my program, Java says "Out of memory". Even though in theory I still have almost 3gb of free RAM left. Below are screenshots of profiling my project in NetBeans (until it suddenly crashes on a random spot). What could cause these problems? Is my program really so expensive?
(source: SSmaker.ru)
(source: SSmaker.ru)
You should greedy-allocate your minimum required overhead. That is,
use something like -Xms1g -XMx1g, so when your app actually starts running,
it has already reserved its maximal heap usage.

Configuring eclipse/java to boost the performance to the fullest

I have a windows pc with 64 bit OS (windows 7 Enterprise) and 8 GB of RAM. I want to run a heavy java program on eclipse and I would like to allocate as much of the PC resources only to eclipse/JVM to boost the performance.
By the way, in two different runs of the same program, it took 33 mintues in one and 15 hours in the other. That's a very big difference which I do not know what configuration change (if any) caused this deterioration in the performance.
Could you please help me to Configure it properly?
In eclipse.ini (in the main folder of eclipse) there is a bunch of parameters that should help you to configure the amount of memory. At the end of the file should be sth like this:
-vmargs
-Xms1024m
-Xmx2048m (max heap)
-Xss1m (stack)
You can add as many parameters as you need. All of them must be after the line
-vmargs
You can find more JVM parameters here
also you can allocate memory for Java with this way:
Type Java inside the Search Control Panel box.
Click the Java icon that pops up.
Click view
Make sure to get the x64 version if you have a 64-bit OS.
Change Runtime Parameters, put '-Xmx1024m' for exemple
Change it depending on how much RAM you have, it is recommend you use 256/512/768/1024/1536/2048.
For 32-bit Operating Systems 768M is recommended.
If you have 64-bit, or that doesn't work, continue to try the following:1024m, 1536m, 2084m
hope that help.

Java 6 Update 25 VM crash: insufficient memory

For an update of this question - see below.
I experience a (reproducible, at least for me) JVM crash (not an OutOfMemoryError)
(The application which crashes is eclipse 3.6.2).
However, looking at the crash log makes me wonder:
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 65544 bytes for Chunk::new
# Possible reasons:
# The system is out of physical RAM or swap space
# In 32-bit mode, the process size limit was hit
# Possible solutions:
# Reduce memory load on the system
# Increase physical memory or swap space
# Check if swap backing store is full
# Use 64 bit Java on a 64 bit OS
# Decrease Java heap size (-Xmx/-Xms)
# Decrease number of Java threads
# Decrease Java thread stack sizes (-Xss)
# Set larger code cache with -XX:ReservedCodeCacheSize=
# This output file may be truncated or incomplete.
Current thread (0x531d6000): JavaThread "C2 CompilerThread1" daemon
[_thread_in_native, id=7812, stack(0x53af0000,0x53bf0000)]
Stack: [0x53af0000,0x53bf0000], sp=0x53bee860, free space=1018k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
V [jvm.dll+0x1484aa]
V [jvm.dll+0x1434fc]
V [jvm.dll+0x5e6fc]
V [jvm.dll+0x5e993]
V [jvm.dll+0x27a571]
V [jvm.dll+0x258672]
V [jvm.dll+0x25ed93]
V [jvm.dll+0x260072]
V [jvm.dll+0x24e59a]
V [jvm.dll+0x47edd]
V [jvm.dll+0x48a6f]
V [jvm.dll+0x12dcd4]
V [jvm.dll+0x155a0c]
C [MSVCR71.dll+0xb381]
C [kernel32.dll+0xb729]
I am using Windows XP 32-bit SP3. I have 4GB RAM.
Before starting the application I had 2 GB free according to the task manager (+ 1 GB system cache which might be freed as well.). I am definitely having enough free RAM.
From the start till the crash I logged the JVM memory statistics using visualvm and jconsole.
I acquired the memory consumption statistics until the last moments before the crash.
The statistics shows the following allocated memory sizes:
HeapSize: 751 MB (used 248 MB)
Non-HeapSize(PermGen & CodeCache): 150 MB (used 95 MB)
Size of memory management areas (Edenspace, Old-gen etc.): 350 MB
Thread stack sizes: 17 MB (according to oracle and due the fact that 51 threads are running)
I am running the application (jre 6 update 25, server vm) using the parameters:
-XX:PermSize=128m
-XX:MaxPermSize=192m
-XX:ReservedCodeCacheSize=96m
-Xms500m
-Xmx1124m
Question:
Why does the JVM crash when there's obviously enough memory on the VM and OS?
With the above settings I think that I cannot hit the 2GB 32-bit limit (1124MB+192MB+96MB+thread stacks < 2GB). In any other case (too much heap allocation), I would rather expect an OutOfMemoryError than a JVM crash
Who can help me to figure out what is going wrong here?
(Note: I upgraded recently to Eclipse 3.6.2 from Eclipse 3.4.2 and from Java 5 to Java 6. I suspect that there's a connection between the crashes and these changes because I haven't seen these before)
UPDATE
It seems to be a JVM bug introduced in Java 6 Update 25 and has something to do with the new jit compiler. See also this blog entry.
According to the blog, the fix of this bug should be part of the next java 6 updates.
In the meanwhile, I got a native stack trace during a crash. I've updated the above crash log.
The proposed workaround, using the VM argument -XX:-DoEscapeAnalysis works (at least it notably lowers the probability of a crash)
2GB on 32-bit JVM on Windows is incorrect. https://blogs.sap.com/2019/10/07/does-32-bit-or-64-bit-jvm-matter-anymore/
Since you are on Windows-XP you are stuck with a 32 bit JVM.
The max heap is 1.5GB on 32 bit VM on Windows. You are at 1412MB to begin with without threads. Did you try decreasing the swap stack size -Xss, and have you tried eliminating the PermSize allocated initially: -XX:PermSize=128m? Sounds like this is an eclipse problem, not a memory-problem per-se.
Can you move to a newer JVM or different (64-bit) JVM on a different machine? Even if you are targeting windows-XP there is no reason to develop on it, unless you HAVE to. Eclipse can run, debug and deploy code on remote machines easily.
Eclipse's JVM can be different then the JVM of things you run in or with eclipse. Eclipse is a memory pig. You can eliminate unnecessary eclipse plug-ins to use less eclipse memory, it comes with things out of the box you probably don't need or want.
Try to null out references (to eliminate circularly un-collectible GC objects), re-use allocated memory, use singletons, and profile your memory usage to eliminate unnecessary objects, references and allocations. Additional tips:
Prefer static memory allocation, i.e allocate once per VM as opposed
to dynamically.
Avoid creation of temporary objects within functions - consider a reset() method which can allow the object to reused
Avoid String mutations and mutation of auto boxed types.
I think that #ggb667 has nailed it with the reason your JVM is crashing. 32-bit Windows architectural constraints limit the available RAM for a Java application to 1.5GB1 ... not 2GB as you surmised. Also, you have neglected to include the address space occupied by the code segment of the executable, shared libraries, the native heap, and "other things".
Basically, this is not a JVM bug. You are simply running against the limitations of your hardware and operating system.
There is a possible solution in the form of PAE (Physical Address Extension) support in some versions of Windows. According to the link, Windows XP with PAE makes available up to 4GB of usable address spaces to user processes. However, there are caveats about device driver support.
Another possible solution is to reduce the max heap size, and do other things to reduce the application's memory utilization; e.g. in Eclipse reduce the number of "open" projects in your workspace.
See also: Java maximum memory on Windows XP
1 - Different sources say different things about the actual limit, but it is significantly less than 2GB. To be frank, it doesn't matter what the actual limit is.
In an ideal world this question should no longer be of practical interest to anyone. In 2020:
You shouldn't be running Windows XP. It has been end of life since April 2014
You shouldn't be running Java 6. It has been end of life since April 2013
If you are still running Java 6, you should be at the last public patch release: 1.6.0_45. (Or a later 1.6 non-public release if you have / had a support contract.)
Either way, you should not be running Eclipse on this system. Seriously, you can get a new 64-bit machine for a few hundred dollars with more memory, etc that will allow you to run an up-to-date operating system and an up-to-date Java release. You should use that to run Eclipse.
If you really need to do Java development on an old 32-bit machine with an old version of Java (because you can't afford a newer machine) you would be advised to use a simple text editor and the Java 6 JDK command line tools (and a 3rd-party Java build tool like Ant, Maven, Gradle).
Finally, if you are still trying to run / maintain Java software that is stuck on Java 6, you should really be trying to get out of that hole. Life is only going to get harder for you:
If the Java 6 software was developed in-house or you have source code, port it.
If you depend on proprietary software that is stuck on Java 6, look for a new vendor.
If management says no, put it to them that they may need to "turn it off".
You / your organization should have dealt with this issue this SEVEN years ago.
I stumbled upon a similar problem at work. We had set -Xmx65536M for our application but kept getting exactly the same kind of errors. The funny thing is that the errors happened always at a time when our application was actually doing pretty lightweight calculations, relatively speaking, and was thus nowhere near this limit.
We found a possible solution for the problem online: http://www.blogsoncloud.com/jsp/techSols/java-lang-OutOfMemoryError-unable-to-create-new-native-thread.jsp , and it seemed to solve our problem. After lowering -Xmx to 50G, we've had none of these issues.
What actually happens in the case is still somewhat unclear to us.
The JVM has its own limits that will stop it long before it hits the physical or virtual memory limits. What you need to adjust is the heap size, which is with another one of the -X flags. (I think it's something creative like -XHeapSizeLimit but I'll check in a second.)
Here we go:
-Xmsn Specify the initial size, in bytes, of the memory allocation pool.
This value must be a multiple of 1024
greater than 1MB. Append the letter k
or K to indicate kilobytes, or m or M
to indicate megabytes. The default
value is 2MB. Examples:
-Xms6291456
-Xms6144k
-Xms6m
-Xmxn Specify the maximum size, in bytes, of the memory allocation pool.
This value must a multiple of 1024
greater than 2MB. Append the letter k
or K to indicate kilobytes, or m or M
to indicate megabytes. The default
value is 64MB. Examples:
-Xmx83886080
-Xmx81920k
-Xmx80m

IBM JRE 1.5 will not startup with the requested 1.5G memory

IBM JRE 5.0 on Windows, when given -Xmx1536m on a laptop with 2GB memory, refuses to start up: error message below. With -Xmx1000m it does start.
Also, it starts fine with -Xmx1536m on other servers and even laptops, so I think that there is something more than just inadequate memory.
Also, when started from within Eclipse (albeit, using the JRE in the IBM 5 JDK in this case) with the same memory parameter, it runs fine.
Any idea what is going on here?
JVMJ9VM015W Initialization error for library j9gc23(2): Failed to instantiate heap. 1536M requested
Could not create the Java virtual machine
Edit:
Does anyone know about the "3GB switch" and if it is relevant here (beyond the obvious fact that approximately that this is a memory limitations problem). How can I tell if it is enabled and what is the most straightforward way to turnit on?
According to IBM DeveloperWorks:
Cause
The system does not have the necessary resources to satisfy the
maximum default heap value required to
run the JVM.
To resolve, here is what it says
Resolving the problem
If you receive
this error message when starting the
JVM, free
memory by stopping other applications
that might be consuming system
resources.
Your JVM doesn't have enough memory resources to create maximum amount of heap space of 1536 MB. Just make sure that you have enough memory to accommodate it.
Also, I believe that in Windows, the maximum heap space is 1000MB? I'm not sure if that's solid, but in Linux/AIX, any Xmx more than 1GB works fine.
The JVM requires that it be able to allocate its memory as a single contiguous block. If you are on a 32-bit system, the maximum available is about 1280M more or less. To get more you must run a 64-bit JVM on a 64-bit OS.
You may be able to get a little more by starting the JVM immediately after rebooting.
As to starting OK on other systems, are those 32 or 64-bit?
Pretty much the maximum you are guaranteed to get on a Windows platform is 1450 MB. Sometimes Windows/Java.exe maps DLLS to addresses in the 1.5-2.0GB range. This doesn't change even if you use the /3GB trick (or you have an OS that supports it). You have to manually rebase the DLLs to get them higher towards the 2GB (or 3GB boundary). It's a real pain in the ass, and I've done it before, but the best I've ever been able to get with and without a combination of /3GB is 1.8G on 32bit Windows.
Best to be done with it and migrate to a 64-bit OS. They're prevalent now-a-days.
I have the same issue in IBM Engineering lifecycle installation:-
Problem:- JVMJ9VM015W Initialization error for library j9gc26(2): Failed to instantiate heap; Could not create the Java virtual machine.
Solution:- I just did it and solve my issue. If you don't have 16GB ram then please don't change the jazz server startup file. If you have 8GB ram then Only do not increase memory size in the server.:
**set JAVA_OPTS=%JAVA_OPTS% -Xmx4G**
**set JAVA_OPTS=%JAVA_OPTS% -Xms4G**
**set JAVA_OPTS=%JAVA_OPTS% -Xmn1G**

Why Sun Java on Solaris take more than twice RSS memory?

I've got a problem on my Solaris servers. When I launch a Sun Java process with restricted memory it takes more than twice the ressources.
For example, I have 64 Go of memory on my servers. 1 is on Linux, the others are on Solaris. I ran the same softwares on all servers (only java).
When servers starts they took between 400Mb and 1,2Gb of RAM. I launch my java process (generally between 4 and 16go per java process) and I can't run more than 32 Gb defined with Xmx and Xmx values. I got this kind of errors :
> /java -d64 -Xms8G -Xmx8G -version
Error occurred during initialization of VM
Could not reserve enough space for object heap
As we can see here, I got a lot of reserved memory and it's made by java process :
> swap -s
total: 22303112k bytes allocated + 33845592k reserved = 56148704k used, 704828k available
As soon as I kill them 1 by 1, I recover my reserved space and could launch others. But in fact I can't use more than a half my memory.
Anybody know how to resolve this problem ?
Thanks
I believe the issue is Linux over committing memory allocation while Solaris is make sure what you allocate fit in virtual memory.
If you think that's a Linux advantage, you might reconsider it when Linux OOM killer randomly kill your mission critical application at it worst stage.
To fix the issue, just add more swap space to Solaris.

Categories