I'm simulating a overload of a server and I'm getting this error:
java.lang.OutOfMemoryError: unable to create new native thread
I've read in this page http://activemq.apache.org/javalangoutofmemory.html, that I can increase the memory size. But how do I do that? Which file I need to modify,? I tried to pass the arguments by the bin/activemq script but no luck.
Your case corresponds to massive number of threads.
There are 3 ways to solve it:
reduce number of threads (i.e., -Dorg.apache.activemq.UseDedicatedTaskRunner=false in the document)
reduce per-thread stack size by -Xss option (default values: 320 KiB for 32-bit Java on Win/Linux, 1024 KiB for 64-bit Java on Win/Linux, see doc)
reduce (not extend) heap size -Xmx option to make a room for per-thread stacks (512 MiB by default in ActiveMQ script)
Note: If stack or heap is too small, it must cause another OutOfMemoryError.
You can specify them using ACTIVEMQ_OPTS shell variable (in UNIX).
For example, run ActiveMQ as
ACTIVEMQ_OPTS=-Xss160k bin/activemq
Check here
Specify the -Xmx argument to the VM that is running the ActiveMQ - Tomcat, for example.
You could assign the Java virtual machine more memory using the -Xmx command argument.
Eg. java -Xmx512M MyClass
We were running into this issue on a Linux (RedHat Enterprise 5) system and discovered that on this build the nprocs ulimit in /etc/security/limits.conf actually controls the number of threads a user can spawn.
You can view this limit using the ulimit -a command.
Out of the box this was set to a soft limit of 100 and a hard limit of 150, which is woefully short of the number of threads necessary to run a modern App Server.
We removed this limit altogether and it solved this issue for us.
This doesn't look like you are running out of heap space, so don't increase that (the -Xmx option). Instead, your application is running out of process memory and decreasing the heap space will free up process memory for native use. The question is, why you are using so much process memory? If you don't use JNI, you probably have created too many threads, and habe's post has explained how to do fix that.
Related
I'm trying to set up a Gridgain cluster with 2 servers.
Load data from a .csv file (1 million to 50 million data) to the Gridgain using GridDataLoader.
Find the min, max, average, etc. from the data loaded,
When running as a standalone application in eclipse I'm getting correct output.
But while making a cluster (2 nodes in the 2 servers + 1 node inside my eclipse environment), I'm getting java.lang.OutOfMemoryError: GC overhead limit exceeded error.
The configuration file I'm using is http://pastebin.com/LUa7gxbe
Changing eclipse.ini's Xmx property might solve the problem.
Change it to -Xmx3g
java.lang.OutOfMemoryError: GC limit overhead exceeded
This error happens when the system spends too much time executing garbage collection. There can be multiple causes, it is highly related to your environment details. I don't know Gridgain. Because of your complex environment, I think about VM tuning: if your application waits for the whole memory to be full before running garbage collection, here is your main problem.
A hint can be the -XX:-UseParallelGC JVM option (some documentation is available here), but it should be the default conf in Grigain. I don't understand the proper way to configure vm options in your environment (some options seem to be related to the cache). According to the same doc, a slow network could induce a low CPU. I guess a high network could induce a high CPU (perhaps related to GC) ? To ensure you have an appropriate VM configuration, could you check the options applied when running ?
Edit the bin/ggstart.sh script, set the JVM_OPTS to a higher value.
Default is 1 GB,
Change it to
JVM_OPTS="-Xms2g -Xmx2g -server -XX:+AggressiveOpts -XX:MaxPermSize=256m"
or higher
I start my program by specifying the max memory to 128MB
java -Xmx128M ...
Then I connect to the jvm instance with jconsole. In the "VM Summary" tab, I found that:
Maximum heap size: 127,808 kbytes
This value is smaller than the one I specified in command line. Can any one give me some tips on this?
I checked the value that is reported by Java Visual VM. It is exactly what I configured as JVM argument:
Example
JVM argument: 1024m
Heap Max: 1.073.741.824 B
So I guess jconsole has a special kind of calculation or Java Visual VM adjusts the value to the configured JVM argument - who knows?
I have found the exact memory size to be slightly smaller. How much smaller apepars to vary based on the version of Java and the OS. I suspect there is some implementation reasons why its is not precisely what you asked for.
BTW: This is only the maximum of the heap size. There are many other memory areas e.g. thread stack, direct memory, perm gen, shared libraries, native resources and memory mapped files which are not included so I wouldn't woory too much about it i.e. as there is a good margin of error between this value and the maximum memory the application will use.
BTW2: A maximum of 128m is rather small these days. Is there a good reason for it to be so small?
To quote the following Java bug report:
Not a bug. The interpretation of the -Xmx flag is VM-dependent. Some
VMs, including HotSpot, enforce a lower bound on the effective value
of this option. The CCC proposal should not have mentioned the -Xmx
flag in this way.
i have developed chatting application using threads. but when i start my application system acts very slow and sometime exception occur that heap is full. i want to increase heap size of Java Virtual Machine. how can i do it?
Just increase the heap size of the JVM. All Java applications, even simple ones, consume a lot of memory. Take a look at this article explaining in detail how to increase the amount of memory available for your application; basically you'll need to pass a couple of extra parameters to the JVM when you invoke the java command, like this:
java -Xms64m -Xmx256m HelloWorld
In the above command, I'm saying that the HelloWorld program should have an initial heap size of 64MB and a maximum of 256MB. Try with these values and fiddle a bit with them until you find a combination of values that works for your application.
You can increase heap size, but your larger issue is "Why did I get that exception?" Increasing the heap size will only delay the inevitable if your application is not cleaning up after itself properly.
You need to instrument your application with Visual VM and see what's going on. That will give you more of a path forward than simply increasing the heap size.
Add -Xmx100m to the command when you start your app. This will give you 100 MB heap (you can change the number).
It sounds strange that a chat app would required more than the standard heap size...
Blockquote
Large server applications often experience two problems with these
defaults. One is slow startup, because the initial heap is small and
must be resized over many major collections. A more pressing problem
is that the default maximum heap size is unreasonably small for most
server applications.
Blockquote
You could start your program via command prompt with these parameters
java -Xms64m -Xmx256m chat_program.
Here Xms64m = 64mb initial heap size
and Xmx256m = 256mb maximum heap size
I have a Java program that is launched by a batch file with a line like this:
javaw -Xms64m -Xmx1024m com.acme.MyProgram
However, on some computers the program will not launch and displays the following message:
Could not reserve enough space for object heap. Could not create the Java virtual machine.
The problem seems to be the the maximum size of the memory allocation pool is larger than the computer can handle. Reducing the maximum size of the memory allocation pool from 1024m to 512m seems to resolve the problem.
Is there a way I can determine how much memory is available on the computer ahead of time (from within the batch file) and determine whether to use -Xmx1024m or -Xmx512m in the batch file invocation? Note that this batch file only needs to work on Windows.
Actually the Java VM already does something similar. If you do not specify -Xms or -Xmx, then these values are inferred from the amount of physical memory on the machine. Or at least so says this page.
You could set -Xms to the minimum heap size which makes your application useful, and let Java determine a proper value for -Xmx.
You could take a look at this page for some answers: Get JVM to grow memory demand as needed up to size of VM limit?
If your program functions correctly with a max heap of 512m, I would use that value.
That said I will also check to see if there is a way to do what you're asking as that is an interesting question.
You could execute from your batch file, check the error level on exit and restart at a lower memory if it failed. I'm not sure the error level would work--if it doesn't you could also check how long it took the program to execute... any thing less than 10sec would be a giveaway.
Just a couple comments though--
If you know it doesn't NEED more than 512, you should run a test to ensure that 1024 actually helps. Larger heaps can often make your GC pauses longer and do little else.
If you're pretty sure you'll use a certain amount of ram (say, the heap will easily fill the 512 you are allocating), you should probably set the min to that number. Setting both the min and max to 512 is good if your program allocates a bunch of stuff but is not situational (always uses about the same amount of ram)
I'm using ASANT to run a xml file which points to a NARS.jar file. (i do not have the project file of the NARS.jar)
I'm getting "java.lang.OutOfMemoryError: Java heap space.
I used VisualVM to look at the heap while running the NARS.jar, and it says that it max uses 50 MB of the heapspace.
I've set the initial and max size of heapspace to 512 MB.
Does anyone have an ide of what could be wrong?
I got 1 GB physical Memory and created a 5 GB pagefile (for test purpose).
Thanks in advance.
Your app may be trying to allocate memory that exceeds your 512m limit, thus you see an outofmemory error even though only 50m is being used. To test this, I would set:
-Xms512m -Xmx1024m
And see what happens. I would also try a smaller test file, say 1g. Keep reducing the file size until you stop seeing the error. If you succeed, then the trouble is that what you're trying to do and the way you're trying to do it takes too much memory. Time to look for an alternate approach.
Are you forking the process when running the NARS.jar file? Setting ANT_OPTS will only have effect on the VM running the ant system. If you use the java task to start/fork an additional VM process, the ANT_OPTS settings will not be inherited.
If this is the case, set either fork="false" in the java task (if you are not using any other options, which require fork to be enabled), or set maxmemory="512m".
XML files are notorious memory hogs since the DOM representation can often require ten times their size on disk.
My guess is that this is where you hit the limit. Is there a stack trace with the out of memory exception?