JHAT cannot analyze a memory dump - java

I have a Memory Dump file and JHAT gives the following message and I cannot analyze anything (as no data is displayed.
Resolving 0 objects...
WARNING: hprof file does not include java.lang.Class!
WARNING: hprof file does not include java.lang.String!
WARNING: hprof file does not include java.lang.ClassLoader!
Does this mean the hprof file is incomplete or corrupt?
I am using
-XX:+HeapDumpOnOutOfMemoryError"
option in my tomcat.

I just came across this same issue with my own heap dump.
jhat shows the warnings you describe, and there's no useful data displayed
Eclipse MAT complains about a NullPointerException
VisualVM can't open the heap dump file at all
It looks like this happens when there is not enough disk space at the time that the heap is dumped, so the file is indeed incomplete/corrupt.
http://forums.oracle.com/forums/thread.jspa?threadID=1175621&tstart=135

I have the same problem today. To clarify, I am using the option heap=sites which is different than a memory dump. I also get the same messages from jhat and jvisualvm. It is possible that jhat does not support reading HPROF files created by heap=sites mode.

Related

Enormous hprof file parsing "Java heap space"

I have to parse enormous hprof file(54GB) and I'm out of RAM(given 30GB). Is there a walkaround?
Or maybe there is a way to proces hprof file via terminal? I could run it on my production server if there is a way.
Edit: I forgot to mention I want to analyse it using Eclipse MAT.

How to analyse the heap dump using jmap in java

I am creating heap dump using below command:
jmap -dump:file=DumpFile.txt <process-id>
I have opened the generated file - DumpFile.txt but it is not in readable format.
So please let me know how to analyze the data in the generated file.
You should use jmap -heap:format=b <process-id> without any paths. So it creates a *.bin file which you can open with jvisualvm.exe (same path as jmap). It's a great tool to open such dump files.
You can use jhat (Java Heap Analysis Tool) to read the generated file:
jhat [ options ] <heap-dump-file>
The jhat command parses a java heap dump file and launches a webserver. jhat enables you to browse heap dumps using your favorite webbrowser.
Note that you should have a hprof binary format output to be able to parse it with jhat. You can use format=b option to generate the dump in this format.
-dump:format=b,file=<filename>
Very late to answer this, but worth to take a quick look at. Just 2 minutes needed to understand in detail.
First create this java program
import java.util.ArrayList;
import java.util.List;
public class GarbageCollectionAnalysisExample{
public static void main(String[] args) {
List<String> l = new ArrayList<String>();
for (int i = 0; i < 100000000; i++) {
l = new ArrayList<String>(); //Memory leak
System.out.println(l);
}
System.out.println("Done");
}
}
Use jps to find the vmid (virtual machine id i.e. JVM id)
Go to CMD and type below commands >
C:\>jps
18588 Jps
17252 GarbageCollectionAnalysisExample
16048
2084 Main
17252 is the vmid which we need.
Now we will learn how to use jmap and jhat
Use jmap - to generate heap dump
From java docs about jmap
“jmap prints shared object memory maps or heap memory details of a given process or core file or a remote debug server”
Use following command to generate heap dump >
C:\>jmap -dump:file=E:\heapDump.jmap 17252
Dumping heap to E:\heapDump.jmap ...
Heap dump file created
Where 17252 is the vmid (picked from above).
Heap dump will be generated in E:\heapDump.jmap
Now use Jhat
Jhat is used for analyzing the garbage collection dump in java -
C:\>jhat E:\heapDump.jmap
Reading from E:\heapDump.jmap...
Dump file created Mon Nov 07 23:59:19 IST 2016
Snapshot read, resolving...
Resolving 241865 objects...
Chasing references, expect 48 dots................................................
Eliminating duplicate references................................................
Snapshot resolved.
Started HTTP server on port 7000
Server is ready.
By default, it will start http server on port 7000.
Then we will go to http://localhost:7000/
Courtesy : JMAP, How to monitor and analyze the garbage collection in 10 ways
If you use Eclipse as your IDE I would recommend the excellent eclipse plugin memory analyzer
Another option is to use JVisualVM, it can read (and create) heap dumps as well, and is shipped with every JDK. You can find it in the bin directory of your JDK.
VisualVm does not come with Apple JDK. You can use VisualVM Mac Application bundle(dmg) as a separate application, to compensate for that.
MAT, jprofiler,jhat are possible options. since jhat comes with jdk, you can easily launch it to do some basic analysis. check this out
If you just run jmap -histo:live or jmap -histo, it outputs the contents on the console!

How to profile a class inside a jar?

If I have a class inside a JAR (compiled with mvn assembly:assembly) which I'm trying to profile, what's the command to get a valid core dump which I can use with jhat or the Eclipse Memory Analyzer?
I tried running this:
java -agentlib:hprof=heap=sites,cpu=samples,file=profile.hprof,format=b -jar the-jar.jar
and the core dump is created when I stop the process.
But neither jhat or the Eclipse Memory Analyzer recognize this as a valid dump.
jhat gives me this warning:
Resolving 0 objects...
WARNING: hprof file does not include java.lang.Class!
WARNING: hprof file does not include java.lang.String!
WARNING: hprof file does not include java.lang.ClassLoader!
Also reading through the hprof documentation, I see that I must pass the class name. How do I do that when it's inside the JAR?
The fact the the class was loaded from a JAR file is not relevant for the heap dump.
You can use jmap to get usable HPROF heap dumps without modifying the start command.
jmap -heap:format=b <pid>
where is the process id that you can get the with the jps command line utility. Both executables are part of the JDK.

Eclipse Memory Analyser,but always shows An internal error occurred?

java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid2584.hprof ...
Heap dump file created [106948719 bytes in 4.213 secs]
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2760)
at java.util.Arrays.copyOf(Arrays.java:2734)
at java.util.ArrayList.ensureCapacity(ArrayList.java:167)
at java.util.ArrayList.add(ArrayList.java:351)
at Main.main(Main.java:15)
But when i open head dump java_pid2584.hprof via Eclipse Memory Analyser,but there is always message:
An internal error occurred during:
"Parsing heap dump from **\java_pid6564.hprof'".Java heap space
The problem is that Eclipse Memory Analyser does not have enough heap space to open the Heap dump file.
You can solve the problem as follows:
open the MemoryAnalyzer.ini file
change the default -Xmx1024m to a larger size
Note that on OS X, to increase the memory allocated to MAT, you need to right-click mat.app and show the package contents. The MemoryAnalyzer.ini file is under /Contents/Eclipse.
Solution for same issue for Memory Analyzer plugin in Eclipse in MAC OS X El Capitan.
I was facing the same issue but with the eclipse plugin and I did not have any Memory Analyzer App in Applications Folder. The solution which worked for me was:
Right Click on Eclipse icon and select Show Package Content.
Go to Contents>Eclipse
Open Eclipse.ini
Change value -Xmx1024m to -Xmx2048m
Restart Eclipse
On OS X 11.5 (El Cap) modifying MemoryAnalyzer.app/Contents/MacOS/MemoryAnalyzer.ini does not work! This is because it's looking for the MemoryAnalyzer.ini in a different place.
On my computer, it was looking for:
MemoryAnalyzer.app/Contents/Eclipse/MemoryAnalyzer.ini but the real .ini file was:
MemoryAnalyzer.app/Contents/MacOS/MemoryAnalyzer.ini.
In order for your changes to take effect, copy the existing .ini file into the new location.
To find where MemoryAnalyzer is looking for the ini file, you can run:
sudo su
cd ...MemoryAnalyzer.app/Contents/MacOS/
dtruss ./MemoryAnalyzer 2>&1 | grep ini
If Memory Analyser is used from Eclipse, then edit your eclipse.ini file to increase the vm argument to -Xmx1024m or higher. This worked for me.
http://wiki.eclipse.org/index.php/MemoryAnalyzer/FAQ#Out_of_Memory_Error_while_Running_the_Memory_Analyzer
As suggested by others, its two step simple process:-
open the MemoryAnalyzer.ini file from your MAT installation directory.
change the default -Xmx1024m to a larger size for e.g. if you have to analyze a 4GB heap dump then you can replace -Xmx1024m with -Xmx5g or -Xmx6g
For more details refer:-
https://better-coding.com/solved-eclipse-mat-java-heap-space-error/
For my experience add in MemoryAnalyzer.ini, Xms and Xmx to the max as your materiel possibilities. G1GC is faster and -XX:-UseGCOverheadLimit is need because gc usage can be high and time consuming, and maybe -XX:+UseStringDeduplication is the key to consume less memory
-vmargs
-Xms8g
-Xmx8g
-XX:-UseGCOverheadLimit
-XX:+UseG1GC
-XX:+UseStringDeduplication
If you are using Mac, try running the executable inside the mat.app 'folder' with -data option, by which you can specify a writable path:
cd mat.app/Contents/MacOS
./MemoryAnalyzer -data <writable_path>
I tried all the solutions here as well, while still getting the same error and the reason eclipse was trying to open the .hprof file as a text file due to wrong or unknown file type / editor association.
Solution: Right click on the file, select open with, then select Others, and select Eclipse Memory Analyzer.
Worked with 700MB dump, and worked with 2G dump on an eclipse heap of about 600M.
An internal error has occurred. Java heap space
Ans: GO to Your Project Work space
open .setting folder
Delete all file of .setting folder.
after you can compile
now there is no error Like Heap space
Enjoy :)
You may reduce your application memory limit, and then again take a dump. Eclipse Memory Analyser puts dump file to memory - I suspect that your Eclipse has less memory than the limit of application.
You can also do the opposite and increase the memory limit for Eclipse, but if your application works on a server, it will be hard to match in size of memory.

Using HeapDumpOnOutOfMemoryError parameter for heap dump for JBoss

I was told I can add the -XX:+HeapDumpOnOutOfMemoryError parameter to my JVM start up options to my JBoss start up script to get a heap dump when we get an out of memory error in our application. I was wondering where this data gets dumped? Is it just to the console, or to some log file? If it's just to the console, what if I'm not logged into the Unix server through the console?
Here's what Oracle's documentation has to say:
By default the heap dump is created in
a file called java_pid.hprof in the
working directory of the VM, as in the
example above. You can specify an
alternative file name or directory
with the -XX:HeapDumpPath= option. For
example -XX:HeapDumpPath=/disk2/dumps
will cause the heap dump to be
generated in the /disk2/dumps
directory.
You can view this dump from the UNIX console.
The path for the heap dump will be provided as a variable right after where you have placed the mentioned variable.
E.g.:
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=${DOMAIN_HOME}/logs/mps"
You can view the dump from the console on the mentioned path.
I found it hard to decipher what is meant by "working directory of the VM". In my example, I was using the Java Service Wrapper program to execute a jar - the dump files were created in the directory where I had placed the wrapper program, e.g. c:\myapp\bin. The reason I discovered this is because the files can be quite large and they filled up the hard drive before I discovered their location.
If you are not using "-XX:HeapDumpPath" option then in case of JBoss EAP/As by default the heap dump file will be generated in "JBOSS_HOME/bin" directory.
If you only configure -XX:+HeapDumpOnOutOfMemoryError parameter then heapdump will be generated in JBOSS_HOME/bin directory for OpenJDK/Oracle JDK. If you are using IBM JDK then heapdump will be created under /tmp directory as phd file. -XX:HeapDumpPath option gives us more feasibility for configuring our custom headpump path location (-XX:HeapDumpPath=/my-custom-jboss-server-path/). It's recommended to have this parameters configured in your environment as it will collect heapdump on OutOfMemory error for analyzing the issue with memory leak of the application or checking any large object retention in the application.

Categories