How can I read IBM jit dump file - java

I want to look into the jitdump.20160505.165247.149.0004.dmp file.
Which is generated by IBM JVM 1.8 when it's crashing. Does anyone know how to read the dmp file?
I tried to use jextract to analyze it, but it complaints as following:
/opt/ibm/ibm-java-x86_64-80/jre/bin/jextract /tmp/jitdump.20160505.165247.149.0004.dmp -v
Loading dump file...
Error. Dump type not recognised, file: /tmp/jitdump.20160505.165247.149.0004.dmp
When trying to open jitdump file via MAT+DTFJ, here is the error message:
Error opening heap dump 'jitdump.20160505.165247.149.0004.dmp'. Check the error log for further details.
Unable to read dump C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp metafile null in DTFJ format DTFJ-J9 (java.io.IOException)
Unable to read dump C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp metafile null in DTFJ format DTFJ-J9
No Image sources were found for C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp (java.io.IOException)
No Image sources were found for C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp

The file is intended for IBM internal analysis only, the contents of the JIT dump files are not useful to anyone without an in depth understanding of the IBM JDK's JIT compiler internals. The existence of JIT dump file(s) does not imply that a JIT problem was encountered, the file was generated to collect data during a JVM crash so that if the crash is determined to be a JIT problem that IBM would stand a better chance of fixing the problem without having to ask for more data by recreating the problem several times.

You can try Eclipse MAT with IBM DTFJ plugin.

Related

How to extract JVM heap dump from core dump file?

I am trying to convert a Linux core dump of Java process to a heap dump file, suitable for analysing with Eclipse MAT. According to this blog post, adapted to the newer OpenJDK 12, I create a core dump and then run jhsdb jmap to convert the dump to HPROF format:
>sudo gcore -o dump 24934
[New LWP 24971]
...
[New LWP 17921]
warning: Could not load shared library symbols for /tmp/jffi4106753050390578111.so.
Do you need "set solib-search-path" or "set sysroot"?
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
0x00007f94c7e9e98d in pthread_join (threadid=140276994615040, thread_return=0x7ffc716d47a8) at pthread_join.c:90
90 pthread_join.c: No such file or directory.
warning: target file /proc/24934/cmdline contained unexpected null characters
warning: Memory read failed for corefile section, 1048576 bytes at 0x7f93756a6000.
warning: Memory read failed for corefile section, 1048576 bytes at 0x7f9379bec000.
...
warning: Memory read failed for corefile section, 1048576 bytes at 0x7f94c82dd000.
Saved corefile dump.24934
> ls -sh dump.24934
22G dump.24934
> /usr/lib/jvm/zulu-12-amd64/bin/jhsdb jmap --exe /usr/lib/jvm/zulu-12-amd64/bin/java --core dump.24934 --binaryheap --dumpfile jmap-dump.24934
Attaching to core dump.24934 from executable /usr/lib/jvm/zulu-12-amd64/bin/java, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 12.0.1+12
null
> ls -sh jmap-dump.24934
3.3M jmap-dump.24934
The core dump file is 22 Gb, while heap dump file is just 3 Mb, so it is likely that jhsdb jmap command fails to process the whole core dump. Also Eclipse MAT fails to open the heap dump file with a following message: The HPROF parser encountered a violation of the HPROF specification that it could not safely handle. This could be due to file truncation or a bug in the JVM.
Alex,
There are two possibilities for this.
First, gcore is the handy script of gdb. I see that it prompts some warning messages to say that it has difficulty to load a solib. gdb may generate a broken core file in the first place. You can try to load the core file using gdb and see if it can parse it.
Secondly, jhsdb parses the core file on its own. you can use an environment var LIBSAPROC_DEBUG=1 to get its traces. It will help you to know what's wrong in parsing.
Why don't you dump java heap using jmap -dump directly? This will skip coredump file.

Java: Using JVM argument -XX:ErrorFile and append the logs in existing log file without pid

I have following configuration for my service
exec java -Djava.io.tmpdir=$tmpdir -Djava.library.path="Some_Path"
-Xmx"$heapsize"m -XX:+UseConcMarkSweepGC -XX:OnOutOfMemoryError="Do something, may be restart"
-XX:ErrorFile=/var/log/service/myService/"myServiceCrash".log -jar .jar
I am not able to append the crash logs into the same file. But new file with new PID is created every time.
Requirement : Dump crash logs into same file.
This is expected behavior. For the first time it will write to the file provided in -XX:ErrorFile=, Once the file exists it won't be overwritten and you will then get the default error file.
Ideally there should be some way top show the file creation fails, but it can't be done as part of the error handling code.
Please check the evaluation here - https://bugs.openjdk.java.net/browse/JDK-8189672

Why do I get: Unable to execute dex: Cannot merge new index 67124 into a non-jumbo instruction

This is as intelligently as I can ask this question right now. I am trying to compile my code and get the following error:
Unable to execute dex: Cannot merge new index 67124 into a non-jumbo instruction!
Conversion to Dalvik format failed: Unable to execute dex: Cannot merge new index 67124 into a non-jumbo instruction!
It started happening after I integrated PayPal sdk for payment. I see the error at my first attempt to test.
I already tried:
dex.force.jumbo=true
I got the same error and I was able to build after several steps. Here they are. Basically, your project is way too big for eclipse/android to handle so you have to A) increase the memory sizes and B) reduce the number of methods you have.
Increase Heap Size in eclipse.ini file.
How to find eclipse.ini file.
How do I increase the size of heap by editing eclipse.ini
I used the following values:
-Xms2048m
-Xmx2048m
That didn't really solve the problem so I followed the instructions in this link. Specifically:
Add the following
dex.force.jumbo=true
in the first line of project.properties
I then got this error: Unable to execute dex: method ID not in [0, 0xffff]: 65536
Which led me to another stack overflow answer here. I ended up deleting some .jars I didn't use anymore to reduce the number of methods my app uses.
I hope that helps!
I had similar error while compiling one of my android project in eclipse.
I played around all the settings and finally changing eclipse heap memory in eclipse.ini file had succeed.
Try to give as much as memory as possible to eclipse if you have enough RAM. My settings in eclipse.ini file as follows,
-Xms3g
-Xmx3g
-XX:MaxPermSize=3g
Let me know whether this works for you!

Where can i obtain symbols table for JavaService?

I have a crash dump (Windows) for JavaService (it's basically a wrapper to expose JBoss app server as a service).
I see this error on opening the dump file using windbg :
*** ERROR: Symbol file could not be found.
Where can i find the symbols table file?
Please send your crash dump along with your wrapper.conf and wrapper.log files to Tanuki Software support at support#tanukisoftware.com We would be happy to look into the problem for you.
Cheers,
Leif

eclipse fails to "build workspace" on large android files...?

i have a severe problem with eclipse, where i need to compile a somewhat larger class. by "larger" i mean, the class has about 5000 lines of code...
problem is that on saving this project, eclipse takes several seconds (30-40) to "build the workspace". to be exact, it says "50%" and keeps saying that for 30-40 seconds. then, it breaks with the following error:
[console]:
[2010-07-09 15:28:39 - Dex Loader] Unable to execute dex: null
[2010-07-09 15:28:39 - myProject] Conversion to Dalvik format failed: Unable to execute dex: null
[problems window]:
Conversion to Dalvik format failed: Unable to execute dex: null
this error is reproducible and keeps popping up until i comment out several thousands LOC, so that the file still has this 5k LOC, but ~2000 of them are comments. THEN it works...
i know that 5000 LOC are no really good programming style, but i need to do it this way for now ... (i have to write this much records to a sqlite database and since sqlite doesnt feature multiple sql-queries in one rawQuery-command i have to execute a single rawQuery() for each and every data record i need to push into the db.... - until i write a file reader to read this data from a file (that needs verification etc)), i'm stuck with this solution...)
how do i get eclipse and the android sdk to accept files this big?
(system: ubuntu 10.04 x86, eclipse 3.6)
I believe the size of a class's bytecode must not exceed 64k in Dalvik. You'll have to split the class into smaller ones.

Categories