I have a crash dump (Windows) for JavaService (it's basically a wrapper to expose JBoss app server as a service).
I see this error on opening the dump file using windbg :
*** ERROR: Symbol file could not be found.
Where can i find the symbols table file?
Please send your crash dump along with your wrapper.conf and wrapper.log files to Tanuki Software support at support#tanukisoftware.com We would be happy to look into the problem for you.
Cheers,
Leif
Related
So I was trying to make a 1.17.1 minecraft server on my mac. I couldn't open my 1.17.1_server.jar with Java 8 so I download Java 16.0.2.
Unfortunately, everytime I was opening the 1.17.1_server.jar file, I got
"The Java JAR file "1.17._server.jar" could not be launched." .
I first thought that it was because the file was launch by Java 8 instead of 16.
So I went into the terminal and run :<path to java> -jar 1.17.1_server.jar
I then got this : Error: Unable to access jarfile 1.17.1_server.jar
Finally i tried to put the path of the jar file in the command...
So I've run : path to java -jar path to server
and got this :
[main/ERROR]: Failed to load properties from file: server.properties
[15:57:35] [main/WARN]: Failed to load eula.txt
[15:57:35] [main/INFO]: You need to agree to the EULA in order to run the server. Go to eula.txt for more info.
So why I have to agreed Eula if i've never launched it ? Does it think that he already been launched ?
As stated in the error message
[15:57:35] [main/INFO]: You need to agree to the EULA in order to run the server. Go to eula.txt for more info.
You have to open the file and check yes
Okay find the solution: my eula and server properties file were in my user folder dk why (never moved them so...)
I'm trying to automate the build process of my Scala application for producing a msi installer.
When doing so, windows claims that certain files cannot be found. Here is the command that gets run
[21:39:01.842] Running C:\Program Files (x86)\WiX Toolset v3.11\bin\candle.exe in C:\Users\RUNNER~1\AppData\Local\Temp\jdk.jpackage6105194214114584024\images\win-msi.image\krystalbull
and further down, i start seeing errors like this:
C:\Users\runneradmin\AppData\Local\Temp\jdk.jpackage6105194214114584024\config\bundle.wxf(56) : error LGHT0103 : The system cannot find the file 'C:\Users\RUNNER~1\AppData\Local\Temp\jdk.jpackage6105194214114584024\images\win-msi.image\krystalbull\app\classes\com\krystal\bull\gui\FileMenu$anon$2.class'.
C:\Users\runneradmin\AppData\Local\Temp\jdk.jpackage6105194214114584024\config\bundle.wxf(71) : error LGHT0103 : The system cannot find the file 'C:\Users\RUNNER~1\AppData\Local\Temp\jdk.jpackage6105194214114584024\images\win-msi.image\krystalbull\app\classes\com\krystal\bull\gui\dialog\ViewEventDialog$anon$10$anon$21$anonfun$lessinit$greater$1.class'.
C:\Users\runneradmin\AppData\Local\Temp\jdk.jpackage6105194214114584024\config\bundle.wxf(81) : error LGHT0103 : The system cannot find the file 'C:\Users\RUNNER~1\AppData\Local\Temp\jdk.jpackage6105194214114584024\images\win-msi.image\krystalbull\app\classes\com\krystal\bull\gui\SettingsMenu$anon$5.class'.
You can see more instances of this error in the build log here:
https://github.com/bitcoin-s/krystal-bull/runs/2697961906?check_suite_focus=true#step:8:11
My question is this:
What are the limitations in the windows file system or the program candle.exe that prevent the pre-processor from reading Scala files that include things like $anon$
Is there a valid syntax guide somewhere?
Will I need to manually rename all of these files?
we have recently upgraded the DataStage from 9.1 to 11.7 on Server AIX 7.1 .
and i'm trying to use the new connector "File Connector" to write on parquet file. i created simple job takes from teradata as a source and write on the parquet file as a target.
Image of the job
but facing below error :
> File_Connector_20,0: java.lang.NoClassDefFoundError: org.apache.hadoop.fs.FileSystem
at java.lang.J9VMInternals.prepareClassImpl (J9VMInternals.java)
at java.lang.J9VMInternals.prepare (J9VMInternals.java: 304)
at java.lang.Class.getConstructor (Class.java: 594)
at com.ibm.iis.jis.utilities.dochandler.impl.OutputBuilder.<init> (OutputBuilder.java: 80)
at com.ibm.iis.jis.utilities.dochandler.impl.Registrar.getBuilder (Registrar.java: 340)
at com.ibm.iis.jis.utilities.dochandler.impl.Registrar.getBuilder (Registrar.java: 302)
at com.ibm.iis.cc.filesystem.FileSystem.getBuilder (FileSystem.java: 2586)
at com.ibm.iis.cc.filesystem.FileSystem.writeFile (FileSystem.java: 1063)
at com.ibm.iis.cc.filesystem.FileSystem.process (FileSystem.java: 935)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.run (CC_JavaAdapter.java: 444)
i followed the steps in below link :
https://www.ibm.com/support/knowledgecenter/SSZJPZ_11.7.0/com.ibm.swg.im.iis.conn.s3.usage.doc/topics/amaze_file_formats.html
1- i uploaded the jar files into "/ds9/IBM/InformationServer/Server/DSComponents/jars"
2- added them to CLASSPATH in agent.sh then restarted the datastage.
3- i have set The environment variable CC_USE_LATEST_FILECC_JARS to the value parquet-1.9.0.jar:orc-2.1.jar.
i tried also to add the CLASSPATH as an environment variable in the job but not worked.
noting that i'm using Local in File System.
so any hint is appreciated as i'm searching a lot time ago.
Thanks in advance,
Which File System mode you are using ? If you are using Native HDFS as File System mode, then you would need to configure CLASSPATH to include some third party jars.
Perhaps these links should provide you with some guidance.
https://www.ibm.com/support/pages/node/301847
https://www.ibm.com/support/pages/steps-required-configure-file-connector-use-parquet-or-orc-file-format
Note : Based on the hadoop distribution and version you are using, the version of the jars could be different.
If the above information does not help in resolving the issue, then you may have to reach out to IBM Support to get this addressed.
TO use File Connector, there is no need to add CLASSPATH in agent.sh unless you want to import HDFS files from IMAM.
If your requirement is reading Parquet files, then set
$CC_USE_LATEST_FILECC_JARS=parquet-1.9.0.jar
$FILECC_PARQUET_AVRO_COMPAT_MODE=TRUE
If you are still seeing issue, then run job with $CC_MSG_LEVEL=2 and open IBM support case along with job design, FULL job log and Version.xml file from Engine tier.
I want to look into the jitdump.20160505.165247.149.0004.dmp file.
Which is generated by IBM JVM 1.8 when it's crashing. Does anyone know how to read the dmp file?
I tried to use jextract to analyze it, but it complaints as following:
/opt/ibm/ibm-java-x86_64-80/jre/bin/jextract /tmp/jitdump.20160505.165247.149.0004.dmp -v
Loading dump file...
Error. Dump type not recognised, file: /tmp/jitdump.20160505.165247.149.0004.dmp
When trying to open jitdump file via MAT+DTFJ, here is the error message:
Error opening heap dump 'jitdump.20160505.165247.149.0004.dmp'. Check the error log for further details.
Unable to read dump C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp metafile null in DTFJ format DTFJ-J9 (java.io.IOException)
Unable to read dump C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp metafile null in DTFJ format DTFJ-J9
No Image sources were found for C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp (java.io.IOException)
No Image sources were found for C:\Users\IBM_ADMIN\Desktop\core_files\opt\ibm\apm\playback\selenium\jitdump.20160505.165247.149.0004.dmp
The file is intended for IBM internal analysis only, the contents of the JIT dump files are not useful to anyone without an in depth understanding of the IBM JDK's JIT compiler internals. The existence of JIT dump file(s) does not imply that a JIT problem was encountered, the file was generated to collect data during a JVM crash so that if the crash is determined to be a JIT problem that IBM would stand a better chance of fixing the problem without having to ask for more data by recreating the problem several times.
You can try Eclipse MAT with IBM DTFJ plugin.
I have been trying to use a home grown test tool and after doing an update to Centos 6.4, I am no longer able to run the tcl based tool. I am getting the following error and I have no internet access on this server. Kindly advise how do I solve this problem?
Thanks
"XpUtils::iload -d /usr/local/testtool/repo/package/linux-glibc2.3-x86_64/lib/tcljava1.4.1 tclblend" failed:
couldn't load file "/usr/local/testtool/repo/package/linux-glibc2.3-x86_64/lib/tcljava1.4.1/libtclblend.so": libjava.so: cannot open shared object file: No such file or directory
while executing
"error "\"XpUtils::iload -d $dir tclblend\" failed:\n $errMsg""
(procedure "loadtclblend" line 168)
invoked from within
"loadtclblend /usr/local/testtool/repo/package/linux-glibc2.3-x86_64/lib/tcljava1.4.1"
("package ifneeded java 1.4.1" script)
invoked from within
"package require java"
("eval" body line 1)
invoked from within
"eval package require $pkg"
("foreach" body line 2)
invoked from within
"foreach pkg $pkgList {
set ::${pkg}Version [eval package require $pkg]
}"
(file "/usr/local/testtool/testtool" line 165)
If you read the error message trace, you'll see that it says that this is all caused by:
libjava.so: cannot open shared object file: No such file or directory
The first steps would then be to ensure that you've got a version of Java actually installed, to check that it includes the file libjava.so, and that the file has been indexed by the system shared library catalog.
It might also be worth checking that all its dependencies are also present and that you've got the architecture for the Tcl library and the Java library matched (e.g., both 32-bit) as those can cause odd failures when they go wrong.