Background:
I am currently working in a Linux Virtual Machine on some Java plugins that I install into Eclipse by placing them into the /opt/eclipse/dropins folder. My plugins need to support both CentOS 6 and CentOS 7 VMs (shouldn't be a big deal since they are written in Java and both flavors of CentOS have Java 1.8.0 installed). My plugins build and install just fine on both OSs. I see them in Eclipse and can interact with them as expected.
The VMs I need to support can either be local to my machine (opened with VMWare Player/Workstation) or hosted on a cloud server. We use Windows Remote Desktop to get into the cloud VMs through xrdp on the Linux server.
Problem:
One of my plugins needs nddsjava.so from /opt/rti_connext_dds-5.2.3/lib/x64Linux3gcc4.8.2.
On both local VMs (CentOS 6 and 7), I can just set the LD_LIBRARY_PATH in an /etc/profile.d script so that any user that logs in can get the path to the required C++ library.
On CentOS 7 cloud VMs, however, when this plugin is invoked, I get java.lang.UnsatisfiedLinkError: no nddsjava in java.library.path. This happens only when opening Eclipse via the Linux menus. If I open a terminal and start Eclipse from there, the plugin can find the C++ library (because my LD_LIBRARY_PATH is set in my /bin/bash terminal). I did a little digging and found out that running chmod g-s /usr/bin/ssh-agent fixes the issue when opening Eclipse from the Linux menus (yes I understand that the chmod opens a security vulnerability. I am willing to look past this).
On CentOS 6 cloud VMs, I have never gotten the plugin to find the C++ library. LD_LIBRARY_PATH seems to get wiped when signing in through xrdp and for whatever reason, CentOS 6 appears to be explicitly not sourcing any of the profile scripts in the main top-level gnome-session process, which means other processes spawned from the GUI will not have LD_LIBRARY_PATH either.
I have also tried adding -Djava.library.path to my eclipse.ini file with no luck. It will fail on the next C++ library needed even though it lives in the same directory: java.lang.UnsatisfiedLinkError: /opt/rti_connext_dds-5.2.3/lib/x64Linux3gcc4.8.2/libnddsjava.so: libnddsc.so: cannot open shared object file: No such file or directory
Question:
Is there a single place that I can set my LD_LIBRARY_PATH for all flavors of Linux I am attempting to support (local CentOS 6, xrdp CentOS 6, local CentOS 7, and xrdp CentOS 7)?
Note: When I say cloud VM below, I mean server hosted VM in which I Windows-Remote-Desktopped into via xrdp.
This answer from server fault worked for me in the CentOS 6 cloud VM. The CentOS 7 cloud VM did not have the startwm.sh script in /etc/xrdp. Instead, it lived in /usr/libexec/xrdp and did not contain the xinitrc lines (referenced in the server fault link) causing the issue in the CentOS 6 cloud VM. Something else must be wiping the LD_LIBRARY_PATH variable in the CentOS 7 cloud VM, so I still needed to perform chmod g-s /usr/bin/ssh-agent to allow my /etc/profile.d script to be invoked on startup.
I'm trying to put SAP JCO3 libraries to my work server that has an OS "Win 2003 Server". The SAP JCO3 libraries worked fine on my localhost which has an OS "Windows 7". However when I performed the same procedure for installation on my work server and tried to use the libraries, I've been getting this error,
> java.lang.ExceptionInInitializerError: Error getting the version of
> the native layer:
> java.lang.UnsatisfiedLinkError:***********\sapjco3.dll: This
> application has failed to start because the application configuration
> is incorrect. Reinstalling the application may fix this problem
Can anyone help me with this issue. Thanks in advance.
The issue is because of Visual 2005 C++ runtime redistributable. The version require by sapjco3.dll should be minimum of 8.0.50727.4053 to work, but however the version on my work server is much lower than this one and lead to this issue. Then I asked my system admin to update it. And now it works.
The sapjco jar depends on the sapjco3.dll native library.
Your local workstation must have that .dll somewhere where java can get at it.
I'd see where the dll is located on your local workstation, and figure out how it's being referenced, then see if you can replicate that on the server.
There are different versions of the .dll for 32-bit and 64-bit windows, so it's possible that you may need a different version on the server than you need on your local workstation.
I had Install Hadoop in pseudo-distributed mode on a VMware having SUSE Linux Enterprise Server 11. I am able to run the hello world examples like word count. Also I used WinSCP to connect to that VM and uploaded several XML files into hadoop cluster.
My question is now how can I configure my eclipse which I am having on my local machine that is windows 7 to connect that VM and write some java code to play with data I had dumped in the cluster. I did some work and able to get Map/Reduce perspective in the eclipse but not able to figure out how to connect hadoop on VM from my local machine, write my java code (mapper,reducer classes) to play with data and save the result back in cluster.
If someone can help me with this that will be great. Thanks in advance.
Let me know if more information is needed.
I am using hadoop-0.20.2-cdh3u5 and eclipse europa 3.3.1
I am struggling with this as well at the moment. Maybe you will find these links helpful:
http://www.bigdatafarm.com/?p=4
http://developer.yahoo.com/hadoop/tutorial/module3.html
Cheers
I have already spent a long time to load and test my application, now I need to profile it. But unluckily, the VisualVM always says "not supported for this JVM" on my local applications?
The applications were started on the same JVM with VisualVM.
I found out that (at least under Windows) one can easily write small batch files to run VisualVM in combination with specific JVMs, which is important for me, since I have installed the 32bit JDK alongside with the 64bit JDK (I need both, so this is sensible for me).
I have created two batch files in the folder "S:\applications\visualvm\bin\":
run_32.bat:
#echo off
START "VisualVM 32" visualvm.exe --jdkhome "C:\Program Files (x86)\Java\jdk1.7.0_07"
run_64.bat:
#echo off
START "VisualVM 64" visualvm.exe --jdkhome "C:\Program Files\Java\jdk1.7.0_07"
Obviously, all paths may differ on your system, but the general idea should still work correctly (on all 64bit versions of Windows). The benefit is that I can use the 32bit batch file when I want to use VisualVM in combination with Java applications that run on the 32bit JVM, and so on for 64bit.
The "start" command has the only benefit that the batch file launches the application without waiting for it to finish, so the command prompt window closes immediately. This is not a feature of VisualVM, but of the Windows batch file interpreter.
In my case, even with the JVMs matching (both 64-Bit), the only way to get things working was sending the argument -Dcom.sun.management.jmxremote to the JVM to be monitored. That also works if you are having problems to connect via Java Mission Control (JMC).
According to JMX's documentation, this what the argument does:
Setting this property registered the Java VM platform's MBeans and published the Remote Method Invocation (RMI) connector via a private interface to allow JMX client applications to monitor a local Java platform, that is, a Java VM running on the same machine as the JMX client.
This was supposed to be enabled automatically, but for some reason it wasn't on my Linux.
VisualVM needs to be run with the same JVM - at least Java 6 with the same 32-bit/64-bit size - as the program to be profiled. (You also need to be the same user, but then this message does not apply).
I would be triple-check that it was the exact same JVM in your situation.
On Linux:
Make sure that your /etc/hosts correctly references the effective ip address of your "hostname"
It appears that a discrepancy here totally confuses the poor jvisualvm and its programmers.
I too got the same problem for local tomcat, I search for solutions for stackoverflow. after some serious debugging, I figured out that VisualGC don't have permissions to get GC informations from tool.jar file.
by links
http://docs.oracle.com/javase/7/docs/technotes/tools/share/jstatd.html#SECURITY
https://stackoverflow.com/a/42107355/3876619
I following steps to solve the issue
1) Create a permission file
vim /tmp/tools.policy
Add
grant codebase "file:${java.home}/../lib/tools.jar" {
permission java.security.AllPermission;
};
save it
2) Now add /tmp/tools.policy to JVM startup parameters
-Djava.security.policy=/tmp/tools.policy
3) Run jvisualVm with sudo
An issue that I just found, thanks to the hint from #user3356656, is that if you start the program while your machine is on one IP, and then try to connect while it is on a different IP, it will fail.
I also met this issue. My case is that on linux, I started tomcat with tomcat_user but I run jvisualvm with root user. It works after I start tomcat with root user.
I was having the problem having of visualvm detect my local tomcat installation on Windows 7. I could connect manually, but then things like memory snapshots and visualgc plugin were not enabled. I confirmed that I was using same JVM version, temp file permissions, etc. Didn't work. Then I found that starting visualvm first, and then tomcat, solved the problem.
As you can see, you are running VisualVM on 32bit JVM
You don't need to unistall 32bit JVM. Just tell VisualVM to use tour 64bit JVM.
If you want to change it permanently, you can edit
in visualvm_13\etc\visualvm.conf and specify the path of jvm here
My problem was JVM optimizations - -XX:+PerfDisableSharedMem flag will break VisualGC. This is apparent if jps will not show your app in the listing.
I can reproduce next behaviour.
I have a java application with right-click menu item to open jvisualvm.
I'm running this java application as standalone setup from a bat file.
It means I modify %path% and other needed environment variables like JDK
accordingly to form my environment.
The BAT that starts application is marked running as non-admin. Environment points to a 64-bit JDK.
Then I start other java application as admin. VM lives from the to same 64-bit JDK-source.
Then I start jvisualvm from first app with the right click ie.as non-admin.
I can see the app in jvisualvm ‘Applications list’ but clickin ‘System properties’ gives error. Message is “Not supported for this JVM”.
JVM arguments are exposed.
Solution is like in some other previous comments:
Starting my right-click jvisualvm-starter as admin I can see also "system properties" .
Certainly if JDKs would be 32-bit and other 64-bit it would not work. been there.
I thought this notion needs to be added here.
In my case, the application was running with admin permissions. So, visualVM also need to be running as admin.
I have changed name to my Windows User and set it all lowercase, restarted my PC and it all works now.
For me, the reason is that I have run the "jstatd" with a different user with the JVM process. I have a special user in the linux to start the JVM thread(it is a tomcat), but I start the jstatd process with root. If you use root to run jps, you cannot see any information of the JVM threads belonging to other users. That is the trouble.
I killed the "jstatd" process started by root, su to the owner of the JVM process, and restart the "jstatd" process and everything wokrs fine now.
We have a Java process which we run as a Windows service (using srvany). It runs with Java 1.6 (1.6.0.23 at the moment).
In the past (Windows XP), I could connect JConsole to the processes, on Windows 7 I cannot do this anymore.
If I run jconsole <pid> I get “Invalid process id:4488”. The services are running as SYSTEM user.
If I make the service run as my desktop user (using “Log On as This Account”) the services process ID appear in JConsole, but they are grayed out and I cannot connect.
Is it impossible to dynamically connect to Java processes when they are running as a Windows 7 service?
Perhaps it is a 64bit/32bit problem, I have several applications compiled with 32bit JDK, which could not be opened with JConsole from 64bit JDK on Windows 7 64bit, after I downloaded the 32bit JDK it worked.
Others have been able to run jstack on 2008r2 which may provide some insight on how to get jconsole to connect on Windows 7. As you noted in your comment, the permissions are important. If the service and jconsole can't access the temp directory to write to the appropriate hsperf subdirectory, it won't work. What is also important is the location of the temp directory, the user the service is running, and the user that is running jconsole.
Running SysInternals psexec -s -i <jdk_home>\bin\jconsole <PID> can be used to run jconsole as Local System, the same user that I believe you are using to run your service.
My experience running jconsole from JDK 1.5 in Server 2008 as a system user was unsuccessful. With permissions that I thought should have been sufficient, I got a Could Not Open PerfMemory error. Java 1.6 may be a different story.
In light of all the challenges with running jconsole locally, you would probably have better luck setting it up to accept remote connections. You could set it up for local-only access with your firewall blocking that port from external access.
Add the following to JAVA_OPTION
-Dcom.sun.management.jmxremote
-Dcom.sun.management.jmxremote.port=8086
-Dcom.sun.management.jmxremote.ssl=false
-Dcom.sun.management.jmxremote.authenticate=false
Then,
Use JConsole to connecet remote session:
localhost:8086
I am currently facing the same problem but on Windows 2003 R2 (SP2). There is on open bug in Oracle Bug database (bug id 6399729)
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6399729
Also there is a work-around posted towards the end. It talks about installing java in "install" mode :-), but didn't work for me on Windows 2003 though. But your mileage may vary!!
Change Environment Variable TEMP and Tmp to a different folder that you created.
Like c:\theTemp
It might be a problem with the folder %TMP%/hsperfdata_{USER_NAME}.
In my case, it worked after I :
close all applications running over the JVM
delete the folder %TMP%/hsperfdata_{USER_NAME} (exemple: C:/Temp/hsperfdata_MyUser)
reopen JConsole (it recreates the folder)
Hope it helps. You can also take a look at this thread in Oracle community.