I use C# to program and use ReadProcessMemory to read the memory of other processes running on the system. However, I'm unsure how to read the memory of a java applet that is running inside a browser? Has anyone tackled this before?
Thanks.
Since 6u10 the default Java PlugIn runs outside of the browser process(es). The process should be readily identifiable as a Java executable with PlugIn classes added to the bootstrap class path.
If the JVM is executing as part of the browser process, I suspect you won't be able to do this easily. The closest you'll be able to get is to measure the browser memory consumption.
However you could measure the memory consumption of the standalone applet viewer whilst running your applet, and then perhaps derive the applet memory consumption from that.
Related
I'm making an application in Java using Eclipse Indigo. When I run it using Eclipse the Task Manager shows javaw.exe is using 50mb of memory. When I export the application as a runnable .jar and execute the .jar the Task Manager shows javaw.exe is using 500mb.
Why is this? How could I fix this?
Edit: I'm using a Windows 7 64 bit, and my system says I have Java 1.7 installed. Apparently the memory problem is caused by a while loop. I'll study what's inside the while loop causing the problem.
Edit: Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Without any additional details about your code, I would suggest using a profiler to analyze the problem. I know YourKit and the one that is available for NetBeans are very good.
Once you run you app from the profiler, you should initially look at the objects and listeners created by your application's packages. If the issue is not there, you can expand your search to other packages until you identify things that are growing out-of-control, and then look at the code that handles those entities.
When you run certain parts of the code multiple times and still see memory utilization after that code stopped running, then you might have a leak and may consider nulling or emptying variables/listeners on exit.
It should be a good starting point, but please report your results back, so we know how it goes. By the way, what operating system are you using and what version of java?
--Luiz
You need to profile your code to get the exact answer, but from my experience when I see similar things I often equate it to garbage collecting. For example, I ran the same job and gave one job 10 gigs and the other 2 gigs..Both ran and completed but the 10gigs one used more memory(and finished faster) while the second(2gig) one, I believe, garbage collected so it still completed but took a bit more time with less memory. I'm a bit new to java so I maybe assuming the garbage collecting but I have seen what you are talking about.
You need to profile your code(check out jconsole, which is included with java, or visualVM)..
That sounds most peculiar.
I can think of two possible explanations:
You looked at the wrong javaw.exe instance. Perhaps you looked at the instance that is running Eclipse ... which is likely to be that big, or bigger.
You have (somehow) managed to configure Java to run with a large heap by default. On Linux you could do this with a wrapper script, a shell function or a shell alias. You can do at least the first of those on Windows.
I don't think it is the JAR file itself. AFAIK, you can't set JVM parameters in a JAR file. It is possible that you've somehow included a different version of something in the JAR file, but that's a bit of a stretch ...
If none of these ideas help, try profiling.
Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Ah yes. BufferedImage uses large amounts of out-of-heap memory and that needs to be managed carefully.
But this doesn't explain why your application used more memory when run from the JAR than when launched from Eclipse ... unless you were telling the application to do different things.
(My platform is Windows XP SP3. My debugger is windbg, but I've tried Immunity and gdb in cygwin as well. They all seem to be affected in the same way. I do not have the source code to either the java applet or the dll in question, so I cannot place debug hooks into the code.)
As per the question title, I am trying to debug a dll which is loaded through java by way of an applet launched within the browser. I am attaching to the java process directly so that I can gain access to the specific dll being loaded. However, after a few seconds of the java process being suspended, it terminates and my debugging session is useless.
What is the cause of this termination? A watchdog process within java itself or the browser? Can it be turned off or tuned or left running?
Watch this video DerbyCon it explains the java applet and watchdog process and some quick tips on getting around it starting around 15:30.
From what he says. Yes there is a watchdog and no there is no way to turn it off. He got around it by patching the binary, either on disk or in memory.
I wrote a wrapper application in c# NET that runs when the .jar file is running, closes when the .jar file closes, etc. This was basically to allow for our web panel to be able to query the executable to find out if it was actually running or not.
I have seen some other panels specifically intended for this software that have an option to reduce the memory usage of it when no one is connected. The java application (Minecraft) basically scales the RAM usage based on the size of the player world rather than how many players are connected. When no one is connected, it should be perfectly fine to reduce the usage.
So is there any way to reduce the RAM usage programatically from C# NET for a Java application?
AFAIK, there is no way to tell a JVM to give regular heap memory back to the operating system ... apart from telling it exit completely.
No.
Why not? Because you can't control the Java-Program in that way for two reasons:
You can't control what the JRE does with it's memory and how the GC is working.
If minecraft.jar requests 512MiB of RAM, he gets 512MiB of RAM. You can't just go all Hey, there's no one connected so I disallow you to allocate memory on an application. I mean, you could...but I don't think you want that (that would trigger exceptions and odd side-effects).
Edit: The only rather easy way to achieve this behavior would be to change the program. Since Minecraft is not free/open-source software, the only thing you could do is file a bug/feature request. Maybe even with extended information and a layout concept on how to achieve better memory usage.
I mean, I'm pretty sure that this could also be achieved with heavy usage of reflection via a Java program...but things go pretty fast downhill from there.
I'm currently writing a Java application that needs to look at how "heavily loaded" the machine it's running on is. On *nix, load average divided by number of processors fits the bill perfectly, and we retrieve load average with ManagementFactory.getOperatingSystemMXBean().getSystemLoadAverage(). Unfortunately, this returns -1 on Windows, as the call is apparently too "expensive" to be called frequently. What's the easiest way to retrieve similar Windows metrics such as the processor queue length or CPU utilisation, either in pure Java or via JNI?
You can retrieve the CPU utilization on Windows using WMI. Some code and documentation for accessing WMI from Java appears to be available here.
Try using the free Hyper SIGAR API. It is a cross platform API for calling system information. It uses JNI for Windows/Linux/Unix/Mac/etc.
http://www.hyperic.com/products/sigar
I wrote a JNLP task manager/information monitor with it and it's a decent API.
http://www.goliathdesigns.com/2009/12/sixth-post/
Source code:
http://sourceforge.net/projects/sysinfomonitor/
You can also perfom this task using Eclipse SWT if you are running on an win32 environment:
http://dentrassi.de/2011/02/04/access-to-wmi-in-java-using-eclipse-swt-ole-integration/
On Linux & Mac, is there a way to pre-cache the JVM - either in RAM, or a state of it, so that when I start a Java program, it starts as fast as C/C++ programs?
I am willing to waste memory to make this happen.
No. Unfortunately :(
On second thought, the reason why Java programs start faster on Windows these days, is because there is a process (Java Quickstart) which aggressively keeps a copy of the runtime library files in the memory cache which apparently helps immensely. I do not know if this approach has been ported to Linux.
Would that not load the JVM binary and
libs into memory so that they can be
shared?
Yes, but only in the same JVM instance. So you have to load your application into this instance, as servlet container do.
The whole bootleneck of the JVM invocation is class loading, that is the reason for the Java Quickstart that Thorbjørn mentioned.
So you can put the class libs on faster media (ram disk) this will probably fasten your (first) startup. I once installed Netbeans + JSDK on a RAM disk and it starts really fast but once started it will run equal fast than loaded from disk.