How do you optimize an external java library's resource usage? - java

I have a utility that I built, and for being such a small purpose-built utility I was very surprised when I noticed during testing that it was using 150mb of memory. I ran it with a heap setting of 1MB and it still took up more than 50mb.
After profiling and spending a day trying to figure out where I went wrong, I decided to test a theory. My utility connects to a proprietary application. That connection requires an external library provided by the application vendor.
I wrote a small Hello World using the library and noticed the following:
1) declaring a new object from the vendors library immediately bumps the memory usage to 50MB (mostly permgen space)
2) actually trying to connect to an application server will bump the memory usage to 150MB.
That's just plain silly as far as I'm concerned.
I'm wondering if there is any potential way to tame the beast. Maybe somehow unload classes that I know aren't necessary or won't ever be referenced. The vendor isn't going to change things any time soon.
Or what about maybe loading the vendor's library only when necessary? That way it only eats up so much memory while communicating with the app server.

The JVM only loads classes you actually use. In fact it only load methods you actually use. e.g. if you have a byte code bug in a method you don't use, you won't know. ;)
The best way to reduce the size of the vendors library is to ask them to improve it, or write your own.
Unfortunately allot of propriety application are not very resource friendly. Their first priority is correctness. :j
BTW: 150 MB of memory costs about $6 so I wouldn't spend allot more than that much time trying to fix it.

Related

How to find memory leakage in J2EE application without having a source code

Quite recently i went thru the interview process of Adobe Systems. There is one question they asked to me is :-
"there is a J2EE application and there is memory leakage in the that application,and we don't have the source code of application, hereby how could you find the memory leakage"
i was clueless at that time so i said :-
"there are many third party tools i.e. there is one which is integrated with eclipse and many more. i don't know the mechanics of those tools."
Still i am searching for answer.
Thank You
You are right there are many tools like visualvm, jmeter. what they simply do is to hook to running jvm and collect data just like you simply get the Threaddumps with jstat or a heapdump, the tools are just fancy data analyser and provides visualisation, under the hood everything resides on heapdump and threaddump which can tell you the memory leak.
In your JDK folder, look in /bin and find "jvisualvm.exe" (Java VisualVM) and double-click. It will scan the machine processes for anything Java that is currently running and allow you to start monitoring its memory usage (including itself!). The interface is pretty intuitive so you ought to figure out how to use it fast enough.
Nice to see a free utility ship with a free app dev't kit that isn't all but useless... in fact, this one helped me a lot in tracking down places in one large, data-intensive project codebase where I really needed to execute System.gc() at particular times to keep me needing >1 GB of memory. (Contrary to religious belief, running System.gc() is in fact a perfectly reasonable thing to do when you need to free up memory that is needed sooner rather than later.) Previously, I was blowing the heap space at all the wrong times (and there's no right time to do that), but this utility helped me locate the places in my code most guilty of memory-holding so I could stop that from happening.
EDIT: Over 2 years later I want to add as follows. I have not used the cited JVM switches myself for the purpose of tracking down memory leaks but I want to share this info b/c it may help someone else.
At: http://javarevisited.blogspot.com/2011/11/hotspot-jvm-options-java-examples.html
Quote: '8) JVM parameters to trace classloading and unloading
-XX:+TraceClassLoading and -XX:+TraceClassUnloading are two JVM options which we use to print logging information whenever classes loads into JVM or unloads from JVM. These JVM flags are extremely useful if you have any memory leak related to classloader and or suspecting that classes are not unloading or garbage collected.'

Impact on performance by using external jars in servlets

I am using Eclipse WTP for a project. It requires few libraries to be used. Few are small in size and few are larger. My question is, what happens when we use external jars in servlets ? If I am importing a heavy-weight library in a Servlet, does it impacts webpage load time ?
Or Java just compiles my program including libraries to give results. I understand heavy-weight jar will take time to load once even on my local machine, but I can manage them to be initiated only once (by creating a separate class and initiate static variables and use them from other classes). But seems like, this can't be done in Servlets and every time a page is loaded, servlet has to load all those heavy jars.
Is it good to use Guava and Solrj in Servlets ? Do they slow down (asking because I feel Solrj is slowing down webpage load time) ?
Including a jar in-and-of itself does not slow down servlet run time. However, using a particular tool/class/functionality in a jar may slow down the servlet, depending on what you are trying to do.
I recommend using a profiler to analyze your code and actually determine what it causing the slowdown. Here's a quote from Martin Fowler's Refactoring:
I had speculated with various members of the team (Kent and Martin deny participating in the speculation) on what was likely wrong with code we knew very well. We had even sketched some designs for improvements without first measuring what was going on.
We were completely wrong. Aside from having a really interesting conversation, we were doing no good at all.
The lesson is: Even if you know exactly what is going on in your system, measure performance, don't speculate. You'll learn something, and nine times out of ten, it won't be that you were right!

Any good test examples for testing profiler?

In order to learn more about testing, we're going to use a profiler on a larger project (to actually get some values and measurements) and since we don't have any large project ourselves, we're forced to use something else. Any good suggestions? Maybe testing JUnit perhaps? (not "With" JUnit)?
Edit:
Not looking for any specific data, just... something... The problem is that all of this is so new so it gets kinda confusing. The point is to get slightly accustomed to testing tools such as a profiler. In other words, there shouldn't be too necessary to know much about the actual program since the program don't really matter and the data gained isn't too significant either and is mostly supposed to merely demonstrate that you can actually get stuff out of testing. So it's a bit confusing how I should proceed since I am not used to big actual programs.
Can I just download normal java files and just run/profile them with NetBeans (or similar) without having to do or care about a bunch of stuff?
Well, I've got my standard scenario. It's in C++, but it shouldn't take more than a day or two to recode it in Java.
Caveat: The scenario is not about measuring, per se, but about performance tuning, which is not at all the same thing.
It makes the point that serious code often contains multiple performance problems, and if you're really trying to make it go fast, profilers are not necessarily the best tools.
It depends on what type of data you want to profile. But the best way to get a "larger project" if you don't have one, is to find some open source project on the web that fit with what you want.
Edit: I never profile with NetBeans, so I can't tell you for this tool, but if you don't care about the tool, you can start trying with VisualVM (included with the JDK), it's a tool for monitoring the JVM. It's very usefull, and if you already run java application (like NetBeans) you'll not need to download extra applications.
Description of the tool taken on their website: VisualVM monitors application CPU usage, GC activity, heap and permanent generation memory, number of loaded classes and running threads.
VisualVM website
If you really want to profile with some source code, a little java application with a main will do the job, but again it depends on what data/amout of data you want to profile. Maybe you can find some "test applications" written in java on the web.

recording cpu usage of java applications

At present I have a set of benchmark tests for recording the speed at which a Java application connects submits and returns data from varying RDBMS housed on varying server platforms. The application uses a simple algorithm for recording the time taken associated with each test. The application itself is a simple Java interface for a user to specify the tests, this seemed easier than hard coding each test or using an IDE to perform each test (bare in mind with the combination of RDBMS, Server O.S and client O.S there are in the region of several hundred individual tests). I would like to further my findings by introducing the cpu usage and memory usage during these tests on the client side where the application resides, I could hard code the algorithm for doing so in my application(My Preference) or use a third party software for monitoring this (Bare in mind it would need to be suitable for cross platform use, Windows 7, Solaris and Ubuntu).
So my question is how could I record the usage of CPU and Memory during a test through either hard coding in my Java application or Using a third party software? If you believe a third party would be the solution please could you mention the actual product and how it is possible to do this?
Thankyou to all who take the time to answer.
Check VisualVM. Has a lot of features
I used VisualVM and help to much to get memory leaks.
Here has a video who show most important VisualVM features
There are plenty of commercial products for this. JProbe is my favorite these days, but I'm also using YourKit. In the free arena, Eclipse has "TPTT" -- "Test Platform something something" -- but it seems to be a rare person who can actually get the darn thing to work. Never works for me.

Excessive memory allocation in Java Sandbox security

Under the Java security model it is possible to block most dangerous actions from untrusted classes, but the last time I checked (a few years ago now) it was still possible for untrusted code to perform a denial of service attack by continually allocating memory until the JVM crashes with an OutOfMemoryException. Looking now, I can't see any improvement in the situation.
I have a requirement to run untrusted code from 3rd parties inside a Java application and I'd like to know if it is possible to somehow restrict the heap/stack space that a class or thread can allocate in the Java security model. Thus preventing memory allocation based DoS attacks. I know about -Xss, but as I understand it that restricts all threads, most of which need no restriction.
I have also considered creating a container for the untrusted code that will run in its own JVM and communicate with the main app through sockets, or doing some static analysis on the untrusted code. However, these both sound like more effort than I hoped, although if someone knows of a trick or opensource library for this I'm interested.
So, is there a way to restrict the amount of memory than a thread can allocate to itself or some other way of preventing memory allocation denial of service attacks in Java?
There is currently no way to do this with standard APIs in Java.
More people have been interested in this and there is a JSR underway for this called Resource Consumption Management API which may be something to look into.
You will need to run the untrusted code in a separate process. There may still be ways to DoS, for instance on old versions of Windows you could easily use up all GDI resources (not tried recently, not now we have Swing).

Categories