Automatic Runtime Performance Regression Test in Java - java

I'm looking for ways to detect changes in runtime performance of my code in an automatic way. This would act in a similar way that JUnit does, but instead of testing the code's functionality it would test for sudden changes in speed. As far as I know there are no tools right now to do this automatically.
So the first question is: Are there any tools available which will do this?
Then the second questions is: If there are no tools available and I need to roll my own, what are the issues that need to be addressed?
If the second question is relevant, then here are the issues that I see:
Variability depending on the environment it is run on.
How do detect changes since micro benchmarks in Java have a large variance.
If Caliper collects the results, how to get the results out of caliper so that they can be saved in a custom format. Caliber's documentation is lacking.

I have just come across http://scalameter.github.io/ which looks appropriate, works in scala and java.

Take a look at Caliper CI, I put out version 2.0 yesterday as a Jenkins plugin.

I don't know any separate tools to handle this, but JUnit has an optional parameter called timeout in the #Test-annotation:
The second optional parameter, timeout, causes a test to fail if it
takes longer than a specified amount of clock time (measured in
milliseconds). The following test fails:
#Test(timeout=100) public void infinity() {
while(true);
}
So, you could write additional unit-tests to check that certain parts work "fast enough". Of course, you'd need to somehow first decide what is the maximum amount of time a particular task should take to run.
-
If the second question is relevant, then here are the issues that I
see:
Variability depending on the environment it is run on.
There will always be some variability, but to minimize it, I'd use Hudson or similar automated building & testing server to run the tests, so the environment would be the same each time (of course, if the server running Hudson also does all other sorts of tasks, these other tasks still could affect the results). You'd need to take this into account when deciding the maximum running time for tests (leave some "head room", so if the test takes, say, 5% more to run than usually, it still wouldn't fail straight away).
How do detect changes since micro benchmarks in Java have a large variance.
Microbenchmarks in Java are rarely reliable, I'd say test larger chunks with integration tests (such as handling a single http-request or what ever you have) and measure the total time. If the test fails due to taking too much time, isolate the problematic code by profiling, or measure and log out the running time of separate parts of the test during the test run to see which part takes the largest amount of time.
If Caliper collects the results, how to get the results out of caliper so that they can be saved in a custom format. Caliber's
documentation is lacking.
Unfortunately, I don't know anything about Caliper.

Related

How long to run through for lower environment performance testing using JMeter on Java APIs

I am running some lower environment perf testing on my Dev setup for Java Vertx APIs. How long should we I continue to run this test to get an acceptable numbers. Please share for any standard guideline documents on this. I am using JMeter for perf testing. My Application stack
Java, Spring, JDBC, Vertx, Oracle
Most probably your test doesn't make a lot of sense as if you get i.e. X requests per second maximum throughput at "lower" it doesn't mean that you will get double amount of the requests per second when you run the same tests against the environment with 2x times more RAM and CPUs.
I can think of a couple of things you can still test if you don't have access to PROD-like environment:
Run a Soak Test, it should allow you to detect memory leaks
Run some concurrency tests, i.e. X users are doing exactly the same action at exactly the same time, it should allow you to detect deadlocks
Use a profiler tool during your tests execution, this way you'll see the largest objects, slowest functions, etc.
Find the slowest/heaviest SQL queries and inspect their plans to see how they can be optimised.
More information: Performance Testing in a Scaled Down Environment. Part Two: 5 Things You Can Test

Using JMH as a framework for performance testing on functional/user level. Is it wrong?

I want to use JMH as a framework for performance testing on functional/user level for web application. Imagine me using JMH to, say, measure how long it takes from the moment when 100 users click "Post Your Question" on this site concurrently, to the moment when user sees their question posted.
Is this entirely wrong? What are the drawbacks of such approach?
I do not expect a nanosecond accuracy for those tests: half a second to a second accuracy are just fine.
I created a first realistic test, and really liked how it looked / worked - exactly what I need. But am I missing some big trouble ahead by using micro-benchmark framework for what it's not intended to do?
Not looking for tool recommendations
Now using this approach for approximately 6 months, I want to say that I still did not see any drawbacks. A few things I learned:
Even though on functional/user level of accuracy is lower, it's important to learn how various configuration parameters work (especially JVM-related, e.g. fork). They may influence how you build your tests, how do you run them, and what do you measure.
JMH is really lightweight and efficient, so comparing to the results obtained using other frameworks may not be valid (basically we saw 10-20% performance boost when running with JMH); I had to establish a new baseline.
JMH Jenkins plug-in helps with visualizing the results

Java application - which all parts of my code are being fired up in production?

I have a java web based application running in production. I need some way to be able to see which all parts of the code is being actually used, by the actions of the end user.
Just to clarify my requirement further.
I do not want to put a logging based solution. Any solution that needs me to put some logs and analyse the logs is not something that I am looking from.
I need some solution that works on similar lines like unit test coverage reporter. Like cobertura or emma reports, after running the unit tests, it shows me which all part of my code was fired up by the unit tests. I need something that will listen to JVM in production and tell me which all parts of my code is being fired up in production by the action of end user.
Why am I trying to do this?
I have a code that I have inherited. It is a big piece - some 25,000 classes. One of the bits that I need to do is to chop off parts of the application that is not being used too much. If I can show to management that there are parts of the application that are being scarcely used, I can chop off those parts from this product and effectively make this product a little more manageable (as in the manual regression test suite that needs to run every week or so and takes a couple of days, can be shortened).
Hope there is some ready solution to this.
As Joachim Sauer said in the comments below your question: the most straightforward approach is to just use a Code Coverage Tool that you'd use for unit testing and instrument the production code with it.
There's a major catch: overhead. Code Coverage analysis can really slow things down and while an informed user-base will tolerate some temporary performance degradation, the whole thing needs to remain useable.
From my experience JaCoCo is relatively light and doesn't impose much overhead, whereas Cobertura will impose a tremendous slowdown. On the other hand, JaCoCo merely flags "hit or no hit" whereas Cobertura gives you per-line hit counts. This means that JaCoCo will only let you find dead spots, whereas Cobertura will let you find rarely hit spots.
Whichever of these two tools you use (possibly one after the other), you may end up with giant class whitelists and class blacklists to restrict the coverage counting to places where it makes sense to do so, thereby keeping the performance overhead down. For example, if the entire thing has a single front controller Servlet, including that in the analysis will maximize the performance overhead while providing no information of value. This could turn into a lot of work and a lot of application deployments.
It may actually be quicker and less work to identify bottlenecks/gateways into specific subsystems and slap a counter on each of those (e.g. perf4j or even a full blown Nagios). Queries are another good place to slap a counter on. If you suspect some part of the application is rarely used, put a few counters there and see what happens.

which tool should be used in order to measure performance of a program unit?

i need to measure performance of my program unit. I am using hibernate as ORM tool. i want a tool that is capable enough to measure the time taken per method invoked and excluding the database loads???
Please help
This is what a profiler does. VisualVM is a free one, but if you want mroe detail as to the timings and behaviour of JDBC queries I suggest you look at YourKit which can anlyse the queries in more depth.
JConsole is a graphical monitoring tool to monitor the Java Virtual Machine and java applications both on a local or remote machine.
http://java.sun.com/developer/technicalArticles/J2SE/jconsole.html
for a quick and dirty hack, you can use http://www.clarkware.com/software/JUnitPerf.html, a junit based microbenchmark framework. You can write your code, mock out (or even use a real database), and run it to get a benchmark. This benchmark is essentially good only for testing a single (or very few) methods, and in their specific uses, not a general profiler.
Your question isn't quite clear to me. Do you wonder which part of your application takes the time? Or do you want to observe the time a certain part of your code takes. In the first case use a proper file. VisualVM and YourKit are both fine profilers, I've used them before and found them very helpfull. In the latter case, I would try a tool like Perf4J which allows you to annotate a method and observe its average runtime, its standard deviation and other thing in realtime or afterwards.

Java Tools for combining runtime execution with coverage

Are there any free tools for Java (preferably Eclipse) that can give metrics on both how frequently code is executed (based on a recorded run) and do a side by side with coverage? I'm not sure if there is already a metric for measuring this but it seems it would be an interesting one to see.
do you mean running the application as in production ?
if in dev environ ...not sure if this is what you are looking for - http://insectj.sourceforge.net/
Are you looking for a Java profiler (something that gives you function call counts and elapsed execution times)? If so, you might want to look at the Eclipse Test & Performance Tools Platform. It doesn't look like it will give you a side-by-side comparison of two pieces of code, but it might be worth looking at.
The SD Java Profiler collects profiling data in a way that generalizes test coverage. So if you have its profile data displayed, you have the test coverage data effectively displayed too; there's no need for a side-by-side comparison.
Not free, and doesn't plug into eclipse yet, but does use a Java GUI.

Categories