Are there any free tools for Java (preferably Eclipse) that can give metrics on both how frequently code is executed (based on a recorded run) and do a side by side with coverage? I'm not sure if there is already a metric for measuring this but it seems it would be an interesting one to see.
do you mean running the application as in production ?
if in dev environ ...not sure if this is what you are looking for - http://insectj.sourceforge.net/
Are you looking for a Java profiler (something that gives you function call counts and elapsed execution times)? If so, you might want to look at the Eclipse Test & Performance Tools Platform. It doesn't look like it will give you a side-by-side comparison of two pieces of code, but it might be worth looking at.
The SD Java Profiler collects profiling data in a way that generalizes test coverage. So if you have its profile data displayed, you have the test coverage data effectively displayed too; there's no need for a side-by-side comparison.
Not free, and doesn't plug into eclipse yet, but does use a Java GUI.
Related
In order to learn more about testing, we're going to use a profiler on a larger project (to actually get some values and measurements) and since we don't have any large project ourselves, we're forced to use something else. Any good suggestions? Maybe testing JUnit perhaps? (not "With" JUnit)?
Edit:
Not looking for any specific data, just... something... The problem is that all of this is so new so it gets kinda confusing. The point is to get slightly accustomed to testing tools such as a profiler. In other words, there shouldn't be too necessary to know much about the actual program since the program don't really matter and the data gained isn't too significant either and is mostly supposed to merely demonstrate that you can actually get stuff out of testing. So it's a bit confusing how I should proceed since I am not used to big actual programs.
Can I just download normal java files and just run/profile them with NetBeans (or similar) without having to do or care about a bunch of stuff?
Well, I've got my standard scenario. It's in C++, but it shouldn't take more than a day or two to recode it in Java.
Caveat: The scenario is not about measuring, per se, but about performance tuning, which is not at all the same thing.
It makes the point that serious code often contains multiple performance problems, and if you're really trying to make it go fast, profilers are not necessarily the best tools.
It depends on what type of data you want to profile. But the best way to get a "larger project" if you don't have one, is to find some open source project on the web that fit with what you want.
Edit: I never profile with NetBeans, so I can't tell you for this tool, but if you don't care about the tool, you can start trying with VisualVM (included with the JDK), it's a tool for monitoring the JVM. It's very usefull, and if you already run java application (like NetBeans) you'll not need to download extra applications.
Description of the tool taken on their website: VisualVM monitors application CPU usage, GC activity, heap and permanent generation memory, number of loaded classes and running threads.
VisualVM website
If you really want to profile with some source code, a little java application with a main will do the job, but again it depends on what data/amout of data you want to profile. Maybe you can find some "test applications" written in java on the web.
I have a java web based application running in production. I need some way to be able to see which all parts of the code is being actually used, by the actions of the end user.
Just to clarify my requirement further.
I do not want to put a logging based solution. Any solution that needs me to put some logs and analyse the logs is not something that I am looking from.
I need some solution that works on similar lines like unit test coverage reporter. Like cobertura or emma reports, after running the unit tests, it shows me which all part of my code was fired up by the unit tests. I need something that will listen to JVM in production and tell me which all parts of my code is being fired up in production by the action of end user.
Why am I trying to do this?
I have a code that I have inherited. It is a big piece - some 25,000 classes. One of the bits that I need to do is to chop off parts of the application that is not being used too much. If I can show to management that there are parts of the application that are being scarcely used, I can chop off those parts from this product and effectively make this product a little more manageable (as in the manual regression test suite that needs to run every week or so and takes a couple of days, can be shortened).
Hope there is some ready solution to this.
As Joachim Sauer said in the comments below your question: the most straightforward approach is to just use a Code Coverage Tool that you'd use for unit testing and instrument the production code with it.
There's a major catch: overhead. Code Coverage analysis can really slow things down and while an informed user-base will tolerate some temporary performance degradation, the whole thing needs to remain useable.
From my experience JaCoCo is relatively light and doesn't impose much overhead, whereas Cobertura will impose a tremendous slowdown. On the other hand, JaCoCo merely flags "hit or no hit" whereas Cobertura gives you per-line hit counts. This means that JaCoCo will only let you find dead spots, whereas Cobertura will let you find rarely hit spots.
Whichever of these two tools you use (possibly one after the other), you may end up with giant class whitelists and class blacklists to restrict the coverage counting to places where it makes sense to do so, thereby keeping the performance overhead down. For example, if the entire thing has a single front controller Servlet, including that in the analysis will maximize the performance overhead while providing no information of value. This could turn into a lot of work and a lot of application deployments.
It may actually be quicker and less work to identify bottlenecks/gateways into specific subsystems and slap a counter on each of those (e.g. perf4j or even a full blown Nagios). Queries are another good place to slap a counter on. If you suspect some part of the application is rarely used, put a few counters there and see what happens.
I'm looking for ways to detect changes in runtime performance of my code in an automatic way. This would act in a similar way that JUnit does, but instead of testing the code's functionality it would test for sudden changes in speed. As far as I know there are no tools right now to do this automatically.
So the first question is: Are there any tools available which will do this?
Then the second questions is: If there are no tools available and I need to roll my own, what are the issues that need to be addressed?
If the second question is relevant, then here are the issues that I see:
Variability depending on the environment it is run on.
How do detect changes since micro benchmarks in Java have a large variance.
If Caliper collects the results, how to get the results out of caliper so that they can be saved in a custom format. Caliber's documentation is lacking.
I have just come across http://scalameter.github.io/ which looks appropriate, works in scala and java.
Take a look at Caliper CI, I put out version 2.0 yesterday as a Jenkins plugin.
I don't know any separate tools to handle this, but JUnit has an optional parameter called timeout in the #Test-annotation:
The second optional parameter, timeout, causes a test to fail if it
takes longer than a specified amount of clock time (measured in
milliseconds). The following test fails:
#Test(timeout=100) public void infinity() {
while(true);
}
So, you could write additional unit-tests to check that certain parts work "fast enough". Of course, you'd need to somehow first decide what is the maximum amount of time a particular task should take to run.
-
If the second question is relevant, then here are the issues that I
see:
Variability depending on the environment it is run on.
There will always be some variability, but to minimize it, I'd use Hudson or similar automated building & testing server to run the tests, so the environment would be the same each time (of course, if the server running Hudson also does all other sorts of tasks, these other tasks still could affect the results). You'd need to take this into account when deciding the maximum running time for tests (leave some "head room", so if the test takes, say, 5% more to run than usually, it still wouldn't fail straight away).
How do detect changes since micro benchmarks in Java have a large variance.
Microbenchmarks in Java are rarely reliable, I'd say test larger chunks with integration tests (such as handling a single http-request or what ever you have) and measure the total time. If the test fails due to taking too much time, isolate the problematic code by profiling, or measure and log out the running time of separate parts of the test during the test run to see which part takes the largest amount of time.
If Caliper collects the results, how to get the results out of caliper so that they can be saved in a custom format. Caliber's
documentation is lacking.
Unfortunately, I don't know anything about Caliper.
i need to measure performance of my program unit. I am using hibernate as ORM tool. i want a tool that is capable enough to measure the time taken per method invoked and excluding the database loads???
Please help
This is what a profiler does. VisualVM is a free one, but if you want mroe detail as to the timings and behaviour of JDBC queries I suggest you look at YourKit which can anlyse the queries in more depth.
JConsole is a graphical monitoring tool to monitor the Java Virtual Machine and java applications both on a local or remote machine.
http://java.sun.com/developer/technicalArticles/J2SE/jconsole.html
for a quick and dirty hack, you can use http://www.clarkware.com/software/JUnitPerf.html, a junit based microbenchmark framework. You can write your code, mock out (or even use a real database), and run it to get a benchmark. This benchmark is essentially good only for testing a single (or very few) methods, and in their specific uses, not a general profiler.
Your question isn't quite clear to me. Do you wonder which part of your application takes the time? Or do you want to observe the time a certain part of your code takes. In the first case use a proper file. VisualVM and YourKit are both fine profilers, I've used them before and found them very helpfull. In the latter case, I would try a tool like Perf4J which allows you to annotate a method and observe its average runtime, its standard deviation and other thing in realtime or afterwards.
Can anyone point me out to a project out there that I can download and run it and it would load / stress test itself and then provide me with reports? I want the project to be as big as it can and to involve as many components in java as it can, also i need it free... or to some very good already made results over the web that I can already take a look and get decision. Thanks!
main issue to benchmark is which would run it faster / better, solaris or linux
Linux and Solaris are not much different seen from your perspective, and I do not believe that the benchmark you ask for exists. A much better approach is to take the application you want to run - which hopefully should be platform independent already - and deploy to the architectures you want to test and then attach with jvisualvm and apply your standard test suite.
This will give you quite a good look at the performance without skewing with heavy profiling.
My guess is that for identical configurations you will see that Linux is slightly better than Solaris as the amount of unused memory available for disk caching will strongly influence the performance of the system. Also note that expert system tuning can also make a big difference, but that I believe that you are most interested in the "out of the box" performance.
It really depends on the aspects of usage you want to benchmark.
I did this for database applications, in this area TPC could be helpful.
I would recommend google: benchmark java numeric|transaction|rendering|olap
Depending on the characteristics of your use case.
Edit: regarding your comment of a java app running on an applicationserver, check from the backend db server whats the maximal throughput? TPC, then write a multithreaded benchmark client to checkout whats the business logics performance. The last step would be to involve webservers using Apache JMeter. This procedure allows you to tune all relevant parameters from OS over DB-Poolsizes etc.