I am interested in how JDK is tested itself, what test engine it uses.
I found some links:
https://github.com/ddopson/openjdk-test - does not look as official repository
http://openjdk.java.net/jtreg/ - contains regression tests, points to Jonathan Gibbons' blog but seems that it is unavailable
Also, I would like to see how frameworks like Swing and JavaFX are tested.
Are there any manuals / instructions available about how to execute / look through JDK tests?
OpenJDK comes with its own regression test suite.
The tests can be found in the 'test' subdirectory of the individual repositories making up a JDK forest. For example, javax.swing jtreg tests for JDK 9 can be found at http://hg.openjdk.java.net/jdk9/jdk9/jdk/file/tip/test/javax/swing .
These tests are run using the jtreg tool. You can learn more about it here: http://openjdk.java.net/projects/code-tools/jtreg/intro.html
I think the bulk of the tests is still run through jtreg. The tests themselves are part of the OpenJDK source trees stored in Mercurial.
I'd look at distribution packaging for information how to run jtreg. I think the distributions run at least a subset of the test suite as part of the build process. I don't know anything in particular about GUI testing; I have never had the need to look at those.
Related
I have a java codebase I need to scan in sonarqube, but when I run the scanner I get:
Please provide compiled classes of your project with sonar.java.binaries property
I don't have the classes; the code I was given wasn't compiled. It's also a pretty complex application and I don't really have time to figure out how to build it myself. Is there a way I can force the analysis to run without any binaries available?
Thanks for any help/ideas!
-Jason
(Also, I ran sonarqube 5.x last year on java code, and definitely did not have to use classfiles for that analysis. I figured this was a new "feature" for version 6, but the documentation says this has been since version 4.12 (?!)
You can pass any valid directory as the value of sonar.java.binaries, for example:
mkdir /tmp/empty
mvn sonar:sonar -Dsonar.java.binaries=/tmp/empty
This will bypass the problem raised by the Java analyzer,
but keep in mind that the analysis results won't be perfectly accurate.
It's very common to have some false positives when the analyzer doesn't have access to the bytecode binaries.
Good morning everyone,
as you can read in the title, I have some serious performing issues with Junit 4.12,ANT (Ant 1.9.2) and Java1.6.0_45.64. I wrote a hole new test framework for my company because the current is messed up.
We are performing continuous integration tests with Jenkins CI, now using Junit 4.12, Ant 1.9.2 and the DbUnit-2.5.1 SNAPSHOT library for DB transactions on our Oracle DB (11g). I am tied to use this libraries/setups because they're are part of my requirements.
So I integrated the new framework to a smaller Project (111 tests) and ran the test suite. From the Eclipse IDE (Mars.1) it took 100s (avg).
Now, so I can integrate to Jenkins, I am writing the ant target. Running the test suite with ant took much longer (~300s). Damnit...
I then rewrote the new framework for using TestNG (the old framework) and ran the ant target. This took only have the time.
I researched for hours on Google and found some configurations for the junit Task, like setting forkmode=once --> nothing changed at all.
I now gave up and I am asking more experienced developers for help. Do you hava any ideas?
Thanks in advance!
Recently I've upgraded SonarQube from 3.5 to 4.5.4 (LTS) and now there are a few users complaining that there are some reports missing on their project dashboards. The reports/numbers missing widgets are: lines of code and complexity. Unit tests coverage displays nothing. Other widgets (like technical debt, issues, directory tangle index) display 0 which also is suspicious. The project is in Java using the Sonar way profile.
The user does:
mvn clean org.jacoco:jacoco-maven-plugin:prepare-agent install
mvn sonar:sonar -Dsonar.login=login -Dsonar.password=***** -Dcom.sun.jndi.ldap.connect.pool.prefsize=0 -Dcom.sun.jndi.ldap.connect.pool.timeout=3600000
The sonar:sonar step shows "0 files indexed".
The log is huge so I don't want to paste it here. I could not find anything helpful in it. What do I need to do to have all reports I used to have?
I have a test project where most of the missing data is displayed "out of the box".
Starting with version 4.3, SonarQube no longer runs automated tests. It expects Jenkins/CI system to run the tests, create the JUnit/PMD/Jacoco/Clover etc. reports, and then tell SonarQube where to find them. (In older versions of SonarQube, this behavior could be achieved by setting the "reuseReports" flag to true.)
If the build is not configured to generate the reports, it will need to be adjusted to do so.
Currently I'm looking at integrating some build processes into my source control (Git hooks specifically). I'm trying to write a pre-commit hook that checks for build errors in my Java project (a medium-large test development project) and fails to allow commits that contain errors in the build. This is turning out to be rather challenging.
The approach here uses a command-line Eclipse tool to build and output warnings and errors. This does technically work, but it's slow and may cause problems with the Eclipse IDE (I've already had heap allocation errors). I've also looked at solutions using ant but these approaches don't seem to be a simple one-line solution, and may still be slow.
My main question: what's the fastest (run-time compilation speed) way to build and validate a Java project, by command line? I'd like a solution that returns 0 with no errors and something else if errors are present, but I'm willing to look at other things.
Let's start with some basics:
pre-commit hooks run on the server and not the client. There is no working directory by default. You have to make sure that javac is available, and is the correct version.
Your pre-commit hook will freeze up the user's terminal until completion.
Now, how long will it take to checkout a fresh copy of your Java project, run Ant, wait for it to compile, and then process the output of the compile? a minute or two? 20 seconds? 10 seconds? Even 10 seconds will feel like forever as you wait for the Git push to complete. And, if other users want to commit code, they have to also wait.
A better, and easier approach is to use a Continuous Build Server like Jenkins. Jenkins is easy to setup. (It comes with its own application server built in) and has hundreds of plugins that you can use to help report the health of your project. If a compile cannot happen, Jenkins will email the culprit and whomever else you mention.
We have our Jenkins setup to do Ant builds, Maven builds, and use either Git or Subversion as our repository (depending upon the project). Jenkins builds the project, keeps the console log, and will fail the build if build.xml fails. At our place, this means I start pestering the developer to fix the problem or to undo their changes. At my last workplace, developers were given 10 minutes to fix the build, or I would undo their changes.
Not only can Jenkins let you know when a build fails, but has plugins that can report on the Java compiler warnings, Javadoc warnings, run Findbugs, PMD, find duplicate lines of code (via CPD that comes with PMD), and then report everything in a series of graphs. You can also mark builds as unstable (build completes, but is problematic) or simply fail the build based upon the number of issues found with these tools.
Jenkins can also run Unit tests, and again graph the results, then run coverage analysis with JaCoCo or Cobertura or Emma.
So, take a look at Jenkins. It's easy to setup and will do exactly what you want and more.
Ant. There isn't going to be a "one-line-solution". Write an ANT script that compiles the code, and fails if there are any errors. It's not easy, but it's the best option.
Out of the choices you mention, Ant is the best. But let's face it, writing XML sucks. My guess is that any build tool will fail and return an error code when compilation fails. My favorite is sbt, but there's a bit of a learning curve if you aren't into Scala (and even those in Scala like to complain about sbt). Another great option IMO is Gradle. You write your scripts in Groovy which is a dynamically-typed superset of Java.
Jenkins may be a something you could look at
I would like to run Atlassian Clover in a production environment (I don't have an issue with overhead ). Does anyone have experience with this, or can you direct me how to do it?
My goal is to get clover reports based on real users actions. I'm using JBoss + JDK 1.5
You can deploy clover build (along with the coverage.db files generated during the instrumentation) to your servers, add a bunch of clover-specific java options to set it up and then collect the results, merge them using the clover merge tools and generate the reports. See Clover Wiki for detailed instructions.
Please note that by default, clover dumps the coverage data upon process termination - you might want to do some research on how to make it happen periodically. Look into -flushpolicy and -flushinterval options.
found the answer finally , thanks all .
after I created the Clover DB ( cmd / or eclipse integration ) , while starting my App server I add the following params
-Djboss.shutdown.forceHalt=false -Dclover.initstring.basedir=/coverage.db
it will do the job
The general strategy would be to use Clover (or Cobertura or a similar tool) when you compile your web application. If you use maven for doing your builds, you can use the cobertura plugin:
http://mojo.codehaus.org/cobertura-maven-plugin/instrument-mojo.html
To add this easily with the cobertura:instrument goal. You then drop the generated war into JBoss just as before.
You'd probably also be interested in Glassbox:
http://www.glassbox.com/glassbox/Home.html
It doesn't generate code coverage, instead it gives you a high-level report at what's going on and can tell you where you may have bottlenecks.
I haven't used clover in a long time... but I do use cobertura (http://cobertura.sourceforge.net/faq.html) for code coverage. Looking at the FAQ for cobertura it does work with JBoss.