I'm running mvn test, and based on the generated xml results file, i'm preparing the results summary.
Here now i'm only able to handle right or wrong testcases.
and if any issue in the code, i'm getting the compilation error in the mvn test
Can we log the compilation errors as well in the surefire-reports ?
Outcome expected: include the compilation error case in the below result itself.
-------------------------------------------------------------------------------
Test set: com.example.helloworld.HelloWorldApplicationTests
-------------------------------------------------------------------------------
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.666 s - in com.example.helloworld.HelloWorldApplicationTests
Related
I am executing some selenium tests and wanted to add Shutterbug to my project for screenshots. I Added a mvn dependency:
<dependency>
<groupId>com.assertthat</groupId>
<artifactId>selenium-shutterbug</artifactId>
<version>1.3</version>
</dependency>
and started to code. It works as expected local, but when I wanted to run it on Jenkins, it gave me NoClassDefFound exception:
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] SomeTest.test:38 » NoClassDefFound com/assertthat/selenium_shutterbug/util...
[INFO]
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
I have deleted all the code related to this library and just have now only the dependency in pom.xml. It fails even now. I found out that failure occures when I run it on Jenkins or when I run it local but in headless mode. How can I get it work?
I follow the evosuite maven documentation to generate unit test cases for my project. Below is the command I've used:
mvn -DmemoryInMB=2000 -Dcores=2 evosuite:generate evosuite:export test
The tool takes around 1 hour to generate test cases for my project (with 9700 lines of Java code). However when it proceed to mvn test phase, all test cases failed with the same message:
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0 sec <<< FAILURE! - in com.xxx.yyy.util.Messages_ESTest
com.xxx.yyy.util.Messages_ESTest Time elapsed: 0 sec <<< ERROR!
java.lang.IllegalStateException: Trying to set up the sandbox while executing a test case
at <evosuite>.<evosuite>(<evosuite>)
at <evosuite>.<evosuite>(<evosuite>)
at <evosuite>.<evosuite>(<evosuite>)
While running the generated test case in Intellij IDEA most of them can pass the test:
Any one got any idea?
this is clearly a bug, which would be best to report on
https://github.com/EvoSuite/evosuite/issues
furthermore, currently in 1.0.3 the tests generated by EvoSuite do not work well with "mvn test". The fix is already in the SNAPSHOT though, and will be part of next release. However, the "Trying to set up the sandbox" might be a new bug as I haven't seen it before... :(
My project: testng-surefire-maven.
In one of the modules I runt mvn clean install.
When all tests are green, I have a result:
Tests run: 277, Failures: 0, Errors: 0, Skipped: 0
Then I in turn make an intentional mistake in one of 3 tests that I am refactoring right now. And as a result I have 3 totally different outputs:
Test1>AbstractTestNGSpringContextTests.springTestContextPrepareTestInstance:149 » BeanCreation
Tests run: 344, Failures: 1, Errors: 0, Skipped: 100
Test2>AbstractTestNGSpringContextTests.springTestContextPrepareTestInstance:149 » BeanCreation
Tests run: 282, Failures: 1, Errors: 0, Skipped: 8
Test3>AbstractTestNGSpringContextTests.springTestContextPrepareTestInstance:149 » BeanCreation
Tests run: 416, Failures: 1, Errors: 0, Skipped: 205
How is that possible???
All I've done is a one-line change in one of the test classes in turn. I didn't touch testng.xml nor pom.xml.
Additionally, if I make a mistake in all 3 of them simultaneously, only one will pop up. I didn't set a custom skipAfterFailureCount in surefire nor any other testng property. Why doesn't it run though all of them and show me the list of all failing tests at once? All tests are in the same package.
Ok, I don't know what influences test count, but I can answer a second part of my question.
Additionally, if I make a mistake in all 3 of them simultaneously, only one will pop up. I didn't set a custom skipAfterFailureCount in surefire nor any other testng property. Why doesn't it run though all of them and show me the list of all failing tests at once? All tests are in the same package.
It happens because testng has a property - configfailurepolicy: Whether TestNG should continue to execute the remaining tests in the suite or skip them if an #Before* method fails. Default behavior is skip.
And that's what happened in my case. I had problems on test init stage, not in the test method itself.
I'm applying BDD methodology using Cucumber, which is GREAT!
The problem is that my test suite getting bigger and bigger and now I get the following exception which fails my test from the wrong reason...
I'm using all sort or Cucumber features, such as: Background, Scenario Outline and simple scenarios.
I run the tests like this:
#RunWith(Cucumber.class)
#Cucumber.Options(features={"...../controller1"})
public class RunCukes1Test {
}
I split my feature files to different directories (controller1, controller2...) and runners (RunCukes1Test, RunCukes2Test...), but this didn't help.
When I run each test itself everything is ok, but when I run it all using mave lifecycle test, it fails. Does anyone know of any best practices in Java Cucumber to avoid problems of such.
Tests run: 5896, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.082 sec
Running com.kenshoo.urlbuilder.appservice.controller.RunCukes4Test
Tests run: 11838, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.833 sec
Exception in thread "Thread-73" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:515)
at java.lang.StringBuffer.append(StringBuffer.java:306)
at java.io.BufferedReader.readLine(BufferedReader.java:345)
at java.io.BufferedReader.readLine(BufferedReader.java:362)
at org.codehaus.plexus.util.cli.StreamPumper.run(StreamPumper.java:129)
Exception in thread "ThreadedStreamConsumer" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOfRange(Arrays.java:3209)
at java.lang.String.<init>(String.java:215)
at java.lang.StringBuffer.toString(StringBuffer.java:585)
at org.apache.maven.surefire.report.PrettyPrintXMLWriter.escapeXml(PrettyPrintXMLWriter.java:167)
at org.apache.maven.surefire.report.PrettyPrintXMLWriter.addAttribute(PrettyPrintXMLWriter.java:178)
at org.apache.maven.surefire.shade.org.codehaus.plexus.util.xml.Xpp3DomWriter.write(Xpp3DomWriter.java:50)
at org.apache.maven.surefire.shade.org.codehaus.plexus.util.xml.Xpp3DomWriter.write(Xpp3DomWriter.java:55)
at org.apache.maven.surefire.shade.org.codehaus.plexus.util.xml.Xpp3DomWriter.write(Xpp3DomWriter.java:39)
at org.apache.maven.surefire.report.XMLReporter.testSetCompleted(XMLReporter.java:128)
at org.apache.maven.surefire.report.MulticastingReporter.testSetCompleted(MulticastingReporter.java:51)
at org.apache.maven.surefire.report.TestSetRunListener.testSetCompleted(TestSetRunListener.java:115)
at org.apache.maven.plugin.surefire.booterclient.output.ForkClient.consumeLine(ForkClient.java:97)
at org.apache.maven.plugin.surefire.booterclient.output.ThreadedStreamConsumer$Pumper.run(ThreadedStreamConsumer.java:67)
at java.lang.Thread.run(Thread.java:662)
Results :
Tests run: 11790, Failures: 0, Errors: 0, Skipped: 0
I got an answer to another java-heap-space exception I had, after the cucumber tests running.
You can see it here - related problem
My theory is that the -XX:MaxPermSize is a factor for during Cucumber running, as Cucumber generates tests code and PermSize is related to amount of code as described what is permsize in java
The -Xmx is a factor for the post Cucumber running, while parsing the tests results.
So the solution is to find the balance between them both and the actual available memory.
I've got a Jenkins CI server that is set up with a Selenium test project running with maven-surefire. I need the project to be a parameterized build, so that I can trigger the build via URL with a Dtest as a parameter (and only run the tests I specify in the url). This works great.
Unfortunately, I've been unable to figure out how to run ALL of the tests, while in this parameterized configuration. Since it is in parameterized build mode, I must ALWAYS specify the -Dtest parameter.
Based on the Surefire documentation, it seems like I should be able to wildcard the test names, and everything will be run:
-Dtest=* or -Dtest=Test*
The odd result of running these parameters is a print statement (that I created) from all 6 of the tests (denoting that they were all started):
"Test <test_name> started, click here to see the SauceLabs video"
And then the standard test result (below) for only 4/6 tests
Running <test_class_path>
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.048 sec
Followed by the summary:
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0
If it matters, the tests are being run in parallel using surefire, and one other odd thing is that while printing out the individual test results, after the 4th one, the 5th result starts printing, but never shows a result, and includes a $1 at the end:
Running <test_class_path>$1
Please let me know if I can clarify anything or answer any questions.
Thanks in advance for any help!
I think that it's a regular expession:
mvn -Dtest=.*
works for me.