Failed TestNG tests with DataProvider in IntelliJ IDEA - java

i recently started playing around with tdd and ran into a problem where i do not understand why one thing is working and the other one doesnt.
the following code works for me:
public class Ant {
public Ant(Point startLocation, Point hive) {
this.currentLocation = new Point(startLocation);
this.hive = new Point(hive);
}
public void goHome() {
if (hive.x > currentLocation.x) {
currentLocation.x++;
} else if (hive.x < currentLocation.x){
currentLocation.x--;
}
if (hive.y > currentLocation.y) {
currentLocation.y++;
} else if (hive.y < currentLocation.y){
currentLocation.y--;
}
}
}
The corresponding test:
#DataProvider(name = "goneHome")
public static Object[][] goHome() {
return new Object[][] {
{new Point(2,1), new Point(3,2), new Point(7,8)},
{new Point(20,1), new Point(19,2), new Point(7,8)},
{new Point(23,10), new Point(22,9), new Point(7,8)},
{new Point(2,10), new Point(3,9), new Point(7,8)},
{new Point(2,8), new Point(3,8), new Point(7,8)},
{new Point(7,1), new Point(7,2), new Point(7,8)}
};
}
#Test(dataProvider = "goneHome")
public void testGoHome(Point currentPosition, Point nextPosition, Point hive)
throws Exception {
Ant ant = new Ant(currentPosition, hive);
ant.move();
assertEquals(ant.getCurrentLocation(), nextPosition);
}
the test fails if i change the ant constructor like this:
public Ant(Point startLocation, Point hive) {
this.currentLocation = startLocation;
this.hive = hive;
}
By failing i mean that the test with the first two sets of the DataProvider work correctly, the rest is failing/not finishing.
Although i am not quite sure what failed. If i remove the first two sets of data in the DataProvider, still only the first two datasets (which where the 3rd and 4th data set before) do not fail.
I use IntelliJ and the symbol besides the "failed" test is still the "loading icon".
Debugging each single test case shows that the points are set correctly. Removing the assert from the test does not change anything.
Can someone explain this behavior to me please?
Thanks in advance
Egon
Edit: corrected the version of the constructor that failed

Maybe it's a bug in IntelliJ IDEA. Sometimes I also facing with this problem. Unfortunatelly it's still (2014-11-24) unresolved: https://youtrack.jetbrains.com/issue/IDEA-100752
Try run your tests with alternate runner (as Maven goal, for instance).

Related

Failed to load the sqljdbc_auth.dll when running multiple JBehave Stories via Maven

I'm working with a Maven project that has three modules:
Database
Module that contains a JBehave Story (Story .java, .story, and steps .java file)
Another module that contains a JBehave Story (Story .java, .story, and steps .java file)
Both of the modules that contain a JBehave Story have the same type of .java file that runs the .story file and steps. Below is the .java file both modules have (but have different names for test purposes):
public class FirstStories extends ConfigurableEmbedder {
private SqlDataSourceProvider dataSourceProvider = new SqlDataSourceProvider();
private final CrossReference xref = new CrossReference();
private Context context = new Context();
private Format contextFormat = new ContextOutput(context);
private ContextView contextView = new JFrameContextView().sized(640, 120);
private ContextStepMonitor contextStepMonitor = new ContextStepMonitor(context, contextView, xref.getStepMonitor());
public FirstStories() {
System.setProperty("jbehave.test", "true");
configuredEmbedder().embedderControls().doGenerateViewAfterStories(true).doIgnoreFailureInStories(false)
.doIgnoreFailureInView(true).doVerboseFailures(true).useThreads(1).useStoryTimeouts("5m");
configuredEmbedder().useEmbedderControls(new PropertyBasedEmbedderControls());
}
#Test
#Override
public void run() throws Throwable {
Embedder embedder = configuredEmbedder();
try {
embedder.runStoriesAsPaths(storyPaths());
} finally {
embedder.generateCrossReference();
}
}
#Override
public Configuration configuration() {
Properties viewResources = new Properties();
viewResources.put("decorateNonHtml", "true");
viewResources.put("reports", "ftl/jbehave-reports-with-totals.ftl");
return new MostUsefulConfiguration()
.useStoryReporterBuilder(
new StoryReporterBuilder()
.withDefaultFormats()//.withViewResources(viewResources)
.withFormats(contextFormat, CONSOLE, TXT, HTML_TEMPLATE, XML_TEMPLATE).withFailureTrace(true)
.withFailureTraceCompression(true).withCrossReference(xref))
.useStepMonitor(contextStepMonitor);
}
#Override
public InjectableStepsFactory stepsFactory() {
return new InstanceStepsFactory(configuration(), new Steps(dataSourceProvider));
}
private List<String> storyPaths() {
String filter = System.getProperty("story.filter", "**/*.story");
return new StoryFinder().findPaths(codeLocationFromClass(this.getClass()), filter, "**/failing_before*.story");
}
}
The .story file is very straightfoward and only has one scenario:
Meta:
Narrative:
As a user
I want to perform an action
So that I can achieve a business goal
Scenario: Test scenario
Given nothing
When I do nothing
Then nothing happens
The steps file only contains one no-op method just to get everything working properly.
When running both JBehave tests via maven, the first story will run just fine. However, when starting up the second story, the following message appears, and the test shortly fails (I can run the second story on its own without issues, only when it runs after the first story):
WARNING: Failed to load the sqljdbc_auth.dll cause : Native Library C:\Windows\System32\sqljdbc_auth.dll already loaded in another classloader
java.sql.SQLException: An attempt by a client to checkout a Connection has timed out.
Is there something I am forgetting to do during my story's run() method to make sure everything is properly destroyed after the story is finished running, so the next story can run correctly without problems?

WebDriver datadriven (using TestNG) scripts takes a long time to start

I have extended Selenium using the Java WebDriver library and the TestNG framework. When running test scripts, I notice an inordinate amount of time for the test to start execution, when the test takes in input parameters from an Excel file (using the #DataProvider annotation).
The delay can amount to about 10 min, which makes it time consuming to run and debug. Is there a reason for this significant delay?
Yes could be because of the way you are reading from excel (greedy data provider) and depends on how big your excel file is. There is something called lazy data provider. Found an example of one here . Posting the code from the link.
For better understanding need to see your code.
public class LazyDataProviderExample {
#Test(dataProvider = "data-source")
public void myTestMethod(String info) {
Reporter.log("Data provided was :" + info, true);
}
#DataProvider(name = "data-source")
public Iterator<Object[]> dataOneByOne() {
return new MyData();
}
private static class MyData implements Iterator<Object[]> {
private String[] data = new String[] { "Java", "TestNG", "JUnit" };
private int index = 0;
#Override
public boolean hasNext() {
return (index <= (data.length - 1));
}
#Override
public Object[] next() {
return new Object[] { data[index++] };
}
#Override
public void remove() {
throw new UnsupportedOperationException("Removal of items is not supported");
}
}
}
For some reason, this issue was resolved by rebuilding my custom Firefox profile - it may have gotten corrupt.
Just posting this as an answer for reference, in case any one is bogged down by this issue.

JUnit Test Suite: Way to create a dataset first before tests start running

I want to setup data for my entire test suite before any of the tests start running. I understand maven runs the test one by one and not a suite, so I cannot use #SuiteClasses. Also I dont want to create the dataset through dbunit-maven-plugin, the dataset has to be created over REST. Is there a way where I can run specific classes as part of maven pre-integration-test and post-integration-test to setup and clean?
For example
public class TestInit
{
public void setUp()
{
//Data setup
}
public void tearDown()
{
//Data clean up
}
}
make setup run before test suite starts and tearDown after it ends. Or can I run 2 separate classes like, TestInitSetup and TestInitTearDown?
Here is a Rule based solution. It may be useful.
The syntax looks like this:
public class SimpleWayToUseDataSetTest {
#Rule
public DataSetRule rule = new DataSetRule(); // <-- this is used to access to the testVectors from inside the tests
public static class MyDataSet extends SimpleTestVectors {
#Override
protected Object[][] generateTestVectors() {
return new Object[][] {
{true, "alpha", new CustomProductionClass()}, // <-- this is a testVector
{true, "bravo", new CustomProductionClass()},
{false, "alpha", new CustomProductionClass()},
{false, "bravo", new CustomProductionClass() }
};
}
}
#Test
#DataSet(testData = MyDataSet.class) // <-- annotate the test with the dataset
public void testFirst() throws InvalidDataSetException { // <-- any access to testData may result in Exception
boolean myTextFixture = rule.getBoolean(0); // <-- this is how you access an element of the testVector. Indexing starts with 0
String myAssertMessage = rule.getString(1); // <-- there are a couple of typed parameter getters
CustomProductionClass myCustomObject = (CustomProductionClass) rule.getParameter(2); // <-- for other classes you need to cast
Assert.assertTrue(myAssertMessage, true);
}
}
If you can't find a solution in JUnit, TestNG supports #BeforeSuite and #AfterSuite, which seem to do what you want.

Junit Dynamically Created Tests Not Working

public class NewTest extends SeleneseTestCase {
public static Test suite() throws Exception
{
TestSuite suite = new TestSuite();
TestSuite s = new TestSuite("TestCase Name");
GeneratedTest t = new GeneratedTest("testName");
t.setFailure("TestCase Name: testName");
s.addTest(t);
t = new GeneratedTest("testAge");
s.addTest(t);
suite.addTest(s);
s = new TestSuite("TestCase Name2");
t = new GeneratedTest("testOOGABOOGA");
t.setFailure("TestCase Name2: testOOGABOOGA");
s.addTest(t);
suite.addTest(s);
s = new TestSuite("TestCase Name4");
t = new GeneratedTest("testName");
t.setFailure("TestCase Name4: testName");
s.addTest(t);
t = new GeneratedTest("testAge");
s.addTest(t);
suite.addTest(s);
s = new TestSuite("TestCase Name3");
t = new GeneratedTest("testName");
t.setFailure("TestCase Name3: testName");
s.addTest(t);
t = new GeneratedTest("testAge");
s.addTest(t);
suite.addTest(s);
return suite;
}
}
public class GeneratedTest extends TestCase
{
public String testFailMessage;
public GeneratedTest(String name)
{
((TestCase)this).setName(name);
}
public void runTest()
{
if (testFailMessage != null)
{
fail(testFailMessage);
}
}
public void setFailure(String msg)
{
testFailMessage = msg;
}
}
As you can see (or maybe you can't) i'm adding tests to junit at runtime. This is all fine and dandy, except that it doesn't properly display them. Here, see what I mean:
click here for image
As you can see, tests with the same name don't even display that they've been run, except for the last test with duplicate name, and that test has the error messages from all the other tests with the same name.
Is this simply just a flaw with the way that i'm doing it (junit3 style)? Would I have to change it to use junit4 parameterization to fix it?
I noticed something similar in Eclipse's test runner. For JUnit 3.8 style parametrized tests, the names were not being displayed. Switching to JUnit 4 style solved the problem.
While this isn't exactly your scenario, I think it is something you'll have to live with until you can update the tests to JUnit 4. Eclipse does still run the tests which is the important thing.

Where should I initialize variables for an OO Recursive Descent Parse Tree?

I'd like to preface this by stating that this is for a class, so please don't solve this for me.
One of my labs for my cse class is creating an interpreter for a BNF that was provided. I understand most of the concepts, but I'm trying to build up my tree and I'm unsure where to initialize values. I've tried in both the constructor, and in the methods but Eclipse's debugger still only shows the left branch, even though it runs through completely.
Here is my main procedure so you can get an idea of how I'm calling the methods.
public class Parser {
public static void main(String[] args) throws IOException {
FileTokenizer instance = FileTokenizer.Instance();
FileTokenizer.main(args);
Prog prog = new Prog();
prog.ParseProg();
prog.PrintProg();
prog.ExecProg();
}
Now here is My Prog class:
public class Prog {
private DeclSeq ds;
private StmtSeq ss;
Prog() {
ds = new DeclSeq();
ss = new StmtSeq();
}
public void ParseProg() {
FileTokenizer instance = FileTokenizer.Instance();
instance.skipToken(); //Skips program (1)
// ds = new DeclSeq();
ds.ParseDS();
instance.skipToken(); //Skips begin (2)
// ss = new StmtSeq();
ss.ParseSS();
instance.skipToken();
}
I've tried having
Prog() {
ds = null;
ss = null;
}
public void ParseProg() {
FileTokenizer instance = FileTokenizer.Instance();
instance.skipToken(); //Skips program (1)
ds = new DeclSeq();
ds.ParseDS();
...
But it gave me the same error. I need the parse tree built up so I can do a pretty print and an execute command, but like I said, I only get the left branch.
Any help would be appreciated. Explanations why are even more so appreciated.
Thank you,
Vasto
Turns out that my issue was in DeclSeq and StmtSeq.
I was declaring variables inside a while loop, thereby losing them after the loop exited. DOH

Categories