Cucumber with JUnit 5 per scenario Test Context on parallel execution - java

Just recently I got to experience scenario parallel execution with Cucumber + JUnit 5 which works fine and I intend to use from now on. Previously since I also use cucumber-spring, I've use Spring to manage a single TestContext instance as a bean annotated with #Component which I reset every Scenario using #Before. Now that there are more scenarios running in parallel, naturally I need a threaded solution. My idea for a solution was roughly the following:
/* Store the Current Scenario in a Thread Local */
private ThreadLocal<Scenario> currentScenario = new ThreadLocal<>();
#Before
public void setup(final Scenario scenario) {
currentScenario.set(scenario);
}
/* Get Scenario from a Context Map with ScenarioId as the key */
private Map<String, TestContext> contextMap = new HashMap<>();
public TestContext getContext(final Scenario scenario) {
return contextMap.get(scenario.getId());
}
The thing is, I don't know if the Scenario starts and end in a single thread or if this proposed solution is safe at all. Is there any other way to get access to the current Scenario's Scenario instance? Any other solutions for this problem? Thank you very much.

If you're on a recent version of Cucumber, then your step definitions classes are scenario scoped by default and should not be annotated with #Component.
Each scenario gets a new instance of the step definition class.
So this is safe, even with parallel execution:
private Scenario currentScenario;
#Before
public void setup(final Scenario scenario) {
currentScenario = scenario;
}
If you have other classes without step definitions that should have a unique instance in each scenario you can combine #Component with #ScenarioScoped.
https://github.com/cucumber/cucumber-jvm/tree/main/cucumber-spring#sharing-state-between-steps

Related

Annotating a class so that its every method can wait for X ms after execution

I am writing some JUnit unit tests to test my DynamoDB data accessor object. Here is one of the test.
....
private static DynamoDBMapper ddbMapper;
#BeforeClass
public static void setup() {
ddbMapper = DynamoDBClients.getTestingDynamoDBMapper();
}
#Before
public void setupTestItem() {
Item item = new Item();
item.setItemID(TEST_COUTSE_ITEM_ID);
// Create an item with every fields populated
ddbMapper.save(item);
}
#Test
public void test_update_simple_attribute() {
Item item = ddbMapper.load(Item.class, TEST_ITEM_ID, TEST_ITEM_VARIANT);
item.setLanguageTag(TEST_LANGUAGE_TAG + "CHANGED");
ddbMapper.save(item);
Item updatedItem = ddbMapper.load(Item.class, TEST_ITEM_ID, TEST_ITEM_VARIANT);
assertEquals(updatedItem.getLanguageTag(), TEST_LANGUAGE_TAG + "CHANGED"); // This field has been changed
}
I have more tests that will update the item, and assert whether the update got pushed through to DynamoDB, and I could read it back.
However I noticed that if I run these tests more often, I sometime ran into issue that DynamoDB have not yet fully write the updated data in, and when I load it, it is still showing the old data. Rerunning the tests usually solve the issue.
I believe that DynamoDB uses eventual consistency model for write, so it makes sense that sometime update might take a bit longer than the Java execution speed. One way I could mitigate this is to have the JUnit test to suspend for 100ms or so.
But I would have to include some code to suspend execution everywhere I do a ddbMapper.save() or ddbMapper.delete(), that doesn't seem feasible for all the tests I have and the tests I will write.
It seems like I could tackle this with an annotation driven approach. Perhaps implementing a class level annotation #SuspendAtEndOfMethod and all its method could get affected, I am wondering if this is possible?

What is the point in unit testing mock returned data?

Consider the scenario where I am mocking certain service and its method.
Employee emp = mock(Employee.class);
when(emp.getName(1)).thenReturn("Jim");
when(emp.getName(2)).thenReturn("Mark");
//assert
assertEquals("Jim", emp.getName(1));
assertEquals("Mark", emp.getName(2));
In the above code when emp.getName(1) is called then mock will return Jim and when emp.getName(2) is called mock will return Mark. My Question is I am declaring the behavior of Mock and checking it assertEquals what is the point in having above(or same kind of) assert statements? These are obviously going to pass. it is simply like checking 3==(1+2) what is the point? When will these tests fail (apart from changing the return type and param type)?
As you noted, these kind of tests are pointless (unless you're writing a unit test for Mockito itself, of course :-)).
The point of mocking is to eliminate external dependencies so you can unit-test your code without depending on other classes' code. For example, let's assume you have a class that uses the Employee class you described:
public class EmployeeExaminer {
public boolean isJim(Employee e, int i) {
return "Jim".equals(e.getName(i));
}
}
And you'd like to write a unit test for it. Of course, you could use the actual Employee class, but then your test won't be a unit-test any more - it would depend on Employee's implementation. Here's where mocking comes in handy - it allows you to replace Employee with a predictable behavior so you could write a stable unit test:
// The object under test
EmployeeExaminer ee = new EmployeeExaminer();
// A mock Employee used for tests:
Employee emp = mock(Employee.class);
when(emp.getName(1)).thenReturn("Jim");
when(emp.getName(2)).thenReturn("Mark");
// Assert EmployeeExaminer's behavior:
assertTrue(ee.isJim(emp, 1));
assertFalse(ee.isJim(emp, 2));
In your case you are testing a getter, I don't know why you are testing it and no clue why would you need to mock it. From the code you are providing this is useless.
There is many scenarios where mocking make sense when you write unit-test you have to be pragmatic, you should test behaviors and mock dependencies.
Here you aren't testing behavior and you are mocking the class under test.
There is no point in that test.
Mocks are only useful for injecting dependencies into classes and testing that a particular behaviour interacts with that dependency correctly, or for allowing you to test some behaviour that requires an interface you don't care about in the test you are writing.
Mocking the class under test means you aren't even really testing that class.
If the emp variable was being injected into another class and then that class was being tested, then I could see some kind of point to it.
Above testcase is trying to test a POJO.
Actually, You can ignore to test POJO's, or in other words, they are automatically tested when testing other basic functionalities. (there are also utilities as mean-beans to test POJO's)
Goal of unit-testing is to test the functionality without connecting to any external systems. If you are connecting to any external system, that is considered integration testing.
Mocking an object helps in creating mock objects that cannot be created during unit-testing, and testing behavior/logic based on what the mocked object (or real object when connecting to external system) data is returned.
Mocks are structures that simulate behaviour of external dependencies that you don't/can't have or which can't operate properly in the context of your test, because they depend on other external systems themselves (e.g. a connection to a server). Therefore a test like you've described is indeed not very helpful, because you basically try to verify the simulated behaviour of your mocks and nothing else.
A better example would be a class EmployeeValidator that depends on another system EmployeeService, which sends a request to an external server. The server might not be available in the current context of your test, so you need to mock the service that makes the request and simulate the behaviour of that.
class EmployeeValidator {
private final EmployeeService service;
public EmployeeValidator(EmployeeService service) {
this.service = service;
}
public List<Employee> employeesWithMaxSalary(int maxSalary) {
List<Employee> allEmployees = service.getAll(); // Possible call to external system via HTTP or so.
List<Employee> filtered = new LinkedList<>();
for(Employee e : allEmployees) {
if(e.getSalary() <= maxSalary) {
filtered.add(e);
}
}
return filtered;
}
}
Then you can write a test which mocks the EmployeeService and simulates the call to the external system. Afterwards, you can verify that everything went as planned.
#Test
public void shouldContainAllEmployeesWithSalaryFiveThousand() {
// Given - Define behaviour
EmployeeService mockService = mock(EmployeeService.class);
when(mockService.getAll()).thenReturn(createEmployeeList());
// When - Operate the system under test
// Inject the mock
EmployeeValidator ev = new EmployeeValidator(mockService);
// System calls EmployeeService#getAll() internally but this is mocked away here
List<Employee> filtered = ev.employeesWithMaxSalary(5000);
// Then - Check correct results
assertThat(filtered.size(), is(3)); // There are only 3 employees with Salary <= 5000
verify(mockService, times(1)).getAll(); // The service method was called exactly one time.
}
private List<Employee> createEmployeeList() {
// Create some dummy Employees
}

Cucumber-JVM: Call a Scenario or ScenarioOutline inside a step definition

I have a step definition where I pass the name of a Scenario or Scenario outline:
#When("^I execute the steps of the following scenario or scenario outline: \"([^\"]*)\"$")
public void execute_steps_of_the_scenario_or_scenario_outline(String name){
...
}
My intent is to execute all the steps of the called Scenario/ScenarioOutline, and append those executed steps to the current scenario. This means that the steps of the called scenario become a part of the current scenarion.
However I have been unable to figure out a way to do this.
Cucumber (and BDD/TDD in general) are not meant to be used like that. Each scenario/test should be isolated from each other and is not a good practice to executes steps from one scenario on another.
You can use Background steps on a feature to execute common steps across the scenarios of that feature or you can also use tags to execute a particular set of actions before and/or after a scenario is executed using Hooks.
Having said that, you could write some logic in your glue code to maintain a list of steps (methods) which are invoked in each scenario and invoke the same list of steps on a subsequent scenario. This assumes that you could guarantee the execution order of the scenarios, which is (again) against all TDD best practices. Below is an example of the skeleton code to achieve what I have just described.
private Scenario scenario;
private Map<String, List<String>> scenarioSteps = new HashMap<>();
#Before
public void setUp(Scenario scenario) {
this.scenario = scenario;
scenarioSteps.put(scenario.getName(), new LinkedList<>());
}
#Given("^the first step is executed$")
public void the_first_step_is_executed() {
final StackTraceElement stackTraceElement = Thread.currentThread().getStackTrace()[1];
scenarioSteps.get(scenario.getName()).add(stackTraceElement.getClassName() + "." + stackTraceElement.getMethodName());
}

Apache Felix - How to guarantee injecting of dynamic references before an activate method

Here is snippet of intrested case:
We have some configuration class it can have multi instances. It suppose that we supply several configurations in one bundle. It's one scope.
#Service
#Component
public class SampleConfigurationImpl implements SampleConfiguration {
// declaration of some properties, init method and etc...
}
Also we have a service which uses these configurations:
#Service
#Component
public class SampleServiceImpl implements SampleService {
#Reference(
referenceInterface = SampleConfiguration.class,
cardinality = ReferenceCardinality.OPTIONAL_MULTIPLE,
policy = ReferencePolicy.DYNAMIC)
private Map<String, SampleConfiguration> sampleConfigurations = new ConcurrentHashMap<>();
private void bindSampleConfigurations(SampleConfiguration sampleConfiguration) {
sampleConfigurations.put(sampleConfiguration.getName(), sampleConfiguration);
}
private void unbindSampleConfigurations(SampleConfiguration sampleConfiguration) {
sampleConfigurations.remove(sampleConfiguration.getName());
}
#Activate
private void init() {
System.out.println(sampleConfigurations.size());
}
}
So, can I get some guarantees that on invocation of init method all configurations are injected (at least of current bundle)? Maybe there is some alternative way to do this. I understand that another bundles can bring new configurations and it's unreal to get guarantees but it's intrested in case of only one bundle.
On practice it can be case when in init method there are only part of configurations. Especially if it's more difficalt case when you have several types of configuration or one service uses another one which has dynamic references and first service relies on fact that everything is injected.
The most unpleasant is that it can bind/unbind configurations both before and after init method.
Maybe there is some way to guarantee that it bind always after init method...
I'm interested in any information. It will be great to get answer on two questions (guarantees before or after). Probably someone has experience how to resolve such problem and can share with me.
Thanks.
No, not that I know of. What I usually do in that case (depending on your use case, it depends on if your activation code is ok with running multiple times) is to create a 'reallyActivate' method I call both from the regular activate and from the bindSampleConfigurations (+ setting an isActivated flag in activate). Then I can perform some logic every time a new SampleConfiguration gets bound, even if it's after the activation. Does that help for your case?

Best approach for unit testing with data

I have a lot of DAO classes I need to test on a Spring project.
I am already using DBUnit to mock my database, however I use the #Before annotation to create objects and compare them after tests on create/update/delete operations.
#DatabaseSetup(value = { "/db_data/dao/common.xml", "/db_data/dao/myDAOCommonTest.xml" })
#DbUnitConfiguration(dataSetLoader = ReplacementDataSetLoader.class)
public class MyDAOImplTest extends AbstractDaoTU {
#Autowired
private MyDAO myDAO;
private Set<ClassNeeded> objectsNeeded = new HashSet<>();
private ClassOne classOne;
private ClassTwo classTwo;
private ClassThree classThree;
#Override
public void setUp() throws Exception {
super.setUp();
this.objectsNeeded.add(somethingComingFromTheMotherClass);
this.classOne = new ClassOne();
this.classOne.setIdClassOne(1L)
this.classOne.setObjectsNeeded(this.objectsNeeded);
// ... Many other sets
this.classTwo = new ClassTwo();
this.classTwo.setIdClassTwo(1L);
this.classTwo.setClassOne(this.classOne);
// ... Many other sets
// ... Other sets follow for a lot of other objects
}
#Test
public void testOne {
// ...
}
// ... Other tests follow
}
I am using an ORM (Hibernate in this case), and most objects are inter-dependent. My dao functions mostly need complete objects to be called, so I must create the objects before testing.
My questions are the following :
Is there a better approach to unit test DAOs ?
What tools do you know to make this easier/faster to write ? (I am using maven for packaging)
Thanks for your help !
DB Unit complicates the maintenance of the tests as it increases the number of places that need updates when something changes. Additionally it separates the data preparation from the tests too much so it's hard to find which of the data relates to which tests.
Ideally each test prepares data for itself. This removes global state and keeps related things together.
To prepare the data just create entities and save them in the very same test. You can use randomization and transaction rollbacks to isolate the tests. Here is an example from one of my projects:
#Test public void returnsExperimentAsItWasSaved() {
Experiment original = Experiment.random();
experimentRepository.save(original);
flushToDbAndClearCache();
Experiment fromDb = experimentRepository.findOne(original.getExperimentId());
assertReflectionEquals(original, fromDb);
}
Note, that the very same DAO class is used to prepare the data.
The best way is to develop your tests as you would develop your code: refactor to minimize duplication, extract reusable services, etc.
So, you'll probably create some TestCaseFactory that chains up a whole set of objects and saves them using your actual DAO's. Then, you can call them from an #Before as you did. If you need a lot of different sets of objects, you can create different methods or a parameter object etc.
And do a cleanup of all test data in an #After.

Categories