I have many boolean methods like boolean isPalindrome(String txt) to test.
At the moment I test each of these methods with two parameterised tests, one for true results and one for false results:
#ParameterizedTest
#ValueSource(strings = { "racecar", "radar", "able was I ere I saw elba" })
void test_isPalindrome_true(String candidate) {
assertTrue(StringUtils.isPalindrome(candidate));
}
#ParameterizedTest
#ValueSource(strings = { "peter", "paul", "mary is here" })
void test_isPalindrome_false(String candidate) {
assertFalse(StringUtils.isPalindrome(candidate));
}
Instead I would like to test these in one parameterised method, like this pseudo Java code:
#ParameterizedTest
#ValueSource({ (true, "racecar"),(true, "radar"), (false, "peter")})
void test_isPalindrome(boolean res, String candidate) {
assertEqual(res, StringUtils.isPalindrome(candidate));
}
Is there a ValueSource for this? Or is there an other way to achieve this in a concise manner?
Through the very helpful comment from Dawood ibn Kareem (on the question) I got a solution involving #CsvSource:
#ParameterizedTest
#CsvSource(value = {"racecar,true",
"radar,true",
"peter,false"})
void test_isPalindrome(String candidate, boolean expected) {
assertEqual(expected, StringUtils.isPalindrome(candidate));
}
I quite like: Although the code uses strings to express boolean types, it is quite compact and keeps things together which IMHO belong together.
Read about #CsvSource here.
Related
I want to make some integration test to test my whole program (it's a standart command line Java application with program args)
Basically I have 3 tests : one to create a resource, one to update the resource and finally one to delete it.
I could do something like this :
#Test
public void create_resource() {
MainApp.main(new String[] {"create", "my_resource_name"});
}
#Test
public void update_resource() {
MainApp.main(new String[] {"update", "my_resource_name"});
}
#Test
public void delete_resource() {
MainApp.main(new String[] {"delete", "my_resource_name"});
}
It works... as long as the methods are executed in the correct order. I've heard that the good execution of a test should not depend of the order.
It's true that ordering tests is considered a smell. Having said that, there might be cases where it might make sense, especially for integration tests.
Your sample code is a little vague since there are no assertions there. But it seems to me you could probably combine the three operation into a single test method. If you can't do that then you can just run them in order. JUnit 5 supports it using the #Order annotation:
#TestMethodOrder(OrderAnnotation.class)
class OrderedTestsDemo {
#Test
#Order(1)
void nullValues() {
// perform assertions against null values
}
#Test
#Order(2)
void emptyValues() {
// perform assertions against empty values
}
#Test
#Order(3)
void validValues() {
// perform assertions against valid values
}
}
I have a test that runs multiple times using a data-provider and it looks something like:
#DataProvider(name="data-provider-carmakers")
public Object[][] dataProviderCarMakers() {
return new Object[][] {
{CarMaker.Ford},
{CarMaker.Chevrolet},
{CarMaker.Renault},
{CarMaker.Porsche}
};
}
#Test( dataProvider = "data-provider-carmakers",
retryAnalyzer = TestRetry.class)
public void validateCarMakerHasElectricModelsLoaded(CarMaker carMaker) {
validatecarMakerContainsElectricModelsLoadedInDB(carMaker);
}
In another test, I have a dependency to the first:
#Test( dependsOnMethods = { "validateCarMakerHasElectricModelsLoaded" })
public void validateChevroletElectricModelsPowerEfficiency() {
List<CarModel> electricCarModels = getChevroletCarModels(fuelType.Electric);
validatePowerEfficiency(electricCarModels);
}
(I know the test doesn't make a lot of sense, in reality the code is far more complex that this and data-provider has far more data, but for the sake of clarity I just went with this example).
So I want to run validateChevroletElectricModelsPowerEfficiency() only if validateCarMakerHasElectricModelsLoaded()[CarMaker.Chevrolet] was successful.
How the code is now, if the first test runs successful for Chevrolet, but fails for Renault, the second test won't run. Is there a way to make a dependency to just one set of data of a test?
I need to mock the constant variable in order to test one of my method. How can I do it with Mokito and Junit.
#Component( "mybean" )
#org.springframework.context.annotation.Scope( value="session" )
public class MyBean {
Public void methodToBeTested() {
if (!AppConst.SOME_CONST.equals(obj.getCostCode())) {
// some logic
}
}
}
AppConst class
#Configuration
public class AppConst
{
public static String SOME_CONST;
public static String HOST_URL;
#PostConstruct
public void postConstruct()
{
SOME_CONST = "My Code";
HOST_URL = "Some URL";
}
}
So, from my junit test class, how can I mock the AppConst and it's variables? Now, when I run it, i hit a nullpointer error.
Can this be done with powermock? if yes please give some sample
Mockito version I use.
compile "org.mockito:mockito-all:1.9.5"
compile "org.powermock:powermock-mockito-release-full:1.6.1"
Instead of mocking there would be another solution to be able to test it:
public void methodToBeTested(SomeObject obj) {
performLogic(AppConst.SOME_CONST, obj);
}
boolean performLogic(String check, SomeObject testObj) {
if (!check.equals(obj.getCostCode())) {
// some logic
return true;
}
return false;
}
That way you can test two things, both combined show you that your code works as intended:
public void testMethodToBeTested() {
MyBean mb = new MyBean() {
#Override
void performLogic(String check, SomeObject testObj) {
assertSame("check constant is passed", AppConst.SOME_CONST, check);
}
}
mb.methodToBeTested(new SomeObject());
mb = new MyBean();
SomeObject so = createSomeTestObject("My Code"); // not the actual constant but an equal String
assertFalse("check some logic not occurred", mb.performLogic("My Code", so));
so = createSomeTestObject("Not the constant");
assertFalse("check some logic not occurred", mb.performLogic("Not the constant", so));
assertTrue("check some logic occurred", mb.performLogic("My Code", so));
// additional tests covering the actual logic
}
Another solution could be putting the condition of the if-statement into its own method, e.g. shouldLogicOccurr(String check) and test that method individually.
In other words: Sometimes it's necessary to refactor your code to make tests easier or sometimes even possible at all. A good side effect is the next time you implement something you already have testability in mind and create your code suitable for this in the first place.
Mocking is a good way to get things under test that use third party libraries that can't be changed and have too many dependencies to be set up but if you end up using that for your own code you've got a design issue.
I read some articles about tests naming conventions and decided to use one with "should". It works pretty nice in most cases like:
shouldAccessDeniedIfWrongPassword
shouldReturnFizzBuzzIfDiv3And5
shouldIncreaseAccountWhenDeposit
But I encountered problems while testing DecimalRepresentation class which displays numbers in diffrent numeral systems, just look at code:
public class DecimalRepresentationTest {
private DecimalRepresentation decimal;
#BeforeEach
void setup() {
decimal = new DecimalRepresentation();
}
#Test
void shouldReturnZeroIfNumberNotSpecified() {
assertEquals("0", decimal.toBinary());
}
#Test
void shouldReturn10IfNumber2() {
decimal.setNumber(2);
assertEquals("10", decimal.toBinary());
}
#Test
void shouldReturn1111IfNumber15() {
decimal.setNumber(15);
assertEquals("1111", decimal.toBinary());
}
}
Now it's not bad, but in case I'm testing negative inputs it looks terrible:
#Test
void shouldReturn11111111111111111111111110001000IfNumberNegative120() {
decimal.setNumber(-120);
assertEquals("11111111111111111111111110001000", decimal.toBinary());
}
#Test
void shouldReturn11111111111111111111111111111111IfNumberNegative1() {
decimal.setNumber(-1);
assertEquals("11111111111111111111111111111111", decimal.toBinary());
}
In examples above I'm testing twice for positive and negative input to be sure there is no hardcoded result and algorithm works fine so i decided to group tests in nested classes for keeping convention:
#Nested
#DisplayName("Tests for positive numbers")
class PositiveConverter {
#Test
void shouldReturn10IfNumber2() {
decimal.setNumber(2);
assertEquals("10", decimal.toBinary());
}
#Test
void shouldReturn1111IfNumber15() {
decimal.setNumber(15);
assertEquals("1111", decimal.toBinary());
}
}
#Nested
#DisplayName("Tests for negative numbers")
class NegativeConverter {
#Test
void shouldReturn11111111111111111111111110001000IfNumberNegative120() {
decimal.setNumber(-120);
assertEquals("11111111111111111111111110001000", decimal.toBinary());
}
#Test
void shouldReturn11111111111111111111111111111111IfNumberNegative1() {
decimal.setNumber(-1);
assertEquals("11111111111111111111111111111111", decimal.toBinary());
}
}
I realize it's overcomplicated because of convention. In case I make lapse it could look much better:
#Test
void testPositiveConversions() {
assertAll(
() -> {decimal.setNumber(2); assertEquals("10", decimal.toBinary());},
() -> {decimal.setNumber(15); assertEquals("1111", decimal.toBinary());}
);
}
#Test
void testNegativeConversions() {
assertAll(
() -> {decimal.setNumber(-120); assertEquals("11111111111111111111111110001000", decimal.toBinary());},
() -> {decimal.setNumber(-1); assertEquals("11111111111111111111111111111111", decimal.toBinary());}
);
}
Should i break convention to keep it simple? The same naming problem i have with tests they get Lists with input and outputs or dynamic tests:
#TestFactory
Stream<DynamicTest> shouldReturnGoodResultsForPositiveNumbers(){ // look at method name lol
List<Integer> inputs = new ArrayList<>(Arrays.asList(2, 15));
List<String> outputs = new ArrayList<>(Arrays.asList("10", "1111"));
return inputs.stream().map(number -> DynamicTest.dynamicTest("Test positive " + number, () -> {
int idx = inputs.indexOf(number);
decimal.setNumber(inputs.get(idx));
assertEquals(outputs.get(idx), decimal.toBinary());
}));
}
Names are supposed to be helpful. Sometimes rules help finding good names, sometimes, they do not. And then the answer is to drop the rule, and maybe go for something completely different, like:
#Test
void testResultForNegativeInput() {
decimal.setNumber(-120);
assertEquals("11111111111111111111111110001000", decimal.toBinary());
}
And if you have several of these methods, maybe adding "ForMinus120" or so would be acceptable.
But instead of spending energy naming here: the real issue is that you are using the wrong kind of testing: you have a whole bunch of input data, that simply result in different output values to check. All your tests are about: one special input value should lead to a special output value.
You don't do that with many almost similar test methods - instead you turn to parameterized tests! Meaning: use a table to drive your test. For JUnit5 and parameterized tests turn here (thanks to user Sam Brannen).
It is great that you spend time and energy to make your tests easy to read. But in this case, that leads to a lot of code duplication. Instead, put down the input/output values into a table, and have one test to check all entries in that table.
i've modeled mine after Roy Osherove's method, here's the regex
^(setup|teardown|([A-Z]{1}[0-9a-z]+)+_([A-Z0-9]+[0-9a-z]+)+_([A-Z0-9]+[0-9a-z]+)+)$
Okay so I'm building Test Cases for my project and I'm Using JUnit for testing. Now the problem I'm facing is that I need different set of arguments for different test cases of the same file.
public class ForTesting{
//Test 1 should run on ips {1, true} and {2,true}
#Test
public void Test1()
{
//Do first Test case
}
//Test 2 should run on ips {3,true} and {4,true}
#Test
public void Test2()
{
//Do another Test case
}
}
I know I can provide multiple arguments using parametrized arguments but the problem is the same set of arguments run for all the test cases. Is there a way to do this?
If you're not looking ONLY for standard junit parametrized tests, and depending on your company's legal policies you can use (at least) the following 2 libraries, which make things easier (both to implement and read):
1) JUnitParams (Apache 2)
#RunWith(JUnitParamsRunner.class)
public class PersonTest {
#Test
#Parameters({"17, false",
"22, true" })
public void shouldDecideAdulthood(int age, boolean expectedAdulthood) throws Exception {
assertThat(new Person(age).isAdult(), is(expectedAdulthood));
}
}
2) Zohhak (LGPL) inspired by JUnit params but bringing some more sugar to the table (easy separator config, converters, etc)
#RunWith(ZohhakRunner.class)
public class PersonTest {
#TestWith({"17, false",
"22, true" })
public void shouldDecideAdulthood(int age, boolean expectedAdulthood) throws Exception {
assertThat(new Person(age).isAdult(), is(expectedAdulthood));
}
}
Credits: Examples above have been shamelessly copied and adjusted from JUnitParams' readme.
Few options:
Use Theories.
In a #Theory, use Assume.assumeThat.
#Theory
public void shouldPassForSomeInts(int param) {
Assume.assumeTrue(param == 1 || param == 2);
}
#Theory
public void shouldPassForSomeInts(int param) {
...
}
Or use #TestedOn.
#Theory
public void shouldPassForSomeInts(#TestedOn(ints={1, 2}) int param) {
...
}
#Theory
public void shouldPassForSomeInts(#TestedOn(ints={3,4}) int param) {
...
}