Java: code duplication in classes and their Junit test cases - java

I've been writing code that processes certain fields of an object by modifying their values. To test it, I first wrote a JUnit test case that recursively traverses fields of an object and makes sure they're correctly modified. The CUT (Class Under Test) does something vary similar: it recursively traverses fields of an object and modifies them as required.
So the code to recursively traverse the fields remains the same in test case and CUT, and is currently duplicated, which is against DRY. So I have two questions:
1) have you come across such situations in your project? If yes, did you apply DRY, or let such duplication remain as is?
2) if I put this common code in a util method, I will need to write a test case to test that, which would again involve traversing fields recursively. So how can this be solved without adding any duplication?

You have just hit the ugly mirror testing anti-pattern. If your CUT has a bug, most likely you will copy it to your test case, essentially verifying that a bug is still there.
You must show us some more code, but basically your test case should be much simpler, no for loops, no conditions - just assertions. If your production code does some fancy traversing, reflection, etc. on complicated data structures - create a test Java object and test every field manually in the unit test.

Use the visitor pattern to abstract traversing the tree, and then build visitors both in the Test case and in your productive code. And test the Visitor infrastructure separately.

Related

How to testing something like a converter

i have a question regarding testing classes like a converter.
Lets say i have a converter from EntityA to EntityB. The converter seems like this:
public EntityB convert(EntityA){
//call interal methods
return B.
}
private xy internalMethod1(...){
//call other interal Method
}
private xy internalMethod2(...){
....
}
private xy internalMethod3(...){
....
}
private xy internalMethod4(...){
....
}
The converter has one public method and 4 internal methods to convert the entity.
How should i test it?
Option1
I only test the public method and cover all cases from the internalMethods by different example inputs.
Advantages:
Testing only the "interface". Dont know the interal structure.
Internal refactoring is very easy and needs no changes at the tests.
Disadvantages:
Really big maybe unclear tests that tests all cases.
Every input must be pass all the methods.
Option2
I write tests for my public method and my private methods. (Some testframeworks can access private methods like powermock or spock (groovy))
I test every method alone and mock every other internal method.
Advantages:
Really small tests that only test the method itself and mock all other methods .
Disadvantages:
I know how it is implemented internal and must change the tests if i refactor some method, some methodname or something at the internal calling structure
Option3
I write some new classes that do the internal stuff and have public methods
Advantages:
Tests are maybe clearer and only for the special classes.
Disadvantages:
More classes for one conversion task.
Please help me what is the best practise here.
Maybe some good links/hints.
Thank you for your time.
The points you make are valid, but I think you might not be estimating their weight correctly.
Writing brittle tests (tests that are coupled to the implementation code) makes for a rigid code base that is hard to change. Since the point of writing tests in the first place is to be able to go fast, this is counter productive.
This is why you write your tests through the API only - it decouples the tests from the implementation. As you've said, this might make writing the tests a bit harder, but the reward is worth the effort since you'll get safety and be able to refactor easily.
Option 3 comes into play when you see a code smell where some tests cover only some of the code, and other tests only cover the other part of the code. This usually means there's a collaborator that maybe needs to be extracted. This is especially true when some internal functions only use some parameters and others don't. Also, when there's code duplication and the like.
What I would suggest, is to write it using the way you described in option 1, and then extract code out if needed, in the refactoring stage.

How to organise the tests of an expensive object

I have a class A that I want to test with JUnit. Creating of an object of type A involves a lot of IO, so it takes about 5 seconds.
A is mutable and I want to test the different methods that change A. Now I am in some kind of dilemma:
If I create a virgin object A for every test method, it just takes too long.
Creating one huge test method with a lot of asserts seems like a bad idea to isolate the causes of possible errors.
"Repairing" the object of type A after each test methods seems to dangerous as well because if the repairing is not done correctly, other test methods might fail without proper reason.
I could also create a deep copy of an instance of A for every test method, but that means that I have to change the class A only to properly test it.
What would you suggest?

How to make this unit test independent?

One of unit test best practices is to make each test independent to all the others. Lets say I want to test add() method of a BoundedPriorityBlockingQueue custom class:
public void testAdd() {
BoundedPriorityBlockingQueue q = BoundedPriorityBlockingQueue();
q.add(1);
assertEquals(1, q.size());
}
as you can see currently testAdd uses size() method so it depends on it but I dont want testAdd() to fail when size() is broken. What is the best practice in this situation?
What is the best practice in this situation?
Just suck it up, bearing in mind that tests are meant to serve you, not the other way round.
Will your tests break if something goes horribly wrong? Yes.
Will it be clear where the problem is? Probably, given that anything using size will fail.
Is this test driving you towards a less testable design? No.
Is this the simplest approach to testing add, which is robust in the face of changing implementation details? Probably. (I'd test that you can get the value out again, mind you.)
Yes, it's sort of testing two parts of the same class - but I really don't think that's a problem. I see a lot of dogma around testing ("only ever test the public API, always use AAA" etc) - in my experience you should temper that dogmatism with a healthy dose of pragmatism.
The goal is to make all test methods independent of other test methods, and this method is independent. It will pass or fail based on the operation of the methods in the class under test, regardless of what you do in other test methods.
It's fine for this test to fail if another method from the class under test is broken. If size() is broken you'll have multiple test failures (this one and the one that explicitly tests size()) so it will be obvious where the problem is. If add() is broken, only this test will fail (along with any other methods that rely on add()).
As others have already said, if your size method is broken the test will fail anyway so you have a reason there to investigate and understand why is that happening.
Anyway, if you are still interested on having such independence between your tests you could go for a white-box testing strategy: I guess that your BoundedPropertyBlockingQueue uses internally either any of the java.util collections, an array or an collection implementation from other provider (Guava, Apache Collections, etc) that you rely on so you don't need to verify that those structures work as they are expected to do.
So, define that internal structure as protected, place your test class in a package with the same name and, instead of relying on the implementation of the size method, go into the guts of the BoundedPropertyBlockingQueue:
BoundedPriorityBlockingQueue q = BoundedPriorityBlockingQueue();
q.add(1);
assertEquals(1, q.contents.size()); // assuming that `contents` attribute is a collection.
The main drawback is that now if your internal implementation of the queue changes, you'll need to change the test whilst with your previous test method you won't need to.
IMO I would chose your current implementation, is less coupled and, at the end, meets its goal.
There's nothing wrong with doing such cross-testing - some methods tend to live in pairs (add/remove, enqueue/dequeue, etc) and it makes little sense to test one without its complementary part.
However, I would give a bit more thought to how the add method will be used by your clients (class users). Most likely won't call add only to determine whether size changed, but rather to later retrieve added item. Perhaps your test should look more like this:
BoundedPriorityBlockingQueue q = new BoundedPriorityBlockingQueue();
QueueItem toAdd = 1;
QueueItem added = q.dequeue();
assertEquals(toAdded, added);
On top of that you can also add guard assert to the test above (to assure queue doesn't start with some items already added) or even better - include separate test that guarantees initial state of queue (size is 0, dequeue returning null/throwing).

How do I write a TDD test for removing a field from a Java class?

I have a Java class with three fields. I realized I only need two of them due to changes in requirements.
Ideally I'd write a failing test case before modifying code.
Is there a standard way, or should I just ignore TDD for this task?
That's refactoring, so you don't need to start with failing tests.
Find all the methods using the field.
Make sure that they're covered by unit tests.
Refactor the methods so they no longer use the field.
Remove the field.
Ensure that the tests are running.
Does the drop of this field change the behavior of the class? If not, just drop the field and check if the class still works correctly (aka, passes the tests you should have already written).
TDD principle is to write code "designed by tests". Which may be sound silly but means that the first class you should write is the test class, testing the behavior of the class under test. You should iterate over few steps:
write the test. It should not compile (you don't have the class/classes under test)
Make the test compile. It should fail (you just have empty class which does not satisfy the assertions in the test)
Make the test to pass in the simplest way (usually, just making the method you are testing to return the expected value)
Refine/Refactor/Generalize the class under test, re-run the test (it should still pass). This step should be really fast, usually less than 2 minutes.
Repeat from step 2 until the desired behavior will emerge almost naturally.
If you have an exhaustive list of all the fields you need, you can compare that list of fields by reflection :
yourClassName.getClass().getDeclaredFields() vs your list of fields
Write a test for the constructor without the field you want to remove.
Obviously only works if the constructor takes the field's value as a parameter.
Delete all tests covering the removed functionality (this doesn't count as "writing production code" as per the 3 Rules of TDD).
Delete all references to the obsolete field in remaining tests. If any of them is to fail, you are then allowed to write the required production code to make it pass.
Once your tests are green again, all subsequent modifications fall into the "refactoring" category. You are allowed to remove your (now unused) field here.

Is there a way to generate all expectations using EasyMock?

I am trying to write a EasyMock Junit test case for some code which is having a lot of extra bits and pieces of code which I am finding a little overkill to Mock.
Say for the given example http://java.dzone.com/articles/easymock-tutorial-%E2%80%93-getting,
Following expectation is set to test
portfolio.getTotalValue()
Expectation
EasyMock.expect(marketMock.getPrice("EBAY")).andReturn(42.00);
EasyMock.replay(marketMock);
Now in my case there are around 30-40 such expectations that I need to set before I can come to my piece of code to unit test.
Is there a way to generate expectations of a code or dynamically generate them ? So that I don't have to manually do all this stuff to test my specific piece of code ?
No.
Seriously, what would you expect it to do?
You can save some labor over the long run by looking at patterns of expectations in multiple tests, and combining those into reusable methods or "#Before" methods.
Actually, it's a code smell: Hard-to-Test Code. Your object might not fulfill the Single Responsibility Principle (SRP).
You can try extracting out some expectations to one or more allowXY or createMockedXY helper methods (void allowDownloadDocument(path, name, etc), Document createMockedDocument(...) for example). Eliminating static helper classes also could be helpful.

Categories