I have a class A that I want to test with JUnit. Creating of an object of type A involves a lot of IO, so it takes about 5 seconds.
A is mutable and I want to test the different methods that change A. Now I am in some kind of dilemma:
If I create a virgin object A for every test method, it just takes too long.
Creating one huge test method with a lot of asserts seems like a bad idea to isolate the causes of possible errors.
"Repairing" the object of type A after each test methods seems to dangerous as well because if the repairing is not done correctly, other test methods might fail without proper reason.
I could also create a deep copy of an instance of A for every test method, but that means that I have to change the class A only to properly test it.
What would you suggest?
Related
I'm trying to write a unit test for an implementation of a Feign client. I wasn't sure how to go about it so I googled it and came across this answer and the accepted answer is this code snippet:
#Test
public someTestClient(){
Person expectedPerson = new Person("name",12));
when(mockPersonClient.getPerson()).return(expectedPerson);
Person person = mockPersionClient.getPerson();
assertEquals(expectedPerson, person);
}
I don't understand why this is a useful test or under what circumstance, other than a problem with the Person constructor, this test could ever fail. Isn't this test essentially the equivalent of:
Person person = new Person("a", 1)
Person expectedPerson = new Person("a", 1)
assertEquals(person, expectedPerson)
I understand unit testing should test functionality in isolation. Will this test just ensure that mockPersonClient exists at runtime?
We can configure a mock object to always return a hard coded and fake object when calling a method on that mock object.
In this example , the OP configured mockPersonClient.getPerson() return a fake Person.However, he just wonder why the fake person was not returned as configured when he called mockPersonClient.getPerson(). I think the codes example he shown was just for him to demonstrate this question. It does not mean he actually wrote that unit test codes to test some production codes.
A test like that doesn't have any value.
Here is a person, I am going to ask this call to return person and then I will check I got person when calling that thing. Of course you will get person, you just hard-coded that, so how is it useful?
Unit tests are about functionality, a simple fact which is lost on many.
Any code which changes data, which filters something, which changes something in a controlled way, is a good candidate for a unit test.
People use mocks a little bit too much and for most of the time for the wrong thing. Yes, we are advised to code against interfaces, but this doesn't mean you should have a very complex system, where you pass interfaces all over the place and then your test code tries to mimic that.
When you mock too much, it means the test you are writing is tied too much to the code it tests, it knows too much about it. Unit tests should not do that, because every time you change the code in some small way, you then discover that now you have to change the test, because you're no longer using 35 interfaces, now you have 47 to mock in a very specific order. That may not be an issue when you have one test, but imagine what happens when you have 1000 tests ...
If people tried to code in more of a functional way then this would not happen. If you pass data, instead of abstractions, now you don't have to mock anything.
Instead of mocking a call a database, isolate it, take the result and pass that to a method, you've just lost an abstraction and your code does not need to mock anything, you just call the method, pass the data in whatever format you want and see what happens.
If you want to test a real database, then you write an integration test. It's really not that complicated, mocking should not be the first thing you do, do it when it helps and you really must, but most of the time, you really don't and it simplifies things if you don't.
I have a Java class with three fields. I realized I only need two of them due to changes in requirements.
Ideally I'd write a failing test case before modifying code.
Is there a standard way, or should I just ignore TDD for this task?
That's refactoring, so you don't need to start with failing tests.
Find all the methods using the field.
Make sure that they're covered by unit tests.
Refactor the methods so they no longer use the field.
Remove the field.
Ensure that the tests are running.
Does the drop of this field change the behavior of the class? If not, just drop the field and check if the class still works correctly (aka, passes the tests you should have already written).
TDD principle is to write code "designed by tests". Which may be sound silly but means that the first class you should write is the test class, testing the behavior of the class under test. You should iterate over few steps:
write the test. It should not compile (you don't have the class/classes under test)
Make the test compile. It should fail (you just have empty class which does not satisfy the assertions in the test)
Make the test to pass in the simplest way (usually, just making the method you are testing to return the expected value)
Refine/Refactor/Generalize the class under test, re-run the test (it should still pass). This step should be really fast, usually less than 2 minutes.
Repeat from step 2 until the desired behavior will emerge almost naturally.
If you have an exhaustive list of all the fields you need, you can compare that list of fields by reflection :
yourClassName.getClass().getDeclaredFields() vs your list of fields
Write a test for the constructor without the field you want to remove.
Obviously only works if the constructor takes the field's value as a parameter.
Delete all tests covering the removed functionality (this doesn't count as "writing production code" as per the 3 Rules of TDD).
Delete all references to the obsolete field in remaining tests. If any of them is to fail, you are then allowed to write the required production code to make it pass.
Once your tests are green again, all subsequent modifications fall into the "refactoring" category. You are allowed to remove your (now unused) field here.
Lets say I have a method:
someMethod(X anObject)
Where X is a type of object that is extremely complex. By this I mean it is not something one can easily instantiate on the fly. I need to somehow unit test someMethod, but I cannot so simply create an X object to put in as parameters.
So I first think to try and mock the object, but the problem I run in to is the someMethod function calls many methods of anObject, meaning this X object that is being mocked has a latge amount of functions that need to be called, and thus need to be mock-expected. To make things worse, these X object methods being called return more X objects, meaning I have to mock objects, to expect mock method calls, to return more mock objects.
Regarding this scenario I have a few questions, as I'm new to the concept of unit testing:
The lengthy unit test method aside, I find my unit test to not only be testing as to whether a method works or not, but also specifying the implementation (because I'm basically specifying most of the code that is being called in the method itself with the mock-expects). Is this a problem (mostly to the concept of unit testing itself)?
Is there any way to get around this, even if only to make my unit test methods be a lot less verbose and more maintainable?
I thought about taking a serialized X object from somewhere else, saving that, and then whenever I call my unit test method, I would unserialize my X object and run that as parameters. This is just some random idea I thought of the top of my head; does anyone actually do this?
In case anyone is wondering what exactly I'm doing, I'm using the IDebugContextListener interface to grab debugging information regarding data on a stackframe at a given step on the java debugger. The "X" that I am referring to are objects that are defined by the interface here, including objects such as IValue, IVariable, and IStackframe. All these variables are provided to me by the Java debugger during runtime.
The fact that you have this difficulty is a symptom of a design problem. When something is hard to test, refactor until it isn't hard to test.
If one object needs to call too many methods of another, then encapsulation is poor, and responsibilities are poorly placed. Presumably, the Single Responsibility Principle is not being followed. If code calls methods that return objects, and must call methods on those in turn, then the Law of Demeter is not being followed.
Your pain comes from the fact, that your method does not comply with the Single Responsibility Principle. Your method does a lot of things with X -- and X also sounds a too comlex. This makes testing very hard -- even with mocking.
Break your method down into manageble chunks, that only do one thing each.
I've been writing code that processes certain fields of an object by modifying their values. To test it, I first wrote a JUnit test case that recursively traverses fields of an object and makes sure they're correctly modified. The CUT (Class Under Test) does something vary similar: it recursively traverses fields of an object and modifies them as required.
So the code to recursively traverse the fields remains the same in test case and CUT, and is currently duplicated, which is against DRY. So I have two questions:
1) have you come across such situations in your project? If yes, did you apply DRY, or let such duplication remain as is?
2) if I put this common code in a util method, I will need to write a test case to test that, which would again involve traversing fields recursively. So how can this be solved without adding any duplication?
You have just hit the ugly mirror testing anti-pattern. If your CUT has a bug, most likely you will copy it to your test case, essentially verifying that a bug is still there.
You must show us some more code, but basically your test case should be much simpler, no for loops, no conditions - just assertions. If your production code does some fancy traversing, reflection, etc. on complicated data structures - create a test Java object and test every field manually in the unit test.
Use the visitor pattern to abstract traversing the tree, and then build visitors both in the Test case and in your productive code. And test the Visitor infrastructure separately.
I have three classes I need to test, lets say Load, Transform, Perform and they all begin or work on the same data object, at least that's what is required, from one data object X the Load methods perform their thing on it, then it is given to Transform which also does its thing with its methods, and a Perform which changes the data object a bit and it is ready.
Now I want to write tests for Load, Transform and Perform.
The test-data object, should I just make a static method in the Load class like
public static TestData makeTestData(...makeit...)
OR should I make a TestDataMock or TestDataTest class ? Which can return an example of it? And make a new TestDataTest class in each Load, Transform and Perform when they need to work on it?
You should always strive to make unit tests independent of each other. For that reason, you should always create any input test-data fresh for each test, whenever possible. What you want to test is "given input data X, verify that output is Y". JUnit has the #Before annotation which you can use to annotate a method that is to be run before each test-case in that class. Typically, that is where you would put all your set-up code (creating and initilizing mock objects, creating or loading test-data, etc).
Alternativly, you could combine your Load, Transform and Perform actions into one test-case, but that would be more of an integration test than a unit test.
Sounds like a good example where dependencies would be useful, so you don't have to recreate the object every time (or worse, mock it). On top of that, you work with the real output produced by the previous phase, and you don't have to use statics (always a code smell).
JUnit doesn't support dependencies, but TestNG does:
#Test
public void load() { ... }
#Test(dependsOnMethods = "load")
public void transform() { ... }
#Test(dependsOnMethods = "transform")
public void perform() { ... }
If transform() fails, the final report will say "1 Passed (load), 1 Failed (transform) and 1 Skipped (perform)", and you know exactly where to look.
Most test-case classes should be in the style of testcase class per class: if your have a class X it has one corresponding class XTest. But that is not the only way of doing things; if you have a group of classes that cooperate you could use JUnit for some low-level integration testing of the cooperating classes. You need only think of a suitable name for this test-case class.
However, if you have a group of cooperating classes, consider hiding that fact behind a facade, or even just a single method call of some higher-level class. Then treat that facade or high-lelve method as something to unit-test.
Or, are you trying to say that you do not know how to test your three classes in isolation because they are very tightly coupled, and the behaviour of one can not be described without reference to the two others? That suggests you have a poor design: consider a redesign so you can describe the required behaviour of each class in isolation, and therefore test them (at least in part) in isolation.