I know that one way to do it would be:
#Test
public void foo() {
try {
// execute code that you expect not to throw Exceptions.
} catch(Exception e) {
fail("Should not have thrown any exception");
}
}
Is there any cleaner way of doing this? (Probably using Junit's #Rule?)
You're approaching this the wrong way. Just test your functionality: if an exception is thrown the test will automatically fail. If no exception is thrown, your tests will all turn up green.
I have noticed this question garners interest from time to time so I'll expand a little.
Background to unit testing
When you're unit testing it's important to define to yourself what you consider a unit of work. Basically: an extraction of your codebase that may or may not include multiple methods or classes that represents a single piece of functionality.
Or, as defined in The art of Unit Testing, 2nd Edition by Roy Osherove, page 11:
A unit test is an automated piece of code that invokes the unit of work being tested, and then checks some assumptions about a single end result of that unit. A unit test is almost always written using a unit testing framework. It can be written easily and runs quickly. It's trustworthy, readable, and maintainable. It's consistent in its results as long as production code hasn't changed.
What is important to realize is that one unit of work usually isn't just one method but at the very basic level it is one method and after that it is encapsulated by other unit of works.
Ideally you should have a test method for each separate unit of work so you can always immediately view where things are going wrong. In this example there is a basic method called getUserById() which will return a user and there is a total of 3 unit of works.
The first unit of work should test whether or not a valid user is being returned in the case of valid and invalid input.
Any exceptions that are being thrown by the datasource have to be handled here: if no user is present there should be a test that demonstrates that an exception is thrown when the user can't be found. A sample of this could be the IllegalArgumentException which is caught with the #Test(expected = IllegalArgumentException.class) annotation.
Once you have handled all your usecases for this basic unit of work, you move up a level. Here you do exactly the same, but you only handle the exceptions that come from the level right below the current one. This keeps your testing code well structured and allows you to quickly run through the architecture to find where things go wrong, instead of having to hop all over the place.
Handling a tests' valid and faulty input
At this point it should be clear how we're going to handle these exceptions. There are 2 types of input: valid input and faulty input (the input is valid in the strict sense, but it's not correct).
When you work with valid input you're setting the implicit expectancy that whatever test you write, will work.
Such a method call can look like this: existingUserById_ShouldReturn_UserObject. If this method fails (e.g.: an exception is thrown) then you know something went wrong and you can start digging.
By adding another test (nonExistingUserById_ShouldThrow_IllegalArgumentException) that uses the faulty input and expects an exception you can see whether your method does what it is supposed to do with wrong input.
TL;DR
You were trying to do two things in your test: check for valid and faulty input. By splitting this into two method that each do one thing, you will have much clearer tests and a much better overview of where things go wrong.
By keeping the layered unit of works in mind you can also reduce the amount of tests you need for a layer that is higher in the hierarchy because you don't have to account for every thing that might have gone wrong in the lower layers: the layers below the current one are a virtual guarantee that your dependencies work and if something goes wrong, it's in your current layer (assuming the lower layers don't throw any errors themselves).
JUnit 5 (Jupiter) provides three functions to check exception absence/presence:
● assertAll()
Asserts that all supplied executables
do not throw exceptions.
● assertDoesNotThrow()
Asserts that execution of the
supplied executable/supplier
does not throw any kind of exception.
This function is available
since JUnit 5.2.0 (29 April 2018).
● assertThrows()
Asserts that execution of the supplied executable
throws an exception of the expectedType
and returns the exception.
Example
package test.mycompany.myapp.mymodule;
import static org.junit.jupiter.api.Assertions.*;
import org.junit.jupiter.api.Test;
class MyClassTest {
#Test
void when_string_has_been_constructed_then_myFunction_does_not_throw() {
String myString = "this string has been constructed";
assertAll(() -> MyClass.myFunction(myString));
}
#Test
void when_string_has_been_constructed_then_myFunction_does_not_throw__junit_v520() {
String myString = "this string has been constructed";
assertDoesNotThrow(() -> MyClass.myFunction(myString));
}
#Test
void when_string_is_null_then_myFunction_throws_IllegalArgumentException() {
String myString = null;
assertThrows(
IllegalArgumentException.class,
() -> MyClass.myFunction(myString));
}
}
I stumbled upon this because of SonarQube's rule "squid:S2699": "Add at least one assertion to this test case."
I had a simple test whose only goal was to go through without throwing exceptions.
Consider this simple code:
public class Printer {
public static void printLine(final String line) {
System.out.println(line);
}
}
What kind of assertion can be added to test this method?
Sure, you can make a try-catch around it, but that is only code bloat.
The solution comes from JUnit itself.
In case no exception is thrown and you want to explicitly illustrate this behaviour, simply add expected as in the following example:
#Test(expected = Test.None.class /* no exception expected */)
public void test_printLine() {
Printer.printLine("line");
}
Test.None.class is the default for the expected value.
If you import org.junit.Test.None, you can then write:
#Test(expected = None.class)
which you might find more readable.
For JUnit versions before 5:
With AssertJ fluent assertions 3.7.0:
Assertions.assertThatCode(() -> toTest.method())
.doesNotThrowAnyException();
Update:
JUnit 5 introduced assertDoesNotThrow() assertion, so I'd prefer to use it instead of adding an additional dependency to your project. See this answer for details.
Java 8 makes this a lot easier, and Kotlin/Scala doubly so.
We can write a little utility class
class MyAssertions{
public static void assertDoesNotThrow(FailingRunnable action){
try{
action.run()
}
catch(Exception ex){
throw new Error("expected action not to throw, but it did!", ex)
}
}
}
#FunctionalInterface interface FailingRunnable { void run() throws Exception }
and then your code becomes simply:
#Test
public void foo(){
MyAssertions.assertDoesNotThrow(() -> {
//execute code that you expect not to throw Exceptions.
}
}
If you dont have access to Java-8, I would use a painfully old java facility: aribitrary code blocks and a simple comment
//setup
Component component = new Component();
//act
configure(component);
//assert
/*assert does not throw*/{
component.doSomething();
}
And finally, with kotlin, a language I've recently fallen in love with:
fun (() -> Any?).shouldNotThrow()
= try { invoke() } catch (ex : Exception){ throw Error("expected not to throw!", ex) }
#Test fun `when foo happens should not throw`(){
//...
{ /*code that shouldn't throw*/ }.shouldNotThrow()
}
Though there is a lot of room to fiddle with exactly how you want to express this, I was always a fan of fluent assertions.
Regarding
You're approaching this the wrong way. Just test your functionality: if an exception is thrown the test will automatically fail. If no exception is thrown, your tests will all turn up green.
This is correct in principle but incorrect in conclusion.
Java allows exceptions for flow of control. This is done by the JRE runtime itself in APIs like Double.parseDouble via a NumberFormatException and Paths.get via a InvalidPathException.
Given you've written a component that validates Number strings for Double.ParseDouble, maybe using a Regex, maybe a hand-written parser, or perhaps something that embeds some other domain rules that restricts the range of a double to something specific, how best to test this component? I think an obvious test would be to assert that, when the resulting string is parsed, no exception is thrown. I would write that test using either the above assertDoesNotThrow or /*comment*/{code} block. Something like
#Test public void given_validator_accepts_string_result_should_be_interpretable_by_doubleParseDouble(){
//setup
String input = "12.34E+26" //a string double with domain significance
//act
boolean isValid = component.validate(input)
//assert -- using the library 'assertJ', my personal favourite
assertThat(isValid).describedAs(input + " was considered valid by component").isTrue();
assertDoesNotThrow(() -> Double.parseDouble(input));
}
I would also encourage you to parameterize this test on input using Theories or Parameterized so that you can more easily re-use this test for other inputs. Alternatively, if you want to go exotic, you could go for a test-generation tool (and this). TestNG has better support for parameterized tests.
What I find particularly disagreeable is the recommendation of using #Test(expectedException=IllegalArgumentException.class), this exception is dangerously broad. If your code changes such that the component under test's constructor has if(constructorArgument <= 0) throw IllegalArgumentException(), and your test was supplying 0 for that argument because it was convenient --and this is very common, because good generating test data is a surprisingly hard problem--, then your test will be green-bar even though it tests nothing. Such a test is worse than useless.
If you are unlucky enough to catch all errors in your code.
You can stupidly do
class DumpTest {
Exception ex;
#Test
public void testWhatEver() {
try {
thisShouldThrowError();
} catch (Exception e) {
ex = e;
}
assertEquals(null,ex);
}
}
Although this post is 6 years old now, however, a lot has changed in the Junit world. With Junit5, you can now use
org.junit.jupiter.api.Assertions.assertDoesNotThrow()
Ex:
public void thisMethodDoesNotThrowException(){
System.out.println("Hello There");
}
#Test
public void test_thisMethodDoesNotThrowException(){
org.junit.jupiter.api.Assertions.assertDoesNotThrow(
()-> thisMethodDoesNotThrowException()
);
}
Hope it will help people who are using newer version of Junit5
JUnit5 adds the assertAll() method for this exact purpose.
assertAll( () -> foo() )
source: JUnit 5 API
To test a scenario with a void method like
void testMeWell() throws SomeException {..}
to not throw an exception:
Junit5
assertDoesNotThrow(() -> {
testMeWell();
});
If you want to test that whether your test target consumes the exception. Just leave the test as (mock collaborator using jMock2):
#Test
public void consumesAndLogsExceptions() throws Exception {
context.checking(new Expectations() {
{
oneOf(collaborator).doSth();
will(throwException(new NullPointerException()));
}
});
target.doSth();
}
The test would pass if your target does consume the exception thrown, otherwise the test would fail.
If you want to test your exception consumption logic, things get more complex. I suggest delegating the consumption to a collaborator which could be mocked. Therefore the test could be:
#Test
public void consumesAndLogsExceptions() throws Exception {
Exception e = new NullPointerException();
context.checking(new Expectations() {
{
allowing(collaborator).doSth();
will(throwException(e));
oneOf(consumer).consume(e);
}
});
target.doSth();
}
But sometimes it's over-designed if you just want to log it. In this case, this article(http://java.dzone.com/articles/monitoring-declarative-transac, http://blog.novoj.net/2008/09/20/testing-aspect-pointcuts-is-there-an-easy-way/) may help if you insist tdd in this case.
Use assertNull(...)
#Test
public void foo() {
try {
//execute code that you expect not to throw Exceptions.
} catch (Exception e){
assertNull(e);
}
}
This may not be the best way but it definitely makes sure that exception is not thrown from the code block that is being tested.
import org.assertj.core.api.Assertions;
import org.junit.Test;
public class AssertionExample {
#Test
public void testNoException(){
assertNoException();
}
private void assertException(){
Assertions.assertThatThrownBy(this::doNotThrowException).isInstanceOf(Exception.class);
}
private void assertNoException(){
Assertions.assertThatThrownBy(() -> assertException()).isInstanceOf(AssertionError.class);
}
private void doNotThrowException(){
//This method will never throw exception
}
}
I faced the same situation, I needed to check that exception is thrown when it should, and only when it should.
Ended up using the exception handler to my benefit with the following code:
try {
functionThatMightThrowException()
}catch (Exception e){
Assert.fail("should not throw exception");
}
RestOfAssertions();
The main benefit for me was that it is quite straight forward and to check the other way of the "if and only if" is really easy in this same structure
I end up doing like this
#Test
fun `Should not throw`() {
whenever(authService.isAdmin()).thenReturn(true)
assertDoesNotThrow {
service.throwIfNotAllowed("client")
}
}
You can expect that exception is not thrown by creating a rule.
#Rule
public ExpectedException expectedException = ExpectedException.none();
You can do it by using a #Rule and then call method reportMissingExceptionWithMessage as shown below:
This is Scala code.
Stumbled over this issue since I created some generic methods like
#Test
void testSomething() {
checkGeneric(anComplexObect)
}
In https://newbedev.com/sonarqube-issue-add-at-least-one-assertion-to-this-test-case-for-unit-test-with-assertions some annotation stuff is proposed.
The solution is much more simple. It's enough to rename the "checkGeneric" method to "assertGeneric".
#Test
void testSomething() {
assertGeneric(anComplexObect)
}
You can create any kind of your own assertions based on assertions from junit, because these are especially designed for creating user defined asserts intendet to work exactly like junit ones:
static void assertDoesNotThrow(Executable executable) {
assertDoesNotThrow(executable, "must not throw");
}
static void assertDoesNotThrow(Executable executable, String message) {
try {
executable.execute();
} catch (Throwable err) {
fail(message);
}
}
Now testing the so called scenario methodMustNotThrow and log all failures in a junit style:
//test and log with default and custom messages
//the following will succeed
assertDoesNotThrow(()->methodMustNotThrow(1));
assertDoesNotThrow(()->methodMustNotThrow(1), "custom facepalm");
//the following will fail
assertDoesNotThrow(()->methodMustNotThrow(2));
assertDoesNotThrow(()-> {throw new Exception("Hello world");}, "message");
//See implementation of methodMustNotThrow below
Generally speaking there is possibility to instantly fail anything the test in any scenarios, in any place where it makes sense by calling fail(someMessage), which is designed exactly for this purpose. For instance use it in a try/catch block to fail if anything is thrown in the test case:
try{methodMustNotThrow(1);}catch(Throwable e){fail("must not throw");}
try{methodMustNotThrow(1);}catch(Throwable e){Assertions.fail("must not throw");}
This is the sample of the method we test, supposing we have such a method that must not fail under specific circumstances, but it can fail:
void methodMustNotThrow(int x) throws Exception {
if (x == 1) return;
throw new Exception();
}
The above method is a simple sample. But this works for complex situations, where the failure is not so obvious.
There are the imports:
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.function.Executable;
import static org.junit.jupiter.api.Assertions.*;
AssertJ can handle this scenario:
assertThatNoException().isThrownBy(() -> System.out.println("OK"));
Check the doc for more information https://assertj.github.io/doc/#assertj-core-exception-assertions-no-exception
The following fails the test for all exceptions, checked or unchecked:
#Test
public void testMyCode() {
try {
runMyTestCode();
} catch (Throwable t) {
throw new Error("fail!");
}
}
I've tried to avoid duplicate code in JUnit test, but I'm kind of stuck.
This is my first test, for the second one it has exactly the same methods but different service (different input). instead of the TestCaseResourceTest1 I have TestCaseResourceTest2. Now what could be the proper way to test both? I want to have a separate file for test number 2, how should I avoid the duplicate code? (ex. use the beforeFileTest() method)
public class TestCaseResourceTest1 {
#Mock
private TestService testService;
#Mock
private AreaService areaService;
private TestCaseService1 testCaseService1; // is changed in test2
#Before
public void before() throws Exception{
testCaseService1 = mock(TestCaseService1.class); // is changed in test2
MockitoAnnotations.initMocks(this);
beforeFileTest();
}
private void beforeFileTest() throws Exception{
doReturn(true).when(areaService).chechExists(any(String.class), eq(false));
}
#Test
public void verifyFileExists() throws Exception{
verifyOtherArea(testCaseService1); // is changed in test2
doReturn(false).when(areaService).chechExists(any(String.class), eq(false));
}
}
just lines with comment is changed in test2 are differences.
Tnx
Given this excerpt from your question:
… instead of the TestCaseResourceTest1 I have TestCaseResourceTest2 … I want to have a separate file for test number 2
… the standard ways of sharing code between test cases are:
Create a Test Suite and include the shared code in the test suite (typically in #BeforeClass and #AfterClass methods). This allows you to (1) run setup code once (per suite invocation); (2) encapsulate shared setup/teardown code and (3) easily add more tests cases later. For example:
#RunWith(Suite.class)
#Suite.SuiteClasses({
TestCaseResourceTest1.class,
TestCaseResourceTest2.class
)}
public class TestSuiteClass {
#BeforeClass
public void setup() {
beforeFileTest();
}
private void beforeFileTest() throws Exception {
// ...
}
}
Create an abstract class which parents TestCaseResourceTest1 and TestCaseResourceTest2 and let those test cases call the shared code in the parent (typically via super() calls). With this approach you can declare default shared code in the parent while still allowing sub classes to (1) have their own behaviour and (2) selectively override the parent/default behaviour
Create a custom JUnit runner, define the shared behaviour in this runner and then annotate the relevant test cases with #RunWith(YourCustomRunner.class). More details on this approach here
Just to reiterate what some of the other posters have said; this is not a common first step so you may prefer to start simple and only move to suites or abstract classes or custom runners if your usage provides a compelling reason to do so.
I had the such situation and it was a sign about wrong implementation design. We are talking about pure unit tests where we test exactly what is implemented in the production classes. If we need duplicated tests it means we probably have duplication in implementation.
How did I resolve it in my project?
Extracted common logic into parent service class and implemented unit tests for it.
For child services I implemented tests only for particular implemented code there. No more.
Implemented an integration tests on real environment were both services were involved and tested completely.
Assuming you want to have the exact same test run for 2 different classes (and not mocking it as in your example code), you can create an abstract test class, that has abstract method that returns an instance of the class to be tested.
Something in the vein of:
public abstract class TestCaseResourceTest {
protected abstract TestCaseService1 getServiceToTest();
#Before
public void before() throws Exception {
testCaseService1 = getServiceToTest();
MockitoAnnotations.initMocks(this);
beforeFileTest();
}
#Test
public void test() {
// do your test here
}
}
public class ConcreteTest extends TestCaseResourceTest {
protected TestCaseService1 getServiceToTest() {
return new TestCaseService();
}
}
public class ConcreteTest2 extends TestCaseResourceTest {
protected TestCaseService1 getServiceToTest() {
return new DifferentService();
}
}
Have you considered using JUnit 5 with its http://junit.org/junit5/docs/current/user-guide/#writing-tests-parameterized-tests ?
It allows you to re-use your tests with different input. This is an example from the documentation which illustrates what you can do now with JUnit 5:
#ParameterizedTest
#ValueSource(strings = { "Hello", "World" })
void testWithStringParameter(String argument) {
assertNotNull(argument);
}
But you can also create your methods which return the input data:
#ParameterizedTest
#MethodSource("stringProvider")
void testWithSimpleMethodSource(String argument) {
assertNotNull(argument);
}
static Stream<String> stringProvider() {
return Stream.of("foo", "bar");
}
Here I am using just strings, but you can really use any objects.
If you are using Maven, you can add these dependencies to start using JUnit 5:
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<version>5.0.0-RC2</version>
<scope>test</scope>
</dependency>
The only annoying thing about JUnit 5 is that it is not released yet.
When going from one test to two tests, you don't know what will be duplicate code, so I find it useful to put everything into one test method. In this case, start by putting the contents of the #Before and beforeFileTest methods inline in the test.
Then you can see that it is just te service that needs changing, so you can extract everything except that into a helper method that is called from two tests.
Also, after you have two tests that are calling the same helper method and are happy with that test coverage, you could look into writing parameterized tests. For example with JunitParams: https://github.com/Pragmatists/junitparams/wiki/Quickstart
I was trying to write unit test using jmocks and junit. (My Project uses core java- no frameworks-) I could not write unit test for some of my classes, by mocking external dependencies, when dependencies were initialized in a a no arg-constructor.
As i cannot provide the actual code, trying to explain the scenario by an example
public interface Apple {
String variety();
}
Implementation.
public class MalgovaApple implements Apple {
#Override
public String variety() {
return "Malgova";
}
}
Class to be tested
public class VarietyChecker {
private Apple apple;
VarietyChecker(){
this.apple = new MalgovaApple();
// instead of new, a factory method is used in actual application
}
public String printAppleVariety(){
String variety = apple.variety();
if(variety.length() < 3){
System.out.println("Donot use Code names- Use complete names");
return "bad";
}
return "good";
}
}
Junit test using jmock
public class VarietyCheckerUnitTest{
Mockery context = new JUnit4Mockery();
#Before
public void setUp() throws Exception {
}
#After
public void tearDown() throws Exception {
}
#Test
public void test_VarietyChecker() throws Exception{
final Apple mockapple = context.mock(Apple.class);
VarietyChecker printer = new VarietyChecker();
context.checking(new Expectations(){{
oneOf(mockapple).variety();will(returnValue("as"));
}});
String varietyNameValid = printer.printAppleVariety();
assertEquals("bad",varietyNameValid);
} }
This test fails - Mocking does not work the values "as" is not injected, the test class executes with MalgovaApple ...
Now if we add below constructor to VarietyChecker and use it test case - it gives expected output...
public VarietyChecker(Apple apple) {
super();
this.apple = apple;
}
and in unit test create test class object like
VarietyChecker printer = new VarietyChecker(mockapple);
Exposing a new constructor just for the purpose of testing is not a good idea. After all it is said that you should not alter the code for testing alone, more than that, i am afraid we have already written "some"(amount) code...
Am i missing something in junit or jmock that can make mocking work even incase of no-arg constructors. Or is this a limitation of simple junit and jmocks and should i migrate to something powerful like Jmockit /PowerMock
You should consider two choices.
Use a constructor parameter as you describe.
In this case, you're not "exposing a new constructor just for the purpose of testing". You're making your class more flexible by allowing callers to use a different factory implementation.
Don't mock it.
In this case, you are declaring that it never makes sense to use a different factory. Sometimes this is okay. At that point, the question changes, though. Instead of, "How do I mock this?" your question is now, "What am I gaining from writing this test?" You might not be gaining much of anything, and it might not make much sense to write the test at all.
If you don't mock it and decide a unit test is still worth it, then you should be asserting on other aspects of the code. Either an end state or some output. In this case, the factory call becomes an implementation detail that's not appropriate for mocking.
It's important not to fall for a "unit test everything" mentality. That is a recipe for Test-induced Design Damage. Evaluate your tests on a case by case basis, deciding whether they're providing you any real value or not. Not writing a unit test is a valid option and is even appropriate at times, even if it's option you try very hard to avoid.
Only you can make a determination which one makes the most sense in this case. From the the fact that this is a factory object we're talking about, I'd probably lean toward the former.
I've been testing my code behavior using TestNG and JMockit for a while now and I have had no specific issue with their combination. Today I came across a situation where I needed to mock one of my internal dependencies, in the so called, type wide manner and I did not need to keep that mock around since none of the test cases dealt with it directly while they counted on the mocked version functionality. So, naturally, I put the mocking logic in my #BeforeMethod. Here is a sample:
public class SampleTest
{
#Mocked
#Cascading
private InnerDependency dependency;
#BeforeMethod
public void beforeMethod()
{
new NonStrictExpectations()
{
{
dependency.getOutputStream((String)any);
result = new Delegate<OutputStream>()
{
public OutputStream getOutputStream(String url)
{
return null;
}
};
}
};
}
#Test
public void testNormalOperation()
{
// The test whose desired behavior depends on dependency being mocked out
// ..
}
}
But, since my tests do not care about the mocked dependency explicitly, I'm not willing to declare it as a test class field, unlike what is done above. To my knowledge of JMockit The only options remaining would be:
Declare dependency as a local mock field:
new NonStrictExpectations()
{
#Cascading
private InnerDependency dependency;
{
//...
}
}
Declare dependency as an input argument for beforeMethod(), similar to what is done for normal #Test methods:
#BeforeMethod
public void beforeMethod(#Mocked #Cascading final InnerDependency dependency)
{
// ...
}
I see that JMockit 1.6+ would not like the first option and warns with WARNING: Local mock field "dependency" should be moved to the test class or converted to a parameter of the test method. Hence, to keep everyone happy, I'm ruling this option out.
But for the second option, TestNG (currently 6.8.6) throws exception when running the test saying java.lang.IllegalArgumentException: wrong number of arguments. I don't see this behavior with normal #Test cases passed with #Mocked parameters. Even playing with #Parameter and #Optional will not help (and should not have!).
So, is there any way I could make this work without declaring the unneccessary test class mock field, or am I missing something here?
Thanks
Only test methods (annotated with #Test in JUnit or TestNG) support mock parameters, so the only choice here is to declare a mock field at the test class level.
Even if not used in any test method, I think it's better than having it declared in a setup method (using #Before, #BeforeMethod, etc.). If it were to be possible, the mock would still have to apply to all tests, because of the nature of setup methods; having a mock field of the test class makes it clear what the scope of the mock is.
Dynamic partial mocking is one more technique to specify #Mocked dependencies locally. However, it has it's limitations (see comments below).
I don't know why, but I have always written my JMock tests like this:
#Test
public void testMyThing() throws Exception {
mockery.checking(new Expectations() {{
oneOf(mockObj).foo();
}});
testObj.bar(); // calls mockObj.foo()
mockery.assertIsSatisfied();
}
But when there are many tests, is it better to move assertIsSatisfied to the tear-down?
#After
public void tearDown() throws Exception {
mockery.assertIsSatisfied();
}
The recommended way to do this is to use the JMock runner. Annotate the class with
#RunWith(JMock.class)
public class TestClass {
This will call the assertion at the right place in the test lifecycle. Teardown isn't the right place as a failure might not be reported in correctly and might mess up other cleanup.
We also have a mockery rule in the repository that works with the new #Rule infrastructure.
Yes, I tend to do it in teardown. It keeps the focus of the individual test methods on what they're actually testing, by removing the boilerplate out into the #After- it's critical to me that tests are as expressive and readable as possible.
In fact, I sometimes take it further than that, and use a JMockSupport base class that handles the Mockery for me (as well as providing convenience implementations of mock(...)). This is just a convenience, of course, and by no means a requirement like it was in JUnit 3.