Simple question. If I use spring-data to generate CRUD methods for my DAO layer, should I still write unit tests against the generated methods? Or would that be the equivalent of unit testing library code?
Thanks in advance.
EDIT: To clarify, I'm asking whether or not the unit test needs to be written in addition to a suite of integration tests that get run before a release. For example, a unit test for the findAll() method of the DAO layer would be similar to the following:
class DepartmentDAOTest extends spock.lang.Specification {
/* ... */
def "returns all departments"() {
setup:
def result = new List<Department>()
when:
result = dao.findAll()
then:
result.size() == EXPECTED_SIZE
}
}
Whereas an integration test would be run probably by a test team or developer by hand, possibly before tagging a new release. This could either be automated using JWebUnit or Geb, and tests every component (including the platform) to ensure they work as expected when "integrated."
If I were to write the DAO implementation by hand using JdbcTemplate there would be no question that I should unit test every method. When I unit test the service layer (which makes calls to the DAO layer) I can mock out the DAO layer so I don't test it twice.
If I make a call into a third-party library like pdfbox for generating a PDF, there's an expectation for each method to work (because it is tested as a part of the pdfbox project). I don't test that their drawSquare method really draws a square, but during integration testing I'll see that my export PDF functionality correctly exports a PDF the way we want it to.
So the question should really be re-worded as, "Under which testing phase should I test my usage of spring-data?"
First, there is no code generated at all. We built a query meta model from the query methods you declare and dynamically execute these queries. The short answer here is: you definitely should test these methods declared. The reason is as obvious as it is simple: the query method declarations - no matter if they use derived queries or manually declared ones - interact with the mapping metadata you defined for your entities. Thus, it's definitely reasonable to check the query method execution to make sure you see the expected results. This then of course an more of an integration test and a semantical check for the queries executed, rather than a classical unit test.
No. As a general rule, don't test the platform.
Related
I was wondering about this question by having some trouble writting unit tests for my spring application.
Let's take the following example:
#SpringBootTest
#RunWith(SpringRunner.class)
public class AlbumDTOConverterTest {
#Autowired
private AlbumDTOConverter albumDTOConverter;
#Test
public void ToEntity_ReturnValue_NotNull() {
AlbumDTO albumDTO = new AlbumDTO("Summer album", new Date(), "This summer, we have made some wonderfull photos. Have a look!", null);
assertNotNull(albumDTOConverter.toEntity(albumDTO));
}
}
In order to make #Autowired work properly, I am launching a container with annotating the test class with #SpringBootTest.
The thing is that I think I am doing this wrong. In my opinion, I'll rather just create a new instance of AlbumDTOConverter by just using the new operator instead of using Spring's IoD.
What do you guys think about this ?
For unit tests you don't need a whole container to start. By definition, such units should be tested in isolation. Creating an instance of a class with the new keyword is perfectly fine. Even if the class has dependencies on other classes, you can also create them manually and pass to an appropriate constructor of the class.
Note that the unit is not the same as the class. The term of the unit is commonly confused among developers, especially beginners. You don't have to rely on the dependency injection in your unit tests. The container will only increase the time needed to execute the tests and the long execution time is the main reason why developers avoid running them very often. There is nothing wrong in manually building your dependency tree for a unit under tests.
In the long run, creating similar inputs for different tests might lead to duplication in the test code, but fortunately there are best practices for this problem, e.g. shared fixture.
If you are doing unit test then you should not use #Autowire every time.
Unit test basic says "Unit tests are responsible for testing a specific piece of code, just a small functionality (unit) of the code"
Now question is when to use spring capabilities ?
Sometimes, you'll need to do some unit tests relying on Spring framework like web service call, repository call etc. For example, if you have a repository that has a custom query using the #Query annotation, you might need to test your query. Also, if you are serialising/deserialising objects, you'd want to make sure that your object mapping is working. You might want to test your controllers as well, when you have some parameter validation or error handling. How can you be sure that you are using Spring correctly? In these situations you can take advantage of the new Spring Boot's test annotations.
I thinks this will give better idea.
Unit tests have become increasingly important in modern software development, and I'm finding myself lost in the dust. Primarily a Java programmer, I understand the basics of unit tests: have methods in place that test fundamental operations in your application. I can implement this for the simple (and often used as examples) cases:
public boolean addNumbers(int a, intb) {return a + b;}
//Unit test for above
public boolean testAddNumbers() {return addNumbers(5, 10) == 15;}
What confuses me is how to move this out into practical application. After all, most simple functions are already in APIs or the JDK. A real world situation that I do frequently in my job is data access, i.e. writing DAOs to work with a database. I can't write static tests like the example above, because pulling a record set from an Oracle box can return a whole manner of things. Writing a generalized unit test that just looks for a particular pattern in the return set seems too broad and unhelpful. Instead, I write no unit tests. This is bad.
Another example of a case where I don't know how to approach writing tests is web applications. My web applications are typically built on a J2EE stack, but they don't involve much logic. Generally it's delivering information from databases with little to no manipulation. Are these inappropriate targets for unit tests?
In short, I've found the vast majority of unit test examples to focus on test cases that are too simplistic and not relevant to what I do. I'm looking for any (preferably Java) examples/tips on writing unit tests for applications that move and display data, not perform logic on it.
You generally don't write unit tests for DAOs, but integration tests. These tests basically consist in
setting the database in a well-known state, suitable for the test
call the DAO method
verify that the DAO returns the right data and/or changes the stateof the database as expected.
Shameless plug: DbSetup is good tool to do the first part. But other tools exist like DBUnit.
To test the business logic of the app (complex or not, that doesn't change much), you typically mock the DAOs using a mocking framework like Mockito:
SomeDao mockDao = mock(SomeDao.class);
when(mockDao.findAllEmployees()).thenReturn(Arrays.asList(e1, e2, e3));
SomeService service = new SomeService(mockDao);
someService.increaseSalaryOfAllEmployeees(1000);
// todo verify that e1, e2 and e3's salary is 1000 larger than before
I was assigned to old web applicaton(JSF 1.2 + Eclipselink), there is no middleware like EJB or Spring and service layer of application is composed of POJO that directly calls EntityManager. Structure of the code is like this SomeBean(backing bean) -> SomeServices(here is mix of business logic and data access code), no seperate DAO layer. The code on service classes usually looks like this(here very simplified):
public void someMethod(SomeEntity someEntity, ....) throws SomeServiceExeption {
try{
entitiyManager.getTransaction.begin();
//lotOfLogicHereAndCallingSomeOtherPrivateMethods
entitiyManager.getTransaction.commit();
}catch(Exception e){
log.error("");
if(entitiyManager.getTransaction..isActive()){
entitiyManager.getTransaction.rollback();
}
throw new SomeServiceExeption(e);
}
}
This application has only few tests, that were testing almost nothing, so I am trying to cover as much code as possible with unit tests(there will be some changes commintg into application that will require a lot of changes in legacy code which is not covered by tests). My question is how would you unit test code like this. I have three ideas:
Refactor to tests. I could introduce DAO layer and put all
entityManager calls there. But refactoring without tests is allways
problem.
Mock EntityManager. I tried this several times with EasyMock, it works
and helps me to at least have some code coverage of code that
requires changes, but is probably not good style, as you should not
mock api that does not belong to you. Also to prepare EntityManager mocks requires a lot of time and code
Instead of unit testing, do integration testing with hsqldb or h2
and some dummy test data. Well this would require probably most of
the work and tests would be slow. Also I want to cover mostly
bussiness logic, not data access.
I would probably first add some integration tests and get coverage on the parts you want to refactor. Then, you can go on refactoring to more isolated units that are individually testable. If refactored correctly, you can then unit test your business logic seperately from the storage.
It's always a good idea to have some integration tests, so that would be a good place to start.
In any case, I wouldn't refactor code that's not covered by any tests.
I have a JUnit test that I would like to run from a main method. I would like to retrieve multiple records from a database (within the main method) and pass each record into the JUnit, using a data object, so that each record can be tested. Can I pass a data object into the run method of JUnit. If not what is the best way to accomplish this. There are so many different scenarios that I would like to use actual data from the database. There could be as many as 5000 or more records to test.
Thanks
Doug
Surely you are looking for Parameterized test case. You can do it easily by using JUnit instead of using main() method.
You need Parameterized to run your test.
It will run your test with different parameters by passing parameters via constructor.
Here is an easy article how to do that. You can also try the example in the documentation also, to understand how it works.
You want to use JUnit's Parameterized Tests. There's really no way to run a main method in a JUnit test case.
On top of the docs, here's a blog post which explains it a little better: http://ourcraft.wordpress.com/2008/08/27/writing-a-parameterized-junit-test/
I think that testing your main method is more along the lines of an integration test or a functional test. The same can be said for testing your database data. If you really want a unit test the firs step would be to refactor your main method using Extract Method to pull out the business logic you want to test.
Doing this gives you a few benefits. First can test your code in isolation (which is one of the more important properties of a good unit test). If you refactor out the business logic you'll know that you are only testing that code and that no other code is affecting your test. Second by having an isolated method you'll be able to easily mock the test data you are looking at by passing in different parameters to the method and make your assertions based on the known mock data.
I have not used Junit before and have not done unit testing automatically.
Scenario:
We are changing our backend DAO's from Sql Server to Oracle. So on the DB side all the stored procedures were converted to oracle. Now when our code calls these thew Oracle Stored Procedures we want to make sure that the data returned is same as compared to sql server stored procedures.
So for example I have the following method in a DAO:
//this is old method. gets data from sql server
public IdentifierBean getHeadIdentifiers_old(String head){
HashMap parmMap = new HashMap();
parmMap.put("head", head);
List result = getSqlMapClientTemplate().queryForList("Income.getIdentifiers", parmMap);
return (IdentifierBean)result.get(0);
}
//this is new method. gets data from Oracle
public IdentifierBean getHeadIdentifiers(String head){
HashMap parmMap = new HashMap();
parmMap.put("head", head);
getSqlMapClientTemplate().queryForObject("Income.getIdentifiers", parmMap);
return (IdentifierBean)((List)parmMap.get("Result0")).get(0);
}
now I want to write a Junit test method that would first call getHeadIdentifiers_old and then getHeadIdentifiers and would compare the Object returned (will have to over-write equals and hash in IdentifierBean). Test would pass only when both objects are same.
In the tester method I will have to provide a parameter (head in this case) for the two methods..this will be done manually for now. Yeah, from the front end parameters could be different and SPs might not return exact results for those parameters. But I think having these test cases will give us some relief that they return same data...
My questions are:
Is this a good approach?
I will have multiple DAO's. Do I write
the test methods inside the DAO
itself or for each DAO I should have
a seperate JUnit Test Class?
(might be n00b question) will all the
test cases be ran automatically? I do
not want to go to the front end click
bunch of stuff so that call to the
DAO gets triggered.
when tests are ran will I find out
which methods failed? and for the
ones failed will it tell me the test
method that failed?
lastly, any good starting points? any
tutorials, articles that show working
with Junit
Okay, lets see what can be done...
Is this a good approach?
Not really. Since instead of having one obsolete code path with somewhat known functionality, you now have two code paths with unequal and unpredictable functionality. Usually one would go with creating thorough unit tests for legacy code first and then refactor the original method to avoid incredibly large amounts of refactoring - what if some part of your jungle of codes forming the huge application keeps calling the other method while other parts call the new one?
However working with legacy code is never optimal so what you're thinking may be the best solution.
I will have multiple DAO's. Do I write the test methods inside the DAO itself or for each DAO I should have a seperate JUnit Test Class?
Assuming you've gone properly OO with your program structure where each class does one thing and one thing only, yes, you should make another class containing the test cases for that individual class. What you're looking for here is mock objects (search for it at SO and Google in general, lots of info available) which help you decouple your class under test from other classes. Interestingly high amount of mocks in unit tests usually mean that your class could use some heavy refactoring.
(might be n00b question) will all the test cases be ran automatically? I do not want to go to the front end click bunch of stuff so that call to the DAO gets triggered.
All IDE:s allow you to run all the JUnit test at the same time, for example in Eclipse just click the source folder/top package and choose Run -> Junit test. Also when running individual class, all the unit tests contained within are run in proper JUnit flow (setup() -> testX() -> tearDown()).
when tests are ran will I find out which methods failed? and for the ones failed will it tell me the test method that failed?
Yes, part of Test Driven Development is the mantra Red-Green-Refactor which refers to the colored bar shown by IDE:s for unit tests. Basically if any of the tests in test suite fails, the bar is red, if all pass, it's green. Additionally for JUnit there's also blue for individual tests to show assertion errors.
lastly, any good starting points? any tutorials, articles that show working with Junit
I'm quite sure there's going to be multiple of these in the answers soon, just hang on :)
You'll write a test class.
public class OracleMatchesSqlServer extends TestCase {
public void testHeadIdentifiersShouldBeEqual() throws Exception {
String head = "whatever your head should be";
IdentifierBean originalBean = YourClass.getHeadIdentifiers_old(head);
IdentifierBean oracleBean = YourClass.getHeadIdentifiers(head);
assertEquals(originalBean, oracleBean);
}
}
You might find you need to parameterize this on head; that's straightforward.
Update: It looks like this:
public class OracleMatchesSqlServer extends TestCase {
public void testHeadIdentifiersShouldBeEqual() throws Exception {
compareIdentifiersWithHead("head1");
compareIdentifiersWithHead("head2");
compareIdentifiersWithHead("etc");
}
private static void compareIdentifiersWithHead(String head) {
IdentifierBean originalBean = YourClass.getHeadIdentifiers_old(head);
IdentifierBean oracleBean = YourClass.getHeadIdentifiers(head);
assertEquals(originalBean, oracleBean);
}
}
* Is this a good approach?
Sure.
* I will have multiple DAOs. Do I write the test methods inside the DAO
itself or for each DAO I should have a separate JUnit Test Class?
Try it with a separate test class for each DAO; if that gets too tedious, try it the other way and see what you like best. It's probably more helpful to have the fine-grainedness of separate test classes, but your mileage may vary.
* (might be n00b question) will all the test cases be run automatically?
I do not want to go to the front end click bunch of stuff so that call
to the DAO gets triggered.
Depending on your environment, there will be ways to run all the tests automatically.
* when tests are ran will I find out which methods failed?
and for the ones failed will it tell me the test method that failed?
Yes and yes.
* lastly, any good starting points? any tutorials, articles that
show working with Junit
I really like Dave Astels' book.
Another useful introduction in writing and maintaining large unit test suites might be this book (which is partially available online):
XUnit Test Patterns, Refactoring Test Code by Gerard Meszaros
The book is organized in 3 major parts. Part I consists of a series of introductory narratives that describe some aspect of test automation using xUnit. Part II describes a number of "test smells" that are symptoms of problems with how we are automating our tests. Part III contains descriptions of the patterns.
Here's a quick yet fairly thorough intro to JUnit.