Test DAO class which is using JDBC with Mockito - java

I am trying to make unit test on a DAO class using Mockito. I have written some unit test before but not on a DAO class using some data base(in this case JDBC and MySQl).
I decided to start with this simple method but I do not now which are the good practices and I do not know how to start.
I do not know if this is important in this case but the project is using Spring Framework.
public class UserProfilesDao extends JdbcDaoSupport {
#Autowired
private MessageSourceAccessor msa;
public long getUserId(long userId, int serviceId) {
String sql = msa.getMessage("sql.select.service_user_id");
Object[] params = new Object[] { userId, serviceId };
int[] types = new int[] { Types.INTEGER, Types.INTEGER };
return getJdbcTemplate().queryForLong(sql, params, types);
}
}

If you really like to test the DAO create an in memory database. Fill it with the expected values, execute the query within the DAO and check that the result is correct for the previous inserted values in the database.
Mocking the Connection, ResultSet, PreparedStatement is too heavy and the result are not as expected, because you are not accessing to a real db.
Note: to use this approach your in memory database should have the same dialect of your phisical database, so don't use specific functions or syntax of the final database, but try to follow the SQL standard.
If you use an in memory database you are "mocking" the whole database. So the result test is not a real Unit test, but is not also an integration test. Use a tool like DBUnit to easily configure and fill your database if you like this approach.
Consider that mocking the database classes (PreparedStatement, Statement, ResultSet, Connection) is a long process and you are not granted that it works as expected, because you are not testing the right format of your sql over an sql engine.
You can also take a look to an article of Lasse Koskela talking about unit testing daos.
To test the DAO you need to:
Empty the database (not necessary for in memory db)
Fill the database with data example (automatic with db unit, done in the #BeforeClass or #Before method)
Run the test (with JUnit)
If you like to formally separated real unit tests from integration tests you can move the DAO tests on a separate directory and test them when needed and in the integration tests.
A possible in memory database that has different compatibility modes is H2, with the following database compatibilities:
IBM DB2
Apache Derb
HSQLDB
MS SQL Server
MySQL
Oracle
PostgreSQL

Related

Mockito QueryDSL

I have a method that uses queryDSL for query generation.
public List<EntityDAO> getObject() {
QEntity entity = QEntity.entity;
JPAQueryFactory queryFactory = getJPAQueryFactory();
JPAQuery<EntityDAO> query = queryFactory
.select(Projections.bean(EntityDAO.class,
entity.propertyA,
entity.propertyB.count().as("count")))
.from(entity)
.where(predicateBuilder.build())
.groupBy(entity.propertyA)
.orderBy(order)
.limit(rowCount)
.offset(pageId*rowCount);
return query.fetch();
}
How can I test this method using Mockito?
This method is the data access layer. For testing the data layer access codes , it is best to test it with the actual DB instance but not a mock because in the end you still need to verify if it can really get the data from the actual database correctly and this is the most appropriate place to do it.
You can check with Testcontainers. It allows you to test with a containerised instance of your DB. After starting this DB container , you simply load some testing data to related tables , call this method and directly verify the correctness of the return data.

Reduce or group Database service calls into batch

I have some functional units of transactional work within service(s) which involved multiple calls to different DAOs (using jdbcTemplate or NamedJdbcTemplate) that return sequence generated IDs to run the next select/insert or run some void insertion in the order of calls. Each DAO call represents a sql statement which will require a call to the Database. For example
#Transactional
public void create() {
playerService.findByUserName(player.getPlayerId());
roleService.insertRoleForPlayer(player.getId(), ROLE_MANAGER);
leagueScoringService.createLeagueScoringForLeague(leagueScoring);
leagueService.insertPlayerLeague(player, code);
Standing playerStanding = new Standing();
playerStanding.setPlayerId(player.getId());
playerStanding.setPlayerUserName(player.getPlayerId());
playerStanding.setLeagueId(leagueId);
standingService.insertStandings(ImmutableList.of(playerStanding));
}
Preferably I want to batch up the calls or ensure they all get executed together on the DB. Note I do not use hibernate and I don't want to either. What is the best approach for this?

How to make dynamic queries at run-time in Spring Boot and Data?

I am new to Java and started with Spring Boot and Spring Data JPA, so I know 2 ways on how to fetch data:
by Repository layer, with Literal method naming: FindOneByCity(String city);
by custom repo, with #Query annotation: #Query('select * from table where city like ?');
Both ways are statical designed.
How should I do to get data of a query that I have to build at run time?
What I am trying to achieve is the possibility to create dynamic reports without touching the code. A table would have records of reports with names and SQl queries with default parameters like begin_date, end_date etc, but with a variety of bodies. Example:
"Sales report by payment method" | select * from sales where met_pay = %pay_method% and date is between %begin_date% and %end_date%;
The Criteria API is mainly designed for that.
It provides an alternative way to define JPA queries.
With it you could build dynamic queries according to data provided at runtime.
To use it, you will need to create a custom repository implementation ant not only an interface.
You will indeed need to inject an EntityManager to create needed objects to create and execute the CriteriaQuery.
You will of course have to write boiler plate code to build the query and execute it.
This section explains how to create a custom repository with Spring Boot.
About your edit :
What I am trying to achieve is the possibility to create dynamic
reports without touching the code. A table would have records of
reports with names and SQl queries with default parameters like
begin_date, end_date etc, but with a variety of bodies.
If the queries are written at the hand in a plain text file, Criteria will not be the best choice as JPQL/SQL query and Criteria query are really not written in the same way.
In the Java code, mapping the JPQL/SQL queries defined in a plain text file to a Map<String, String> structure would be more adapted.
But I have some doubts on the feasibility of what you want to do.
Queries may have specific parameters, for some cases, you would not other choice than modifying the code. Specificities in parameters will do query maintainability very hard and error prone. Personally, I would implement the need by allowing the client to select for each field if a condition should be applied.
Then from the implementation side, I would use this user information to build my CriteriaQuery.
And there Criteria will do an excellent job : less code duplication, more adaptability for the query building and in addition more type-checks at compile type.
Spring-data repositories use EntityManager beneath. Repository classes are just another layer for the user not to worry about the details. But if a user wants to get his hands dirty, then of course spring wouldn't mind.
That is when you can use EntityManager directly.
Let us assume you have a Repository Class like AbcRepository
interface AbcRepository extends JpaRepository<Abc, String> {
}
You can create a custom repository like
interface CustomizedAbcRepository {
void someCustomMethod(User user);
}
The implementation class looks like
class CustomizedAbcRepositoryImpl implements CustomizedAbcRepository {
#Autowired
EntityManager entityManager;
public void someCustomMethod(User user) {
// You can build your custom query using Criteria or Criteria Builder
// and then use that in entityManager methods
}
}
Just a word of caution, the naming of the Customized interface and Customized implementating class is very important
In last versions of Spring Data was added ability to use JPA Criteria API. For more information see blog post https://jverhoelen.github.io/spring-data-queries-jpa-criteria-api/ .

jOOQ + Liquibase + H2 in unit test results in "schema not found" exception

i'm writing unit tests for some data access code. the key pieces in the setup consist of:
jOOQ generated artifacts for CRUD operations
Liquibase to handle schema evolutions
given as much, i'm trying to setup the tests as follows:
create a java.sql.Connection to initialize an H2 database with the appropriately named schema. (it's worth noting here that the connection is created with the following URL:
jdbc:h2:mem:[schema-name];MODE=MySQL;DB_CLOSE_DELAY=-1).
using the aforementioned connection, invoke Liquibase to run through a change log that creates all the objects in the database schema
using the aforementioned connection, create a org.jooq.DSLContext with which the data access components can be tested.
an abstract class encapsulates these three steps in a #Before annotated method, and test classes extend this abstract class to leverage the initialized org.jooq.DSLContext instance. something like this:
abstract class DbTestBase {
protected lateinit var dslContext: DSLContext
private lateinit var connection: Connection
open fun setUp() {
connection = DriverManager.getConnection("jdbc:h2:mem:foo;MODE=MySQL;DB_CLOSE_DELAY=-1")
// invoke Liquibase with this connection instance...
dslContext = DSL.using(connection, SQLDialect.H2)
}
open fun tearDown() {
dslContext.close()
connection.close()
}
}
class MyTest : DbTestBase() {
private lateinit var repository: Repository
#Before override fun setUp() {
super.setUp()
repository = Repository(dslContext)
}
#After override fun tearDown() {
super.tearDown()
}
#Test fun something() {
repository.add(Bar())
}
}
this results in the following exception:
Caused by: org.h2.jdbc.JdbcSQLException: Schema "foo" not found; SQL statement:
insert into `foo`.`bar` (`id`) values (?) [90079-196]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.command.Parser.getSchema(Parser.java:688)
at org.h2.command.Parser.getSchema(Parser.java:694)
at org.h2.command.Parser.readTableOrView(Parser.java:5535)
at org.h2.command.Parser.readTableOrView(Parser.java:5529)
at org.h2.command.Parser.parseInsert(Parser.java:1062)
at org.h2.command.Parser.parsePrepared(Parser.java:417)
at org.h2.command.Parser.parse(Parser.java:321)
at org.h2.command.Parser.parse(Parser.java:293)
at org.h2.command.Parser.prepareCommand(Parser.java:258)
at org.h2.engine.Session.prepareLocal(Session.java:578)
at org.h2.engine.Session.prepareCommand(Session.java:519)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1204)
at org.h2.jdbc.JdbcPreparedStatement.<init>(JdbcPreparedStatement.java:73)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:288)
at org.jooq.impl.ProviderEnabledConnection.prepareStatement(ProviderEnabledConnection.java:106)
at org.jooq.impl.SettingsEnabledConnection.prepareStatement(SettingsEnabledConnection.java:70)
at org.jooq.impl.AbstractQuery.prepare(AbstractQuery.java:410)
at org.jooq.impl.AbstractDMLQuery.prepare(AbstractDMLQuery.java:342)
at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:316)
... 25 more
i can see the Liquibase logging at the point where the schema is regenerated. and i've since changed the H2 URL to create file-based database, which i was able to inspect and verify that the schema does indeed exist.
i'd appreciate any help in spotting anything wrong in the approach.
after much trial and error i was able to work through my issues by resolving two problems:
Lukas's comment differentiating "database" (or "catalog") vs "schema" pushed me in the right direction. though the terms seem to be used interchangeably in MySQL (my production database), they are not in H2. seems rather obvious in hindsight, but the remedy to this was to invoke a JDBC call to manually construct the schema and then set it as the default just before invoking Liquibase to reconstruct the schema, a la:
...
connection = DriverManager.getConnection(...)
connection.createStatement().executeUpdate("create schema $schemaName")
connection.schema = schemaName.toUpperCase()
// invoke Liquibase with this connection
...
despite opening a case-insensitive connection to H2, the toUpperCase() invocation still proved necessary. not sure why...
jOOQ quotes names for all schema objects so as to make them case-insensitive. however, as i've come to understand, quotes are used to enforce case-sensitivity in H2. the presence of the quotes in the generated queries were therefore causing a slew of errors due to objects that could not be found. the remedy for this was to supply a different RenderNameStyle to the query generator that would omit the quotes, a la:
...
val settings = Settings().withRenderNameStyle(RenderNameStyle.AS_IS)
val dslContext = DSL.using(connection, SQLDialect.H2, settings)
...
hope this can be of help to someone else down the line.

Unit testing after adding database with Hibernate

I've added database functionality using hibernate to a system which was in memory up to this point. When all the data was it the memory I was able to use JUnit which restored the original data after each test.
Is there a way to achieve the same result with the new hibernate addition?
By "the same result" I mean start with the database at its original state, do the test which can alter the database, and restore the database to its original state.
Up until now, my ideas are:
In memory database (which is a Hibernate feature) but that won't allow me to use my actual data.
Add "testing flag" to me DOA won't commit the changes if set.
I am sure there is a better solution, but I haven't found anything better yet.
You could start the database transaction before each test:
#PersistenceContext
private EntityManager em;
#Before
public void init() {
em.getTransaction().begin();
}
#After
public void destroy() {
em.getTransaction().rollback();
}
This way, each test has a transaction running before the test starts and this transaction is rolled back after the test finishes, so you always discard all changes the current test underwent.
I think we should be clear with the definition of Unit Test. Unit Test must only test a small unit (a public method) in the application.
Assuming you have a DAO layer which uses Hibernate to interact with Database. Now the Hibernate uses a SessionFactory that requires a dataSource. The data source of the Unit Test should not be same as the one for your production application.
The idea is to define a test datasource and use a in memory DB (hsqldb or any other). For each of the test case you can execute some queries on the in memory DB, using the test dataSource and clear that after the execution of the Unit Test. For each Unit Test you should execute the query so that the test data setup is done for that particular test.
For e.g.: If you want to test the following:
1) Create Account
2) Update Account
3) Delete Account
Then there are three test scenarios and there can bee multiple Unit Tests possible for each of the scenario.
Now before executing the Create Account Test, it is important that the DB doesn't have this account. and then you call the createAccount method in the DAO to test the same. No need to verify if the result is in DB or not. Just check the return of your method and if it is same as expected on a successful account creation then your test case should pass.
For Update Account, your setup method should insert one account through query and then you must call the updateAccount in DAO for this account id and so on.
Please stick to the definition of Unit Tests and do not use it for testing more than one functionality at a time.
Hope this helps.

Categories