DbUnit - JdbcSQLException: Function "*" not found - java

I have a user-defined function in MS SQL Server called from Java code that appears to be undefined when running integration tests in H2 database. You can find my code in the previous question.
Test code:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {H2Config.class})
#TestExecutionListeners({
DependencyInjectionTestExecutionListener.class,
DbUnitTestExecutionListener.class,
TransactionalTestExecutionListener.class
})
#TransactionConfiguration(defaultRollback = true)
public class TableDaoTest {
#Autowired
private TableDao tableDao;
#Test
#DatabaseSetup("/datasets/import.xml")
public void testMethod01() {
tableDao.getRecordsByGroup();
...
Database schema is autogenerated by Hibernate. As you can see data for the test is populated by DbUnit using xml dataset. And this test fails because my function that exists in MS SQL server DB is undefined in H2 database.
Application log:
Caused by: org.hibernate.exception.GenericJDBCException: could not prepare statement
...
Caused by: org.h2.jdbc.JdbcSQLException: Function "SAFE_MOD" not found; SQL statement:
select table10_.id, table10_.value, ... from Table1 table10_ where table10_.group1=dbo.safe_mod(?, ?);
...
How to import / create a function before DbUnit test?

H2 database doesn't support user-defined SQL functions. However, in this database, Java functions can be used as stored procedures as well.
#SuppressWarnings("unused")
public class H2Function {
public static int safeMod(Integer n, Integer divider) {
if (divider == null) {
divider = 5000;
}
return n % divider;
}
}
Note, that only static Java methods are supported; both the class and the method must be public.
The Java function must be declared (registered in the database) by calling CREATE ALIAS ... FOR before it can be used:
CREATE ALIAS IF NOT EXISTS safe_mod DETERMINISTIC FOR "by.naxa.H2Function.safeMod";
This statement should be executed before any test so I decided to put it inside connection init SQL:
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUrl("jdbc:h2:mem:my_db_name");
dataSource.setUsername("sa");
dataSource.setPassword("");
dataSource.setConnectionInitSqls(Collections.singleton(
"CREATE ALIAS IF NOT EXISTS safe_mod DETERMINISTIC FOR \"by.naxa.H2Function.safeMod\";"));
return dataSource;
}

Credit to naxa, this solution is based on theirs. This is to 'stub' the WORD_SIMILARITY from postgres,
#RunWith(SpringRunner.class)
#SpringBootTest
#TestPropertySource(
locations = "classpath:application-test.properties")
public class testServiceTests {
#Autowired
private MyService myService;
#Test
public void someTest() {
}
}
This should be in your application-test.properties
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.url=jdbc:h2:~/my_database;MODE=PostgreSQL;INIT=CREATE ALIAS IF NOT EXISTS WORD_SIMILARITY DETERMINISTIC FOR "com.example.H2Function.wordSimilarity";TRACE_LEVEL_FILE=0;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE;
spring.datasource.username=postgres
spring.datasource.password=postgres
hibernate.dialect=org.hibernate.dialect.PostgreSQL9Dialect
hibernate.flushMode=FLUSH_AUTO
hibernate.hbm2ddl.auto=create-drop
spring.datasource.data=classpath:V1__base-schema.sql
Here is the H2 function to replace
#SuppressWarnings("unused")
public class H2Function {
public static double wordSimilarity(String string, String word) {
if ( word== null ) {
return 0;
}
return 0.5;
}
}

#naXa and #Alan approaches works for scalar valued functions. For Table valued functions use ResultSet:
package com.package.app;
import org.h2.tools.SimpleResultSet;
#SuppressWarnings("unused")
public class H2Functions {
// All function params goes here
// LocalDate not working here, we have to use java.sql.Date
public static ResultSet gePrice(Long recipientId, Long currencyId, Date priceDate) {
SimpleResultSet rs = new SimpleResultSet();
rs.addColumn("price", Types.DECIMAL, 10, 0);
rs.addColumn("priceDate", Types.TIMESTAMP, 10, 0);
rs.addRow(new BigDecimal("123.23"), new Timestamp(LocalDateTime.now().toEpochSecond(ZoneOffset.UTC)) );
return rs;
}
}
Example application.yml configuration:
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
url: jdbc:h2:DB_NAME;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
name:
username:
password:
hikari:
auto-commit: false
# create alias for functions on db connection initialization
connection-init-sql: "CREATE ALIAS IF NOT EXISTS SAFE_MOD DETERMINISTIC FOR \"com.package.app.H2Functions.gePrice\";"
Then you can wrap response with your POJO.
More about user defined functions you can find in H2 documentation:
http://www.h2database.com/html/features.html#user_defined_functions

Related

How to properly maintain data for Testcontainers in init script?

I am using Testcontainers to load a Dockerized database to be used for my Spring Boot application for integration testing. I currently an using an initialization script to load all of the data:
CREATE TABLE public.table1 (
...
...
);
CREATE TABLE public.table2 (
...
...
);
This is all working fine. I also have my own manual test data that I insert to test different scenarios:
-- Data for pre-existing quiz
INSERT INTO public.table1 (id, prop1, prop2) values (1, 'a', 'b');
INSERT INTO public.table2 (id, prop1, prop2) values (1, 'c', 'd');
INSERT INTO public.table2 (id, prop1, prop2) values (2, 'e', 'f');
Again, this is all working fine, and I am using a YAML file to read mock these objects to be used for my tests
table1s:
first:
id: 1
prop1: a
prop2: b
table2s:
first:
id: 1
prop1: c
prop2: d
second:
id: 2
prop1: e
prop2: f
In which I will be able to put these into a class that I can read from the YAML file properties so it can be used for my test classes
public class Table1TestData {
#Autowired
private Environment env;
private UUID id;
private boolean prop1;
private boolean prop2;
public UUID getId() {
return id;
}
public void setId(UUID id) {
this.id = id;
}
public boolean getProp1() {
return prop1;
}
public void setProp1(String trial) {
this.prop1 = prop1;
}
....
public Table1TestData getFirstRowData(){
Table1HelperFactory ret = new Table1HelperFactory();
ret.setId(UUID.fromString(env.getProperty("table1s.first.id")));
ret.setProp1(env.getProperty("table1s.first.id"));
....
return ret;
}
....
}
And I use this helper as an autowired entity in my tests (especially for my Service classes):
public class Table1ServiceTest {
#ClassRule
public static PostgresContainer postgresContainer = PostgresContainer.getInstance();
#Autowired
Table1Service table1Service;
#Autowired
Table1TestData table1TestData;
#Autowired
MockMvc mockMvc;
#Autowired
ObjectMapper objectMapper;
#BeforeAll
private static void startup() {
postgresContainer.start();
}
#Test
#DisplayName("Table 1 Service Test")
#Transactional
public void findTable1ById() throws Exception {
Table1TestData testData = table1TestData.getFirstRowData();
Table1 table1 = table1Service.findTable1ById(testData.getId());
assertNotNull(table1);
assertEquals(table1.getId(), testData.getId());
assertEquals(table1.prop1(), testData.prop1());
....
}
}
However, let's say I have to apply a new column to Table1 (or any table really) and I put the new schema into the init script. I now have to manually go to each of these insert statements and put in a new column with a value (assuming there's no default), or even say if a column is removed (even if it doesn't affect the classes necessarily). This ends up being cumbersome.
So my question really is, for someone who is using an init script to populate test data for a containerized DB, what is the best way to go about maintaining this data efficiently without much manual curating?
I think you could take advantage of the initialization script feature for postgresql: Just place your sql scripts under /docker-entrypoint-initdb.d (creating the directory if necessary) and it will execute them directly without any programmatic work.
You can check out an example here:
https://github.com/gmunozfe/clustered-ejb-timers-kie-server/blob/master/src/test/java/org/kie/samples/integration/ClusteredEJBTimerSystemTest.java
Define your postgresql pointing to that directory:
.withFileSystemBind("etc/postgresql", "/docker-entrypoint-initdb.d",
BindMode.READ_ONLY)
If you want different scripts per tests, have a look a this article:
https://www.baeldung.com/spring-boot-data-sql-and-schema-sql

Create new PostgresSQL schema

What's the easiest way to create a new Postgres scheme inside the database on the runtime and also, create the tables written inside a SQL file?
\This is a Spring boot application and the method receives the schema name that needs to be created for the db.
Although it sounds like this would be a case for using Liquibase or Flyway or any other tool, here is a simple (but very hacky) solution/starting point:
(rough) Steps:
create the whole ddl query, which consists of the "create and use schema part" and the content of your SQL file
inject the entity manager
run the whole ddl query as a native query
Example/(hacky) Code:
Here a simple controller class defining a GET method that takes a parameter called "schema":
#Controller
public class FooController {
private static final String SCHEMA_FORMAT = "create schema %s; set schema %s; ";
#PersistenceContext
EntityManager entityManager;
#Value("classpath:foo.sql")
Resource fooResource;
#GetMapping("foo")
#Transactional
public ResponseEntity<?> foo(#RequestParam("schema") String schema)
throws IOException {
File fooFile = new ClassPathResource("foo.sql").getFile();
String ddl = new String(Files.readAllBytes(fooFile.toPath()));
String schemaQuery = String.format(SCHEMA_FORMAT, schema, schema);
String query = String.format("%s %s", schemaQuery, ddl);
entityManager.createNativeQuery(query).executeUpdate();
return ResponseEntity.noContent().build();
}
}

Spring data doesn't increment after DBsetup

I have such question. I'm using DBsetup for spring boot tests and postgresql database. And I'm using DBsetup to set user, but when I'm trying to set another user by spring data I have the next exception:
Подробности: Key (id)=(1) already exists.
org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint [users_pkey];
This is my test class:
#RunWith(SpringRunner.class)
#SpringBootTest
#TestPropertySource("/application-test.properties")
public class UserRepositoryTest {
#Autowired
private ApplicationContext applicationContext;
#Autowired
private UserRepository userRepository;
#Autowired
private DataSource dataSource;
#Before
public void insertData() throws SQLException {
Operation operation = sequenceOf(CommonOperations.DELETE_ALL, CommonOperations.INSERT_USER);
DbSetup dbSetup = new DbSetup(new DataSourceDestination(dataSource), operation);
dbSetup.launch();
}
#After
public void cleanPK() throws SQLException {
DBUtil.resetAutoIncrementColumns(applicationContext, "user");
}
#Test
public void registerUser() {
val user = new User(null, "Glass", "123123", "glass999#mail.ru");
assertEquals(user, userRepository.saveAndFlush(user));
}
}
Operations for DBsetup:
public class CommonOperations {
public static final Operation DELETE_ALL = deleteAllFrom("article_tag", "article", "tag", "users");
public static final Operation INSERT_USER =
insertInto("users")
.columns("id", "email", "password", "username")
.values(1, "krikkk998#mail.ru", "123123", "Daimon")
.build();
}
Class to reset sequence:
#NoArgsConstructor
public final class DBUtil {
public static void resetAutoIncrementColumns(ApplicationContext applicationContext,
String... tableNames) throws SQLException {
DataSource dataSource = applicationContext.getBean(DataSource.class);
String resetSqlTemplate = "ALTER SEQUENCE %s RESTART WITH 1;";
try (Connection dbConnection = dataSource.getConnection()) {
for (String resetSqlArgument: tableNames) {
try (Statement statement = dbConnection.createStatement()) {
String resetSql = String.format(resetSqlTemplate, resetSqlArgument + "_id_seq");
statement.execute(resetSql);
}
}
}
}
}
Does anyone know how to solve this problem?
One thing to look at:
public static final Operation INSERT_USER =
insertInto("users")
.columns("id", "email", "password", "username")
.values(1, "krikkk998#mail.ru", "123123", "Daimon")
.build();
Here you are using a hard-coded id i.e. 1
Now, when in the test case you are trying to create another user, you passed the id as null, assuming it's supposed to pick from the sequence. It will start from 1 too. Hence, you get a conflict.
You have an issue related to constraint violation. So one thing you can do is in your table change the 'id' column to "auto_increment". The DB will take care of incrementing this column value automatically.
At any point, if you want to reset this id value, you can call "resetAutoIncrementColumns()", and then in your INSERT SQL, you do not have to specify the 'id' column at all. It will always insert a unique value when a new user is saved.
Hope this will help you.

JOOQ - column "id" is of type uuid but expression is of type character varying

I'm having an issue related to JOOQ.
---------
This is my "setup" that led to the issue.
Table:
CREATE TABLE "public".xyz
(
id UUID NOT NULL,
CONSTRAINT pk_t_xyz PRIMARY KEY(id)
);
Generated field with JOOQ is correct
public final TableField<XYZRecord, UUID> ID = createField("id", org.jooq.impl.SQLDataType.UUID.nullable(false), this, "comment");
UUID is from java.util.*
My "custom" POJO with UUID from java.util.*:
public class XYZ {
#NotNull
private UUID id;
public XYZ (#NotNull UUID id) {
this.id = id;
}
public UUID getId() {
return id;
}
}
DSL configuration:
#Configuration
public class DataSourceConfiguration {
#Qualifier("dataSource")
#Autowired
private DataSource dataSource;
#Bean
public DefaultDSLContext dsl() {
return new DefaultDSLContext(configuration());
}
#Bean
public DataSourceConnectionProvider connectionProvider() {
return new DataSourceConnectionProvider
(new TransactionAwareDataSourceProxy(dataSource));
}
public DefaultConfiguration configuration() {
DefaultConfiguration jooqConfiguration = new DefaultConfiguration();
jooqConfiguration.set(connectionProvider());
jooqConfiguration.set(new DefaultExecuteListenerProvider(exceptionTransformer()));
return jooqConfiguration;
}
#Bean
public ExceptionTranslator exceptionTransformer() {
return new ExceptionTranslator();
}
}
Datasource in application.yml
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/xyz-data
Version of jooq is 3.10.5. I'm using spring-boot-starter-jooq with spring boot in version 2.0.0.RELEASE. PostgresSQL version is 10.
------------------
When I try to insert data like this:
dslContext.insertInto(XYZ, XYZ.ID)
.values(xyz.getId()).execute();
It does not work for some reason. As I understand from exception below JOOQ is casting UUID to string and therefore the SQL is invalid. Should I write some kind of converter or did I define something in wrong way?
Error:
org.springframework.jdbc.BadSqlGrammarException: Access database using jOOQ; bad SQL grammar [insert into "public"."xyz" ("id") values (?)]; nested exception is org.postgresql.util.PSQLException: ERROR: column "id" is of type uuid but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 225
[...]
Caused by: org.postgresql.util.PSQLException: ERROR: column "id" is of type uuid but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 225
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2422)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2167)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
at org.postgresql.jdbc.PgPreparedStatement.execute(PgPreparedStatement.java:144)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.execute(ProxyPreparedStatement.java:44)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.execute(HikariProxyPreparedStatement.java)
at org.jooq.tools.jdbc.DefaultPreparedStatement.execute(DefaultPreparedStatement.java:209)
at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:429)
at org.jooq.impl.AbstractDMLQuery.execute(AbstractDMLQuery.java:452)
at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:347)
... 51 more
I had this exact issue with Spring Boot. I could work around the issue by explicitly specifying the Postgres Dialect when setting up the DefaultConfiguration object.
E.g. in your DSL configuration:
public DefaultConfiguration configuration() {
DefaultConfiguration jooqConfiguration = new DefaultConfiguration();
// Explicitly set the Dialect
jooqConfiguration.setSQLDialect(SQLDialect.POSTGRES);
jooqConfiguration.set(connectionProvider());
jooqConfiguration.set(new DefaultExecuteListenerProvider(exceptionTransformer()));
return jooqConfiguration;
}
This is a bug in jOOQ: https://github.com/jOOQ/jOOQ/issues/7351
It seems to happen only when binding null values as UUIDs. The workaround is to implement your own data type binding:
https://www.jooq.org/doc/latest/manual/sql-building/queryparts/custom-bindings

jdbc update query no working on unit test with derby database and SpringJUnit4ClassRunner

I am having problems trying to do an unit test.
The test class is simple like:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"/job-runner-context.xml})
public class AtualizarDataServiceTest {
#Autowired
DataSource testDataSource;
#Autowired
Service service;
#Before
public void setUp() throws DatabaseUnitException, SQLException{
service.setDataSource(testDataSource);
}
#Test
public final void testUpdateDate() throws SQLException {
assertTrue(verifyDate());
service.updateDate();
assertFalse(verifyDate()); //Assert brokes here
}
private boolean verifyDate(){
SimpleJdbcTemplate consultaTemplate = new SimpleJdbcTemplate(testDataSource);
int count = consultaTemplate.queryForInt("SELECT COUNT(*) FROM MY_TABLE");
return count == 0;
}
}
The service:
public class Service {
private DataSource dataSource;
public void updateDate(){
SimpleJdbcTemplate jdbcTemplate = new SimpleJdbcTemplate(getDataSource());
String query = "UPDATE MY_TABLE SET DT_UPDATE_OPERATION = ?";
jdbcTemplate.update(query, new Object[]{new java.sql.Date(Calendar.getInstance().getTime().getTime())});
}
public DataSource getDataSource() {
return dataSource;
}
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
}
job-runner-context.xml important pieces:
<context:annotation-config/>
<context:component-scan base-package="my.package"/>
<bean class="my.package.Service"/>
<bean id="testDataSource" class="com.read.only.MyBasicDataSource" destroy-method="close" lazy-init="true">
<property name="jdbcReference" value="derby" />
</bean>
the jdbc connection properties:
<com:jdbcReference name="derby" type="DATABASE">
<com:credential user="" password="" />
<com:autocommit arg="false" />
<com:databaseConfig driverClassName="org.apache.derby.jdbc.EmbeddedDriver"
url="jdbc:derby:dbderby;create=true" databaseName="ANY" />
</com:jdbcReference>
At fist I thought the problem was related to commit issues but I tried set the value of autocommit properties to "true" and also manually call testDataSource.getConnection().commit() but it didn't work. The code and methods are working fine but the test isn't updating the derby database. In other test classes data is preset with dbUnit in the same database and the code works. This answer here gives an general list of possible problems I checked and I am reading and writing to the same tables in the same schemas. Am I missing something?
Try setup <com:autocommit arg="true" />.
As the answer to the question I posted I verified the autocommit and if I was writing to the right database, but didn't check the obvious: you can't update a table with no registers! The query UPDATE MY_TABLE SET DT_UPDATE_OPERATION = ? was applied to an empty table and the count query would always return 0. I just configured the test to make DbUnit import an state to the database from a xml file. Sorry for the trouble.

Categories