DbUnit - Warning: AbstractTableMetaData - java

I am using DbUnit in the latest version 2.4.8 and I get many warnings in my Unit tests with this message:
WARN : org.dbunit.dataset.AbstractTableMetaData -
Potential problem found: The configured data type factory
'class org.dbunit.dataset.datatype.DefaultDataTypeFactory'
might cause problems with the current database 'MySQL' (e.g. some datatypes may
not be supported properly). In rare cases you might see this message because the
list of supported database products is incomplete (list=[derby]). If so please
request a java-class update via the forums.If you are using your own
IDataTypeFactory extending DefaultDataTypeFactory, ensure that you override
getValidDbProducts() to specify the supported database products.
So I thought I add this (I use a MySQL database):
protected void setUpDatabaseConfig(DatabaseConfig config) {
config.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MySqlDataTypeFactory());
}
But this does not help to avoid these warnings. What's wrong here?
Thank you in advance & Best Regards Tim.

I solved this with info from the dbunit faq. Just setting the data type factory property made the warning go away.
Connection dbConn = template.getDataSource().getConnection();
IDatabaseConnection connection = new DatabaseConnection(dbConn, "UTEST", false);
DatabaseConfig dbConfig = connection.getConfig();
// added this line to get rid of the warning
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new OracleDataTypeFactory());

With Spring-Boot you can use such configuration bean
#Configuration
public class DbUnitConfiguration {
#Autowired
private DataSource dataSource;
#Bean
public DatabaseDataSourceConnectionFactoryBean dbUnitDatabaseConnection() {
DatabaseConfigBean bean = new DatabaseConfigBean();
bean.setDatatypeFactory(new MySqlDataTypeFactory());
DatabaseDataSourceConnectionFactoryBean dbConnectionFactory = new DatabaseDataSourceConnectionFactoryBean(dataSource);
dbConnectionFactory.setDatabaseConfig(bean);
return dbConnectionFactory;
}
}

I know this is an old thread but all the answers here are more complicated than they need to be.
The simplest way to accomplish setting the factory on every connection acquisition is to supply an OperationListener and implement its connectionRetrieved method to do what you want. No overriding needed; the listener will be invoked every time an IDatabaseConnection is acquired.

I was using JTDS driver and MS SQL 2008. In my DBUntiTest class override the following method. The waring message disappeared.
#Override
protected void setUpDatabaseConfig(DatabaseConfig config) {
config.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MsSqlDataTypeFactory());
}

#reassembler's answer is spot on. Just to add that I am testing against different database products, so I now set the DataType Factory according to the current connection:
private IDatabaseConnection getConnection(Connection jdbcConnection) throws Exception {
String databaseProductName = jdbcConnection.getMetaData().getDatabaseProductName();
DatabaseConnection databaseConnection = new DatabaseConnection(jdbcConnection);
DatabaseConfig dbConfig = databaseConnection.getConfig();
switch (databaseProductName) {
case "HSQL Database Engine":
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new HsqldbDataTypeFactory());
break;
case "MySQL":
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MySqlDataTypeFactory());
break;
default:
log.warn("No matching database product found when setting DBUnit DATATYPE_FACTORY");
}
return databaseConnection;
}
You can obviously add any additionaly databases to this list.

I am using Dbunit version 2.7.0. In my case, just setting the data type factory property in the #Test doesn't suffice. The warning continues when calling dbunit JdbcDatabaseTester.onsetup() method.
I solved the problem implementing a MyJdbcDatabaseTester that extends JdbdDatabaseTester, and overriding the method getConnection(), configuring the datatype factory property:
public class MyJdbcDatabaseTester extends JdbcDatabaseTester {
public MyJdbcDatabaseTester(String driverClass, String connectionUrl, String username,
String password )
throws ClassNotFoundException {
super( driverClass, connectionUrl, username, password );
}
#Override
public IDatabaseConnection getConnection() throws Exception {
IDatabaseConnection result = super.getConnection();
DatabaseConfig dbConfig = result.getConfig();
//to supress warnings when accesing to database
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MySqlDataTypeFactory());
return result;
}
}
Then I use MyJcbdDatabaseTester instead of JdbcDatabaseTester in my tests

Related

Specify Oracle Schema SpringBoot 2.x/Apache Camel - Multiple Datasource

I am having a hard time changing the oracle datasource schema for my springboot app, that will eventually be used by my camel routes. I am logging in as user readonly, but all the data is in schema mydata. Readonly has read rights to the mydata schema.
I have tried calling ALTER SESSION SET CURRENT_SCHEMA=mydata against the datasource (by autowiring, and then getting the connection object from the datasource) and it doesn't work, I have no issue running selects from statement objects I create off the connection (see code below)
If I create a rest endpoint that executes ALTER SESSION SET CURRENT_SCHEMA=mydata and if I call that from postman or a browser, that will change my schema and my other endpoints will work, but I would prefer not to do it that way since I will have to call that endpoint. I guess I could call that endpoint in my springboot app when it loads but it just seems like the wrong way to do it.
I also do not want to hardcode/prefix all my tables with the schema name since different regions have different schema names, I'd like to configure the schema name in the properties file.
Here is my application.properties, I have tried various ways to set the schema in the properties file based on other stack overflow posts, and so far none of them work.
spring.datasource.first.url=jdbc:oracle:thin:#myserver:10100:db9
spring.datasource.first.username=readonly
spring.datasource.first.password=readonlypass
## DOESNT WORK ->spring.datasource.hikari.schema=mydata
## DOESNT WORK ->spring.datasource.hikari.first.schema=mydata
#sync database
spring.datasource.second.driverClassName=oracle.jdbc.OracleDriver
spring.datasource.second.url = jdbc:oracle:thin:myserver2:10100:db15
spring.datasource.second.username = eam
spring.datasource.second.password = eampass
Here is the code from my springboot application:
/**
* A spring-boot application that includes a Camel route builder to setup the Camel routes
*/
#SpringBootApplication
#ImportResource({"classpath:spring/camel-context.xml"})
public class Application extends RouteBuilder {
int workorderSyncFrequency = 5000;
//Autowired the first datasource in attempts to alter the session to set my schema name.
#Autowired
DataSource firstDataSource;
// must have a main method spring-boot can run
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
//setup first datasource
#Bean
#Primary
#ConfigurationProperties("spring.datasource.first")
public DataSourceProperties firstDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.first.configuration")
public DataSource firstDataSource() {
return firstDataSourceProperties().initializeDataSourceBuilder()
.type(HikariDataSource.class).build();
}
//setup second data source
#Bean
#ConfigurationProperties("spring.datasource.second")
public DataSourceProperties secondDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource.second.configuration")
public DataSource secondDataSource() {
return firstDataSourceProperties().initializeDataSourceBuilder()
.type(HikariDataSource.class).build();
}
#Override
public void configure() throws Exception {
Connection con = DataSourceUtils.getConnection(firstDataSource);
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery("select count(*) from mydata.ASSET");
rs.next();
//simply testing I am using the correct datasource and I can query from the second schema and this works.
System.out.println("++++++++++++++++++++++ASSET COUNT+++++++++++++++++++"+rs.getInt(1));
//Tried both of these statements, neither works.
//stmt.executeQuery("ALTER SESSION SET CURRENT_SCHEMA=mydata");
//stmt.executeUpdate("ALTER SESSION SET CURRENT_SCHEMA=mydata");
//Connection is defaulted to autocommit tried this just in case.
con.commit();
//ASSET table doesnt exist on the readonly schema, only on the mydata schema
//if I call test3 I will get a table or view does not exist, unless I first call the "schema"
//endpoint below.
rest()
.get("test3")
.produces(MediaType.APPLICATION_JSON_VALUE)
.route()
.to("sql:SELECT * FROM ASSET where rownum < 10"
+ "?dataSource=firstDataSource&outputType=SelectList");
//This works if I call this route, but its a weird way to make this work.
rest()
.get("schema")
.produces(MediaType.APPLICATION_JSON_VALUE)
.route()
.to("sql:ALTER SESSION SET CURRENT_SCHEMA=mydata"
+ "?dataSource=firstDataSource&outputType=SelectList");
}

SparkJava JOOQ dependency injection with Guice

I am writing simple CRUD application which would fetch person records from database and I'm using SparkJava framework I have working code which fetches records from database but I would want to extract JOOQ DSLContext code and inject it as a bean and initialize it in another class in order to have more cleaner code, but I'm not sure how to achieve it here's main method which currently hold everything:
public static void main(String[] args) throws IOException {
final BasicDataSource ds = new BasicDataSource();
final Properties properties = new Properties();
properties.load(BankApiApplication.class.getResourceAsStream("/application.properties"));
ds.setDriverClassName(properties.getProperty("db.driver"));
ds.setUrl(properties.getProperty("db.url"));
ds.setUsername(properties.getProperty("db.username"));
ds.setPassword(properties.getProperty("db.password"));
final ConnectionProvider cp = new DataSourceConnectionProvider(ds);
final Configuration configuration = new DefaultConfiguration()
.set(cp)
.set(SQLDialect.H2)
.set(new ThreadLocalTransactionProvider(cp, true));
final DSLContext ctx = DSL.using(configuration);
final JSONFormat format = new JSONFormat().format(true).header(false).recordFormat(JSONFormat.RecordFormat.OBJECT);
port(8080);
get("/persons", (request, response) -> {
return ctx.select().from(Person.PERSON).fetch().formatJSON();
});
}
How I could extract code which initializes Datasource and configures DSLContext and instead I could just inject DSLContext or some kind of DSLContextHolder and do querying ?
So, in general, you want to inject the highest-level object you can. This is related to the Law of Demeter, which in short says that a component can know about its direct dependencies, but it shouldn't know about those dependencies' dependencies.
In your case, you're really only using DSLContext (ctx). [A note here: your code has a lot of two-letter names - it's pretty hard to follow. It would be easier if you wrote out e.g. ctx -> dslContext, cp -> connectionProvider]. This means you really only want your method to know about the DSLContext, not its dependencies. Therefore, it would be good to pull the following out into a module, then inject just a DSLContext:
Configuration
ConnectionProvider
Properties
BasicDataSource
If all these things are only used in this one main(), you can write a single Provider to return a DSLContext. If some of these are used in multiple places (for more than instantiating this main()'s DSLContext), then they can go in their own Providers. For example, here's what a Provider for a DSLContext would look like, if Configuration was placed in its own Provider:
public class MyModule extends AbstractModule {
// other providers
// ...
#Provides
#Singleton
public DSLContext dslContext(Configuration configuration) {
return SL.using(configuration);
}
}
Then, in your main(), you would write:
Injector injector = Guice.createInjector(yourModule());
DSLContext myContext = injector.getInstance(DSLContext.class);
// ... use it

Using POST method for big queries in Spring Data Solr

I am using Spring Data Solr in my project. In some cases generated queries to Solr are too big (e.g.15Kb+) and cause Solr exceptions. This solution: http://codingtricks.fidibuy.com/participant/join/54fce329b760506d5d9e7db3/Spring-Data-Solr-cannot-handle-long-queries
still fails for some queries.
Since directly sending those queries to Solr via POST works fine, I chose to work in this direction. I failed to find in Spring Data Solr any way to configure the preferred method (GET/POST) for queries. Therefore, I came to the following solution: I extended SolrServer
public class CustomSolrServer extends HttpSolrServer {
public CustomSolrServer(String home, String core) {
super(home);
setCore(core);
}
#Override
public QueryResponse query(SolrParams params) throws SolrServerException {
METHOD method = METHOD.GET;
if (isBigQuery(params)) {
method = METHOD.POST;
}
return new QueryRequest( params, method ).process( this );
}
}
(some details skipped, setCore() and isBigQuery() are trivial and skipped as well)
and use it as SolrServer bean in SolrConfiguration.class:
#Configuration
#EnableSolrRepositories(basePackages = { "com.vvy.repository.solr" }, multicoreSupport=false)
#Import(value = SolrAutoConfiguration.class)
#EnableConfigurationProperties(SolrProperties.class)
public class SolrConfiguration {
#Autowired
private SolrProperties solrProperties;
#Value("${spring.data.solr.core}")
private String solrCore;
#Bean
public SolrServer solrServer() {
return new CustomSolrServer(solrProperties.getHost(),solrCore) ;
}
}
This works OK, but has a couple of drawbacks: I had to set multiCoreSupport to false. This was done because when Spring Data Solr implements repositories from the interfaces, with multiCoreSupport on it uses MultiCoreSolrServerFactory and tries to store a server per core, which is done by cloning them to the holding map. Naturally, it crashes on a customized SolrServer, because SolrServerUtils doesn't know how to clone() it. Also, I have to set core manually instead of enjoying Spring Data extracting it from #SolrDocument annotation's parameter on the entity class.
Here are the questions
1) the main and general question: is there any reasonable way to solve the problem of too long queries in Spring Data Solr (or, more specifically, to use POST instead of GET)?
2) a minor one: is there a reasonable way to customize SolrServer in Spring Data Solr and yet maintain multiCoreSupport?
Answer for Q1:Yes, u can using POST instead of GET.
Answer for Q2:Yes, u already have done a half.Except following:
1)u have to rename 'CustomSolrServer' to 'HttpSolrServer',u can check method
org.springframework.data.solr.server.support.SolrServerUtils#clone(T, java.lang.String)
for reason.
2)u don't have to specify concrete core name.U can specify core name using annotation
org.springframework.data.solr.core.mapping.SolrDocument
on corresponding solr model.
3)set multicoreSupport = true
According to your sample of classes, they should look like as following:
package com.x.x.config;
import org.apache.solr.client.solrj.SolrRequest;
import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.request.QueryRequest;
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.params.SolrParams;
public class HttpSolrServer extends org.apache.solr.client.solrj.impl.HttpSolrServer {
public HttpSolrServer(String host) {
super(host);
}
#Override
public QueryResponse query(SolrParams params) throws SolrServerException {
SolrRequest.METHOD method = SolrRequest.METHOD.POST;
return new QueryRequest(params, method).process(this);
}
}
#Configuration
#EnableSolrRepositories(basePackages = { "com.vvy.repository.solr" }, multicoreSupport=true)
#Import(value = SolrAutoConfiguration.class)
#EnableConfigurationProperties(SolrProperties.class)
public class SolrConfiguration {
#Autowired
private SolrProperties solrProperties;
#Bean
public SolrServer solrServer() {
return new com.x.x.config.HttpSolrServer(solrProperties.getHost()) ;
}
}
ps: Latest spring-data-solr 3.x.x already support custom query request method,see post issue

mongodb multi tenacy spel with #Document

This is related to
MongoDB and SpEL Expressions in #Document annotations
This is the way I am creating my mongo template
#Bean
public MongoDbFactory mongoDbFactory() throws UnknownHostException {
String dbname = getCustid();
return new SimpleMongoDbFactory(new MongoClient("localhost"), "mydb");
}
#Bean
MongoTemplate mongoTemplate() throws UnknownHostException {
MappingMongoConverter converter =
new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
return new MongoTemplate(mongoDbFactory(), converter);
}
I have a tenant provider class
#Component("tenantProvider")
public class TenantProvider {
public String getTenantId() {
--custome Thread local logic for getting a name
}
}
And my domain class
#Document(collection = "#{#tenantProvider.getTenantId()}_device")
public class Device {
-- my fields here
}
As you see I have created my mongotemplate as specified in the post, but I still get the below error
Exception in thread "main" org.springframework.expression.spel.SpelEvaluationException: EL1057E:(pos 1): No bean resolver registered in the context to resolve access to bean 'tenantProvider'
What am I doing wrong?
Finally figured out why i was getting this issue.
When using Servlet 3 initialization make sure that you add the application context to the mongo context as follows
#Autowired
private ApplicationContext appContext;
public MongoDbFactory mongoDbFactory() throws UnknownHostException {
return new SimpleMongoDbFactory(new MongoClient("localhost"), "apollo-mongodb");
}
#Bean
MongoTemplate mongoTemplate() throws UnknownHostException {
final MongoDbFactory factory = mongoDbFactory();
final MongoMappingContext mongoMappingContext = new MongoMappingContext();
mongoMappingContext.setApplicationContext(appContext);
// Learned from web, prevents Spring from including the _class attribute
final MappingMongoConverter converter = new MappingMongoConverter(factory, mongoMappingContext);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return new MongoTemplate(factory, converter);
}
Check the autowiring of the context and also
mongoMappingContext.setApplicationContext(appContext);
With these two lines i was able to get the component wired correctly to use it in multi tenant mode
The above answers just worked partially in my case.
I've been struggling with the same problem and finally realized that under some runtime execution path (when RepositoryFactorySupport relies on AbstractMongoQuery to query MongoDB, instead of SimpleMongoRepository which as far as I know is used in "out of the box" methods provided by SpringData) the metadata object of type MongoEntityMetadata that belongs to MongoQueryMethod used in AbstractMongoQuery is updated only once, in a method named getEntityInformation()
Because MongoQueryMethod object that holds a reference to this 'stateful' bean seems to be pooled/cached by infrastructure code #Document annotations with Spel not always work.
As far as I know as a developer we just have one choice, use MongoOperations directly from your #Repository bean in order to be able to specify the right collection name evaluated at runtime with Spel.
I've tried to use AOP in order to modify this behaviour, by setting a null collection name in MongoEntityMetadata but this does not help because changes in AbstractMongoQuery inner classes, that implement Execution interface, would also need to be done in order to check if MongoEntityMetadata collection name is null and therefore use a different MongoTemplate method signature.
MongoTemplate is smart enough to guess the right collection name by using its private method
private <T> String determineEntityCollectionName(T obj)
I've a created a ticket in spring's jira https://jira.spring.io/browse/DATAMONGO-1043
If you have the mongoTemplate configured as in the related issue, the only thing i can think of is this:
<context:component-scan base-package="com.tenantprovider.package" />
Or if you want to use annotations:
#ComponentScan(basePackages = "com.tenantprovider.package")
You might not be scanning the tenant provider package.
Ex:
#ComponentScan(basePackages = "com.tenantprovider.package")
#Document(collection = "#{#tenantProvider.getTenantId()}_device")
public class Device {
-- my fields here
}

Same hibernate session between JUnit tests causing problems

I'm developing a web app which is based in Spring 2.5 and hibernate 3. Recently I've introduced JUnit tests and I've done some integration tests using DBUnit framework. DBUnit is supposed to update the database with an xml dataset between one test and another, and it's working well, as I've seen.
However, when I update an element in a test, hibernate seems to catch this information and even I load the element in the following test, the information is the one I've modified. If I look the DB when the execution is paused, the Data Base is properly reseted by DBUnit. So I think it can be an Hibernate problem..
Is there a way to make a tearDown between tests saying I want a new hibernate session for my spring context? By the way, I'm not using Spring annotations and I get the Spring context by code:
String[] contextLocations = new String[2];
contextLocations[0] = "WebContent/WEB-INF/applicationContext.xml";
contextLocations[1] = "src/System_V3/test/applicationContext.xml";
context = new FileSystemXmlApplicationContext(contextLocations);
DBUnit setUp:
#Before
public void setUpBeforeClass() throws Exception {
handleSetUpOperation();
}
private static void handleSetUpOperation() throws Exception {
conn = getConnection();
conn.getConnection().setAutoCommit(false);
final IDataSet data = getDataSet();
try {
DatabaseOperation.REFRESH.execute(conn, data);
} finally {
conn.close();
}
}
private static IDatabaseConnection getConnection() throws ClassNotFoundException, SQLException,
DatabaseUnitException {
Class.forName("org.gjt.mm.mysql.Driver");
return new DatabaseConnection(DriverManager.getConnection(
"jdbc:mysql://localhost:3306/web_database", "root", "pass"));
}
private static IDataSet getDataSet() throws IOException, DataSetException {
ClassLoader classLoader = TestPrueba.class.getClassLoader();
return new FlatXmlDataSetBuilder().build(classLoader
.getResourceAsStream("System_V3/test/dataset.xml"));
}
Tests are done in JUnit 4 using only #Test annotations and test class is not extending any library class.
Any suggestion?
Not sure if this is something that can help you - but just in case...
Try to use session.clear() and use it in teardown method.
Please take a look here http://docs.jboss.org/hibernate/orm/3.5/api/org/hibernate/Session.html#clear()
According to spec session.clear() ->
Completely clear the session. Evict all loaded instances and cancel all pending saves, updates and deletions. Do not close open iterators or instances of ScrollableResults.
You need to execute your tests within a transaction. This can be achieved by setting the SpringJUnit4ClassRunner for your test. After this is configured you can use #Transactional annotation per test.
With this approach you can #Autowired your beans directly to your test too.
For instance:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:context-file.xml" })
public class MyTest {
#Autowired
private MyService myService;
#Transactional
#Test
private void myFirstTest() {
...
myService.executeSomething();
...
}
}
and of course, you can set the default behaviour to RollBack on your test class annotating it with #TransactionConfiguration(defaultRollback = true/false)

Categories