I have the problem with JPA Criteria API while using in my project different datasource persistance.
There are two PU uses different datasources:
<persistence-unit name="analysis" transaction-type="RESOURCE_LOCAL">
<provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
<non-jta-data-source>AnalysisDS</non-jta-data-source>
<class>entity1</class>
<class>entity2</class>
<class>entity3</class>
and
<persistence-unit name="reaction" transaction-type="RESOURCE_LOCAL">
<provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
<non-jta-data-source>ReactionDS</non-jta-data-source>
<class>someEntity1</class>
<class>someEntity2</class>
<class>someEntity3</class>
Spring load it, in applicationContext
<bean id="defaultAnalysysDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
lazy-init="default">
<property name="jndiName" value="AnalysisDS"/>
<property name="lookupOnStartup" value="false"/>
<property name="cache" value="true"/>
<property name="proxyInterface" value="javax.sql.DataSource"/>
</bean>
<bean id="defaultReactionDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
lazy-init="default">
<property name="jndiName" value="ReactionDS"/>
<property name="lookupOnStartup" value="false"/>
<property name="cache" value="true"/>
<property name="proxyInterface" value="javax.sql.DataSource"/>
</bean>
In my DAO I can work with this PU with EntityManager, for example for
ReactionDS I'Am using
#PersistenceContext(unitName = "reaction")
private EntityManager entityManager;
And all work done - simple query's and JPQL expressions.
But when I want to introduce to my DAO JPA Criteria API
Like this :
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
...
I have exception after getCriteriaBuilder() method works:
Caused by: <openjpa-2.4.0-r422266:1674604 fatal user error> org.apache.openjpa.util.MetaDataException: Errors encountered while resolving metadata. See nested exceptions for details.
at org.apache.openjpa.meta.MetaDataRepository.resolve(MetaDataRepository.java:675)
at org.apache.openjpa.meta.MetaDataRepository.getMetaDataInternal(MetaDataRepository.java:418)
at org.apache.openjpa.meta.MetaDataRepository.getMetaData(MetaDataRepository.java:389)
at org.apache.openjpa.persistence.meta.MetamodelImpl.(MetamodelImpl.java:86)
at org.apache.openjpa.persistence.EntityManagerFactoryImpl.getMetamodel(EntityManagerFactoryImpl.java:348)
at org.apache.openjpa.persistence.EntityManagerFactoryImpl.getCriteriaBuilder(EntityManagerFactoryImpl.java:332)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
...
Caused by: <openjpa-2.4.0-r422266:1674604 fatal user error> org.apache.openjpa.util.MetaDataException: Table "ANALYSIS.ENTITY1" given for "entity1" does not exist.
at org.apache.openjpa.jdbc.meta.MappingInfo.createTable(MappingInfo.java:532)
at org.apache.openjpa.jdbc.meta.ClassMappingInfo.getTable(ClassMappingInfo.java:317)
at org.apache.openjpa.jdbc.meta.ClassMappingInfo.getTable(ClassMappingInfo.java:339)
at org.apache.openjpa.jdbc.meta.strats.FullClassStrategy.map(FullClassStrategy.java:73)
at org.apache.openjpa.jdbc.meta.ClassMapping.setStrategy(ClassMapping.java:392)
at org.apache.openjpa.jdbc.meta.RuntimeStrategyInstaller.installStrategy(RuntimeStrategyInstaller.java:55)
at org.apache.openjpa.jdbc.meta.MappingRepository.prepareMapping(MappingRepository.java:410)
at org.apache.openjpa.meta.MetaDataRepository.preMapping(MetaDataRepository.java:769)
at org.apache.openjpa.meta.MetaDataRepository.resolve(MetaDataRepository.java:658)
... 147 more
The problem root cause in JPA, because his trying to use a tables from Analys in Reaction PU and extracts all meta-classes for entities that are located in different datasources, but access to them is doing in one.
But when I granted select on Entity1 to ReactionDS - all works done. (because I can use Select * from Analysis.Entity1 from reaction)
The question - how to make the metamodel classes to choose working only within the specified DS in EntityManager (in current example - Reaction, not together with Analysis) ?
p.s Database is Oracle, using Weblogic 12.1.3 and OpenJpa2.4.
Metamodel is generated automatically with maven plugin on compile:
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<processors>
<processor>org.apache.openjpa.persistence.meta.AnnotationProcessor6</processor>
</processors>
<optionMap>
<openjpa.metamodel>true</openjpa.metamodel>
</optionMap>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.apache.openjpa</groupId>
<artifactId>openjpa</artifactId>
<version>${openjpa.version}</version>
</dependency>
</dependencies>
</plugin>
I think you may be confused by your Spring Framework datasource declarations.
These beans do not define your datasource, they only provide a way for other Spring components to access the datasources that have been configured in your server. JPA does not use these at all.
Therefore, your problem lies in the datasources that you have defined in your WebLogic server. It looks like you have defined both datasources to reference the same database instance.
I have a Maven project with multiple modules. Module ModuleB uses ModuleA as an internal Maven dependency. In moduleA I have a Spring xml config module.a.xml that loads a module.a.properties file. In the Spring xml config of moduleB I import the module.b.properties file together with the module.a.xml config.
In the end I end up with a Spring xml config with two property file imports. Depending of the order of the imports I can only access properties of one file: either module.a.properties or module.b.properties. How can I use both properties at the same time?
The problem with a solution using the PropertyPlaceholderConfigurer is that the properties files reside at different modules and moduleB shouldn't worry about a properties file of moduleA.
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer" id="corePlaceHolder">
<property name="locations">
<list>
<value>classpath:modula.a.properties</value>
<value>classpath:modula.b.properties</value>
</list>
</property>
</bean>
The problem with using ignore-unresolvable="true" is that a forgotten property can easily be missed and putting ignore-unresolvable="true" on the property-placeholder can easily be missed.
<context:property-placeholder location="module.a.properties" order="0" ignore-unresolvable="true"/>
<context:property-placeholder location="module.b.properties" order="1" ignore-unresolvable="true"/>
Not sure this can solve your problem but, since you are using a maven multi-module build, have you considered suing a maven plugin to create a third properties file as a merge of A and B with proper override strategy?
Here is a sample using maven-merge-properties-plugin
<plugin>
<groupId>org.beardedgeeks</groupId>
<artifactId>maven-merge-properties-plugin</artifactId>
<version>0.2</version>
<configuration>
<merges>
<merge>
<targetFile>${moduleB.output.dir}/module-final.properties</targetFile>
<propertiesFiles>
<propertiesFile>${moduleB.src.dir}/moduleB.properties</propertiesFile>
<propertiesFile>${moduleA.src.dir}/moduleA.properties</propertiesFile>
</propertiesFiles>
</merge>
</merges>
</configuration>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>merge</goal>
</goals>
</execution>
</executions>
</plugin>
In this way you will get all A and B properties in just on file. If a property exists on both A and B, B will win (check file order in configuration).
Being both modules in the same project retrieve both files should be quite easy.
You could even use another plugin to unpack just needed properties files from external jars.
Hope this helps.
How can the Arquillian configuration file Arquillian.xml be shared between projects and team members?
<arquillian xmlns="http://jboss.org/schema/arquillian"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://jboss.org/schema/arquillian
http://jboss.org/schema/arquillian/arquillian_1_0.xsd">
<container qualifier="jbossas-managed-wildfly-8" default="true">
<configuration>
<property name="jbossHome">C:\test\wildfly-8.1.0.Final</property>
<property name="javaVmArguments">-Djboss.socket.binding.port-offset=2 -Xmx512m -XX:MaxPermSize=128m</property>
<property name="managementPort">9992</property>
</configuration>
</container>
The problem is this points to specific locations on the the disk, and different team members use Wildfly in different locations.
In addition we must duplicate Arquillian.xml for each project that uses it.
We use Arquillian for Maven testing (which could inject the values) and JUnit tests within Eclipse (which cannot inject them).
Any ideas how to do this?
Since there is already Maven support and structure then you can make use of Maven properties and replace of place holder values. It is simple
I guess your Arquillian.xml is under src/test/resources/arquillian.xml right? Then you can replace the absolute values with properties.
<configuration>
<property name="jbossHome">${jboss.home}</property>
</configuration>
The above property can be either defined in the properties section of your pom or can be overridden during mvn executuon using -Djboss.home=C:\myPath
In order though this thing to work, you want Maven automatically for each developer when is about to package arquillian.xml to replace this place-holder ${jboss.home} with a value, that we have either defined on top in the properties section or we have passed it from the command line. This is done through the resource filtering functionality
<build>
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
<testResources>
</build>
See the simple examples here
I have a project MyProject which has a dependency on configuration in another project, BaseProject. Inside BaseProject I have dependencies to many different projects like ErrorLogging, PerformanceLogging, etc... I want to be able to build the top level project (MyProject) and have it filter all the spring xml files in those projects that it has as dependencies. I'm not having any luck. I can see the beans but they are not being filtered. Some of the beans are being filtered with default filters defined in their own poms but non are using the filters from MyProject.
MyProject - This contains the filter files and imports the config from the other projects.
BaseProject - Has spring beans defined which require filtering.
ErrorLogging - Has spring beans defined which require filtering.
When I run a package from MyProject all the spring files are correctly extracted into the jar file but they still contain the property placeholder values ${error.logging.host} for example... The beans in MyProject are correctly filtered. The alternative to this is to define the beans in MyProject but there are about 10 of these projects which use BaseProject and it's beans and I do not want to have to redefine them across 10 seperate projects.
If anyone could shed any light on this issue it'd be great. Thanks
Edit :
To make this clearer, I have a spring beans xml definition inside of the project ErrorLogging called errors-config.xml which defines beans for connecting to databases. This just has place holders for the connection details which should be provided by the filter.properties file that is inside of MyProject.
errors-config.xml is imported as a resource into baseproject-config.xml which sits inside of the BaseProject. Base project and it's config file are imported to MyProject.
I then build MyProject using Maven and I would like the property placeholders inside of errors-config.xml to be replaced with the values in the filter.properties in MyProject. MyProject can successfully filter it's own files but not those of ErrorsLogging project. ErrorsLogging seems to pick up filters from it's own src/main/resources folder instead of that of MyProject.
You can achieve that by unpacking all the dependencies, filtering and packing again, the whole process depends on the structure of your project, for a basic configuration this may suffices:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack-dependencies</id>
<!--unpack all the dependencies to the target of this project-->
<phase>initialize</phase>
<inherited>false</inherited>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<includeGroupIds>${pom.groupId}</includeGroupIds>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/${artifactId}</outputDirectory>
<includes>**/*.properties,**/*.xml</includes>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptors>
<descriptor>${config.maven.plattform.resources}/assembly/zip.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>zip</id>
<phase>package</phase>
<inherited>true</inherited>
<goals>
<goal>assembly</goal>
</goals>
</execution>
</executions>
</plugin>
This should work as long as you have correctly defined the correct filtering of the resources (which takes places later and also uses the maven-resources-plugin).
You could use the PropertyOverrideConfigurer to override the initial properties.
For example, if you have the folowing datasource definition in errors-config.xml :
<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource">
<property name="driverClass" value="${hibernate.driverClassName}" />
<property name="jdbcUrl" value="${hibernate.url}" />
<property name="user" value="${hibernate.username}" />
<property name="password" value="${hibernate.password}" />
</bean>
You can override the database connection properties in the MyProject context like this :
<bean id="propertyOverideConfigurer" class="org.springframework.beans.factory.config.PropertyOverrideConfigurer">
<property name="locations">
<list>
<value>filter.properties</value>
</list>
</property>
</bean>
And in the filter.properties file you need to specify the bean names and properties you wish to override :
datasource.driverClass = oracle.jdbc.driver.OracleDriver
datasource.jdbcUrl = jdbc:oracle:thin:#localhost:1521:xe
datasource.user = username
datasource.password = password
Hope this helps.
Is there a way to set up a second persistence.xml file in a Maven project such that it is used for testing instead of the normal one that is used for deployment?
I tried putting a persistence.xml into src/test/resources/META-INF, which gets copied into target/test-classes/META-INF, but it seems target/classes/META-INF (the copy from the src/main/resources) gets preferred, despite mvn -X test listing the classpath entries in the right order:
[DEBUG] Test Classpath :
[DEBUG] /home/uqpbecke/dev/NetBeansProjects/UserManager/target/test-classes
[DEBUG] /home/uqpbecke/dev/NetBeansProjects/UserManager/target/classes
[DEBUG] /home/uqpbecke/.m2/repository/junit/junit/4.5/junit-4.5.jar
...
I would like to be able to run tests against a simple hsqldb configuration without having to change the deployment version of the JPA configuration, ideally straight after project checkout without any need for local tweaking.
The following will work for Maven 2.1+ (prior to that there wasn't a phase between test and package that you could bind an execution to).
You can use the maven-antrun-plugin to replace the persistence.xml with the test version for the duration of the tests, then restore the proper version before the project is packaged.
This example assumes the production version is src/main/resources/META-INF/persistence.xml and the test version is src/test/resources/META-INF/persistence.xml, so they will be copied to target/classes/META-INF and target/test-classes/META-INF respectively.
It would be more elegant to encapsulate this into a mojo, but as you're only copying a file, it seems like overkill.
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<id>copy-test-persistence</id>
<phase>process-test-resources</phase>
<configuration>
<tasks>
<!--backup the "proper" persistence.xml-->
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml.proper"/>
<!--replace the "proper" persistence.xml with the "test" version-->
<copy file="${project.build.testOutputDirectory}/META-INF/persistence.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
<execution>
<id>restore-persistence</id>
<phase>prepare-package</phase>
<configuration>
<tasks>
<!--restore the "proper" persistence.xml-->
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml.proper" tofile="${project.build.outputDirectory}/META-INF/persistence.xml"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
In an EE6/CDI/JPA project, a test src/test/resources/META-INF/persistence.xml is picked up just fine without any further configuration.
When using JPA in Spring, the following works in the application context used for testing:
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<!--
JPA requires META-INF/persistence.xml, but somehow prefers the one
in classes/META-INF over the one in test-classes/META-INF. Spring
to the rescue, as it allows for setting things differently, like by
referring to "classpath:persistence-TEST.xml". Or, simply referring
to "META-INF/persistence.xml" makes JPA use the test version too:
-->
<property name="persistenceXmlLocation" value="META-INF/persistence.xml" />
<!-- As defined in /src/test/resources/META-INF/persistence.xml -->
<property name="persistenceUnitName" value="myTestPersistenceUnit" />
<property name="jpaVendorAdapter">
<bean
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
</bean>
</property>
</bean>
Here, /src/test/resources/META-INF/persistence.xml (copied into target/test-classes) would be preferred over /src/main/resources/META-INF/persistence.xml (copied into target/classes).
Unfortunately, the location of the persistence.xml file also determines the so-called "persistence unit's root", which then determines which classes are scanned for #Entity annotations. So, using /src/test/resources/META-INF/persistence.xml would scan classes in target/test-classes, not classes in target/classes (where the classes that need to be tested would live).
Hence, for testing, one would need to explicitly add <class> entries to persistence.xml, to avoid java.lang.IllegalArgumentException: Not an entity: class .... The need for <class> entries can be avoided by using a different file name, like persistence-TEST.xml, and put that file in the very same folder as the regular persistence.xml file. The Spring context from your test folder can then just refer to <property name="persistenceXmlLocation" value="META-INF/persistence-TEST.xml" />, and Spring will find it for you in src/main.
As an alternative, one might be able to keep persistence.xml the same for the actual application and the tests, and only define one in src/main. Most configuration such as the drivers, dialect and optional credentials can be done in the Spring context instead. Also settings such as hibernate.hbm2ddl.auto can be passed in the context:
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<!-- For example: com.mysql.jdbc.Driver or org.h2.Driver -->
<property name="driverClassName" value="#{myConfig['db.driver']}" />
<!-- For example: jdbc:mysql://localhost:3306/myDbName or
jdbc:h2:mem:test;DB_CLOSE_DELAY=-1 -->
<property name="url" value="#{myConfig['db.url']}" />
<!-- Ignored for H2 -->
<property name="username" value="#{myConfig['db.username']}" />
<property name="password" value="#{myConfig['db.password']}" />
</bean>
<bean id="jpaAdaptor"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<!-- For example: org.hibernate.dialect.MySQL5Dialect or
org.hibernate.dialect.H2Dialect -->
<property name="databasePlatform" value="#{myConfig['db.dialect']}" />
</bean>
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter" ref="jpaAdapter" />
<property name="jpaProperties">
<props>
<!-- For example: validate, update, create or create-drop -->
<prop key="hibernate.hbm2ddl.auto">#{myConfig['db.ddl']}</prop>
<prop key="hibernate.show_sql">#{myConfig['db.showSql']}</prop>
<prop key="hibernate.format_sql">true</prop>
</props>
</property>
</bean>
It seems multiple persistence.xml files is a general problem with JPA, solved only by classloading tricks.
A workaround that works for me is to define multiple persistence units in one persistence.xml file and then make sure that your deployment and test code use a different binding (in Spring you can set the "persistenceUnitName" property on the entity manager factory). It pollutes your deployment file with the test configuration, but if you don't mind that it works ok.
Add a persistance.xml for tests: /src/test/resources/META-INF/persistence.xml
As #Arjan said, that would change persistance unit's root and entity classes would be scanned in target/test-classes. To handle that, add jar-file element to this persistance.xml:
/src/test/resources/META-INF/persistence.xml
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
version="2.0">
<persistence-unit name="com.some.project">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<jar-file>${project.basedir}/target/classes</jar-file>
<properties>
<property name="javax.persistence.jdbc.url" value="jdbc:postgresql://localhost:5432/test_database" />
<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver" />
<property name="javax.persistence.jdbc.user" value="user" />
<property name="javax.persistence.jdbc.password" value="..." />
</properties>
</persistence-unit>
</persistence>
Then, add filtering of test resources to your pom.xml:
<project>
...
<build>
...
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
...
</build>
...
</project>
This will work because jar-file can target to directories, not only to jar files.
I prefer the solution of using different persistence.xml for testing and production as Rich Seller post (thanks!!).
But need to change:
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml.proper" tofile="${project.build.outputDirectory}/META-INF/persistence.xml"/>
for:
<move file="${project.build.outputDirectory}/META-INF/persistence.xml.proper" tofile="${project.build.outputDirectory}/META-INF/persistence.xml" overwrite="true"/>
In order persistence.xml.proper not embedded in .jar file
I tried the ClassLoaderProxy approach but had the problem that the JPA annotated classes are not handled as persistent classes by hibernate.
So decided to try it without using persistence.xml. The advantage is that the maven build and the Eclipse JUnit test will work without modifications.
I have a persitent support class for JUnit testing.
public class PersistenceTestSupport {
protected EntityManager em;
protected EntityTransaction et;
/**
* Setup the the {#code EntityManager} and {#code EntityTransaction} for
* local junit testing.
*/
public void setup() {
Properties props = new Properties();
props.put("hibernate.hbm2ddl.auto", "create-drop");
props.put("hibernate.dialect", "org.hibernate.dialect.MySQLDialect");
props.put("hibernate.connection.url", "jdbc:mysql://localhost/db_name");
props.put("hibernate.connection.driver_class", "com.mysql.jdbc.Driver");
props.put("hibernate.connection.username", "user");
props.put("hibernate.connection.password", "****");
Ejb3Configuration cfg = new Ejb3Configuration();
em = cfg.addProperties(props)
.addAnnotatedClass(Class1.class)
.addAnnotatedClass(Class2.class)
...
.addAnnotatedClass(Classn.class)
.buildEntityManagerFactory()
.createEntityManager();
et = em.getTransaction();
}
}
My test classes just extend PersistenceTestSupport and call the setup() in TestCase.setup().
The only drawback is to keep the persistent classes up todate, but for JUnit testing this is acceptable for me.
This answer might sounds silly but I was looking for a way which lets me run those tests from eclipse by Run As -> JUnit Test. This is how I made it:
#BeforeClass
public static void setUp() throws IOException {
Files.copy(new File("target/test-classes/META-INF/persistence.xml"), new File("target/classes/META-INF/persistence.xml"));
// ...
}
I'm just copying the test/persistence.xml to classes/persistence.xml. This works.
Keep two copies of persistence.xml file. One for testing and another for normal build.
The default life cycle copy the build persistence.xml to src/test/resources/META-INF
Create a separate profile which when run will copy the testing persistence.xml to src/test/resources/META-INF
Persistence.xml is used as a starting point to search for entity classes unless you list all classes explicitly and add .
So if you want to override this file with another one, say from src/test/resources, you have to specify every single entity class in this second persistence.xml otherwise no entity class would be found.
Another solution would be to overwrite the file using the maven-resources-plugin ('copy-resources' goal). But then you have to overwrite it twice, once for testing (e.g. phase process-test-classes) and once for the real packaging (phase 'prepare-package').
This is an extension of Rich Seller's answer with proper handling of Hibernate finding multiple persistence.xml files on the classpath and pre-testing state restoration.
Setup:
Create one persistence file for deployment/packaging and one for testing:
src/main/resources/persistence.xml
src/test/resources/persistence-testing.xml
in your pom.xml add this to the plugins section:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<id>copy-test-persistence</id>
<phase>process-test-resources</phase>
<configuration>
<tasks>
<echo>renaming deployment persistence.xml</echo>
<move file="${project.build.outputDirectory}/META-INF/persistence.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml.proper"/>
<echo>replacing deployment persistence.xml with test version</echo>
<copy file="${project.build.testOutputDirectory}/META-INF/persistence-testing.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml" overwrite="true"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
<execution>
<id>restore-persistence</id>
<phase>prepare-package</phase>
<configuration>
<tasks>
<echo>restoring the deployment persistence.xml</echo>
<move file="${project.build.outputDirectory}/META-INF/persistence.xml.proper" tofile="${project.build.outputDirectory}/META-INF/persistence.xml" overwrite="true"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
Advantages over other solutions
No extra Java code required
Only one persistence.xml on classpath
Both building and testing work as expected
Describing output on console (echo)
For packaging the state is 100% restored. No leftover files
I'm trying to do the same thing. I have a solution that works for me - yours may vary (and you might not love the solution... it's a bit low-level).
I came across an article on the net where they were using a custom class loader to do something similar which served as inspiration. If anyone can see how to improve then suggestions would be welcome btw. For deployment I rely on container injection of the EntityManager but for testing I create it myself using this code:
final Thread currentThread = Thread.currentThread();
final ClassLoader saveClassLoader = currentThread.getContextClassLoader();
currentThread.setContextClassLoader(new ClassLoaderProxy(saveClassLoader));
EntityManagerFactory emFactory = Persistence.createEntityManagerFactory("test");
em = emFactory.createEntityManager();
Then the ClassLoaderProxy is about as minimal as you can get and just redirects requests for META-INF/persistence.xml to META-INF/test-persist.xml:
public class ClassLoaderProxy extends ClassLoader {
public ClassLoaderProxy(final ClassLoader parent) {
super();
}
#Override
public Enumeration<URL> getResources(final String name) throws IOException {
if (!"META-INF/persistence.xml".equals(name)) {
return super.getResources(name);
} else {
System.out.println("Redirecting persistence.xml to test-persist.xml");
return super.getResources("META-INF/test-persist.xml");
}
}
}
Just to explain this a bit more:
There are two persistence.xml files (one named persistence.xml that is used outside testing and one named test-persist.xml that is used for tests).
The custom class loader is only active for unit tests (for deployment everything is normal)
The custom class loader redirects requests for "META-INF/persistence.xml" to the test version ("META-INF/test-persist.xml").
I was originally hitting some problems because Hibernate will revert back (somehow) to the classloader that was used to load Hibernate (at least I think that is what was going on). I've found that putting the ClassLoader switching code (the first block) as a static block in your Test case it will get loaded before Hibernate but that, depending on your unit test structure you may also need to put the same code in other places (yuck).
Another approach is to use a separate persistence.xml for testing (test/../META-INF/persistence.xml but override the Scanner as follows: -
testing persistence.xml needs to contain
<property name="hibernate.ejb.resource_scanner" value = "...TestScanner" />
Code for new class TestScanner is as follows.
import java.lang.annotation.Annotation;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Set;
import org.hibernate.ejb.packaging.NamedInputStream;
import org.hibernate.ejb.packaging.NativeScanner;
public class TestScanner extends NativeScanner
{
#Override
public Set <Class <?> >
getClassesInJar (URL jar, Set <Class <? extends Annotation> > annotations)
{ return super.getClassesInJar (getUpdatedURL (jar), annotations); }
#Override
public Set <NamedInputStream>
getFilesInJar (URL jar, Set <String> patterns)
{ return super.getFilesInJar (getUpdatedURL (jar), patterns); }
#Override
public Set <Package>
getPackagesInJar (URL jar, Set <Class <? extends Annotation> > annotations)
{ return super.getPackagesInJar (getUpdatedURL (jar), annotations); }
private URL getUpdatedURL (URL url)
{
String oldURL = url.toExternalForm ();
String newURL = oldURL.replaceAll ("test-classes", "classes");
URL result;
try {
result = newURL.equals (oldURL) ? url : new URL (newURL);
} catch (MalformedURLException e)
{ // Whatever }
return result;
}
}
When using OpenEJB, persistence.xml can be overriden with alternate descriptors: http://tomee.apache.org/alternate-descriptors.html
Another option for this use case would be adding multiple persistence units, one for lets say production and another one for testing and inject the EntityManagerFactory accordingly.
Place both persistence-units into the persistence.xml of the actual project and have your test cases inject the correct EntityManager. The example below illustrates how to do that with guice. Please note that I've left in some mockito mocking for completeness, the mockito specific code has been marked accordingly and is not required for injection.
public class HibernateTestDatabaseProvider extends AbstractModule {
private static final ThreadLocal<EntityManager> ENTITYMANAGER_CACHE = new ThreadLocal<>();
#Override
public void configure() {
}
#Provides
#Singleton
public EntityManagerFactory provideEntityManagerFactory() {
return Persistence.createEntityManagerFactory("my.test.persistence.unit");
}
#Provides
public CriteriaBuilder provideCriteriaBuilder(EntityManagerFactory entityManagerFactory) {
return entityManagerFactory.getCriteriaBuilder();
}
#Provides
public EntityManager provideEntityManager(EntityManagerFactory entityManagerFactory) {
EntityManager entityManager = ENTITYMANAGER_CACHE.get();
if (entityManager == null) {
// prevent commits on the database, requires mockito. Not relevant for this answer
entityManager = spy(entityManagerFactory.createEntityManager());
EntityTransaction et = spy(entityManager.getTransaction());
when(entityManager.getTransaction()).thenReturn(et);
doNothing().when(et).commit();
ENTITYMANAGER_CACHE.set(entityManager);
}
return entityManager;
}
}
put tests in own maven project with its persistence.xml
I'd suggest using different maven profiles where you could filter your database.proprerties files and have one database.properties per profile.
This way you don't have to keep duplicates of any other configuration files except for the .properties.
<properties>
<!-- Used to locate the profile specific configuration file. -->
<build.profile.id>default</build.profile.id>
<!-- Only unit tests are run by default. -->
<skip.integration.tests>true</skip.integration.tests>
<skip.unit.tests>false</skip.unit.tests>
<integration.test.files>**/*IT.java</integration.test.files>
</properties>
<profiles>
<profile>
<id>default</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<!--
Specifies the build profile id, which is used to find out the correct properties file.
This is not actually necessary for this example, but it can be used for other purposes.
-->
<build.profile.id>default</build.profile.id>
<skip.integration.tests>true</skip.integration.tests>
<skip.unit.tests>false</skip.unit.tests>
</properties>
<build>
<filters>
<!--
Specifies path to the properties file, which contains profile specific
configuration. In this case, the configuration file should be the default spring/database.properties file
-->
<filter>src/main/resources/META-INF/spring/database.properties</filter>
</filters>
<resources>
<!--
Placeholders found from files located in the configured resource directories are replaced
with values found from the profile specific configuration files.
-->
<resource>
<filtering>true</filtering>
<directory>src/main/resources</directory>
<!--
You can also include only specific files found from the configured directory or
exclude files. This can be done by uncommenting following sections and adding
the configuration under includes and excludes tags.
-->
<!--
<includes>
<include></include>
</includes>
<excludes>
<exclude></exclude>
</excludes>
-->
</resource>
</resources>
</build>
</profile>
<profile>
<id>integration</id>
<properties>
<!--
Specifies the build profile id, which is used to find out the correct properties file.
This is not actually necessary for this example, but it can be used for other purposes.
-->
<build.profile.id>integration</build.profile.id>
<skip.integration.tests>false</skip.integration.tests>
<skip.unit.tests>true</skip.unit.tests>
</properties>
<build>
<filters>
<!--
Specifies path to the properties file, which contains profile specific
configuration. In this case, the configuration file is searched
from spring/profiles/it/ directory.
-->
<filter>src/main/resources/META-INF/spring/profiles/${build.profile.id}/database.properties</filter>
</filters>
<resources>
<!--
Placeholders found from files located in the configured resource directories are replaced
with values found from the profile specific configuration files.
-->
<resource>
<filtering>true</filtering>
<directory>src/main/resources</directory>
<!--
You can also include only specific files found from the configured directory or
exclude files. This can be done by uncommenting following sections and adding
the configuration under includes and excludes tags.
-->
<!--
<includes>
<include></include>
</includes>
<excludes>
<exclude></exclude>
</excludes>
-->
</resource>
</resources>
</build>
</profile>
</profiles>
With the help of surefire for unit tests and failsfe for integration tests, you're done.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<junitArtifactName>org.junit:com.springsource.org.junit</junitArtifactName>
<!--see: https://issuetracker.springsource.com/browse/EBR-220-->
<printSummary>false</printSummary>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
<!-- Skips unit tests if the value of skip.unit.tests property is true -->
<skipTests>${skip.unit.tests}</skipTests>
<!-- Excludes integration tests when unit tests are run. -->
<excludes>
<exclude>${integration.test.files}</exclude>
</excludes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.12</version>
<configuration>
<!-- Skips integration tests if the value of skip.integration.tests property is true -->
<skipTests>${skip.integration.tests}</skipTests>
<includes>
<include>${integration.test.files}</include>
</includes>
<forkMode>once</forkMode>
<!--
<reuseForks>false</reuseForks>
<forkCount>1</forkCount>
-->
</configuration>
<executions>
<execution>
<id>integration-test</id>
<goals>
<goal>integration-test</goal>
</goals>
</execution>
<execution>
<id>verify</id>
<goals>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
Now you need just mvn test for your unit tests and mvn verify -Pintegration for your integration tests.
Obviously you should create the database.properties files in the specified (on the profiles) paths (or elsewhere and change the paths)
Based-on reference: http://www.petrikainulainen.net/programming/tips-and-tricks/creating-profile-specific-configuration-files-with-maven/
I found 2 possibilities without changing classloader/using other Maven plugins/profiles/copy-overwrite files.
TL;DR: check provider name.
At first I started to construct the entityManagerFactory programmatically, like here: create entity manager programmatically without persistence file.
So I did sth very similar:
#BeforeClass
public static void prepare() {
Map<String, Object> configOverrides = new HashMap<>();
configOverrides.put("hibernate.connection.driver_class", "org.h2.Driver");
configOverrides.put("hibernate.connection.url", "jdbc:h2:mem:test;DB_CLOSE_DELAY=-1");
configOverrides.put("hibernate.connection.username", "sa");
configOverrides.put("hibernate.connection.password", "sa");
configOverrides.put("hibernate.dialect", "org.hibernate.dialect.H2Dialect");
configOverrides.put("hibernate.show_sql", "true");
configOverrides.put("hibernate.hbm2ddl.auto", "validate");
factory = new HibernatePersistence().createContainerEntityManagerFactory(
new CustomPersistenceUnitInfo(), configOverrides
);
//factory = Persistence.createEntityManagerFactory("test");
assertNotNull(factory);
}
...
private static class CustomPersistenceUnitInfo implements PersistenceUnitInfo {
#Override
public String getPersistenceUnitName() {
return "test";
}
#Override
public String getPersistenceProviderClassName() {
return "org.hibernate.jpa.HibernatePersistenceProvider";
// <------------note here: this is wrong!
}
#Override
public PersistenceUnitTransactionType getTransactionType() {
return PersistenceUnitTransactionType.RESOURCE_LOCAL;
}
#Override
public DataSource getJtaDataSource() {
return null;
}
#Override
public DataSource getNonJtaDataSource() {
return null;
}
#Override
public List<String> getMappingFileNames() {
return Collections.emptyList();
}
#Override
public List<URL> getJarFileUrls() {
try {
return Collections.list(this.getClass()
.getClassLoader()
.getResources(""));
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
#Override
public URL getPersistenceUnitRootUrl() {
return null;
}
#Override
public List<String> getManagedClassNames() {
return Arrays.asList(
"com.app.Entity1",
"com.app.Entity2"
);
}
#Override
public boolean excludeUnlistedClasses() {
return true;
}
#Override
public SharedCacheMode getSharedCacheMode() {
return null;
}
#Override
public ValidationMode getValidationMode() {
return null;
}
#Override
public Properties getProperties() {
return null;
}
#Override
public String getPersistenceXMLSchemaVersion() {
return null;
}
#Override
public ClassLoader getClassLoader() {
return null;
}
#Override
public void addTransformer(final ClassTransformer classTransformer) {
}
#Override
public ClassLoader getNewTempClassLoader() {
return null;
}
}
But then, I found it still return null. Why?
Then I found that when I use com.hibernate.ejb.HibernatePersistence class, the provider should not be com.hibernate.jpa.HibernatePersistenceProvider, but com.hibernate.ejb.HibernatePersistence. The class HibernatePersistenceProvider is not even found with IDEA "Open Class", even when it is in the main persistence.xml.
In Ejb3Configuration.class I found:
integration = integration != null ? Collections.unmodifiableMap(integration) : CollectionHelper.EMPTY_MAP;
String provider = (String)integration.get("javax.persistence.provider");
if (provider == null) {
provider = info.getPersistenceProviderClassName();
}
if (provider != null && !provider.trim().startsWith(IMPLEMENTATION_NAME)) { // private static final String IMPLEMENTATION_NAME = HibernatePersistence.class.getName(); which, is, "com.hibernate.ejb.HibernatePersistence"
LOG.requiredDifferentProvider(provider);
return null;
} else {
So I went back to the first solution of persistence.xml, and change provider name, and now it works. It seems that even the provider in main is jpa.xxx, in tests it is not.
So, in summary, 3 things to check:
turn on -X in Maven to check if maven-resources-plugin really copied your src/test/resources/META-INF/persistence.xml into target/test-classes(I think this never fails)
check if hibernate-entitymanager is in your classpath(you can check with mvn dependency:tree -Dincludes=org.hibernate:hibernate-entitymanager.
check provider's name, most important one. Should be org.hibernate.ejb.HibernatePersistence.
<persistence version="2.0"
xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="test" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>com.app.model.Company</class>
...