Configure HSQLDB with maven - java

I'm developing a Spring application where everything is configured with maven (in pom.xml). My application uses a PostgreSQL database, but unit tests use an in-memory HSQLDB database.
I just run into an issue with TEXT columns because they are not supported natively by HSQLDB. In my entity class I have :
private #Column(columnDefinition = "text") String propertyName;
This works fine with Postgres, but HSQLDB is generating the following error : type not found or user lacks privilege: TEXT. The table is not created, and of course as a result most of my tests fail.
I found that I need to activate PostgreSQL compatibility in order for this to work by setting sql.syntax_pgs to true.
My question is : where do I put this setting ? I would like to put it in pom.xml because everything is configured there, but I don't know where.
For exemple I have :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Dspring.profiles.active=test</argLine>
</configuration>
</plugin>
Can I somehow add an <argLine> with this setting ?

When you add hsqldb dependency it uses default connection properties. You can override these properties in property file or through other configuration as per your requirement. You can set "sql.syntax_pgs=true" to HSQLDB connection url. For example in case of spring boot this will be like below.
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Dspring.datasource.url=jdbc:hsqldb:mem:PUBLIC;sql.syntax_pgs=true</argLine>
</configuration>
</plugin>

you can set it in the Datasource configuration as given here
<bean class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" id="dataSource">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:mem:PUBLIC;sql.syntax_pgs=true" />
<property name="username" value="sa" />
<property name="password" value="" />
</bean>

Related

Eclipselink static weaving in Java web application - Eclipse

Im currently working on project where I develop Java web application. I use IDE Eclipse.
As data layer I use EclipseLink(2.6) JPA. Web application runs on Tomcat webserver(7). Now I realized that I need to use LAZY fetching for my Entities because of performance issues.
After some research I figured out that I need to use "static weaving", acording to manual pages, I found that I have 3 possibilities how to do that: Ant, Maven or use command line.
Since I have no experience with Ant, Maven or command line options I dont know how to continue now. I would like to pick easiest solution, which is Ant (from my begginer point of view). Can you suggest?
My project is divided into two projects:
JPA project, with persistence entities and database operations
Java application with servlets and JSP, this project contains link to JPA project
I run that application on:
remote Tomcat server - then I generate .war file that contains both projects and then I upload it on server
localhost - then I run that application directly from Eclipse (Run As -> localhost)
Can please somebody tell me process how I should continue now?
Shall I specify two steps Ant build that firstly create .jar from my JPA project, then do static weaving and continue in building web project to .war? So far I found only Ant builds where .jar applications are generated, not sure how it differ to web applications. Any tutorials there?
Can please somebody share some hits? I am complete beginer in this area - perhaps I missed some easy ways. My main goal is to have .war file that contains entities with static weaving, secondary goal is to automate deploying on tomcat localhost server, as I do it now from Eclipse IDE.
Thank you.
I can only speak for the Maven side of things, because thats the one I have experience with.
Inside your pom.xml , under plugins you will have to add the maven staticweave plugin:
<plugin>
<groupId>de.empulse.eclipselink</groupId>
<artifactId>staticweave-maven-plugin</artifactId>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>weave</goal>
</goals>
<configuration>
<persistenceXMLLocation>META-INF/persistence.xml</persistenceXMLLocation>>
</configuration>
</execution>
</executions>
</plugin>
You may have to adjust your persistence.xml-location.
Inside your persistence.xml you will need to activate static weaving:
<properties>
<property name="eclipselink.target-database" value="org.eclipse.persistence.platform.database.H2Platform" />
<property name="eclipselink.weaving" value = "static"/>
<property name="eclipselink.weaving.internal" value="true"/>
<property name="eclipselink.weaving.lazy" value="true" />
<property name="eclipselink.weaving.changetracking" value="true" />
<property name="eclipselink.weaving.fetchgroups" value="true" />
<property name="eclipselink.weaving.eager" value="false" />
<property name="eclipselink.ddl-generation" value="drop-and-create-tables" />
<property name="eclipselink.ddl-generation.output-mode" value="database" />
<property name="eclipselink.logging.level" value="FINEST" />
</properties>
Again, you may have to change your platform, desired logging level and the other parameters.
If you leave all the eclipselink.weaving.*-parameters out, they will have their default values.
The static weaving happens in the process-classes phase of the maven lifecycle after the compile phase. If you have packaging set to war, you will get a single *.war-file that you can then deploy.
If you would really rather do it via ant-task, you should look at
https://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Advanced_JPA_Development/Performance/Weaving/Static_Weaving#Use_the_weave_Ant_Task
I can't really answer any questions about that though.

Wildfly Data Persistence

I am currently working on a Java EE project and am working with the Wildfly server.
I have a Web project and EJB project which are deployed onto the Wildfly server.
I can save a user for example, but only for as long as the server is running.
There is no data persistence between server downtimes.
I have searched through the internet but couldn't find an answer.
My persistence.xml looks like this:
<persistence-unit name="primary">
<!-- If you are running in a production environment, add a managed
data source, this example data source is just for development and testing! -->
<!-- The datasource is deployed as WEB-INF/kitchensink-quickstart-ds.xml, you
can find it in the source at src/main/webapp/WEB-INF/kitchensink-quickstart-ds.xml -->
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<properties>
<!-- Properties for Hibernate -->
<property name="hibernate.hbm2ddl.auto" value="create-drop" />
<property name="hibernate.show_sql" value="false" />
<value="true"/>
</properties>
If I want to persist any information, do i need to reconfigure this file?
I hope you can help me :)
Your problem is this line
<property name="hibernate.hbm2ddl.auto" value="create-drop" />
Everytime when the wildfly starts up, JPA creates a new database model with an empty database.
Adjust your code to
<property name="hibernate.hbm2ddl.auto" value="update" />
You are using "ExampleDS" which is set up as H2 in-memory database by default. It therefore does not persist data between restarts on purpose (useful for development/testing). Go to wildfly's standalone/configuration/standalone.xml configuration file and search for "ExampleDS" in the "datasources" section. It should show:
<connection-url>jdbc:h2:mem:test;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE</connection-url>
where "mem" means in-memory. You can change "mem:test" to any write path, e.g.
<connection-url>jdbc:h2:~/test;DB_CLOSE_DELAY=-1</connection-url>
to use a H2 file-based database stored as "test" in your home-folder (assuming *nix).
You can also define additional databases (Postgresql, Oracle, etc) in the datasources-section.

JPA MetaDataException for different datasources

I have the problem with JPA Criteria API while using in my project different datasource persistance.
There are two PU uses different datasources:
<persistence-unit name="analysis" transaction-type="RESOURCE_LOCAL">
<provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
<non-jta-data-source>AnalysisDS</non-jta-data-source>
<class>entity1</class>
<class>entity2</class>
<class>entity3</class>
and
<persistence-unit name="reaction" transaction-type="RESOURCE_LOCAL">
<provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
<non-jta-data-source>ReactionDS</non-jta-data-source>
<class>someEntity1</class>
<class>someEntity2</class>
<class>someEntity3</class>
Spring load it, in applicationContext
<bean id="defaultAnalysysDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
lazy-init="default">
<property name="jndiName" value="AnalysisDS"/>
<property name="lookupOnStartup" value="false"/>
<property name="cache" value="true"/>
<property name="proxyInterface" value="javax.sql.DataSource"/>
</bean>
<bean id="defaultReactionDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
lazy-init="default">
<property name="jndiName" value="ReactionDS"/>
<property name="lookupOnStartup" value="false"/>
<property name="cache" value="true"/>
<property name="proxyInterface" value="javax.sql.DataSource"/>
</bean>
In my DAO I can work with this PU with EntityManager, for example for
ReactionDS I'Am using
#PersistenceContext(unitName = "reaction")
private EntityManager entityManager;
And all work done - simple query's and JPQL expressions.
But when I want to introduce to my DAO JPA Criteria API
Like this :
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
...
I have exception after getCriteriaBuilder() method works:
Caused by: <openjpa-2.4.0-r422266:1674604 fatal user error> org.apache.openjpa.util.MetaDataException: Errors encountered while resolving metadata. See nested exceptions for details.
at org.apache.openjpa.meta.MetaDataRepository.resolve(MetaDataRepository.java:675)
at org.apache.openjpa.meta.MetaDataRepository.getMetaDataInternal(MetaDataRepository.java:418)
at org.apache.openjpa.meta.MetaDataRepository.getMetaData(MetaDataRepository.java:389)
at org.apache.openjpa.persistence.meta.MetamodelImpl.(MetamodelImpl.java:86)
at org.apache.openjpa.persistence.EntityManagerFactoryImpl.getMetamodel(EntityManagerFactoryImpl.java:348)
at org.apache.openjpa.persistence.EntityManagerFactoryImpl.getCriteriaBuilder(EntityManagerFactoryImpl.java:332)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
...
Caused by: <openjpa-2.4.0-r422266:1674604 fatal user error> org.apache.openjpa.util.MetaDataException: Table "ANALYSIS.ENTITY1" given for "entity1" does not exist.
at org.apache.openjpa.jdbc.meta.MappingInfo.createTable(MappingInfo.java:532)
at org.apache.openjpa.jdbc.meta.ClassMappingInfo.getTable(ClassMappingInfo.java:317)
at org.apache.openjpa.jdbc.meta.ClassMappingInfo.getTable(ClassMappingInfo.java:339)
at org.apache.openjpa.jdbc.meta.strats.FullClassStrategy.map(FullClassStrategy.java:73)
at org.apache.openjpa.jdbc.meta.ClassMapping.setStrategy(ClassMapping.java:392)
at org.apache.openjpa.jdbc.meta.RuntimeStrategyInstaller.installStrategy(RuntimeStrategyInstaller.java:55)
at org.apache.openjpa.jdbc.meta.MappingRepository.prepareMapping(MappingRepository.java:410)
at org.apache.openjpa.meta.MetaDataRepository.preMapping(MetaDataRepository.java:769)
at org.apache.openjpa.meta.MetaDataRepository.resolve(MetaDataRepository.java:658)
... 147 more
The problem root cause in JPA, because his trying to use a tables from Analys in Reaction PU and extracts all meta-classes for entities that are located in different datasources, but access to them is doing in one.
But when I granted select on Entity1 to ReactionDS - all works done. (because I can use Select * from Analysis.Entity1 from reaction)
The question - how to make the metamodel classes to choose working only within the specified DS in EntityManager (in current example - Reaction, not together with Analysis) ?
p.s Database is Oracle, using Weblogic 12.1.3 and OpenJpa2.4.
Metamodel is generated automatically with maven plugin on compile:
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<processors>
<processor>org.apache.openjpa.persistence.meta.AnnotationProcessor6</processor>
</processors>
<optionMap>
<openjpa.metamodel>true</openjpa.metamodel>
</optionMap>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.apache.openjpa</groupId>
<artifactId>openjpa</artifactId>
<version>${openjpa.version}</version>
</dependency>
</dependencies>
</plugin>
I think you may be confused by your Spring Framework datasource declarations.
These beans do not define your datasource, they only provide a way for other Spring components to access the datasources that have been configured in your server. JPA does not use these at all.
Therefore, your problem lies in the datasources that you have defined in your WebLogic server. It looks like you have defined both datasources to reference the same database instance.

How do I make logback read a properties file which name is a variable?

I am using http://logback.qos.ch/
I am running my java process with a parameter for example -Dproperties.url=myappproperties-production.properties or -Dproperties.url=myappproperties-development.properties depending on the environemnt it is run in.
Problem: how to make logback pick up my properties file?
If the properties file name is static I would do (works fine):
<configuration>
<property resource="myappproperties-development.properties" />
(...)
</configuration>
But I need something that is dynamic (this does not work):
<configuration>
<property resource="${properties.url}" />
(...)
</configuration>
The value of the resource file can be a property itself. In other words,
<configuration debug="true">
<property resource="${properties.url}" />
(...)
</configuration>
should work. BYW, set the debug attribute of <configuration> element to true to see logback's internal messages on the console. Which version of logback are you using?
How would you handle this if the properties url value was in inverted commas?
e.g.
-Dproperties.url="myappproperties-production.properties"
I just tried this and it didnt work
The reason I ask is because the latest version of the Amazon AMI in Amazon Elastic Beanstalk adds inverted commas around a property value - even if a user doesnt specify this
This is the Spring recommended way:
<configuration>
<springProfile name="local">
<!-- configuration to be enabled when the "local" profile is active -->
<property resource="application-local.properties" />
</springProfile>
<springProfile name="dev">
<!-- configuration to be enabled when the "dev" profile is active -->
<property resource="application-dev.properties" />
</springProfile>
</configuration>
Reference docs: https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-logging.html

How to use HSQLdb with Ibatis

I want to use in memory database to query for data in my unit testing, my project is Ibatis (with annotation) for querying actual database which I want to mimic with the help of HSQLDB.
Please help me with how to configure iBatis with HSQLDB.
Also is there any way to these better for unit testing with code which is strongly dependent on database in its functions.
You can make an iBatis sqlMappings.xml config file something like this:
<sql-map-config>
<properties resource="configuration.properties" />
<!--The datasource for you application is configured here: -->
<datasource name = "hsql"
factory-class="com.ibatis.db.sqlmap.datasource.SimpleDataSourceFactory"
default="true">
<property name="JDBC.Driver" value=""/>
<property name="JDBC.ConnectionURL" value=""/>
<property name="JDBC.Username" value=""/>
<property name="JDBC.Password" value=""/>
</datasource>
<!--Declare the SQL Maps to be loaded for this application.
Be sure it's in your classpath. -->
<sql-map resource="maps/beanMappings.xml"/>
</sql-map-config>
plus a congifuration.properties file like this:
JDBC.Driver=org.hsqldb.jdbcDriver
JDBC.ConnectionURL=jdbc:hsqldb:hsql://localhost/myDb
JDBC.Username=sa
JDBC.Password=
and then use it like this:
String resource = "maps/SqlMapConfig.xml";
Reader reader = Resources.getResourceAsReader(resource);
SqlMap sqlMap = XmlSqlMapBuilder.buildSqlMap(reader);

Categories