Jooq code gen with multiple xml schema files - java

I am using jooq codegen plugin in maven to generate code from xml schema file.
<configuration>
<generator>
<database>
<name>org.jooq.util.xml.XMLDatabase</name>
<properties>
<!-- Use any of the SQLDialect values here -->
<property>
<key>dialect</key>
<value>MYSQL</value>
</property>
<!-- Specify the location of your database file -->
<property>
<key>xml-file</key>
<value>${project.basedir}/src/main/resources/schema.xml</value>
</property>
</properties>
</database>
<generate>
<daos>true</daos>
<pojos>true</pojos>
<records>true</records>
<relations>true</relations>
<globalObjectReferences>false</globalObjectReferences>
</generate>
<target>
<!-- The destination package of your generated classes (within the
destination directory) -->
<packageName>com.generated.classes</packageName>
<!-- The destination directory of your generated classes. Using
Maven directory layout here -->
<directory>${project.basedir}/src/generated/classes</directory>
</target>
</generator>
</configuration>
Is there a solution to generate code from two different schema files. Example: schema-other.xml.

This is not yet supported by the XMLDatabase meta data source. The pending feature request is: https://github.com/jOOQ/jOOQ/issues/6260
There are workarounds, though:
Using separate configurations
If the two schemas / files are not linked, you can run two independent code generation runs. If you're using Maven, you could do it like this (see also this question):
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<version>3.9.4</version>
<executions>
<execution>
<id>first-generation</id>
<phase>generate-sources</phase>
<goals><goal>generate</goal></goals>
<configuration>
<generator>
<database>
<name>org.jooq.util.xml.XMLDatabase</name>
...
<properties>
<property>
<key>xml-file</key>
<value>file1.xml</value>
</property>
</properties>
</database>
...
<target>
<packageName>com.generated.classes.schema1</packageName>
<directory>${project.basedir}/src/generated/classes</directory>
</target>
</generator>
</configuration>
</execution>
<execution>
<id>second-generation</id>
<phase>generate-sources</phase>
<goals><goal>generate</goal></goals>
<configuration>
<!-- jOOQ configuration here -->
</configuration>
</execution>
</executions>
</plugin>
If you're using standalone code generation, just configure two separate runs.
Merging the XML files
You could of course merge the two XML files manually into a single one, e.g. by using XSLT for automatic merging, or manually.

Related

Information_schema not getting generated from jooq for SQL Server

I am using Jooq Trial for generating code from SQL Server database as a poc. I use the below congifuration. However, it is not generating the information schema during compilation.
<plugin>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<version>${jooq.version}</version>
<executions>
<execution>
<id>jooq-codegen</id>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<skip>${skip.jooq.generation}</skip>
</configuration>
</execution>
</executions>
<configuration>
<jdbc>
<driver>com.microsoft.sqlserver.jdbc.SQLServerDriver</driver>
<url>${database.url}</url>
<user></user>
<password></password>
</jdbc>
<generator>
<name>org.jooq.codegen.JavaGenerator</name>
<database>
<name>org.jooq.meta.sqlserver.SQLServerDatabase</name>
<includes>.*</includes>
<excludes></excludes>
<!--<inputSchema></inputSchema> --> <!-- This will generate all schema of db, better to only generate the one
interested in -->
<inputCatalog>scm</inputCatalog>
<schemata>
<schema>
<inputSchema>dbo</inputSchema>
</schema>
<schema>
<inputSchema>INFORMATION_SCHEMA</inputSchema>
</schema>
</schemata>
</database>
<target>
<packageName>org.blackdread.sqltojava.jooq</packageName>
<directory>target/generated-sources/jooq</directory>
</target>
</generator>
</configuration>
<dependencies>
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq-meta</artifactId>
<version>${jooq.version}</version>
</dependency>
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq-codegen</artifactId>
<version>${jooq.version}</version>
</dependency>
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq</artifactId>
<version>${jooq.version}</version>
</dependency>
</dependencies>
</plugin>
Log:
[INFO] Generation finished: scm.dbo: Total: 1.493s, +0.333ms [INFO]
[INFO] Excluding empty schema : scm.INFORMATION_SCHEMA [INFO]
Removing excess files
But information_schema is available as views and it is returning me the necessary information too. I'm using windows authentication and not sa.
For historic reasons, jOOQ-meta's SQLServerDatabase only queries the sys.objects table, not the sys.all_objects table, to reverse engineer your database. This should be changed, of course. I have created a feature request for this:
https://github.com/jOOQ/jOOQ/issues/8827
It has been implemented for jOOQ 3.12
Workaround
In the meantime, you have these options:
Extend the SQLServerDatabase to adapt its queries to fetch rom all_objects, not from objects (this is a lot simpler with the professional edition than with the free trial, as you'll get the sources, and the right to patch the source code)
Use the JDBCDatabase, which queries JDBC's DatabaseMetaData, instead. This should return content from the sys and INFORMATION_SCHEMA schemas as well (but currently doesn't give access to e.g. stored procedures)
Use the generated INFORMATION_SCHEMA tables located in the jOOQ-meta module

JOOQ generated classes have schema defaulted to PUBLIC

This is a follow up question to this SO Question. I am using jooq codegen for auto generation of jooq classes from jpa entities. I am planning to use jooq as a SQL query builder and execute the actual queries with JPA EntityManager. But jooq is generating the tables from entities with schema defaulted to PUBLIC.
For example if my query has to be
select SCHEMA_A.colA, SCHEMA_A.colB from SCHEMA_A.tableA;
jooq is generating
select PUBLIC.colA, PUBLIC.colB from PUBLIC.tableA;
This is making the query fail when I fire the following query because the schema is invalid.
entityManager.createNativeQuery(sqlString).getResultList();
What configuration do I need to add to make the autogenerated code contain the actual schema name?
Codegen:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<version>3.9.1</version>
<!-- The plugin should hook into the generate goal -->
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jooq</groupId>
<artifactId>jooq-meta-extensions</artifactId>
<version>3.9.1</version>
</dependency>
<dependency>
<groupId>com.yaswanth</groupId>
<artifactId>domain</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>
<configuration>
<!-- Generator parameters -->
<generator>
<database>
<name>org.jooq.util.jpa.JPADatabase</name>
<outputSchema>[SCHEMA_A]</outputSchema>
<properties>
<!-- A comma separated list of Java packages, that contain your entities -->
<property>
<key>packages</key>
<value>com.yaswanth.domain.entity</value>
</property>
</properties>
</database>
<target>
<packageName>com.yaswanth.domain.entity.jooq</packageName>
<directory>target/generated-sources/jooq</directory>
</target>
</generator>
</configuration>
</plugin>
</plugins>
</build>
jooq spring beans config
<bean id="dslContext" class="org.jooq.impl.DefaultDSLContext">
<constructor-arg ref="jooqConfig" />
</bean>
<bean id="jooqConfig" class="org.jooq.impl.DefaultConfiguration">
<property name="SQLDialect" value="MYSQL" />
<property name="dataSource" ref="myDataSource" />
<property name="settings" ref="settings"/>
</bean>
<bean id="settings" class="org.jooq.conf.Settings">
<property name="renderSchema" value="true" />
</bean>
The <outputSchema/> element cannot stand alone, it must be paired with an <inputSchema/> element. The reason is that if you leave out the <inputSchema/> element, then all schemata in your database are used for code generation, and it would be unclear what the standalone <outputSchema/> element means.
This misconfiguration should probably be reported as a warning in the log files: https://github.com/jOOQ/jOOQ/issues/6186
More information about the code generator's schema mapping feature here:
https://www.jooq.org/doc/latest/manual/code-generation/codegen-advanced/codegen-config-catalog-and-schema-mapping
Note: This is how the jOOQ code generator behaves in general, the fact that the JPADatabase seems to be a bit particular here doesn't matter. Even with JPA annotated entities, you could have a database that includes several schemata if you specify the #Table(schema = "..."), for instance.

Maven - xmlbeans : working with multiple schema files to generate a single jar file

I have different service schema files(more than 5), from which I wanted to generate a jar file using xmlbeans.
I was using xmlbean plugin as follows
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>xmlbeans-maven-plugin</artifactId>
<version>${xmlbeans.version}</version>
<executions>
<execution>
<goals>
<goal>xmlbeans</goal>
</goals>
<phase>compile</phase>
</execution>
</executions>
<inherited>true</inherited>
<configuration>
<download>true</download>
<javaSource>${java.version}</javaSource>
<schemaDirectory>src/main/xsd</schemaDirectory>
<xmlConfigs>
<xmlConfig implementation="java.io.File">src/main/xsdconfig/xsdconfig.xml</xmlConfig>
</xmlConfigs>
</configuration>
</plugin>
</plugins>
I want to have different package name for different service schema. How to specify that and where to provide the schema path and xsdConfig file to apply the package details.
Please advice.
You need to define a file ending in .xsdconfig (e.g. myConfig.xsdconfig) to map the targetNamespace in each of your schema files to a Java package name. Put this .xsdconfig file in the same directory as the corresponding .xsd file that you are compiling. Suppose for example you have the following .xsd file:
<?xml version="1.0"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
targetNamespace="http://your.company.org/dileep">
...
</xs:schema>
Then you would define the following myConfig.xsdconfig files as follows:
<!-- you must use the http://www.bea.com/2002/09/xbean/config namespace here -->
<xb:config xmlns:xb="http://www.bea.com/2002/09/xbean/config">
<xb:namespace uri="http://your.company.org/dileep"> <!-- map this namespace -->
<xb:package>org.company.your.dileep</xb:package> <!-- to this Java package -->
</xb:namespace>
<!-- more namespace mappings can go here ... -->
</xb:config>
It is also possible to control the names of the Java classes generated from each of your schema files.
You can read more about this in the official XMLBeans documentation.

Renaming static files when building WAR file using Maven

I want to rename static files when building my WAR file using maven-war-plugin, with version number. For intance, in my project I have a file:
src/main/webapp/css/my.css
I want it to appear in the WAR file as:
css/my-versionNumber.css
The same question was already asked (see Rename static files on maven war build), but the accepted answer is to rename a directory. I do not want to rename a directory, I want to rename the actual file.
Is this possible?
OK, I found a way, the principles come from: Automatic update of generated css files via m2e.
Renaming files
I chose to use Maven AntRun Plugin, because it allows to rename several files using a replacement pattern, see https://stackoverflow.com/a/16092997/1768736 for details.
Alternative solutions were to use:
maven-assembly-plugin: but it doesn't support renaming of filesets, only renaming of individual files (see file assembly descriptor and comments of this answer: https://stackoverflow.com/a/4019057/1768736)
maven-resources-plugin: it allows to copy resources, not to rename them (see copy resources mojo and comments of this question: Renaming resources in Maven)
Make renamed files available for maven-war-plugin
Copy files: the idea is to copy the renamed files into a temp directory, to be added by maven-war-plugin to the WAR file as a web-resource. maven-war-plugin builds the WAR during the packaging phase, so we will need to copy renamed files before that.
Prevent maven-war-plugin to manage the files to be renamed by maven-antrun-plugin: this is done by using the parameter warSourceExcludes.
Make it work from within Eclipse with m2e-wtp
Change lifecycle mapping: the problem is that m2e by default doesn't execute all lifecycles defined in the POM file (to see the lifecycles executed/ignored, from Eclipse, go to your project properties, then Maven > Lifecycle Mapping). So you need to use the fake plugin org.eclipse.m2e.lifecycle-mapping to add the maven-antrun-plugin lifecycle, see life cycle mapping documentation.
Change maven-antrun-plugin output directory: the problem is that m2e-wtp acquires its web-resources before any lifecycle can be launched, so, before maven-antrun-plugin can rename the files. As a workaround, we create a profile, activated only when the project is built by m2e, to change the property used to set maven-antrun-plugin output directory, to write directly into m2e-wtp web-resources.
That's it! POM snippet:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
...
<properties>
<!-- properties used to rename css files with version number. -->
<css.version>${project.version}</css.version>
<!-- Temp directory where to copy renamed files, later added by maven-war-plugin as web-resources. -->
<rename.tmp.directory>${project.build.directory}/rename_tmp</rename.tmp.directory>
</properties>
<!-- There is a problem when running the webapp from Eclipse: m2e-wtp acquires
the web-resources before any lifecycle can be launched, so, before
maven-antrun-plugin can rename the files. We define a profile so that
maven-antrun-plugin copies files directly into the m2e-wtp web-resources directory,
when running from Eclipse. -->
<profiles>
<profile>
<id>m2e</id>
<!-- This profile is only active when the property "m2e.version"
is set, which is the case when building in Eclipse with m2e,
see https://stackoverflow.com/a/21574285/1768736. -->
<activation>
<property>
<name>m2e.version</name>
</property>
</activation>
<properties>
<rename.tmp.directory>${project.build.directory}/m2e-wtp/web-resources/</rename.tmp.directory>
</properties>
</profile>
</profiles>
...
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>run</goal>
</goals>
<id>rename-resources</id>
<!-- perform copy before the package phase,
when maven-war-plugin builds the WAR file -->
<phase>process-resources</phase>
<configuration>
<target>
<!-- copy renamed files. -->
<copy todir="${rename.tmp.directory}/css/">
<fileset dir="src/main/webapp/css/">
<include name="**/*.css" />
</fileset>
<!-- See other Mappers available at http://ant.apache.org/manual/Types/mapper.html -->
<mapper type="glob" from="*.css" to="*-${css.version}.css"/>
</copy>
</target>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<!-- We do no let the maven-war-plugin take care of files that will be renamed.
Paths defined relative to warSourceDirectory (default is ${basedir}/src/main/webapp) -->
<warSourceExcludes>css/</warSourceExcludes>
<webResources>
<!-- include the resources renamed by maven-antrun-plugin,
at the root of the WAR file -->
<resource>
<directory>${rename.tmp.directory}</directory>
<includes>
<include>**/*</include>
</includes>
</resource>
</webResources>
</configuration>
</plugin>
...
</plugins>
<!-- When running server from Eclipse, we need to tell m2e to execute
maven-antrun-plugin to rename files, by default it doesn't. We need to modify the life cycle mapping. -->
<pluginManagement>
<plugins>
<!-- This plugin is not a real one, it is only used by m2e to obtain
config information. This is why it needs to be put in the section
pluginManagement, otherwise Maven would try to download it. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<versionRange>[1.0.0,)</versionRange>
<goals>
<goal>run</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute>
<!-- set to true, otherwise changes are not seen,
e.g., to a css file, and you would need to perform
a project update each time. -->
<runOnIncremental>true</runOnIncremental>
</execute >
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
...
</build>
...
</project>
You need define
<properties>
<css.version>1.1.0</css.version>
</properties>
After that call it by EL
css/${css.version}.css

Building for local and production in IntelliJ using Maven

I have a simple spring mvc app, using maven with intellij.
How do you go about creating seperate files for both production and development?
e.g. say I want to set a production and development mysql connection string for nhibernate?
How can I have it such that when I build it will take the correct file to use to grab configuration information? (and any advice on naming conventions for the files?)
Using an ant task is pretty straight forward for this.
First, create a couple profiles under <project> in your pom:
<profiles>
<profile>
<id>build-dev</id>
<activation>
<!-- <activeByDefault>true</activeByDefault> -->
<property>
<name>env</name>
<value>dev</value>
</property>
</activation>
<properties>
<config.name>config.dev.properties</config.name>
</properties>
</profile>
<profile>
<id>build-prod</id>
<activation>
<property>
<name>env</name>
<value>prod</value>
</property>
</activation>
<properties>
<config.name>config.prod.properties</config.name>
</properties>
</profile>
</profiles>
Then use the maven-antrun-plugin
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<delete file="${project.build.outputDirectory}/config.properties"/>
<copy file="src/main/resources/${config.name}" tofile="${project.build.outputDirectory}/config.properties"/>
<delete file="${project.build.outputDirectory}/config.dev.properties"/>
<delete file="${project.build.outputDirectory}/config.prod.properties"/>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
Now you just specify the profile you want when you run mvn. If you want a default, uncomment and place the:
<!-- <activeByDefault>true</activeByDefault> -->
section in the profile you want by default. As it is, the build will fail on the ant task if neither is specified.
There are a ton of ways to go about this.
In general, things like DB connection strings can go into property files, and replaced in the Spring XML configuration files using a PropertyPlaceholderConfigurer. One common-ish trick is to then create a custom implementation that looks for a -D startup parameter, a user name, a machine name, etc. that can be used to decide which property file to actually use.
The same trick can be used for the Spring configuration files as well by creating an implementation of an XmlWebApplicationContext (? I can never remember what to subclass) that adds/modifies the default getConfigLocations to add, say, files prefaced with a user or machine name, -D startup parameter value, and so on.
Btw, you're not using NHibernate if you're using Java, you're using Hibernate.
Edit Brian's approach is one of those "tons of ways", I just like to keep it configurable without building, i.e., dynamic based on arbitrary "local" conditions, etc. so I can swap things out really easily.

Categories