i'm using the Persitence class from javax for generating SQL scripts from our entities. And it works just fine, for the project where i'am currently in. Here is the code:
final Map<Object, Object> properties = new HashMap<>(dialect.getDefaultProperties());
properties.put(AvailableSettings.DIALECT, dialect.getClass().getName());
properties.put(AvailableSettings.CONNECTION_PROVIDER, DriverManagerConnectionProviderImpl.class.getName());
properties.put(AvailableSettings.DEFAULT_SCHEMA, schemaName);
properties.put(AvailableSettings.HBM2DDL_SCRIPTS_ACTION, "create");
properties.put(AvailableSettings.HBM2DDL_CREATE_SOURCE, "metadata");
properties.put(AvailableSettings.HBM2DDL_SCRIPTS_CREATE_TARGET, target.toURI().toURL().toString());
properties.put(AvailableSettings.USE_QUERY_CACHE, "false");
properties.put(AvailableSettings.USE_SECOND_LEVEL_CACHE, "false");
properties.put(AvailableSettings.IMPLICIT_NAMING_STRATEGY, "org.hibernate.boot.model.naming.ImplicitNamingStrategyComponentPathImpl");
properties.put(AvailableSettings.PHYSICAL_NAMING_STRATEGY, SpringPhysicalNamingStrategy.class.getName());
properties.put(AvailableSettings.JPA_VALIDATION_MODE, "ddl, callback");
properties.put(AvailableSettings.HBM2DDL_DATABASE_ACTION, "none");
properties.put(AvailableSettings.DRIVER, "org.h2.Driver");
properties.put(AvailableSettings.URL, "jdbc:h2:mem:export");
properties.put(AvailableSettings.HBM2DDL_DELIMITER, ";");
System.setProperty("line.separator", ";\n");
Persistence.generateSchema(schemaName, properties);
Generally it's working fine for the entities in the current project, but not fir external entites. I have a external maven module, which contains some entities i also need to be considered for our SQL script. Is there a way to define the external module in the properties, so the will be considered too ?
Create a maven module for your entities
Execute mvn clean install in this module
Add this module in the dependencies of the project that wants to use this modules' entities.
This is all you have to do and I also applied this in many projects.
Since you did not provide what you have done in this manner, cant't help more.
So, please try these steps and if there was a problem update your question with how to did the steps.
P.S: Also if this is the first time creating multi-module project, please google about multi-module maven project. There are plenty of useful posts out there.
Related
I have a multi-module Spring Boot Gradle project. I have properties in each module yml file that point to database: user, pass, url.
It's working solution, but it's difficult to change project database. Every time I want switch database user or url, I must change 10+ yml files.
How to avoid this?
You could bind the properties in a class (see here: https://www.baeldung.com/configuration-properties-in-spring-boot) and inject the class where needed.
I have multi module project like below
DbUtils
Doctor Module Project
Patient Module Project
I have configured hibernate in DbUtils project with configuration file as below
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan(new String[] { "com.test.dbutils.pojos" });
Now I want to use this session factory in both sub modules and I want to setPackagesToScan Project wise.
i.e for Doctor Module com.test.doctor.pojos and for Patient module com.test.patient.pojos
Q.1) How to achieve this?
Q.2) Is there any other better approach creating and using session factory in multi modules?
If you are on Hibernate4+ then you may want to go thru http://docs.jboss.org/hibernate/orm/4.2/devguide/en-US/html/ch16.html
Otherwise, to my knowledge, you have to let JVM know about set packagesToScan at compile time. If you are running Doctor and Patient as separate SpringBoot applications or so[but still leveraging upon common code in DBUtils] then you have option to pass the packages as String via some application.properties file and set the right POJO package in the child module.
If you have both Doctor and Patient running in the same application, you can atleast try to pass multiple packages [sessionFactory.setPackagesToScan(new String[] { "com.test.doctor.pojos", "com.test.patient.pojos" });]
I am developing project with Spring Framework.
I have created about 5 modules, sometimes one depend on other, but they are all on top level, and up to this point everything works fine.
Example:
Database module has only external dependencies
Identity module depends on database module
Facebook stuff module depends on identity module
Now, I have created directory in root of project called modules, and moved all modules into it (so they all were, and still are on same relative distance to each other).
All tests passes and I can build/compile and inspect classes without any problem.
However, now when I try to run only identity module (that does not require facebook stuff) spring throws me an exception, that it cannot find facebook beans. Of course it cannot, because there is no dependency, but I do not want to add this dependency. #Configuration is #Lazy so there is no point creating such #Bean anyway.
Code:
new AnnotationConfigApplicationContext(Application.class);
Application class is #Lazy #Configuration and does #ComponentScan from whole application, and as I understand it finds also #Configuration's from other modules and then - I do not know why - tried to create those #Bean's from other modules but fails as expected.
I have verified with git, that the only between working and not working states are moving those modules into new folder.
So to clarify, working/default structure is:
/.gradle
/.idea
/DatabaseModule
/IdentityModule
/FacebookModule
/.out
/.gitignore
and not working one is:
/.gradle
/.idea
/modules/DatabaseModule
/modules/IdentityModule
/modules/FacebookModule
/.out
/.gitignore
Code stays the same.
I think, that if I will add all dependencies to all modules then it will work but for obvious reasons I do not want to do this.
Am I doing something wrong?
Is there any convention, that I am breaking?
Bonus question: how are nested modules different, from ordinary folder containing modules?
EDIT:
I should also note, that all tests pass in both scenarios, however I am not using spring in tests (no dependency injection) - just new or Mock() everything
Is it possible for new Flyway migrations to be generated by JPA/Hibernate's automatic schema generation when a new model / field etc. are added via Java code.
It would be useful to capture the auto-generated SQL and save it directly to a new Flyway migration, for review / editing / committing to a project repository.
Thank you in advance for any assistance or enlightenment you can offer.
If your IDE of choice is IntelliJ IDEA, I'd recommend using the JPA Buddy plugin to do this. It can generate Flyway migrations by comparing your Java model to the target DB.
You can use it to keep your evolving model and your SQL scripts in sync.
Also, it can create the init script if your DB is empty.
Once you have it installed and have Flyway as your Maven/Gradle dependency, you can generate a migration like this:
Flyway doesn't have built-in support for diff, I use liquidbase within a maven spring boot project and changelogs can be created from JPA/hibernate changes by using:
mvn liquibase:diff
All of the options for liquibase diff are located here:
http://www.liquibase.org/documentation/maven/maven_diff.html
If you want to generate the update SQL automatically, you can ask Hibernate to do so; just add the lines below to your Spring Boot configuration:
spring.jpa.properties.javax.persistence.schema-generation.create-source=metadata
spring.jpa.properties.javax.persistence.schema-generation.scripts.action=update
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=update.sql
When you execute the application, this will generate a file named update.sql on the root of your project. Now, you can just copy and paste them into your Flyway migration.
This was adapted from this other answer: https://stackoverflow.com/a/36966419/679240 ; it is basically the same logic, except that one wants to generate a database creation script, while I needed an update script, instead.
BTW, if you want to replace the names of the foreign keys on the script with more readable ones, you could use this regex: ^(alter table .*?)(\w+)(\s+add constraint )\w+( foreign key \()(.*?)(\).*) with this replacement: $1$2$3fk_$2__$5$4$5$6; this will change the names of the FKs in the script to fk_name_of_the_table__name_of_the_field.
We have the following scenario with our project:
A core web application packaged as a
war file (call it Core project).
The need to "customize" or "extend" the core app
per customer (call it Customer project). This mostly includes
new bean definitions (we're using
Spring), ie. replacing service
implementations in the core.war with
customer-specific implementations.
We want to develop the Core and Customer projects independently
When the Customer project is developed, we need to be able to run/debug it in Eclipse (on Tomcat) with the Core project as a dependency
When the Customer project is built, the resulting war file "includes" the core and customer projects. So this .war is the customer-specific version of the application
I'm looking for suggestions as to the best way to do this in terms of tooling and project configuration.
We're using Ant currently, but would like to avoid getting buried in more ant. Has anyone done this with Maven?
I've seen a lot of posts on how to build a web application that depends on a java application, but nothing on a web application depending on another web app.
Thanks!
Sounds like Maven WAR overlay does what you want.
In Eclipse there is a "native" WTP way to do this. It mainly using linked folders and a little hack in .settings/org.eclipse.wst.common.component file. You can read the article about it at http://www.informit.com/articles/article.aspx?p=759232&seqNum=3 the chapter called "Dividing a Web Module into Multiple Projects". The problem with this is that the linked folder must be relative to some path variable can be defined in Window/Preferences/General/Workspace/Linked Resources tab. Otherwise the linked folder definition (can be found in .project file in project root) will contain workstation specific path. The path variable practicly should be the workspace root. This solution works great with WTP, deploy and everything else works like it should.
The second solution is to use ant for it. Forget it. You will deeply regret it.
The third solution is to use maven for it. You can forget the comfort of WTP publishing if you dont do some tricks. Use war overlays like others suggested. Be sure to install both m2eclipse, m2eclipse extras. There is an extension plugin released recently, that can help you. Described at this blog. I did not try it, but looks ok. Anyway Maven have nothing to do with linked folders, so I think even the first solution and this maven overlay can live together if necessary.
As for headless builds you can use HeadlessEclipse for the first solution. It is dead (by me) now, but still works :). If you use the maven overlay + eclipse stuff, headless builds are covered by maven.
This is little bit more involved but at a high-level we do it as below. We have the core platform ui divided to multiple war modules based on the features (login-ui,catalog-mgmt-ui etc). Each of these core modules are customizable by the customer facing team.
We merge all of these modules during build time into 1 single war module. The merge rules are based on maven's assembly plugin.
You usually start from the Java source code. WARs don't include the Java source code, just the compiled classes under WEB-INF/classes or JARs under WEB-INF/libs.
What I would do is use Maven and start a brand new empty webapp project with it: http://maven.apache.org/guides/mini/guide-webapp.html
After you have the new empty project structure, copy the Java source code to it (src/main/java) and fill out the dependencies list in pom.xml.
Once you've done all this you can use mvn clean package to create a standard WAR file that you can deploy to Tomcat.
You might want to look into designing your core app with pluggable features based on interfaces.
For example say your core app has some concept of a User object and needs to provide support for common user based tasks. Create a UserStore interface;
public interface UserStore
{
public User validateUser(String username, String password) throws InvalidUserException;
public User getUser(String username);
public void addUser(User user);
public void deleteUser(User user);
public void updateUser(User user);
public List<User> listUsers();
}
You can then code your core app (logon logic, registration logic etc) against this interface. You might want to provide a default implementation of this interface in your core app, such as a DatabaseUserStore which would effectively be a DAO.
You then define the UserStore as a Spring bean and inject it where needed;
<bean id="userStore" class="com.mycorp.auth.DatabaseUserStore">
<constructor-arg ref="usersDataSource"/>
</bean>
This allows you to customise or extend the core app depending on specific customer's needs. If a customer wants to integrate the core app with their Active Directory server you write a LDAPUserStore class that implements your UserStore interface using LDAP. Configure it as a Spring bean and package the custom class as a dependant jar.
What you are left with is a core app which everyone uses, and a set of customer specific extensions that you can provide and sell seperately; heck, you can even have the customer write their own extensions.