I have been trying to create a Spring batch program which has to read certain data from Database and write it into another table. I don't want the Spring Batch metadata tables to be created in my Database. When I tried that, I was not able to do the transactions.
I avoided the meta data tables by extending DefaultBatchConfigurer and overriding like this,
#Override
public void setDataSource(DataSource dataSource) {
// override to do not set datasource even if a datasource exist.
// initialize will use a Map based JobRepository (instead of database)
}
By doing this I was getting org.springframework.dao.InvalidDataAccessApiUsageException: no transaction is in progress; nested exception is javax.persistence.TransactionRequiredException: no transaction is in progress exception.
Is there a way by which I can avoid the metadata tables and still use the transactions?
If your are using spring boot you may add the below line to application.properties or environment specific property file to ensure the spring batch metadata tables are not getting created:
spring.batch.initializer.enabled=false
Also since you do not need the meta tables so do not extend DefaultBatchConfigurer class.
I would only extend this class if I want to set up a persistent JobRepository i.e. create the spring batch meta tables, for which we need a lot of other configurations that are provided by default by DefaultBatchConfigurer class.
Related
I am having trouble finding information about this issue I am running into. I am interested in implementing row level security on my Postgres db and I am looking for a way to be able to set postgres session variables automatically through some form of an interceptor. Now, I know that with hibernate you are able to do row-level-security using #Filter and #FilterDef, however I would like to additionally set policies on my DB.
A very simple way of doing this would be to execute the SQL statement SET variable=value prior to every query, though I have not been able to find any information on this.
This is being used on a spring-boot application and every request is expected to will have access to a request-specific value of the variable.
Since your application uses spring, you could try accomplishing this in one of a few ways:
Spring AOP
In this approach, you write an advice that you ask spring to apply to specific methods. If your methods use the #Transactional annotation, you could have the advice be applied to those immediately after the transaction has started.
Extended TransactionManager Implementation
Lets assume your transaction is using JpaTransactionManager.
public class SecurityPolicyInjectingJpaTransactionManager extends JpaTransactionManager {
#Autowired
private EntityManager entityManager;
// constructors
#Override
protected void prepareSynchronization(DefaultTransactionStatus status, TransactionDefinition definition) {
super.prepareSynchronization(status, definition);
if (status.isNewTransaction()) {
// Use entityManager to execute your database policy param/values
// I would suggest you also register an after-completion callback synchronization
// This after-completion would clear all the policy param/values
// regardless of whether the transaction succeeded or failed
// since this happens just before it gets returned to the connection pool
}
}
}
Now simply configure your JPA environment to use your custom JpaTransactionManager class.
There are likely others, but these are the two that come to mind that I've explored.
Is it possible to load initial data in a MongoDB database using src/main/resources/data.sql or by any other file?
I understand that data.sql is used for SQL DB's whereas MongoDB is a NOSQL DB. But just wanted to know if there is any equivalent of data.sql for NOSQL DB's.
While googling I found out this SO link (Spring Boot - Loading Initial Data) which does what I am looking for but still it's not a standalone file data.sql.
To load initial data you can use db migration tool like MongoBee
It's very useful option to handle data initialization in java. You just need to configure #Bean public Mongobee mongobee in your spring boot and setup component scan for data ChangeLogs where data creation actually happens.
You can use a repository populator with Spring Data MongoDB. Let me demonstrate this with a code sample in Kotlin:
#Configuration
class TestApplicationConfig {
#Value("classpath:test_data.json")
private lateinit var testData: Resource
#Bean
#Autowired
fun repositoryPopulator(objectMapper: ObjectMapper): Jackson2RepositoryPopulatorFactoryBean {
val factory = Jackson2RepositoryPopulatorFactoryBean()
// inject your Jackson Object Mapper if you need to customize it:
factory.setMapper(objectMapper)
factory.setResources(arrayOf(testData))
return factory
}
}
Put test_data.json in resources directory.
you can define your data in json/xml and use populator elements of the repository to load the data.
https://docs.spring.io/spring-data/mongodb/docs/2.0.9.RELEASE/reference/html/#core.repository-populators
I have a need to dynamically create and connect to potentially hundreds of databases using from a single ResourceServer (REST Server). The REST controller would do something like this:
#RequestMapping("/teachers")
public List<Teacher> teachers(#RequestParam(value="db", defaultValue="db") String db) {
//Look up the correct datasource
DataSource ds = DSSources.get(db);
//Associate the datasource with the repository
...
//Return the teachers from the database using
//the TeacherRepository (Spring Data JPA Repository)
return TeacherRepository.getAllTeachers();
}
I'm thinking that the DSSources is a Map<String, Datasource that contains the DataSource instances. How do I programmatically create the datasources? Once they are created, how are they associated with the Spring Data JPA Repositories? All databases will share a common set of repositories.
Storing Datasource in a Map is not a good solution has you have to think of Datasources failing and recreating those as well.
What you need is Multi Tenancy. If you use Hibernate underneath JPA you can use Hibernate Multi Tenancy. I am pretty sure other ORMs also do provide Multi Tenancy capabilities.
I come from php/laravel. Whenever I want to seed the database i only need to run php artisan db:seed. This will run some php scripts that will insert data into the database.
I want to achieve this same feature using spring/hibernate. I know I can add an import.sql file to seed the database after schema creation. However, I want to import these fixtures using java and the ORM available so I do not need to maintain an sql.
Is there a way?
If not, there should be some configuration to trigger a script that use the ORM entity manager to persist entities in the database after schema creation.
The main idea is not to maintain a big sql seeder file over schema revisions.
Thanks!
If you're using Spring data you can use Repository populators.
Otherwise you may register an event that fires after the spring context is loaded :
#Component
public class YourListener {
// Declare your autowired beans here
#EventListener
public void handleContextRefresh(ContextRefreshedEvent event) {
// Your seeder
// + You can use all the registred beans (repositories, services...)
}
}
For more detail check: Better application events in Spring Framework 4.2
My unit tests use Hibernate to connect to an in-memory HSQLDB database. I was hoping there would be a way to clear and recreate the database (the entire database including the schema and all the data) in JUnit's TestCase.setUp() method.
you can config your hibernate configuration file to force database to recreate your tables and schema every time.
<!-- Drop and re-create the database schema on startup -->
<property name="hbm2ddl.auto">create-drop</property>
hibernate.hbm2ddl.auto Automatically validates or exports schema DDL to the database when the SessionFactory is created. With create-drop, the database schema will be dropped when the SessionFactory is closed explicitly.
e.g. validate | update | create | create-drop
if you don't like to have this config in your real hibernate config, you can create one hibernate config for unit testing purpose.
If you are using Spring, then you can use the #Transactional attribute on your unit test, and by default at the end of every unit test all persisted data will be automatically rolled back so you dont need to worry about dropping the tables every time.
I haa walked throug an example here http://automateddeveloper.blogspot.com/2011/05/hibernate-spring-testing-dao-layer-with.html
hibernate.hbm2ddl.auto=create-drop
And bootstrap a new SessionFactory.
From testing perspective, the best practice is to clear data after every single test. If you use create-drop, it will also drop the table schema. This causes an overhead of recreating the schema everytime.
Since you are using hsql, which provides a direct mechanism to truncate, it would be the best option in this case.
#After
public void clearDataFromDatabase() {
//Start transaction, based on your transaction manager
dao.executeNativeQuery("TRUNCATE SCHEMA PUBLIC AND COMMIT");
//Commit transaction
}
Be careful with wiping the world and starting over fresh each time. Soon, you will likely want to start with a "default" set of test data loaded in your system. Thus, what you really want is to revert to that base state before each test is ran. In this case, you want a Transaction which rollsback before each test run.
To accomplish this, you should annotate your JUnit class:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"classpath:/path/to/spring-config.xml"})
#TransactionConfiguration(transactionManager="myTransactionManager", defaultRollback=true)
public class MyUnitTestClass {
...
}
And then annotate each of your test methods with #Transactional:
#Transactional
#Test
public void myTest() {
...
}