Convert annotation based configuration to XML - java

How to write this part of annotation based configuration in XML?
#Bean
public EventRepository eventRepository() throws Exception {
return new SolrRepositoryFactory(eventTemplate())
.getRepository(EventRepository.class, new EventRepositoryImpl(eventTemplate()));
}
Full code of this config:
#Configuration
public class SolrContext {
#Bean
public SolrServerFactory solrServerFactory() {
return new MulticoreSolrServerFactory(new HttpSolrServer("solr.host"));
}
#Bean
public SolrTemplate eventTemplate() throws Exception {
SolrTemplate solrTemplate = new SolrTemplate(solrServerFactory());
solrTemplate.setSolrCore("events");
return solrTemplate;
}
#Bean
public EventRepository eventRepository() throws Exception {
return new SolrRepositoryFactory(eventTemplate())
.getRepository(EventRepository.class, new EventRepositoryImpl(eventTemplate()));
}
}
I got this example from that answer.

Related

jOOQ with Spring Boot not inserting entity

I faced an issue with jOOQ not inserting entities unless the repository is annotated with #Transactional.
Here's my configuration:
#Configuration
#EnableTransactionManagement
#RequiredArgsConstructor
public class PersistenceConfig {
#Value("${spring.datasource.url}")
private String dbUrl;
#Value("${spring.datasource.username}")
private String dbUser;
#Value("${spring.datasource.password}")
private String dbPassword;
#Bean
#SneakyThrows
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
// https://mariadb.com/kb/en/about-mariadb-connector-j/
config.setDriverClassName(DatabaseDriver.MARIADB.getDriverClassName());
config.setJdbcUrl(dbUrl);
config.setUsername(dbUser);
config.setPassword(dbPassword);
config.setAutoCommit(false);
// https://github.com/brettwooldridge/HikariCP/wiki/MySQL-Configuration
config.addDataSourceProperty("cacheServerConfiguration", true);
config.addDataSourceProperty("useServerPrepStmts", true);
config.addDataSourceProperty("useLocalSessionState", true);
config.addDataSourceProperty("cacheResultSetMetadata", true);
config.addDataSourceProperty("rewriteBatchedStatements", true);
config.addDataSourceProperty("elideSetAutoCommits", true);
config.addDataSourceProperty("maintainTimeStats", false);
config.addDataSourceProperty("cachePrepStmts", true);
config.addDataSourceProperty("prepStmtCacheSize", 350);
config.addDataSourceProperty("prepStmtCacheSqlLimit", 2048);
return new HikariDataSource(config);
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSource() {
return new TransactionAwareDataSourceProxy(dataSource());
}
#Bean
public DataSourceTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
public DataSourceConnectionProvider connectionProvider() {
return new DataSourceConnectionProvider(transactionAwareDataSource());
}
#Bean
public ExceptionTranslator exceptionTransformer() {
return new ExceptionTranslator();
}
#Bean
public SpringTransactionProvider springTransactionProvider() {
return new SpringTransactionProvider(transactionManager());
}
#Bean
public DefaultConfiguration configuration() {
DefaultConfiguration jooqConfiguration = new DefaultConfiguration();
jooqConfiguration.set(connectionProvider());
jooqConfiguration.set(new DefaultExecuteListenerProvider(exceptionTransformer()));
jooqConfiguration.set(SQLDialect.MARIADB);
jooqConfiguration.set(springTransactionProvider());
return jooqConfiguration;
}
#Bean
public DefaultDSLContext dsl() {
return new DefaultDSLContext(configuration());
}
#Bean
public TransactionTemplate transactionTemplate() {
return new TransactionTemplate(transactionManager());
}
private static class ExceptionTranslator extends DefaultExecuteListener {
public void exception(ExecuteContext context) {
SQLDialect dialect = context.configuration().dialect();
SQLExceptionTranslator translator
= new SQLErrorCodeSQLExceptionTranslator(dialect.name());
context.exception(translator
.translate("Access database using jOOQ", context.sql(), context.sqlException()));
}
}
}
repository:
#Repository
public class UserRepository extends UserDao {
private final DSLContext dslContext;
public UserRepository(DSLContext dslContext) {
super(dslContext.configuration());
this.dslContext = dslContext;
}
}
So, calling userRepository.insert(...) doesn't actually insert into the database although the logs say that the following:
org.jooq.tools.LoggerListener : Executing query : insert into `user` (...)
org.jooq.tools.LoggerListener : -> with bind values : insert into `user` ...
However, If I overload UserDao's insert method and annotate it with #Transacational - it works, the rows actually get inserted. I suppose I have misconfigured something.
Spring Boot with jOOQ boot starter is used.
The problem is actually with setAutoCommit(false).

JDBC Spring data #Transactional not working

I'm using springboot and spring-data-jdbc.
I wrote this repository class
#Repository
#Transactional(rollbackFor = Exception.class)
public class RecordRepository {
public RecordRepository() {}
public void insert(Record record) throws Exception {
JDBCConfig jdbcConfig = new JDBCConfig();
SimpleJdbcInsert messageInsert = new SimpleJdbcInsert(jdbcConfig.postgresDataSource());
messageInsert.withTableName(record.tableName()).execute(record.content());
throw new Exception();
}
}
Then I wrote a client class that invokes the insert method
#EnableJdbcRepositories()
#Configuration
public class RecordClient {
#Autowired
private RecordRepository repository;
public void insert(Record r) throws Exception {
repository.insert(r);
}
}
I would expect that no record are insert to db when RecordClient's insert() method is invoked, because RecordRespository's insert() throws Exception. Instead the record is added however.
What am I missing?
EDIT. This is the class where I configure my Datasource
#Configuration
#EnableTransactionManagement
public class JDBCConfig {
#Bean
public DataSource postgresDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.postgresql.Driver");
dataSource.setUrl("jdbc:postgresql://localhost:5432/db");
dataSource.setUsername("postgres");
dataSource.setPassword("root");
return dataSource;
}
}
You have to inject your datasource instead of creating it manually. I guess because #Transactional only works for Spring managed beans. If you create a datasource instance by calling new constructor (like this new JDBCConfig(). postgresDataSource()), you are creating it manually and it's not a Spring managed beans.
#Repository
#Transactional(rollbackFor = Exception.class)
public class RecordRepository {
#Autowired
DataSource dataSource;
public RecordRepository() {}
public void insert(Record record) throws Exception {
SimpleJdbcInsert messageInsert = new SimpleJdbcInsert(dataSource);
messageInsert.withTableName(record.tableName()).execute(record.contents());
throw new Exception();
}
}

Spring data mongodb override configuration

I have simple maven projects with Spring-data-mongodb project. And I need to specify (override) Database connection details. MongoDB databse runs on localhost with default settings (port=27017). I am trying to use AppConfig
#Configuration
#EnableMongoRepositories
public class AppConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "TestDatabase";
}
#Override
public Mongo mongo() throws Exception {
// wrong port on purpose
return new MongoClient("127.0.0.1", 27007);
}
}
My main file looks like this
public class MongoApp {
private static final Log log = LogFactory.getLog(MongoApp.class);
public static void main(String[] args) throws Exception {
MongoOperations mongoOps = new MongoTemplate(new MongoClient(), "database");
mongoOps.insert(new Person("Joe", 34));
log.info(mongoOps.findOne(new Query(where("name").is("Joe")), Person.class));
// mongoOps.dropCollection("person");
}
}
When I run project everything works fine but I it should not with this config.
project structure
main
-AppConfig.java
-MongoApp.java
-Person.java
I found the solution.
I used this configuration:
#Configuration
public class AppConfig {
public #Bean
MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(), "mydb");
}
public #Bean
MongoTemplate mongoTemplate() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return mongoTemplate;
}
}
And main file with ApplicationContext:
public class MongoApp {
private static final Log log = LogFactory.getLog(MongoApp.class);
public static void main(String[] args) throws Exception {
ApplicationContext ctx = new AnnotationConfigApplicationContext(AppConfig.class);
MongoOperations mongoOperation = (MongoOperations)ctx.getBean("mongoTemplate");
mongoOperation.insert(new Person("Joe", 34));
log.info(mongoOperation.findOne(new Query(where("name").is("Joe")), Person.class));
//mongoOps.dropCollection("person");
}
}

How to make Spring #Transactional to work in junit test

I try to make my test to work with Spring #Transactional annotation.
#ContextConfiguration(classes = SomeTest.SomeTestSpringConfig.class)
#RunWith(SpringJUnit4ClassRunner.class)
public class SomeTest {
#Autowired
MyBean some;
#Autowired
PlatformTransactionManager transactionManager;
#Test
public void testSpring() throws Exception {
some.method();
assertTrue(some.isTransactionalWorks);
}
#EnableAspectJAutoProxy(proxyTargetClass = true)
#EnableLoadTimeWeaving
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#TransactionConfiguration
static class SomeTestSpringConfig {
#Bean
PlatformTransactionManager transactionManager() {
return new MyTransactionManager(dataSource());
}
#Bean
MyBean some() {
return new MyBean();
}
#Bean
DataSource dataSource() {
return new SimpleDriverDataSource(Driver.load(), "jdbc:h2:mem:unit-test");
}
}
}
class MyBean {
#Autowired
DataSource dataSource;
public boolean isTransactionalWorks;
#Transactional
private void someInTransaction() {
try {
dataSource.getConnection();
} catch (SQLException e) {
e.printStackTrace();
}
System.out.println("I should be in transaction");
}
public void method() {
someInTransaction();
}
}
class MyTransactionManager implements PlatformTransactionManager, InitializingBean {
private final DataSourceTransactionManager base = new DataSourceTransactionManager();
#Autowired
MyBean some;
public MyTransactionManager(DataSource datasource) {
base.setDataSource(datasource);
}
#Override
public TransactionStatus getTransaction(TransactionDefinition definition) throws TransactionException {
some.isTransactionalWorks = true;
return base.getTransaction(definition);
}
#Override
public void commit(TransactionStatus status) throws TransactionException {
base.commit(status);
}
#Override
public void rollback(TransactionStatus status) throws TransactionException {
base.rollback(status);
}
#Override
public void afterPropertiesSet() throws Exception {
base.afterPropertiesSet();
}
}
Also I added -javaagent:D:/libs/spring-instrument-4.1.7.RELEASE.jar to VM options for this test.
But it always fails. What did I miss?
Please check this link, i think it is the similar problem u are facing.
How to configure AspectJ with Load Time Weaving without Interface
In this link he has asked to provide both aspectjweaver.jar and spring-instrument.jar in vm argument.
Good to know it worked. :)

Spring-batch #BeforeStep does not work with #StepScope

I'm using Spring Batch version 2.2.4.RELEASE
I tried to write a simple example with stateful ItemReader, ItemProcessor and ItemWriter beans.
public class StatefulItemReader implements ItemReader<String> {
private List<String> list;
#BeforeStep
public void initializeState(StepExecution stepExecution) {
this.list = new ArrayList<>();
}
#AfterStep
public ExitStatus exploitState(StepExecution stepExecution) {
System.out.println("******************************");
System.out.println(" READING RESULTS : " + list.size());
return stepExecution.getExitStatus();
}
#Override
public String read() throws Exception {
this.list.add("some stateful reading information");
if (list.size() < 10) {
return "value " + list.size();
}
return null;
}
}
In my integration test, I'm declaring my beans in an inner static java config class like the one below:
#ContextConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
public class SingletonScopedTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
JobLauncherTestUtils jobLauncherTestUtils() {
return new JobLauncherTestUtils();
}
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(stepUnderTest())
.build();
}
#Bean
public Step stepUnderTest() {
return stepBuilder.get("step-under-test")
.<String, String>chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public ItemReader<String> reader() {
return new StatefulItemReader();
}
#Bean
public ItemProcessor<String, String> processor() {
return new StatefulItemProcessor();
}
#Bean
public ItemWriter<String> writer() {
return new StatefulItemWriter();
}
}
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testStepExecution() {
JobExecution jobExecution = jobLauncherTestUtils.launchStep("step-under-test");
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
}
This test passes.
But as soon as I define my StatefulItemReader as a step scoped bean (which is better for a stateful reader), the "before step" code is no longer executed.
...
#Bean
#StepScope
public ItemReader<String> reader() {
return new StatefulItemReader();
}
...
And I notice the same issue with processor and my writer beans.
What's wrong with my code? Is it related to this resolved issue: https://jira.springsource.org/browse/BATCH-1230
My whole Maven project with several JUnit tests can be found on GitHub: https://github.com/galak75/spring-batch-step-scope
Thank you in advance for your answers.
When you configure a bean as follows:
#Bean
#StepScope
public MyInterface myBean() {
return new MyInterfaceImpl();
}
You are telling Spring to use the proxy mode ScopedProxyMode.TARGET_CLASS. However, by returning the MyInterface, instead of the MyInterfaceImpl, the proxy only has visibility into the methods on the MyInterface. This prevents Spring Batch from being able to find the methods on MyInterfaceImpl that have been annotated with the listener annotations like #BeforeStep. The correct way to configure this is to return MyInterfaceImpl on your configuration method like below:
#Bean
#StepScope
public MyInterfaceImpl myBean() {
return new MyInterfaceImpl();
}
We have added a warning log message on startup that points out, as we look for the annotated listener methods, if the object is proxied and the target is an interface, we won't be able to find methods on the implementing class with annotations on them.
as suggested by pojo-guy
Solution is to implement StepExecutionListener and Override beforeStep method to set stepExecution
#Override
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}

Categories