Equivalent of MyBatis XML multiple environments in MyBatis Guice - java

I'm writing a service that needs to use a different database depending on context (a simple string label). Each database has exactly the same schema. The list of databases is dynamic.
Looking through MyBatis-Guice documentation on multiple data sources, the example is where the list of datasources are known upfront, and each datasource has a different mapper. Similarly, a question found here on SO assumes the same requirements.
As stated, my requirements are much more dynamic and fluid. The idea is to have all the currently known databases (with their connection information) in a config and have that parsed at service startup. Then, dependent upon the context of any incoming requests, the code should pull the SqlSessionFactory for the correct database. All downstream code that uses that SqlSessionFactory is exactly the same - i.e. not dependent on request context. Which means the same mappers are used no matter what database is required.
My MyBatis and Guice knowledge is admittedly very new and limited. However, I've not been able to google anything that shows the MyBatis-Guice equivalent to the multiple environment approach supported by the XML configuration of MyBatis.

I managed to come up with a solution that works for me, so thought I'd share it here. The decision to use Guice had already been made, so there was no wriggle room there.
First, I wrote a MyBatis Guice module for registering a single datasource. It is a PrivateModule so that all the MyBatis classes that get registered for one datasource do not conflict with other registrations for other datasources. It makes use of an internal MyBatisModule implementation because Java doesn't support multiple inheritance. Meaning we can't do public class MyMyBatisModule extends PrivateModule, MyBatisModule {...}.
public class MyMyBatisModule extends PrivateModule {
private final String datasourceLabel;
private final Properies datasourceProperties;
private List< Key<?> > exposedKeys = new ArrayList< Key<?> >();
public MyMyBatisModule( String datasourceLabel, Properties datasourceProperties ) {
this.datasourceLabel = datasourceLabel;
this.datasourceProperties = datasourceProperties;
}
#Override
protected void configure() {
install( new InternalMyMyBatisModule( ) );
for( Key<?> key: keys ) {
expose( key );
}
}
private class InternalMyMyBatisModule extends MyBatisModule {
#Override
protected void initialize( ) {
environmentId( datasourceLabel );
Names.bindProperties( binder(), properties );
install( JdbcHelper.MySQL ); // See JDBC Helper commentary below
bindDataSourceProviderType( C3p0DataSourceProvider.class ); // Choose whichever one you want
bindTransactionFactoryType( JdbcTransactionFactory.class );
// Register your mapper classes here. These mapper classes will have their
// keys exposed from the PrivateModule
//
// i.e.
//
// keys.add( registerMapper( FredMapper.class );
// kets.add( registerMapper( GingerMapper.class );
}
private <T> Key<T> registerMapper( Class<T> mapperClass ) {
Key<T> key = Key.get( mapperClass, Names.named( datasourceLabel ) );
bind( key ).to( mapperClass );
addMapperClass( mapperClass );
return key;
}
}
}
JdbcHeler.MySQL: I've used JdbcHelper.MySQL as a shortcut to map properties to the connection string, and use com.mysql.jdbc.Driver as the JDBC driver. It's declared as:
MySQL("jdbc:mysql://${JDBC.host|localhost}:${JDBC.port|3306}/${JDBC.schema}", "com.mysql.jdbc.Driver"),
Now it's time to register all your datasources. MyBatisModules handles this for us. It requires a map of datasourceLabel to jdbc properties.
public class MyBatisModules extends AbstractModule {
private Map< String, Properties > connectionsProperties;
public MyBatisModules( Map< String, Properties > = new HashMap< String, Properties > connectionsProperties ) {
this.connectionsProperties = connectionsProperties; // consider deep copy if appropriate
}
#Override
protected void configure( ) {
for( Entry< String, Properties > datasourceConnectionProperties : this.connectionsProperties.entrySet() ) {
install( new MyMyBatisModule( datasourceConnectionProperties.getKey(), datasourceConnectionProperties.getValue() ) );
}
bind( MapperRetriever.class ); // See MapperRetriever later
// bind your DAO classes here. By wrapping MyBatis Mapper use in DAO implementations, theoretically we
// can fairly easily change from MyBatis to any other database library just by changing the DAO implementation.
// The rest of our codebase would remain the same.
//
// i.e.
//
// bind( FredDao.class ).to( FredDaoMyBatis.class );
// bind( GingerDao.class).to( GingerDaoMyBatis.class );
}
}
Now we just need some way of getting the right Mapper class (which itself is associated with the right datasource). To do this, we actually need to call a method on the Guice Injector. I don't really like the idea of passing that around, so I wrapped it in MapperRetriever. You need to implement a retrieval method for each of your Mappers.
public class MapperRetriever {
private final Injector injector;
#Inject
public MapperRetriver( Injector injector ) {
this.injector = injector;
}
// The follwing two methods use the example Mappers referenced in the MyMyBatisModule implementation above
public FredMapper getFredMapper( String datasourceLabel ) {
return this.injector.getInstance( Key.get( FredMapper.class, Names.named( datasourceLabel ) ) );
}
public GingerMapper getGingerMapper( String datasourceLabel ) {
return this.injector.getInstance( Key.get( GingerMapper.class, Names.named( datasourceLabel ) ) );
}
}
And an example DAO implementation ...
public interface FredDao {
Fred selectFred( String datasourceLable, String fredId );
}
public class FredDaoMyBatis implements FredDao {
private MapperRetriever mapperRetriever;
#Inject
public FredDaoMyBatis( MapperRetriever mapperRetriever ) {
this.mapperRetriever = mapperRetriever;
}
#Override
public Fred selectFred( String datasourceLabel, String fredId ) {
FredMapper fredMapper = this.mapperRetriever.getFredMapper( datasourceLabel );
return fredMapper.getFred( fredId );
}
}

You can also create a custom SqlSessionFactoryProvider which returns a SqlSessionFactory which delegates to the correct DataSource's SqlSessionFactory. Using a ThreadLocal to determine the underlying SqlSessionFactory.
public class DelegatingSqlSessionFactory implements SqlSessionFactory {
private final Map<String, SqlSessionFactory> factories = new HashMap<>();
public DelegatingSqlSessionFactory(Map<String, DataSource> dataSources) throws ClassNotFoundException {
dataSources.forEach((key, ds) -> {
factories.put(key, createSqlSessionFactory(ds));
});
}
private SqlSessionFactory delegate() {
// Read from a ThreadLocal to determine correct SqlSessionFactory key
String key = findKey();
return factories.get(key);
}
#Override
public SqlSession openSession() {
return delegate().openSession();
}
#Override
public SqlSession openSession(boolean autoCommit) {
return delegate().openSession(autoCommit);
}
#Override
public SqlSession openSession(Connection connection) {
return delegate().openSession(connection);
}
#Override
public SqlSession openSession(TransactionIsolationLevel level) {
return delegate().openSession(level);
}
#Override
public SqlSession openSession(ExecutorType execType) {
return delegate().openSession(execType);
}
#Override
public SqlSession openSession(ExecutorType execType, boolean autoCommit) {
return delegate().openSession(execType, autoCommit);
}
#Override
public SqlSession openSession(ExecutorType execType, TransactionIsolationLevel level) {
return delegate().openSession(execType, level);
}
#Override
public SqlSession openSession(ExecutorType execType, Connection connection) {
return delegate().openSession(execType, connection);
}
#Override
public Configuration getConfiguration() {
return delegate().getConfiguration();
}
}

Related

How to register non-standarized SQL functions manually in Spring Boot application?

I'm using JPA query in my current spring-boot project. How can I add non-standardized SQL functions like GROUP_CONCAT?
Prior, to my previous problem :
How to show a column result in a one line comma separated list in JPA query
I found that GROUP_CONCAT is not a registered function in JPA query but could be accessed by registering it manually. I already tried following links but didn't work for me :
How to add non-standardized sql functions in Spring Boot application?
Registering a SQL function with JPA and Hibernate
https://thoughts-on-java.org/database-functions/
https://vladmihalcea.com/hibernate-sql-function-jpql-criteria-api-query/
1.
public class SqlFunctionsMetadataBuilderContributor
implements MetadataBuilderContributor {
#Override
public void contribute(MetadataBuilder metadataBuilder) {
metadataBuilder.applySqlFunction(
"group_concat",
new StandardSQLFunction(
"group_concat",
StandardBasicTypes.STRING
)
);
}
}
2.
public String render(Type firstArgumentType, List arguments, SessionFactoryImplementor factory)
throws QueryException {
if (arguments.size() < 1) {
throw new QueryException(new IllegalArgumentException("group_concat should have at least one arg"));
}
StringBuilder builder = new StringBuilder();
if (arguments.size() > 1 && arguments.get(0).equals("'distinct'")) {
builder.append("distinct ");
builder.append(arguments.get(1));
} else {
builder.append(arguments.get(0));
}
return "group_concat(" + builder.toString() + ")";
}
3.
#Configuration
public class DataSource {
#Bean
public JpaVendorAdapter jpaVendorAdapter() {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(true);
adapter.setDatabase(Database.MYSQL);
// package to CustomMysqlDialect
adapter.setDatabasePlatform("com.myprojet.admin.configuration.RegisterSqlFunction");
adapter.setGenerateDdl(false);
return adapter;
}
}
public RegisterSqlFunction() {
super();
registerFunction("group_concat, new StandardSQLFunction("group_concat",
StandardBasicTypes.STRING));
}
I except using group_concat with JPA query.
You can find a fully functional example in my High-Performance Java Persistence GitHub repository.
In your case, you don't need to customize the JpaPlatform. That should be set to the HibernateJpaPlatform.
You can register the MetadataBuilderContributer either programaticallly via the application.properties configuration file:
hibernate.metadata_builder_contributor=com.vladmihalcea.book.hpjp.SqlFunctionsMetadataBuilderContributor
Create a class and add mySql Function you need to use in the overridden method:
public class SqlFunctionsMetadataBuilderContributor implements MetadataBuilderContributor{
#Override
public void contribute(MetadataBuilder metadataBuilder) {
metadataBuilder.applySqlFunction(
"group_concat",
new StandardSQLFunction(
"group_concat",
StandardBasicTypes.STRING
)
);
}
}
After that, provide your metadata_builder_contributor via application.properties:
spring.jpa.properties.hibernate.metadata_builder_contributor = qualifiedClassName
In case someone is having issues when registering this in a SpringBoot app this is the right way:
Create a class that implements: MetadataBuilderContributor interface.
package com.application.config;
public class SqlFunctionsMetadataBuilderContributor implements MetadataBuilderContributor {
#Override
public void contribute(MetadataBuilder metadataBuilder) {
metadataBuilder.applySqlFunction(
"STRING_AGG",
new StandardSQLFunction(
"STRING_AGG",
StandardBasicTypes.STRING
)
);
}
}
In your application .yml (or .properties) refer to the previously created class in the following properties path: spring.jpa.properties.hibernate.metadata_builder_contributor
spring:
jpa:
properties:
hibernate:
metadata_builder_contributor: com.application.config.SqlFunctionsMetadataBuilderContributor

Spring Data: default 'not deleted' logic for automatic method-based queries when using soft-delete policy

Let's say we use soft-delete policy: nothing gets deleted from the storage; instead, a 'deleted' attribute/column is set to true on a record/document/whatever to make it 'deleted'. Later, only non-deleted entries should be returned by query methods.
Let's take MongoDB as an example (alghough JPA is also interesting).
For standard methods defined by MongoRepository, we can extend the default implementation (SimpleMongoRepository), override the methods of interest and make them ignore 'deleted' documents.
But, of course, we'd also like to use custom query methods like
List<Person> findByFirstName(String firstName)
In a soft-delete environment, we are forced to do something iike
List<person> findByFirstNameAndDeletedIsFalse(String firstName)
or write queries manually with #Query (adding the same boilerplate condition about 'not deleted' all the time).
Here comes the question: is it possible to add this 'non-deleted' condition to any generated query automatically? I did not find anything in the documentation.
I'm looking at Spring Data (Mongo and JPA) 2.1.6.
Similar questions
Query interceptor for spring-data-mongodb for soft deletions here they suggest Hibernate's #Where annotation which only works for JPA+Hibernate, and it is not clear how to override it if you still need to access deleted items in some queries
Handling soft-deletes with Spring JPA here people either suggest the same #Where-based approach, or the solution applicability is limited with the already-defined standard methods, not the custom ones.
It turns out that for Mongo (at least, for spring-data-mongo 2.1.6) we can hack into standard QueryLookupStrategy implementation to add the desired 'soft-deleted documents are not visible by finders' behavior:
public class SoftDeleteMongoQueryLookupStrategy implements QueryLookupStrategy {
private final QueryLookupStrategy strategy;
private final MongoOperations mongoOperations;
public SoftDeleteMongoQueryLookupStrategy(QueryLookupStrategy strategy,
MongoOperations mongoOperations) {
this.strategy = strategy;
this.mongoOperations = mongoOperations;
}
#Override
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, ProjectionFactory factory,
NamedQueries namedQueries) {
RepositoryQuery repositoryQuery = strategy.resolveQuery(method, metadata, factory, namedQueries);
// revert to the standard behavior if requested
if (method.getAnnotation(SeesSoftlyDeletedRecords.class) != null) {
return repositoryQuery;
}
if (!(repositoryQuery instanceof PartTreeMongoQuery)) {
return repositoryQuery;
}
PartTreeMongoQuery partTreeQuery = (PartTreeMongoQuery) repositoryQuery;
return new SoftDeletePartTreeMongoQuery(partTreeQuery);
}
private Criteria notDeleted() {
return new Criteria().orOperator(
where("deleted").exists(false),
where("deleted").is(false)
);
}
private class SoftDeletePartTreeMongoQuery extends PartTreeMongoQuery {
SoftDeletePartTreeMongoQuery(PartTreeMongoQuery partTreeQuery) {
super(partTreeQuery.getQueryMethod(), mongoOperations);
}
#Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
Query query = super.createQuery(accessor);
return withNotDeleted(query);
}
#Override
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
Query query = super.createCountQuery(accessor);
return withNotDeleted(query);
}
private Query withNotDeleted(Query query) {
return query.addCriteria(notDeleted());
}
}
}
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface SeesSoftlyDeletedRecords {
}
We just add 'and not deleted' condition to all the queries unless #SeesSoftlyDeletedRecords asks as to avoid it.
Then, we need the following infrastructure to plug our QueryLiikupStrategy implementation:
public class SoftDeleteMongoRepositoryFactory extends MongoRepositoryFactory {
private final MongoOperations mongoOperations;
public SoftDeleteMongoRepositoryFactory(MongoOperations mongoOperations) {
super(mongoOperations);
this.mongoOperations = mongoOperations;
}
#Override
protected Optional<QueryLookupStrategy> getQueryLookupStrategy(QueryLookupStrategy.Key key,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Optional<QueryLookupStrategy> optStrategy = super.getQueryLookupStrategy(key,
evaluationContextProvider);
return optStrategy.map(this::createSoftDeleteQueryLookupStrategy);
}
private SoftDeleteMongoQueryLookupStrategy createSoftDeleteQueryLookupStrategy(QueryLookupStrategy strategy) {
return new SoftDeleteMongoQueryLookupStrategy(strategy, mongoOperations);
}
}
public class SoftDeleteMongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable>
extends MongoRepositoryFactoryBean<T, S, ID> {
public SoftDeleteMongoRepositoryFactoryBean(Class<? extends T> repositoryInterface) {
super(repositoryInterface);
}
#Override
protected RepositoryFactorySupport getFactoryInstance(MongoOperations operations) {
return new SoftDeleteMongoRepositoryFactory(operations);
}
}
Then we just need to reference the factory bean in a #EnableMongoRepositories annotation like this:
#EnableMongoRepositories(repositoryFactoryBeanClass = SoftDeleteMongoRepositoryFactoryBean.class)
If it is required to determine dynamically whether a particular repository needs to be 'soft-delete' or a regular 'hard-delete' repository, we can introspect the repository interface (or the domain class) and decide whether we need to change the QueryLookupStrategy or not.
As for JPA, this approach does not work without rewriting (possibly duplicating) a substantial part of the code in PartTreeJpaQuery.

In Spring Security (Spring Boot 2.x) how do I provide my own implementation for #Pre/PostAuthorize and checking roles?

I want to use the built in Spring #PreAuthorize annotations and hasRole, hasAnyRole, etc but have the Spring classes call my implementation to determine if it should be true/false. How would I do this?
Is there a configuration in WebSecurityConfigurerAdapter I can override?
Do I need to implement a SecurityExpressionRoot class? And if so, where do I tell it to use mine?
I tried overriding the access decision manager and adding my own voter, but even though it calls my method and I return true (and it's an AffirmativeBased manager), it still goes to SecurityExpressionRoot.hasAnyRole() which then returns false.
public class MyDecisionVoter implements AccessDecisionVoter<Object>
{
#Override
public boolean supports(ConfigAttribute attribute)
{
//We want to always be called
return true;
}
#Override
public boolean supports(Class<?> clazz)
{
//We want to always be called
return true;
}
#Override
public int vote(Authentication authentication, Object object, Collection<ConfigAttribute> attributes)
{
//For testing purposes
return ACCESS_GRANTED;
}
}
The manager
public class MyAffirmativeBasedDecisionManager extends AffirmativeBased
{
public MyAffirmativeBasedDecisionManager(List<AccessDecisionVoter<?>> decisionVoters)
{
super( decisionVoters );
}
#Override
public boolean supports(Class<?> clazz)
{
for ( AccessDecisionVoter<?> voter : this.getDecisionVoters() )
{
if ( voter.supports( clazz ) )
{
return true;
}
}
return false;
}
}
Configuring
#EnableWebSecurity
#EnableGlobalMethodSecurity( prePostEnabled = true )
public class MyConfig extends WebSecurityConfigurerAdapter
{
#Override
protected void configure(HttpSecurity http) throws Exception
{
//Turn on OAuth
http.authorizeRequests()
.anyRequest()
.authenticated()
.accessDecisionManager( createDecisionManager() );
}
private AccessDecisionManager createDecisionManager()
{
List<AccessDecisionVoter<? extends Object>> decisionVoters = new ArrayList<>();
ExpressionBasedPreInvocationAdvice expressionAdvice = new ExpressionBasedPreInvocationAdvice();
expressionAdvice.setExpressionHandler( new DefaultMethodSecurityExpressionHandler() );
decisionVoters.add( new MyDecisionVoter() );
decisionVoters.add( new PreInvocationAuthorizationAdviceVoter( expressionAdvice ) );
decisionVoters.add( new RoleVoter() );
decisionVoters.add( new AuthenticatedVoter() );
return new MyAffirmativeBasedDecisionManager( decisionVoters );
}
}
This should let them in, but it fails with a 403:
#GetMapping( "shouldallow" )
#ResponseBody
#PreAuthorize( "hasRole('ROLE_NOT_EXIST')" )
public String shouldAllow()
{
return "should allow";
}
If you want to have #PreAuthorize("hasRole('USER')") call your method instead of the one in SecurityExpressionRoot, then, yes, you'd need to replace that by exposing your own MethodSecurityExpressionHandler as a bean. You'd override its createSecurityExpressionRoot method:
class MyExpressionHandler extends DefaultMethodSecurityExpressionHandler {
#Override
protected MethodSecurityExpressionOperations
createSecurityExpressionRoot(Authentication a, MethodInvocation mi) {
return new MyRoot(super.createSecurityExpressionRoot(a, mi));
}
}
#EnableGlobalMethodSecurity(prePostEnabled=true)
class UsingCustomExpressionHandler extends GlobalMethodSecurityConfiguration {
#Override
protected MethodSecurityExpressionHandler createExpressionHandler() {
return new MyExpressionHandler();
}
}
BUT, there are less invasive things that you can try first.
Using a Bean
For example, you can refer to any of your own beans inside a SpEL. So, if you created a #Bean that can perform the evaluation, then you don't need to call hasRole. Instead, you can just do:
#PreAuthorize("#myBean.evaluate(authentication)")
It gives you a lot of flexibility to do whatever you need to do with the Authentication to determine access.
Mapping Authorities
Or, you can consider mapping whatever you have in the way of custom roles into a set of GrantedAuthoritys. Several of the authentication mechanisms in Spring Security come with a way to map custom authority representations.
For example, I noticed your comment // turn on OAuth. If you are wanting to override hasRole because of OAuth scopes, you can use oauth2ResourceServer() and supply a custom JwtAuthenticationConverter to adapt your custom authorities into GrantedAuthoritys. In which case, hasRole might not have to be overridden at all. (Of course, I don't know what your specific situation is regarding how you are authenticating the user. This is just one example of GrantedAuthority conversion among many.)
It might help:
public class AuthorizationService {
public boolean hasAccess(Object obj) {
// your code here
}
}
#GetMapping( "/someurl" )
#PreAuthorize("#authorizationService.hasAccess(#obj)")
public void dummyMethod(#PathVariable("obj") Object obj) {
}
if you define the permission based on some object, you can pass it directly to the method. Otherwise you can ignore argument in hasAccess method. And you should provide bean of AuthorizationService.

SequenceGenerator to SequenceStyleGenerator moving from hibernate 4.2 to 5

I've recently upgraded my project to use hibernate 5 which was earlier using hibernate 4.2. We have database sequence with naming convention followed as "SEQ_PrimaryKeyName". In hibernate 4.2 we were using org.hibernate.id.SequenceGenerator to generate the sequence - the code looks like this -
public class PrimaryKeyGenerator extends IdentityGenerator implements Configurable {
private PrimaryKeyGenerator pkGen;
public PrimaryKeyGenerator () {
pkGen= new SequenceGenerator();
}
//configure sequence generator
public void configure(Type type, Properties params, Dialect Dialect) throws MappingException {
if (pkGen instanceof Configurable) {
String seqName = "SEQ_" + params.getProperty(PersistentIdentifierGenerator.PK);
params.setProperty(SequenceGenerator.SEQUENCE, seqName);
((Configurable)pkGen).configure(type, params, dialect);
}
}
// Generate sequence
public Serializable generate(SessionImplementor session, Object obj) throws HibernateException {
return pkGen.generate(session, obj);
}
}
SequenceGenerator is deprecated in hibernate 5 and as per javadoc it is recommended to use org.hibernate.id.enhanced.SequenceStyleGenerator.
I modified my existing PrimaryKeyGenerator class to following
public class PrimaryKeyGenerator extends IdentityGenerator implements Configurable {
private PrimaryKeyGenerator pkGen;
public PrimaryKeyGenerator() {
pkGen = new SequenceStyleGenerator();
}
// implement the methods of SequenceStyleGenerator.java
#Override
public void configure(Type type, Properties params, ServiceRegistry serviceRegistry) throws MappingException {
if (pkGen instanceof Configurable) {
String seqName = "SEQ_" + params.getProperty(PersistentIdentifierGenerator.PK);
params.setProperty(SequenceStyleGenerator.SEQUENCE_PARAM, seqName);
((Configurable) pkGen).configure(type, params, serviceRegistry);
}
}
public Serializable generate(SessionImplementor session, Object obj) throws HibernateException {
// collect instance of org.hibernate.boot.model.relational.Database
Database db = MetadataExtractor.INSTANCE.getDatabase();
(SequenceStyleGenerator)pkGen).registerExportables(db);
return pkGen.generate(session, obj);
}
}
With above changes sequence are getting generated properly. The one question I have is that Is it allright to call registerExportable() method before calling generate method ? I am not sure but it looks like registerExportable should be called only once and not every time time generate() method is called. If I don't make a call to registerExportable method explicitly I get following exception.
"SequenceStyleGenerator's SequenceStructure was not properly initialized"
May be the way I am trying to use SequenceStyleGenerator is not correct.

Making a "TransactionAction" inner class

I've got a Spring + Hibernate + MySQL web application, which is just a hello-world-test-area for now.
One of my Service classes implements this method:
public List<Offerta> tutte() {
List<Offerta> tutte = null;
TransactionStatus status = txm.getTransaction( new DefaultTransactionDefinition() );
try {
tutte = dao.getAll(Offerta.class);
txm.commit(status);
} catch (Exception e) {
e.printStackTrace();
txm.rollback(status);
}
return tutte;
}
'txm' is an injected PlatformTransactionManager.
What I want now, is to avoid duplicating the "wrapping" transaction code in all my service's methods!
I would like something like this:
someHelperTransactionClass.doThisInTransaction(new TransactionAction() {
List l = dao.queryForSomething();
});
But that's a inner class: how can I pass in and out data from it? I mean, how can I get the resulting "l" list from that TransactionAction? You could answer in a number of ways to this specific case, but what I need is a generic TransactionAction or a different solution which let me write the actual database code, without having to write each time the same boring code.
Please do not answer "Why don't you use #Transactional annotations or AOP tx:advice configuration?" because I CAN'T!
Why? I am on Google AppEngine, and that cool guys are not so cool: the disabled access to the javax.naming package, and something in those great ways to declarative transactions touches it. :-\
You can imitate basic AOP mechanism using Proxy objects. Such as http://www.devx.com/Java/Article/21463/1954
This is a mock. But I really doubt it plays well with Spring or GAE. PLease note that you need to use interfaces for Proxies.
interface Dao {
List<Foo> getAllFoo();
}
public class MyDao implements Dao {
public MyDao() {
}
public List<Foo> getAllFoo() {
//.. get list of foo from database. No need to use transactions
}
public static void main(String[] args) {
Dao dao = new MyDao();
InvocationHandler handler = new TransactionProxyHandler(dao);
Dao proxy = (Dao) Proxy.newProxyInstance(MyDao.class.getClassLoader(), MyDao.class.getInterfaces(), handler);
List<Foo> all = proxy.getAllFoo();
}
}
class TransactionProxyHandler implements InvocationHandler {
protected Object delegate;
PlatformTransactionManager txm = new PlatformTransactionManager();
public TransactionProxyHandler(Object delegate) {
this.delegate = delegate;
}
public Object invoke(Object proxy, Method method, Object[] args)
throws Throwable {
TransactionStatus status = txm.getTransaction();
Object res = null;
try {
res = method.invoke(delegate, args);
txm.commit(status);
} catch (Exception e) {
e.printStackTrace();
txm.rollback(status);
}
return res;
}
}

Categories