SparkJava JOOQ dependency injection with Guice - java

I am writing simple CRUD application which would fetch person records from database and I'm using SparkJava framework I have working code which fetches records from database but I would want to extract JOOQ DSLContext code and inject it as a bean and initialize it in another class in order to have more cleaner code, but I'm not sure how to achieve it here's main method which currently hold everything:
public static void main(String[] args) throws IOException {
final BasicDataSource ds = new BasicDataSource();
final Properties properties = new Properties();
properties.load(BankApiApplication.class.getResourceAsStream("/application.properties"));
ds.setDriverClassName(properties.getProperty("db.driver"));
ds.setUrl(properties.getProperty("db.url"));
ds.setUsername(properties.getProperty("db.username"));
ds.setPassword(properties.getProperty("db.password"));
final ConnectionProvider cp = new DataSourceConnectionProvider(ds);
final Configuration configuration = new DefaultConfiguration()
.set(cp)
.set(SQLDialect.H2)
.set(new ThreadLocalTransactionProvider(cp, true));
final DSLContext ctx = DSL.using(configuration);
final JSONFormat format = new JSONFormat().format(true).header(false).recordFormat(JSONFormat.RecordFormat.OBJECT);
port(8080);
get("/persons", (request, response) -> {
return ctx.select().from(Person.PERSON).fetch().formatJSON();
});
}
How I could extract code which initializes Datasource and configures DSLContext and instead I could just inject DSLContext or some kind of DSLContextHolder and do querying ?

So, in general, you want to inject the highest-level object you can. This is related to the Law of Demeter, which in short says that a component can know about its direct dependencies, but it shouldn't know about those dependencies' dependencies.
In your case, you're really only using DSLContext (ctx). [A note here: your code has a lot of two-letter names - it's pretty hard to follow. It would be easier if you wrote out e.g. ctx -> dslContext, cp -> connectionProvider]. This means you really only want your method to know about the DSLContext, not its dependencies. Therefore, it would be good to pull the following out into a module, then inject just a DSLContext:
Configuration
ConnectionProvider
Properties
BasicDataSource
If all these things are only used in this one main(), you can write a single Provider to return a DSLContext. If some of these are used in multiple places (for more than instantiating this main()'s DSLContext), then they can go in their own Providers. For example, here's what a Provider for a DSLContext would look like, if Configuration was placed in its own Provider:
public class MyModule extends AbstractModule {
// other providers
// ...
#Provides
#Singleton
public DSLContext dslContext(Configuration configuration) {
return SL.using(configuration);
}
}
Then, in your main(), you would write:
Injector injector = Guice.createInjector(yourModule());
DSLContext myContext = injector.getInstance(DSLContext.class);
// ... use it

Related

How to access configuration in a Lagom service at startup?

I am migrating my current app in Spring/J2EE to Lagom. I am working in Java. I need to read variables from the configuration (application.conf in resources folder). In the implementation module, I try to inject configuration as a class variable like this
#Inject
private Configuration config
but when I access this config object in the constructor, it gives null pointer exception.
The whole code is like this
import play.Configuration;
public class SomeServiceImpl implements SomeService {
#Inject
private Configuration config;
public SomeServiceImpl() {
//getting configuration from application.conf
// gives exception as config is null.
String key = config.getString(“key”);
}
#Override
public ServiceCall<Request, Response> send() {
//works here, does not give exception
String key = config.getString(“key”);
}
}
Sorry, I should have been clear from the beginning. I have edited the original question. I get null pointer exception when I try to read from configuration object in constructor but I am able to use it in service call implementation. I want some way in which I can access the configuration in application.conf at startup and possibly store in some config class which can be accessed anywhere later.
In Java, when an object is instantiated, the first thing that happens (before anything else can possibly happen) is the constructor is invoked. After that, frameworks like Guice (which Lagom uses) are free to inject things, but they can't do it until the constructor has been invoked. So, all your #Inject annotated fields will be null when the constructor is invoked, there is nothing you can do to work around that.
So, don't use field injection, use constructor injection, eg:
import play.Configuration;
public class SomeServiceImpl implements SomeService {
private final Configuration config;
#Inject
public SomeServiceImpl(Configuration config) {
this.config = config;
String key = config.getString("key");
}
#Override
public ServiceCall<Request, Response> send() {
String key = config.getString("key");
}
}
Constructor injection is not just recommended for this use case, you should be using it everywhere, it avoids all these potential issues.

Dependency injection using Guice with the DAO pattern

For a small side project I'm working on I've been trying to implement something of a DAO pattern for my interactions with the DB, and have started using Guice (for my first time) to handle the DI for me. Right now I have this class hierarchy:
DAOImpl takes a reference to a class type so my database client (mongo/morphia) can do some initialization work and instantiate a BasicDAO provided by morphia. Here's snippets of the relevant classes:
public class DAOImpl<T> implements DAO<T> {
private static final Logger LOG = LoggerFactory.getLogger(DAOImpl.class);
private static final String ID_KEY = "id";
private final org.mongodb.morphia.dao.DAO morphiaDAO;
#Inject
public DAOImpl(Datastore ds, Class<T> resourceClass) {
morphiaDAO = new BasicDAO(resourceClass, ds);
LOG.info("ensuring mongodb indexes for {}", resourceClass);
morphiaDAO.getDatastore().ensureIndexes(resourceClass);
}
}
public class UserDAO extends DAOImpl<User> {
#Inject
public UserDAO(Datastore ds) {
super(ds, User.class);
}
public User findByEmail(String email) {
return findOne("email", email);
}
}
I know that I need to tell Guice to bind the relevant classes for each generic DAOImpl that gets extended, but I'm unsure of how to do it. This looks like it might have been answered but it's not clicking for me. I've tried some of the following:
public class AppInjector extends AbstractModule {
#Override
protected void configure() {
bind(com.wellpass.api.dao.DAO.class).to(DAOImpl.class);
// bind(new TypeLiteral<SomeInterface<String>>(){}).to(SomeImplementation.class);
// bind(new TypeLiteral<MyGenericInterface<T>>() {}).to(new TypeLiteral<MyGenericClass<T>>() {});
// bind(new TypeLiteral<DAO<User>>() {}).to(UserDAO.class);
bind(new TypeLiteral<DAO<User>>(){}).to(new TypeLiteral<DAOImpl<User>>() {});
}
}
These are some of the the errors I've seen:
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) No implementation for org.mongodb.morphia.Datastore was bound.
while locating org.mongodb.morphia.Datastore
for the 1st parameter of com.wellpass.api.dao.UserDAO.<init>(UserDAO.java:12)
at com.wellpass._inject.AppInjector.configure(AppInjector.java:18)
2) java.lang.Class<T> cannot be used as a key; It is not fully specified.
at com.wellpass.api.dao.DAOImpl.<init>(DAOImpl.java:19)
at com.wellpass._inject.AppInjector.configure(AppInjector.java:14)
Any help would be much appreciated.
If you want an injection site like the following:
#Inject
public DAOConsumer(DAO<User> dao) {
}
to be injected with an instance of your UserDAO class then
bind(new TypeLiteral<DAO<User>>() {}).to(UserDAO.class);
is the correct syntax.
As for your other error:
1) No implementation for org.mongodb.morphia.Datastore was bound.
This is because Datastore is an interface. You need to bind the interface to an implementation, an instance, or a Provider<Datastore>.
To work out how to do this, think of the steps you would need to do this manually without the extra complication of Guice. Once you 100% understand this, you can try and design an object graph that appropriately reflects the steps in the initialization of morphia.
To get you started, the morphia quick tour has a guide on how to get an instance of the Datastore object:
final Morphia morphia = new Morphia();
// tell Morphia where to find your classes
// can be called multiple times with different packages or classes
morphia.mapPackage("org.mongodb.morphia.example");
// create the Datastore connecting to the default port on the local host
final Datastore datastore = morphia.createDatastore(new MongoClient(), "morphia_example");
datastore.ensureIndexes();
From their code, you can see that there are at least two dependencies required to get the Datastore:
A singleton Morphia
A singleton MongoClient
You will have to write some code to set this up, possibly using Guice's Provider class.

mongodb multi tenacy spel with #Document

This is related to
MongoDB and SpEL Expressions in #Document annotations
This is the way I am creating my mongo template
#Bean
public MongoDbFactory mongoDbFactory() throws UnknownHostException {
String dbname = getCustid();
return new SimpleMongoDbFactory(new MongoClient("localhost"), "mydb");
}
#Bean
MongoTemplate mongoTemplate() throws UnknownHostException {
MappingMongoConverter converter =
new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
return new MongoTemplate(mongoDbFactory(), converter);
}
I have a tenant provider class
#Component("tenantProvider")
public class TenantProvider {
public String getTenantId() {
--custome Thread local logic for getting a name
}
}
And my domain class
#Document(collection = "#{#tenantProvider.getTenantId()}_device")
public class Device {
-- my fields here
}
As you see I have created my mongotemplate as specified in the post, but I still get the below error
Exception in thread "main" org.springframework.expression.spel.SpelEvaluationException: EL1057E:(pos 1): No bean resolver registered in the context to resolve access to bean 'tenantProvider'
What am I doing wrong?
Finally figured out why i was getting this issue.
When using Servlet 3 initialization make sure that you add the application context to the mongo context as follows
#Autowired
private ApplicationContext appContext;
public MongoDbFactory mongoDbFactory() throws UnknownHostException {
return new SimpleMongoDbFactory(new MongoClient("localhost"), "apollo-mongodb");
}
#Bean
MongoTemplate mongoTemplate() throws UnknownHostException {
final MongoDbFactory factory = mongoDbFactory();
final MongoMappingContext mongoMappingContext = new MongoMappingContext();
mongoMappingContext.setApplicationContext(appContext);
// Learned from web, prevents Spring from including the _class attribute
final MappingMongoConverter converter = new MappingMongoConverter(factory, mongoMappingContext);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return new MongoTemplate(factory, converter);
}
Check the autowiring of the context and also
mongoMappingContext.setApplicationContext(appContext);
With these two lines i was able to get the component wired correctly to use it in multi tenant mode
The above answers just worked partially in my case.
I've been struggling with the same problem and finally realized that under some runtime execution path (when RepositoryFactorySupport relies on AbstractMongoQuery to query MongoDB, instead of SimpleMongoRepository which as far as I know is used in "out of the box" methods provided by SpringData) the metadata object of type MongoEntityMetadata that belongs to MongoQueryMethod used in AbstractMongoQuery is updated only once, in a method named getEntityInformation()
Because MongoQueryMethod object that holds a reference to this 'stateful' bean seems to be pooled/cached by infrastructure code #Document annotations with Spel not always work.
As far as I know as a developer we just have one choice, use MongoOperations directly from your #Repository bean in order to be able to specify the right collection name evaluated at runtime with Spel.
I've tried to use AOP in order to modify this behaviour, by setting a null collection name in MongoEntityMetadata but this does not help because changes in AbstractMongoQuery inner classes, that implement Execution interface, would also need to be done in order to check if MongoEntityMetadata collection name is null and therefore use a different MongoTemplate method signature.
MongoTemplate is smart enough to guess the right collection name by using its private method
private <T> String determineEntityCollectionName(T obj)
I've a created a ticket in spring's jira https://jira.spring.io/browse/DATAMONGO-1043
If you have the mongoTemplate configured as in the related issue, the only thing i can think of is this:
<context:component-scan base-package="com.tenantprovider.package" />
Or if you want to use annotations:
#ComponentScan(basePackages = "com.tenantprovider.package")
You might not be scanning the tenant provider package.
Ex:
#ComponentScan(basePackages = "com.tenantprovider.package")
#Document(collection = "#{#tenantProvider.getTenantId()}_device")
public class Device {
-- my fields here
}

Access Config.groovy from Java class

When my Grails app starts up, I also begin a Spring Integration and Batch process in the background. I want to have some DB connection properties stored in the Config.groovy file, but how do I access them from a Java class used in teh Integration/Batch process?
I found this thread:
Converting Java -> Grails ... How do I load these properties?
Which suggests using:
private Map config = ConfigurationHolder.getFlatConfig();
followed by something like:
String driver = (String) config.get("jdbc.driver");
This actually works fine (teh properties are loaded correctly from Config.groovy) but the problem is that ConfigurationHolder is after being deprecated. And any thread I've found dealing with the issue seems to be Grails-specific and suggest using dependancy injection, like in this thread:
How to access Grails configuration in Grails 2.0?
So is there a non-deprecated way to get access to the Config.groovy properties from a Java class file?
Just checked in some of my existing code, and I use this method described by Burt Beckwith
Create a new file: src/groovy/ctx/ApplicationContextHolder.groovy
package ctx
import org.springframework.context.ApplicationContext
import org.springframework.context.ApplicationContextAware
import javax.servlet.ServletContext
import org.codehaus.groovy.grails.commons.GrailsApplication
import org.codehaus.groovy.grails.plugins.GrailsPluginManager
import org.springframework.context.ApplicationContext
import org.springframework.context.ApplicationContextAware
#Singleton
class ApplicationContextHolder implements ApplicationContextAware {
private ApplicationContext ctx
private static final Map<String, Object> TEST_BEANS = [:]
void setApplicationContext(ApplicationContext applicationContext) {
ctx = applicationContext
}
static ApplicationContext getApplicationContext() {
getInstance().ctx
}
static Object getBean(String name) {
TEST_BEANS[name] ?: getApplicationContext().getBean(name)
}
static GrailsApplication getGrailsApplication() {
getBean('grailsApplication')
}
static ConfigObject getConfig() {
getGrailsApplication().config
}
static ServletContext getServletContext() {
getBean('servletContext')
}
static GrailsPluginManager getPluginManager() {
getBean('pluginManager')
}
// For testing
static void registerTestBean(String name, bean) {
TEST_BEANS[name] = bean
}
// For testing
static void unregisterTestBeans() {
TEST_BEANS.clear()
}
}
Then, edit grails-app/config/spring/resources.groovy to include:
applicationContextHolder(ctx.ApplicationContextHolder) { bean ->
bean.factoryMethod = 'getInstance'
}
Then, in your files inside src/java or src/groovy, you can call:
GrailsApplication app = ApplicationContextHolder.getGrailsApplication() ;
ConfigObject config = app.getConfig() ;
Just to register, in Grails 2.x, there's a Holders class that replaces this deprecated holder. You can use this to access grailsApplication in a static context.
I can't work out why this is not working, but I can suggest an alternative approach entirely. Grails sets up a PropertyPlaceholderConfigurer that takes its values from the grailsApplication.config, so you could declare a
public void setDriver(String driver) { ... }
on your class and then say
<bean class="com.example.MyClass" id="exampleBean">
<property name="driver" value="${jdbc.driver}" />
</bean>
This also works in resources.groovy if you're using the beans DSL, but you must remember to use single quotes rather than double:
exampleBean(MyClass) {
driver = '${jdbc.driver}'
}
Using "${jdbc.driver}" doesn't work because that gets interpreted by Groovy as a GString and (fails to be) resolved when resources.groovy is processed, whereas what you need is to put a literal ${...} expression in as the property value to be resolved later by the placeholder configurer.

Same hibernate session between JUnit tests causing problems

I'm developing a web app which is based in Spring 2.5 and hibernate 3. Recently I've introduced JUnit tests and I've done some integration tests using DBUnit framework. DBUnit is supposed to update the database with an xml dataset between one test and another, and it's working well, as I've seen.
However, when I update an element in a test, hibernate seems to catch this information and even I load the element in the following test, the information is the one I've modified. If I look the DB when the execution is paused, the Data Base is properly reseted by DBUnit. So I think it can be an Hibernate problem..
Is there a way to make a tearDown between tests saying I want a new hibernate session for my spring context? By the way, I'm not using Spring annotations and I get the Spring context by code:
String[] contextLocations = new String[2];
contextLocations[0] = "WebContent/WEB-INF/applicationContext.xml";
contextLocations[1] = "src/System_V3/test/applicationContext.xml";
context = new FileSystemXmlApplicationContext(contextLocations);
DBUnit setUp:
#Before
public void setUpBeforeClass() throws Exception {
handleSetUpOperation();
}
private static void handleSetUpOperation() throws Exception {
conn = getConnection();
conn.getConnection().setAutoCommit(false);
final IDataSet data = getDataSet();
try {
DatabaseOperation.REFRESH.execute(conn, data);
} finally {
conn.close();
}
}
private static IDatabaseConnection getConnection() throws ClassNotFoundException, SQLException,
DatabaseUnitException {
Class.forName("org.gjt.mm.mysql.Driver");
return new DatabaseConnection(DriverManager.getConnection(
"jdbc:mysql://localhost:3306/web_database", "root", "pass"));
}
private static IDataSet getDataSet() throws IOException, DataSetException {
ClassLoader classLoader = TestPrueba.class.getClassLoader();
return new FlatXmlDataSetBuilder().build(classLoader
.getResourceAsStream("System_V3/test/dataset.xml"));
}
Tests are done in JUnit 4 using only #Test annotations and test class is not extending any library class.
Any suggestion?
Not sure if this is something that can help you - but just in case...
Try to use session.clear() and use it in teardown method.
Please take a look here http://docs.jboss.org/hibernate/orm/3.5/api/org/hibernate/Session.html#clear()
According to spec session.clear() ->
Completely clear the session. Evict all loaded instances and cancel all pending saves, updates and deletions. Do not close open iterators or instances of ScrollableResults.
You need to execute your tests within a transaction. This can be achieved by setting the SpringJUnit4ClassRunner for your test. After this is configured you can use #Transactional annotation per test.
With this approach you can #Autowired your beans directly to your test too.
For instance:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:context-file.xml" })
public class MyTest {
#Autowired
private MyService myService;
#Transactional
#Test
private void myFirstTest() {
...
myService.executeSomething();
...
}
}
and of course, you can set the default behaviour to RollBack on your test class annotating it with #TransactionConfiguration(defaultRollback = true/false)

Categories