Hello I am trying to create an application using dropwizard framework. I have the DAO classes impl which needs an handle to connection manager instance which will then be used to get database connections. I have a multi tenant database application. This connection manager would be a custom implementation.
The application uses hikari cp as connection pool and mysql database. I want to initialize the datasource and connection pool using dropwizard managed object feature. Once the datasource is initialized I want to inject the connection manager instance in each of dao classes using guice binding something like
bind(ConnectionManager.class).toProvider(ConnectionManagerProvider.class);
Then in each dao impl classes
#Inject
public class UserDAOIpl extends AbstractDAO {
protected UserDAOImpl(ConnectionManager connectionManager) {
super(connectionManager);
}
}
I have looked everywhere on the net there is no particular example for my use case. Also there is a lack of documentation at dropwirzard.io
This is more of an architectural design question rather than code question.
The datasource module would be a separate module which would be used in many service. I am using maven as build tool.
My questions are
How I can approach this situation ? Some class names and implementation guide lines would be very useful.
The application would be handing half a million requests a day. The solution should be feasible.
I look forward to community for any guidance or if any body can point me to some good resources.
NOTE: We won't be using hibernate for this application and would be using JDBI.
I prepared a setup similar to the one you described as follows. It sets up guice, initializes a DBIFactory (you might need to adopt that part to your scenario). Then a JDBI object is handed over to a repository implementation that can use it to persist an entity of type Vessel.
(1) Adding guice to the project
<dependency>
<groupId>com.hubspot.dropwizard</groupId>
<artifactId>dropwizard-guice</artifactId>
<version>x.x.x</version>
</dependency>
(2) Setup Guice in initialize():
guiceBundle = GuiceBundle.<YourConfiguration>newBuilder()
.addModule(new GuiceModule())
.enableAutoConfig("your.package.name.heres")
.setConfigClass(YourConfiguration.class)
.build();
(3) Guice config for preparing JDBI elements
public class GuiceModule extends AbstractModule {
private DBI jdbi;
#Provides
public DBI prepareJdbi(Environment environment,
SightingConfiguration configuration) throws ClassNotFoundException {
// setup DB access including DAOs
// implementing a singleton pattern here but avoiding
// Guice to initialize DB connection too early
if (jdbi == null) {
final DBIFactory factory = new DBIFactory();
jdbi = factory.build(environment, configuration.getDataSourceFactory(), "h2");
}
return jdbi;
}
#Provides
public VesselJDBI prepareVesselJdbi(DBI jdbi) {
return jdbi.onDemand(VesselJDBI.class);
}
#Override
protected void configure() {
bind(VesselRepository.class).to(VesselRepositoryImpl.class);
/* ... */
}
}
(4) start using it in your classes
public class VesselRepositoryImpl implements VesselRepository {
private VesselJDBI jdbi;
#Inject
public VesselRepositoryImpl(VesselJDBI jdbi) {
this.jdbi = jdbi;
}
public Vessel create(Vessel instance) {
return jdbi.inTransaction((transactional, status) -> {
/* do several things with jdbi in a transactional way */
});
}
}
(please note: the last code example used Java 8. To use JDBI with Java 8 with Dropwizard 0.8.1 please use jdbi version 2.62 to avoid bug https://github.com/jdbi/jdbi/issues/144)
Please let me know if this helped you.
Best regards,
Alexander
I can't comment, but wanted to add on to Alex's answer:
For the repository implementation, I recommend having the repository be handled by jDBI instead of using Guice. Here's what I did:
In the Guice module, add a provide method:
#Provides
#Singleton
public void repository(Dbi dbi) {
// dbi.onDemand(whateverYourClassIs.class)
}
in the repository class, use #CreateSqlObject to have your DAOs available:
public abstract class Repo {
#CreateSqlObject
abstract Dao dao(); // will return a jDBI managed DAO impl
public void doWhatever() {
/// logic
}
}
This has the distinct advantage that you can now use jDBI annotations. (I have not found a way to use them with guice directly). This is very nice for example, if you need to execute DAO code in a transaction. The Repository is still handled within Guice so it can be injected anywhere, but jDBI handles the tricky bits within your DAO/Repository code.
Hope this helps :)
Artur
Related
I'm trying to do some tests to see if my transactional methods are working fine. However I do not fully understand whether or not I should mock the database or not and how JOOQ comes into this equation. Below is the Service class with the transaction of adding a role into the databse.
#Service
public class RoleService implements GenericRepository<Role>
{
#Autowired
private DSLContext roleDSLContext;
#Override
#Transactional
public int add(Role roleEntry)
{
return roleDSLContext.insertInto(Tables.ROLE,
Tables.ROLE.NAME,
Tables.ROLE.DESCRIPTION,
Tables.ROLE.START_DATE,
Tables.ROLE.END_DATE,
Tables.ROLE.ID_RISK,
Tables.ROLE.ID_TYPE,
Tables.ROLE.ID_CONTAINER)
.values(roleEntry.getName(),
roleEntry.getDescription(),
roleEntry.getStartDate(),
roleEntry.getEndDate(),
roleEntry.getIdRisk(),
roleEntry.getIdType(),
roleEntry.getIdContainer())
.execute();
}
}
I'm using MySQL and the connection to the database is made using the spring config file
spring.datasource.url=jdbc:mysql://localhost:3306/role_managementverifyServerCertificate=false&useSSL=true
spring.datasource.username=root
spring.datasource.password=123456
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
I'm assuming I don't have to reconnect to the database everytime I'm testing the transaction and closing the connection after it finishes. I know that there is
MockDataProvider provider = new MockDataProvider()
but I don't understand how it works.
What is the best way to test the before mentioned method?
Disclaimer
Have you read the big disclaimer in the jOOQ manual regarding mocking of your database?
Disclaimer: The general idea of mocking a JDBC connection with this jOOQ API is to provide quick workarounds, injection points, etc. using a very simple JDBC abstraction. It is NOT RECOMMENDED to emulate an entire database (including complex state transitions, transactions, locking, etc.) using this mock API. Once you have this requirement, please consider using an actual database product instead for integration testing, rather than implementing your test database inside of a MockDataProvider.
It is very much recommended you use something like testcontainers to integration test your application, instead of implementing your own "database product" via the mock SPI of jOOQ (or any other means of mocking).
If you must mock
To answer your actual question, you can configure your DSLContext programmatically, e.g. using:
#Bean
public DSLContext getDSLContext() {
if (testing)
return // the mocking context
else
return // the actual context
}
Now inject some Spring profile value, or whatever, to the above configuration class containing that DSLContext bean configuration, and you're all set.
Alternatively, use constructor injection instead of field injection (there are many benefits to that)
#Service
public class RoleService implements GenericRepository<Role> {
final DSLContext ctx;
public RoleService(DSLContext ctx) {
this.ctx = ctx;
}
// ...
}
So you can manually construct your service in the test that mocks the database:
RoleService testService = new RoleService(mockingContext);
testService.add(...);
But as you can see, the mocking is completely useless. Because what you want to test is that there's a side effect in your database (a record has been inserted), and to test that side effect, you'll want to query the database again, but unless you mock that as well, or re-implement an entire RDBMS, you won't see that record in the database. So, again, why not just integration test your code, instead?
In my Java app that based on Spring Boot, I am trying to implement a caching mechanism for the following service method:
#Override
public List<EmployeeDTO> findAllByCountry(Country country) {
final Map<Pair<UUID, String>, List<CountryTranslatable>> valueList
= countryRepository...
// code omitted for brevity
}
After several examples regarding to this issue, I decided on the approach mentioned on A Guide To Caching in Spring.
However, I am a little bit confused as it contains Spring and Spring Boot implementations and uses different annotation examples. I think I should start from 3.1. Using Spring Boot section as I use Spring Boot, but I am not sure about which Caching Annotation I should use (4.1. #Cacheable seems to be ok but I am not sure).
So, where should I put SimpleCacheCustomizer and how can I apply that approach for my service method above (findAllByCountry)? Any simple example would really be appreciated as I am new in Spring.
You don't need any customizations if you are a starter, and you want only the basics then do the following
#Configuration
#EnableCaching
public class CachingConfig {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager();
}
}
The provided article states, return new ConcurrentMapCacheManager("addresses"); but you can use the default constructor and the relevant cache for adresses will be created later with #Cacheable("addresses"). So no need for this to be in configuration.
You also need
#Cacheable("employeesList")
#Override
public List<EmployeeDTO> findAllByCountry(Country country) {
final Map<Pair<UUID, String>, List<CountryTranslatable>> valueList
= countryRepository...
// code omitted for brevity
}
ready to go, that is the basic setup
If you want to customize the autoconfigured cachemanager then only you should implement CacheManagerCustomizer interface.
In usual cases you don't need to customize the autoconfigured cachemanager. The example has been given in the link you attached.
Your understanding on the cacheable annotation is also correct and it should work fine for you.
You can put that component class with other classes in the component scan range.
You should put your SimpleCacheCustomizer along with your others Spring configuration class. That way, your component will be scanned and loaded by Spring.
#Component
public class SimpleCacheCustomizer
implements CacheManagerCustomizer<ConcurrentMapCacheManager> {
#Override
public void customize(ConcurrentMapCacheManager cacheManager) {
cacheManager.setCacheNames(asList("employeesList", "otherCacheName"));
}
}
To use the cache with your service, add the annotation #Cacheable("employeesList")
#Cacheable("employeesList")
#Override
public List<EmployeeDTO> findAllByCountry(Country country) {
final Map<Pair<UUID, String>, List<CountryTranslatable>> valueList
= countryRepository...
// code omitted for brevity
}
If you want to verify the cache is working, just enable sql_query in your Spring configuration and check that findAllByCountry is no longer making any request to the DB.
I was wondering if there was a sort of compromise that allowed you to emulate/leverage the Google Guice style EDSL way of writing modules which binds interfaces to implementations in Spring.
For example, say I had a Google Guice Module that looked like this:
public class BillingModule extends AbstractModule {
protected void configure() {
bind(BillingService.class).to(RealBillingService.class);
}
}
This binds the BillingService interface to the RealBillingService implementation.
One way that I think I can do utilizing Spring's Java configuration class is something that looks like this
#Configuration
public class BillingConfiguration {
#Bean
public BillingService getRealBillingService() {
return new RealBillingService();
}
}
I was wondering if there was a better way to do this or if this broke down with increasingly complex usage.
I really like Google Guice and how it does Dependency Injection but that's kind of all it does. Spring does a lot more (yes, its dependency injection mechanism is still not 'as-nice' as Guice) but undeniably has some great projects that we would like to utilize like Spring Data, Spring Data REST, etc. which eliminate the need for writing a ton of boilerplate code.
The way to do this is to use #Profile to include different implementations of the same interface.
A simple example would be DataSource. This can however easily be extended to any other interfaces with multiple implementations.
As an example:
#Configuration
public class LocalDataConfig {
#Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.HSQL)
.addScript("classpath:com/bank/config/sql/schema.sql")
.addScript("classpath:com/bank/config/sql/test-data.sql")
.build();
}
}
and then for use in production:
#Configuration
#Profile("production")
public class JndiDataConfig {
#Bean
public DataSource dataSource() throws Exception {
Context ctx = new InitialContext();
return (DataSource) ctx.lookup("java:comp/env/jdbc/datasource");
}
}
Then all you need to do is declare which profiles are active when you start your application context and #Inject/#Autowire DataSource where you need it.
I've used both Guice and Spring a fair bit. As far as I know, the spring usage you show in the question is the only way to achieve the same as the binding in Guice with Spring. If you need to inject dependencies you can always include those as arguments to the #Bean method and have spring inject them for you.
It's definitely not as clean but it works the same way. One key thing to watch out for is that the default scope in spring is Singleton rather than a new instance every time (spring calls this scope prototype)
I have a standard Magnolia module that I've implemented as a Spring MVC REST client. In this module, I am trying to retrieve a JCR node and use Node2BeanProcessor to transform the Node object into my custom bean. Code below:
#Repository
public class JcrRepo() {
#Inject
public Node2BeanProcessor node2Bean;
public MagicWord getMagicWord(String key) {
Session session = LifeTimeJCRSessionUtil.getSession("magic");
Node theNode = session.getNode("/magicWords/" + key);
return node2Bean.toBean(theNode, MagicWord.class);
}
}
When I run this, I encounter a NullPointerException for the variable node2Bean. Which means it wasn't injected properly. However, I am able to do this:
node2Bean = Components.getComponent(Node2BeanProcessor.class);
The Components.getComponent() javadoc states: "Returns a component from the currently set ComponentProvider. Consider using IoC to inject the component instead." Which is what I'm trying to figure out.
Note that I have not done any Guice configuration as I'm looking for a way to leverage on Magnolia's already initialized Guice context to grab my objects.
Is there a better way to do injection than this, or have I done anything wrong or skipped a step?
Appreciate the help.
P.S. For now I've implemented a hacky way to use this in Spring IoC:
#Bean
public Node2BeanProcessor node2Bean() {
return Components.getComponent(Node2BeanProcessor.class);
}
(Working with Magnolia 4.5) I use #Inject for Node2BeanProcessor in a class implementing info.magnolia.module.ModuleLifecycle:
public class MyModule implements ModuleLifecycle {
#Inject
private Node2BeanProcessor node2BeanProcessor;
#Override
public void start(ModuleLifecycleContext moduleLifecycleContext) {
...
getNode2BeanProcessor().toBean(someNode);
...
}
}
Maybe your NullPointerException comes from theNode? Have you verified that theNode is not null?
Another guess is that it could be a lifecycle issue. From what I remember, Components.getComponent() works in situations where #Inject does not (in Magnolia).
Finally: Your instance variable should definitely be private.
If JcrRepo is not instantiated by Guice then Guice also won't be available to inject the Node2BeanProcessor field. Mixing Spring and Guice IoC containers can get confusing, so I tend to stick with Guice as that's what comes with Magnolia.
I have a POJO class with a method annotated with #Transactional
public class Pojo {
#Transactional
public void doInTransaction() {
...
}
}
Spring declarative transaction management is based on AOP but I don't have any experience with that. My question is:
Is it possible that when invoking the (new Pojo).doInTransaction() alone, Spring will start a Transaction.
Spring declarative transaction
management is based on APO but I don't
have any experience with that.
I would recommend to start working with it and you will get the experience of using transaction advices using AOP. A good starting point is here.
Is it possible that when invoking the
(new Pojo).doInTransaction() alone,
Spring will start a Transaction.
No, you can't expect Spring to be aware of a bean that you manually invoked. However, it sounds like that you are wanting to avoid declarative transaction management and do programmatic transaction management. There is a way to do that with Spring using the Transaction Template. Is that what you were looking for?
It is somewhat possible, but in a cumbersome way: You must use the AutowireCapableBeanFactory mechanism.
Here is a transactional class as example
public interface FooBar{
void fooIze(Object foo);
}
public class FooBarImpl implements FooBar{
#Transactional
#Override
public void fooIze(final Object foo){
// do stuff here
}
}
And here is how we can use it:
public class FooService implements ApplicationContextAware{
private ApplicationContext applicationContext;
#Override
public void setApplicationContext(
final ApplicationContext applicationContext){
this.applicationContext = applicationContext;
}
public void serviceMethod(){
//declare variable as interface, initialize to implementation
FooBar fooBar = new FooBarImpl();
// try to use it, won't work, as it's not a proxy yet
Object target = new Object[0];
fooBar.fooIze(target); // no transaction
// now let spring create the proxy and re-assign the variable
// to the proxy:
fooBar = // this is no longer an instance of FooBarImpl!!!
(FooBar) applicationContext
.getAutowireCapableBeanFactory()
.applyBeanPostProcessorsAfterInitialization(fooBar,
"someBeanName");
fooBar.fooIze(fooBar); // this time it should work
}
}
This is not a best practice. For one thing, it makes your application highly aware of the Spring Framework and also, it violates the dependency injection principles. So use this only if there is no other way!
Yes, it is possible. Spring does not require the use of dynamic proxies for #Transactional to work. Instead, you can use "true AOP", as provided by AspectJ.
For the details, see http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/transaction.html#transaction-declarative-aspectj
The way Spring handle the transaction through Annotation is using AOP as you've said.
The AOP bit is implemented using Dynamic Proxies (see doc)
So in order to do so you'll need to retrieve an instance of your class (Pojo here) through the spring container since to make it work, Spring will return you a Dynamic Proxy over your Pojo that will automatically surround any annotated method with the transaction management code.
If you simply do a
Pojo p = new Pojo();
p.doInTransaction();
Spring doesn't have any role to play here and your method call won't be inside a transaction.
so what you need to do is something like this
ApplicationContext springContext = ...;
Pojo p = (Pojo) springContext.getBean("your.pojo.id");
p.doInTransaction();
Note: this is an example, you should prefer dependency injection instead of retrieving your bean manually from the context
By doing so, and with a properly configured Spring Context, Spring should have lookout your classes to scan for transactional annotation and automatically wrapped your beans into annotation aware dynamic proxies instances. From your point of view that doesn't change anything, you'll still cast your object to your own Classes, but if you try to print out the class name of your spring context Pojo bean, you'll get something as Proxy$... and not your original class name.
Have a look at this link anyway : link text