I am currently working on an application in which the an instance of the domain object D is injected in to the application. The domain object can contain many classes together in different combinations and permutations as defined by its bean and hence leading to many different final objects D, which I refer to as different versions of D. For a given version of D, I have to fill up the primitive values in it and then save it to the database. Saving it to the database is pretty simple using JPA and Hibernate. The problem is filling up the values in D. The values are fetched over the network using SNMP and then filled up. For each version of D, there is different a strategy to follow, since each version of D may have a different MIB. I are currently following the factory pattern. The factory takes a version of D and returns a valueRetriever for specific to that version of D, which is then used to fetch the values and fill D.
The other obvious way is to inject a configuration retriever in with D and then use it to retrieve the configuration. But I also need to use the retriever during runtime to re-fetch the configurations, so that makes it necessary to store the retriever too in the database, hence creating a new table for each retriever, which seems to an overhead currently.
My question is: Can there be a better way to retrieve the configurations i.e. have a valueRetriever given the above scenario using dependency injection.
Edit: Can AOP be of any use here?
It seems that some of the objects you needing to create have a complex creation logic. You may wont to look at the Spring FactoryBean interface, since a FactoryBean can get all the complex details over the network while allowing you to create an instance and then inject it into other beans.
The basis for Spring's DI is the Bean Factory/Application Context, so it's entirely possible to replace what you're doing.
The difference will be that you'll have to be able to put all your permutations into the Spring configuration and give control over to the application context. If you can't do that, perhaps the solution you've got is preferred.
UPDATE: I would start to fear that your Spring solution is adding in too many unfamiliar technologies into what might be an overly complicated situation.
Take a breath and think "simple".
I wouldn't worry about the database for now. The Spring application context will be the database if you can get all the combinations you need into the bean factory. I'm assuming these configurations are read-only and not altered once you declare them. If that's not the case all bets are off.
Related
In a project, I have a org.apache.commons.configuration.PropertiesConfiguration object registered as a Bean, to provide configuration values around the application, with hot-reloading capabilities.
Example: I defined a DataSource singleton Bean. I then created a ReloadingDataSource object, which wraps and delegate to the "real" DataSource, and each time the configuration file changes, it is able to recreate it in a thread-safe manner.
I'd like to do something similar for simple properties values.
I'd like to create a simple, Autowireable object that delegate retrieval to the Apache PropertiesConfiguration Bean.
The usage should be similar to:
#Property("my.config.database")
private Property<String> database;
And the call site would simply be:
final String databaseValue = database.get()
You'll say, just pass around the PropertiesConfiguration object. Maybe you're right, but I'd like to provide another abstraction over that, a simpler-to-use one.
I know that with ProxyFactoryBean it is possible to create an AOP proxy for method calls. Is this the right path, or are there better alternatives? Maybe pure Spring AOP/AspectJ?
I don't want to use Spring Cloud or similar dependencies.
Spring Cloud will recreate the beans, so keep in mind whatever solution you come up with, if you have another bean which only reads this value once for instance when it is initiated, it won't re-initialize itself, that is the problem Spring Cloud Config takes care of.
AOP only works at the method level as I understand, so you can definitely intercept a call to somebean.getFoo(). But within somebean, there is no way to proxy calls to the variable itself: somebean.foo. You would have to reset foo every time your PropertiesConfiguration changed, and again keep in mind that, if anything else needs the new value of foo you would need to handle this or bite the bullet use Spring Cloud.
The overhead you have with changing stuff at run-time to avoid a re-deploy should really be thought carefully about. For Netflix, this makes sense because they have thousands and thousands of servers. But for smaller players I can't see the justification, the decision adds much complexity. Nightmare to test.
Do you test changing your configuration at run-time or accept the risk and assume it works?
Do you test changing from A -> B whilst under load of a user performing a transaction to the database?
Test other raise conditions where foo is changing?
Some things to think about.
Working with Hibernate, I noticed that all of the Java objects going into persistence are defined in a mapping file. Is there a way to only depend on the annotations instead of a separate .xml for this? At the time of creation, we do not know what the object that is to be persisted contains. We know it is primitive data types, Strings, ints, floats/doubles, but we do not know how many of each field the object may contain until the same time it needs to have a table created for it to be entered into the db.
Note that Hibernate is just the first ORM solution that I've looked at. I am not tied to it if there is another ORM solution that solves this problem.
I think in your use case, you can use Dozer mapping for managing beans without having explicit definitions of class files and this can ben loaded at runtime using spring annotatons dependency injection.
You may look into JDX ORM for Java. The mapping is defined declaratively in a text file but just a minimal specification is needed for each class - its name and the names of the primary key attributes. Other attributes are automatically picked up by JDX. So you may continue to modify your class without making any further changes to its mapping specification. Disclaimer: I am the architect of JDX ORM.
I'm currently on the process of adding Spring and Hibernate to an existing application, but after reading lots of tutorials there are still a couple (aka a lot of things) that either seem strange to me or I'm missing something...
All the tutorials that I found are straight forward ones (like most tutorials should be), as seen on Example A, one controller to handle the requests (JSP or WS) and autowire the manager class to interact with the DB.
In my case this doesn't apply, since the application has a class to handle the requests, then it instantiates a handler class, which in turn creates a new class to handle something else that creates a new class to handle (....)* and then handles the the database connection as seen on Example B.
My question is how can I make my Business logic Class n "Springable", ie, able to make a Database Manager autowired inside of it?
From all the examples that I've seen, I've come up with these alternatives:
Create the autowire to ALL the DbManager inside of the Controller, and then IoC to all the Business Classes until it reaches the Business Logic class n. This would follow with the Spring standards, but would imply the most code refactoring
Transform ALL the Business Logic classes into beans
Add SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(this); to the Business Logic class n and use the #Autowire to access the DbManager
Am I missing something or is there any other alternative?
This is just my opinion, but you may be interested.
The basic philosophy of Spring, the fact that the creation and configuration of objects involved in the container, but not in the business objects, is known as IoC or Dependency Injection. Based on the configuration, the container creates and associates(injects) the objects with each other. This allows you to remove the code of the business-classes related to instantiation and configuration (this code can be quite complex). So your classes will become easier and cleaner, and can focus on the business-logic and nothing else.
I believe that business objects do not need to create each other. Let Spring do it. He does it perfectly.
Just mark your business logic classes, depend on its role, with one of stereotype: #Component, #Service, #Controller (meaning of stereotypes you can find here), and inject it with #Autowired. And if you need Database Manager in this classes, inject it same way.
So, my choice corresponds to point number two: "2. Transform ALL the Business Logic classes into beans..."
You can (and should!) use Spring Stereotypes for this.
Refer to my previous answer to a similar question for details about the proposed application structure.
My application has about 50 entities that are displayed in grid format in the UI. All 50 entities have CRUD operations. Most of the operations have the standard flow
ie. for get, read entities from repository, convert to DTO and return a list of DTO's.
for create/update/delete - get DTO's - convert to entities, use repository to create/update/delete on DB, return updated DTOs
Mind you that for SOME entities, there are also some entity specific operations that have to be done.
Currently, we have a get/create/update/delete method for all our entities like
getProducts
createProducts
updateProducts
getCustomers
createCustomers
updateCustomers
in each of these methods, we use the Product/Customer repository to perform the CRUD operation AFTER conversion from entity -> dto and vice versa.
I feel there is a lot of code repetition and there must be a way by which we can remove so many of these methods.
Can i use some pattern (COMMAND PATTERN) to get away with code repetition?
Have a look at the Spring Data JPA or here project. It does away with boilerplate code for DAO.
I believe it basically uses AOP to interpret calls like
findByNameandpassword (String name,String passwd)
to do a query based upon the parameters passed in selecting the fields in the method name (only an interface).
Being a spring project it has very minimal requirements for spring libraries.
Basically, you have 2 ways to do this.
First way: Code generation
Write a class that can generate the code given a database schema.
Note that this you will create basic classes for each entity.
If you have custom code (code specific to certain entities) you can put that in subclasses so that it doesn't get overwritten when you regenerate the basic classes.
Object instatiation should be via Factory methods so that the correct subclass is used.
Make sure you add comments in the generated code that clearly states that the code is generated automatically (so that people don't start editing them directly).
Second way: Reflection
This solution, while being more elegant, is also more complex.
Instead of generating one basic class for each entity you have one basic class that can handle any entity. The class would be using reflection to access the DTO:s.
If you have custom code (code specific to certain entities) you can put that in other classes. These other classes would be injected into the generic class.
Using reflection would require a strict naming policy on your DTO:s.
Conclusion
I have been in a project using the first method in a migration project to generate DTO classes for the service interface between the new application server (running java) and the fat clients and it worked quite well. We had more than 100 generated DTO classes. I am aware that what you are attempting is slighty different. Editing database records is a generic problem (all projects need it) but there aren't (m)any frameworks for it.
I have been thinking about creating a generic tool or framework for it but I have never gotten around to it.
I have always used the DAO pattern for CRUD operations, each DAO in charge of accessing to a unique datasource at a time, and use generics to support multiple entities.
Now I require the same with the following changes
1.- Datasources will be added/removed dynamically at runtime
2.- A unit of work involve for instance: reading from datasource A, writing on B and deleting from A if B succeded. A and B will be interchangeable, which makes me think of some sort of origin/destination mechanism.
3.- Reads will only be done against 1 datasource only
The entities will be the same in all datasource, for which I could add a factory that creates a new DAO whenever a datasource is added, answering the first question. But I'm not sure how to address the rest.
Is the DAO pattern still suitable? If it is, what needs to be added? Or is there a different approach to this as a whole?
If Spring is part of your application stack, You can use AbstractRoutingDataSource which will give the flexibility to add dynamic datasource mapping. If not go through the source code of it and you can build your own logic something similar to this.
On a quick google, I come across this http://blog.springsource.org/2007/01/23/dynamic-datasource-routing/.
It is explaining this dynamic routing in action.
This sounds like a business transaction. You need a business component covering the transaction, which involves multiple DAOs.