Inject Spring Boot Service Into Non-Managed Class - java

I have a set of classes that all extend from my ReportConfig abstract class. Once created, they are currently immutable and not managed by Spring.
However, I'm finding that sometimes I need to perform an action that requires the use of a particular service, but I'm struggling to find the best way to inject these services into my non-managed ReportConfig instances. See below:
#Service
public class ReportService() {
#Autowired
private SchemaService schemaService;
#Autowired
private ExecutionService executionService;
#Autowired
private ReportConfigFactory reportConfigFactory;
public Result executeReport(ReportRequest request) {
ReportConfig reportConfig = reportConfigFactory.getReportConfig(request);
reportConfig.validateAgainstSchema(schemaService.getSchemaForDataset(reportConfig.getDataset()));
executionService.execute(reportConfig.getQuery());
}
Now I have no issue with the dataService.execute() line, but I do have an issue with the reportConfig.validateAgainstSchema() line and I feel an alternative would be something like:
reportConfig.validateAgainstSchema(schemaService);
But I feel that maybe represents tight coupling.
I'm imagining there's also a way to inject SchemaService straight into ReportConfigs, but not sure if this defeats the object...
Keen to hear your thoughts.
Thanks

From what I understood from the current design of your application given in question and I am guessing the functionality and thinking about an alternative design,
reportConfig.validateAgainstSchema(schemaService.getSchemaForDataset(reportConfig.getDataset()));
What you can think of is, if reportconfig is the right place to do validateAgainstSchema if not think about changing it.
Is schemaService the right place to do validateAgainstSchema(I think not) if that is also not the right place to do it, then create a SchemaValidator or something similar.
This is something that I could think of with the information you have given me.
If validateAgainstSchema is not overridden by any child classes of ReportConfig and possibility is very rare then you should definitely think of moving it to a different class. The reason I say this is that if the schema changes then you should not have to change the reportConfig class
If its overridden and there are multiple implementations then one way would be to use a validator factory to get the validator and do validation. I don't even know if this suits your use case but this is all I can come up with the information in the question.
Hope it helps you in someway.

Why do you have to inject services into config classes ? Could you just not pass what config need directly ?
Alternatively if you must you can use setter based injection to autowire the service into ReportConfig abstract class or you can use method injection.
Something like
Setter based Injection
public abstract class ReportConfig {
protected SchemaService schemaService;
protected final setSchemaService(SchemaService schemaService) {
this.schemaService = schemaService;
}
}
Method Injection
public class BeanUtility implements ApplicationContextAware {
private ApplicationContext applicationContext;
protected SchemaService getSchemaService() {
return this.applicationContext.getBean("schemaService", SchemaService.class);
}
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
}

Your logical is a bit strange.
The correct is another service receive ReportConfig. You doing the opposite.
You can work with bean inside another object not spring managed but you need to recreate it.
YourService container = (YourService) context.getBean("yourService");
container.load();
But it's wrong. The right way is change your logical. Create a generic service with rules to specific reportConfig. Autowire your service and pass your report config and work with that specific data inside the service. (sry my english)

You could think of wrapping your services in static holder classes
public class SchemaServiceHolder {
private static final SchemaServiceHolder INSTANCE = new SchemaServiceHolder();
#Autowired
private SchemaService schemaService;
private SchemaServiceHolder () {
}
public static SchemaServiceHolder instance() {
return INSTANCE;
}
public SchemaService getSchemaService() {
return schemaService;
}
}
You then need to create a bean of this type so Spring can inject the service into the singleton SchemaServiceHolder.INSTANCE
#Configuration
public class AppConfig {
#Bean
public SchemaServiceHolder schemaServiceHolder () {
return SchemaServiceHolder.instance();
}
}
And finally, your objects can reference the schemaService statically
SchemaService schemaService = SchemaServiceHolder.instance().getSchemaService();

The simplest solution in your case to " also a way to inject SchemaService straight into ReportConfigs, but not sure if this defeats the object..." is to pass the object to the factory.
So basically this line reportConfigFactory.getReportConfig(request);
would be changed to this
reportConfigFactory.getReportConfig(request, schemaService);
It is totally fine to pass the spring managed bean in a constructor of a non managed spring class. In your case is not the constructor directly but to a factory method.

Related

Spring #Service generics: Do I need to create a bean for each type?

I'm trying to create a generic crud service that I can use for any type of entity. But when I was trying it I noticed that even though I was autowired several different services like
#Autowired
MyService<Item> itemService;
#Autowired
MyService<Students> studentsService;
When I tried the .findAll() method I noticed it was returning items.. on both services! So after doing some debugging I noticed that the instances of itemService and studentsService are the same which would explain what I just mentioned.
To make sure of this I made some small test as follows:
#Autowired
Foo<String> fooStr;
#Autowired
Foo<Long> fooLong;
static int counter = 0;
#Service
class Foo <T> {
public Foo(){
counter++;
}
}
#EventListener(ApplicationReadyEvent.class)
public void doSomethingAfterStartup() {
System.out.println("hello world, I have just started up " + counter);
}
Basically I set up a generic foo service and autowire it under what I assume should be two different instances(Foo and Foo), and a counter on the constructor to check how many times this class is instantiated. However, instead of counter = 2(one from fooStr and another one from fooLong) it's actually 1 which I guess confirms my previous assumption.
So here's my question: Do I actually need to declare a bean myself for each type of generic if I use #Service? Isn't there an easier way? I have to make lots of cruds for a project that are pretty much the same so I'd really like to avoid having to declare all the beans for each entity type if possible
By default Service is a singleton. In order to get a new instance each time, use
#Scope("prototype") annotation on your Service class

Equivalent for #Conditional in CDI

I have two classes with post construct initialization, and i need one of them to be injected based on a vm argument. I have done this kind of conditional injection in spring using #Conditional annotation, however i could not find any equivalent in CDI. Can some one please help me with this.
The code goes something like this,
public void impl1{
#PostConstruct
public void init(){
....
}
....
}
public void impl2{
#PostConstruct
public void init(){
...
}
....
}
If vmargument type=1, impl1 has to be injected and if type=2, impl2 has to be injected
For runtime decision (without changing your beans.xml), you basically have two options:
Option 1: use a producer method
#Produces
public MyInterface getImplementation() {
if(runtimeTestPointsTo1) return new Impl1();
else return new Impl2();
}
Drawback: you leave the world of bean creation by using new, therefore your Impl1 and Impl2 cannot #Inject dependencies. (Or you inject both variants in the producer bean and return one of them - but this means both types will be initialized.)
Option 2: use a CDI-extension
Basically listen to processAnotated() and veto everything you don't want. Excellent blog-entry here: http://nightspawn.com/rants/cdi-alternatives-at-runtime/
Probably the best way is to use an extension. You will create two classes both of which will have the same type so they are eligible for injection into the same injection point. Then, using the extension, you will disable one of them, leaving only one valid (the other will not become a bean).
Extensions can 'hook into' container lifecycle and affect it. You will want to leverage ProcessAnnotatedType<T> lifecycle phase (one of the first phases) to tell CDI that certain class should be #Vetoed. That means CDI will ignore it and not turn in into a bean.
Note the type parameter T in ProcessAnnotatedType<T> - replace it with a type of your implementation. Then the observer will only be notified once, when that class is picked up by CDI. Alternatively, you can replace T with some type both impls have in common (typically an interface) and the observer will be notified for both (you then need to add a login to determine which class was it notified for).
Here is a snippet using two observers. Each of them will be notified only once - when CDI picks up that given impl - and if it differes from the vm arg, it is vetoed:
public class MyExtension implements Extension {
public void observePAT(#Observes ProcessAnnotatedType<Impl1.class> pat){
// resolve your configuration option, you can alternatively place this login into no-args constructor and re-use
String vmArgumentType = loadVmArg();
// if the arg does not equal this impl, we do not want it
if (! vmArgumentType.equals(Impl1.class.getSimpleName())) {
pat.veto();
}
}
public void observePAT(#Observes ProcessAnnotatedType<Impl2.class> pat){
// resolve your configuration option, you can alternatively place this login into no-args constructor and re-use
String vmArgumentType = loadVmArg();
// if the arg does not equal this impl, we do not want it
if (! vmArgumentType.equals(Impl2.class.getSimpleName())) {
pat.veto();
}
}
}
Create your own #Qualifier and use it to inject cdi bean:
public class YourBean {
#Inject
#MyOwnQualifier
private BeanInterface myEJB;
}
#Qualifier
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.FIELD, ElementType.METHOD})
public #interface MyOwnQualifier {
YourCondition condition();
}

what is the difference between using or not Spring Beans?

Probably i'll get a lot of downvotes, but it's so confusing for me all this fact of whether use beans or not. Lets suppose this example
interface ICurrency {
String getSymbol();
}
public class CurrencyProcessor {
private ICurrency currency ;
public CurrencyProcessor(ICurrency currency) {
this.currency = currency;
}
public void doOperation(){
String symbol = currency.getSymbol();
System.out.println("Doing process with " + symbol + " currency");
// Some process...
}
}
So, to inject the ICurrency impl injection i think that i can do it by two ways:
Way 1: Without Spring beans
public class CurrencyOperator {
private ICurrency currency ;
private CurrencyProcessor processor;
public void operateDefault(){
currency = new USDollarCurrency();
processor = new CurrencyProcessor(currency)
this.processor.doOperation();
}
}
Where USDollarCurrency is an ICurrency interface implementation
Way 2: Using Spring beans
#ContextConfiguration(classes = CurrencyConfig.class)
public class CurrencyOperator {
#Autowired private ICurrency currency ;
#Autowired private CurrencyProcessor processor;
public void operateDefault(){
this.processor.doOperation();
}
}
#Configuration
public class CurrencyConfig {
#Bean
public CurrencyProcessor currencyProcessor() {
return new CurrencyProcessor(currency());
}
#Bean
public ICurrency currency() {
return new USDollarCurrency();
}
I really don't understand what would be the benefits of using Spring's beans. I read some things but what i most found was about the benefits of using DI, and as i understand, both ways are injecting the dependency that CurrencyProcessor require, what is changing is the way that i am creating and using objets, am i wrong? So in concrete, my questions are:
1. What are the benefits of using Beans at this case?
2. Why should i use Spring instead of doing it manually like first way?
3. Talking about performance, which of this cases is better?
Suppose you have 2 DAO classes one for Oracle, the seconde for MySQL, and both classes are implementing a DAO interface. You define an implementation as a bean in Spring configuration file. In the business class you have an attribut of type DAO, while in the spring configuration file you choose the real type wheather Oracle or MySQL to inject or using spring annotation #Autowired
This reduce coupling and it will be easy to move from Oracle to MySQL.
#Service
public class Business {
#Autowired
private Dao daoImpl;
//Business methods that invoks Dao methods
}
In the Spring configuration file (XML file) you use the following:
<bean id="daoImpl" class="app.com.MySQLDaoImpl OR app.com.OracleDaoImpl"/>
By just changing the class attribut of your bean you change the whole implementation, without any change in your business class ! Good luck.
Your example without Spring doesn't dependency injection!
With dependency injection, the actual implementation of the interface is determined outside the code itself in order to reduce the coupling!
You should be able to need another implementation (you could for example switch from one JMS client to another...).
To answer to your last question, using Spring is (a very little bit) less performant but much more flexible.
EDIT :
Spring is not the only tool that can be used for DI but it is the most popular and it contains a lot of features. Note that many Java standards also (such as JPA) use DI.

Strategy for many DAOs in Spring Java

We have many DAOs in an existing project (currently with no interfaces, but that can change). Rather than wiring a Spring-managed bean for each DAO class and injecting them into the service layer, we have a DAO "factory" of sorts that looks like this:
public class DAOFactory {
private static DAOFactory daoFac;
static{
daoFac = new DAOFactory();
}
private DAOFactory(){}
public static DAOFactory getInstance(){
return daoFac;
}
public MyDAO1 getMyDAO1(){
return new MyDAO1();
}
public MyDAO2 getMyDAO2(){
return new MyDAO2();
}
...
(Note that MyDAO1 and MyDAO2 are concrete classes)
This allows us to easily add/call DAO methods within the Service layer, without having to 1.) add a DAO interface as a property to the service class 2.) wire the DAO implementation into service method via configuration. (And we sometimes use multiple DAOs in one service class).
DAOFactory.getInstance().getMyDAO1().doSomething();
This strategy has worked for us so far (we haven't had much need for switching implementations), but I'm wondering if there is a better method if we were able to start new? I looked at autowiring the DAOs as beans, but I'd still need to create properties in each service class to represent those DAOs being used. And in a large project, I'm hesitant to start auto-wiring beans anyway - we need to provide visibility for all developers.
It feels like I'm flip-flopping between a.) being tightly-coupled to an implementation, but less code/config overhead and b.) being loosely coupled to interfaces, but requiring plenty of code/configuration overhead.
Is there a better way I'm missing? Something in-between? Opinions welcomed.
I will have all the DAO s as Spring managed components and inject them into services for loose coupling. Why do you think autowiring beans is bad in a big project.?
Just annotate each DAO class with #Component
and replace MyDao mydao = factory.getmyDao() with
#Autowired
MyDao myDao;
I dont see much coding/configuration overhead with it.
I've taken a couple of different approaches so far with my projects, and haven't really settled on what's "best." And there may not be a "best" but perhaps a "best for your needs."
First, I went with a base service class.
public abstract BaseService {
#Autowired FooDAO fooDao;
#Autowired BarDAO barDao;
. . .
. . .
protected getFooDAO() {
return this.fooDao;
}
}
Then in my service classes, I can write simply
Foo foo = getFooDAO().uniqueById(id);
This works, and it keeps my services tidy from all the autowiring and accessor classes for the dao instance variables. Problem is, I've now got a copy of this base class in each of my services, which frankly, meh, not that big of a deal. But it also sort of produces a code smell because it's not using composition over inheritance where, in essence, a reason for DI is to encourage composition.
A coworker suggested a factory such as yours, and called it ServiceProvider. We autowire this into our services.
#Component
public class ServiceProvider {
#Autowired FooDAO fooDao;
public FooDAO getFooDAO() {
return this.fooDao;
}
. . .
// yadda yadda
}
Then we have something like what you have:
Foo foo = getServiceProvider().getFooDAO().uniqueById(id);
And that's pretty darned ugly and really doesn't lend itself to clarity. So, we've dabbled with using just the instance of the provider, and naming it something short and sweet like sp. Then we get
Foo foo = this.sp.getFooDAO().uniqueById(id);
And again, it works. And it's probably a better design. And we only autowire the DAOs in one place, rather than into each Service, even though that's not really much of a problem either way. But it makes me feel better even though Me Feeling Better isn't a project requirement (but don't cha think it oughta be?)
I've been thinking we'd combine the two. We'd change BaseService to autowire the ServiceProvider, and then wrap the ugly calls.
public abstract BaseService {
#Autowired ServiceProvider serviceProvider;
protected getFooDAO() {
return this.serviceProvider.getFooDAO();
}
protected getBarDAO() {
return this.serviceProvider.getBarDAO();
}
}
Makes for nicer shorthand in my services, doesn't require me to autowire each DAO into each service which just gets clunky, in my opinion, but also doesn't have a copy of all those references in each service which, you know, is an utterly ridiculous concern.
The problem that I'm left with is stepping through code in the debugger. Stepping in and out of each of those getWhateverDAO() calls is tedious, and adding a possible step through getServiceProvider() doesn't help things either.
But that's where I'm at with this issue. Frankly, I think I spend so much time thinking about this because it's a great way of avoiding all the truly hard problems our application poses.
Good question.
I think that is is very pity that you started to use DAOFactory. Spring is a super flexible factory, so I really do not understand why do you need other one. Autowiring in spring has a lot of advantages and do not reqire interfaces, so you can easily switch to using spring to access DAOs. IMHO it does not reduce but improves the visibility for other developers.
Moreover if you are thinking about refactoring of DAO layer take a look on GenericDAO from google code: http://code.google.com/p/hibernate-generic-dao/
I had a very good experience with this library. It saves your time. You actually do not need many DAOs. You need exactly one DAO. You obviously can wrap the generic DAO from google code and add your application specific terminology and functionality. But do not add entity specific code there. The entity specific code should be at service layer. No more fragile HQL, no coupling with hibernate if you are using Hibernate criteria API. This library supports both Hibernate and JPA and its API is very simple and strong.
If you use too many DAOs in 1 service you should think about to split 1 service into more lower lever (fine grained) services
If you don't want to annotate your DAO classes or have configuration annotations like #Value polluting them, I see two options:
1. Create a #Configuration with DAO #Beans
#Configuration
public class DaoConfiguration {
#Value("${db.name}")
private String dbName;
#Value("${foo.table}")
private String fooTable;
#Value("${bar.table}")
private String barTable;
#Bean
private FooDao fooDao() {
return new FooDao(dbName, fooTable);
}
#Bean
private BarDao barDao() {
return new BarDao(dbName, barTable);
}
}
Then create an #Autowired field for the DAO you need:
#Autowired
private FooDao fooDao;
2. Create a DAO factory #Component
Useful if you need to do some cleanup when the DAOs are destroyed.
#Component
public class DaoFactory {
#Value("${db.name}")
private String dbName;
#Value("${foo.table}")
private String fooTable;
#Value("${bar.table}")
private String barTable;
private FooDao fooDao;
private BarDao barDao;
#PostConstruct
public void init() {
fooDao = new FooDao(dbName, fooTable);
barDao = new BarDao(dbName, barTable);
}
#PreDestroy
public void destroy() {
try {
fooDao.close();
} catch (Exception e) {
log.error("Failed to clean up FooDao", e);
}
try {
barDao.close();
} catch (Exception e) {
log.error("Failed to clean up BarDao", e);
}
}
public FooDao fooDao() {
return fooDao;
}
public BarDao barDao() {
return barDao;
}
}
Then create an #Autowired field for the factory in the classes you need DAOs:
#Autowired
private DaoFactory daoFactory;
And use it as:
daoFactory.barDao().findAll();

Using guice for a framework with injected classes, proper way to initialize?

I'm trying to write a framework where arbitrary bean classes are injected with classes from my API, and they can interact with both those classes as well have triggered callbacks based on defined annotations. Here's an example bean:
#Experiment
static class TestExperiment {
private final HITWorker worker;
private final ExperimentLog log;
private final ExperimentController controller;
#Inject
public TestExperiment(
HITWorker worker,
ExperimentLog expLog,
ExperimentController controller
) {
this.worker = worker;
this.expLog = expLog;
this.controller = controller;
}
#SomeCallback
void callMeBack() {
... do something
log.print("I did something");
}
}
I'm trying to use Guice to inject these beans and handle the interdependencies between the injected classes. However, I have two problems:
One of the classes I pass in (HITWorker) is already instantiated. I couldn't see how to move this to a Provider without significantly complicating my code. It is also persistent, but not to the Guice-defined session or request scope, so I am managing it myself for now. (Maybe if the other issues are overcome I can try to put this in a provider.)
More importantly, I need a reference to the other injected classes so I can do appropriate things to them. When Guice injects them, I can't access them because the bean class is arbitrary.
Here's some really bad code for what I basically need to do, which I am sure is violating all the proper dependency injection concepts. Note that hitw is the only instance that I need to pass in, but I'm creating the other dependent objects as well because I need references to them. With this code, I'm basically only using Guice for its reflection code, not its dependency resolution.
private void initExperiment(final HITWorkerImpl hitw, final String expId) {
final ExperimentLogImpl log = new ExperimentLogImpl();
final ExperimentControllerImpl cont = new ExperimentControllerImpl(log, expManager);
// Create an experiment instance with specific binding to this HITWorker
Injector child = injector.createChildInjector(new AbstractModule() {
#Override
protected void configure() {
bind(HITWorker.class).toInstance(hitw);
bind(ExperimentLog.class).toInstance(log);
bind(ExperimentController.class).toInstance(cont);
}
});
Object experimentBean = child.getInstance(expClass);
expManager.processExperiment(expId, experimentBean);
// Initialize controller, which also initializes the log
cont.initialize(expId);
expManager.triggerStart(expId);
tracker.newExperimentStarted(expId, hitw, cont.getStartTime());
}
Am I screwed and just have to write my own injection code, or is there a way to do this properly? Also, should I just forget about constructor injection for these bean classes, since I don't know what they contain exactly anyway? Is there any way to get the dependencies if I am asking Guice to inject the bean instead of doing it myself?
For context, I've been reading the Guice docs and looking at examples for several days about this, to no avail. I don't think I'm a complete programming idiot, but I can't figure out how to do this properly!
Your "experiment" seems to be something like a "request" in the sense that it has a defined lifecycle and some associated stuff the experiment can pull in at will.
Therefore I think you should wrap all that into a custom scope as described in the docs about Custom Scopes. This matches your case in several points:
You can "seed" the scope with some objects (your HITWorker)
The lifecycle: do "enter scope" before you setup the experiment and "exit scope" after you finished your work.
Access to "shared" stuff like ExperimentLog and ExperimentController: Bind them to the scope. Then both the framework and the experiment instance can simple #Inject them and get the same instance.

Categories