Inject dependency across a constructor with multiple constructors existing - java

public abstract class MainService<T extends Managed> {
private static final Logger logger = LoggerFactory.getLogger(ContentService.class);
protected final ExecutorService executor;
private final boolean idValidation;
#Autowired
private LockValidator lockValidator;
public MainService() {
this(null, true);
}
public MainService(boolean idValidation) {
this(null, idValidation);
}
public MainService(final ThreadConfig tpConfig) {
this(tpConfig, true);
}
protected MainService(final ThreadConfig tpConfig, final boolean idValidation) {
// Some code
}
The code above works fine. But I have to replace the #Autowired annotation and inject the component via constructor. The problem is when I create this constructor:
public MainService(LockValidation lockValidation) {
this.lockValidation = lockValidation;
}
Instantly, these attributes get errors:
protected final ExecutorService executor;
private final boolean idValidation;
Variable 'executor' might not have been initialized
And of course, I guess that I need to send some parameters into the new constructor. The question is: How Could I refactor this code, using a constructor to inject the component instead of the annotation?

Quick fix:
public MainService(LockValidation lockValidation,ThreadConfig tpConfig,boolean idValidation) {
this(tpConfig,idValidation);
this.lockValidation = lockValidation;
}
You are not following SOLID principles (Dependency Injection) here by initializing executor inside the constructor. Best approach would be to use all argument constructor.
protected MainService(ThreadConfig tpConfig,boolean idValidation,ExecutorService executor,LockValidation lockValidation) {
this.tpConfig=tpConfig;
this.idValidation=idValidation;
this.executor=executor;
this.lockValidation=lockValidation;
/// Some code
}

Finally, I could see that to create a constructor could be a headache because it's probable to make refactors in the constructors. So, I decided to implement an injection via set method like this:
#Autowired
public void setLockValidation(LockValidation lockValidation) {
this.lockValidation = lockValidation;
}
It worked fine

Related

Java - Getter/Setter, behavior and Interfaces

I have a question, a little bit theoretical:
Assume, I have the following classes :
interface ReportInterface {
void execute();
}
class Report implements ReportInterface {
private final Repository rep;
Report(Repository ref){
this.rep = ref;
}
public void execute(){
//do some logic
}
}
class ReportWithSetter implements ReportInterface {
private final Repository rep;
private String release;
ReportWithSetter(Repository ref){
rep = ref;
}
public void execute(){
if (release == null) throw IlligalArgumentException("release is not specified");
//do some logic
}
public void setRelease(String release){
this.release=release;
}
}
The second report needs an additional parameter release to work properly, but my interface is defined without parameters for execute method, so I work around it with a setter method, so it would look like:
ReportWithSetter rep2 = new ReportWithSetter (rep);
rep.setRelease("R1.1");
rep.execute();
So I don't like this additional rep.setRelease. I looks weird and artificial - a user of this class may be confused, and for example, if I make the class as a singleton bean in Spring, it is a source of potential error, if it is requested for the second time and somebody forgets to trigger rep.setRelease for the second time. Besides putting it into constructor (I want to make it a spring bean), what would be the best practice to handling this situation?
Assuming you are allowed to change the interface, here are a few solutions I can think of:
Solution #1
void execute(Optional<String> release);
or
void execute(#Nullable String release);
and then use them for Report class as execute(Optional.empty()) or execute(null).
Solution #2
void execute(String... release);
and then use it for Report class as execute() and for ReportWithSetter class as execute("R1.1").
Solution #3
Define both void execute(); and void execute(String release); in the interface. Then while implementing, throw UnsupportedOperationException in the method you don't need. For example, in Report class, you would do:
public void execute(){
//do some logic
}
public void execute(String release){
throw new UnsupportedOperationException("Use the overloaded method");
}
You can also make both these methods as default in the interface, so your implementation classes don't have to worry about implementing the unsupported method.
Use whichever is most readable and maintainable for you.
Solution 1: Spring Dependency Injection - Field Injection:
Spring's Dependency Injection works with reflection, so Setter methods are not required.
So if you make your Report class a Spring Bean and use #Autowired to inject another bean, then the Setter method is not required.
It would look like this:
#Component
class ReportWithRelease implements ReportInterface {
#Autowired private final Repository rep;
#Autowired private Release release;
public void execute(){
if (release == null) throw IlligalArgumentException("release is not specified");
//do some logic
}
}
I changed "String release" to "Release release", because making a bean of "String" would be also strange. So the "Release" class would have to contain your "String release".
If "String release" contains only some configured value, which does not change at runtime. Then you can use #Value to read its String value from a properties file.
Solution 2: Spring Constructor Injection:
Constructor injection is another option, which is even more recommended.
Then your Report bean would look like this:
#Component
class ReportWithRelease implements ReportInterface {
private Repository rep;
private Release release;
#Autowired
public ReportWithRelease(Repository rep, Release release) {
this.rep = rep;
this.release = release;
}
public void execute(){
if (release == null) throw IlligalArgumentException("release is not specified");
//do some logic
}
}
Factory method patterns are good if you want to create instances of different classes of same interface.
class MyFactory {
ReportInterface createInstance(Class clazz, String... args) {
if (Report.class.equals(clazz)) {
return new Report();
}
if (ReportWithSetter.class.equals(clazz)) {
return new ReportWithSetter(args[0]);
}
throw new IllegalArgumentException(clazz.getName());
}
}
Spring of course offers autowiring, but introducing #AutoWire should be done for systematic purposes.
Here you can do with a two-stage execute, a factory.
class ReportFactory /*ReportWithSetter*/ {
private final Repository rep;
private final String release;
private final ReportInterface report = ...;
ReportFactory (Repository rep, String release) {
this.rep = rep;
this.release = release;
}
public ReportInterface report() {
return report;
}
}
new ReportFactory(rep, release).execute();

Using Mock for fields that are instances of one class

I have a classes:
public class Sender {
private final SomeClass firstField;
private final SomeClass secondField;
private Sender(SomeClass firtsField, SomeClass secondField){
this.firstField = firstField;
this.secondField = secondField;
}
}
#RunWith(MockitoJUnitRunner.class)
public class SenderTest{
#Mock
private firstField;
#Mock
private secondField;
}
Everything are looking grade, but looks like it injects the same objects in two fields or something like this. When I am trying to use when(..).thenReturn() for one field it sets data two another and vise verse; And the most strange that it works fine in debug mode. What can you say?
Mockito has some problems with constructor injection of two or more fields of the same type. But it works perfectly if you use setter injection.
So you can refactor "Sender" class like this:
public class Sender {
private SomeClass firstField;
private SomeClass secondField;
public void setFirstField(SomeClass firstField) {
this.firstField = firstField;
}
public void setSecondField(SomeClass secondField) {
this.secondField= secondField;
}
}
Remember that if class has both the constructor and setters, Mockito will choose the constructor for injection and completely ignore setters.
Edit: if you definitely need to use constructor for some reason, you can always mock fields manually instead of using Mockito annotations.
So in your case Sender would stay the same and SenderTest would be like this:
public class SenderTest {
private SomeClass firstField;
private SomeClass secondField;
private Sender sender;
#Before
public void setUp() {
firstField = Mockito.mock(SomeClass.class);
secondField = Mockito.mock(SomeClass.class);
sender = new Sender(firstField, secondField);
}
#Test
public void smokeTest() {
}
}
It depends what SomeClass is itself. It it a data (POJO) object, it's worth to create them in test (and i.e. fill with random generated values).
If it is a service. It can be sign for a architecture problem. why do you need two copies of the same service? Probably it makes sense to do some refactoring.

Constructor DTO container pattern

Has anyone seen a pattern whereby Java constructor parameters are created using a DTO object and then injected into the class using Spring? I believe it is an anti-pattern but am curious if there is a reasonable case where it could be used.
In my mind the ideal thing to do is refactor the code so the need for the single constructor object is redundant.
What's everyones thoughts?
My particular example is of the form:
#Service
public class MyClass {
private MyClassConstructorDto constructorDependencyContainer;
#Autowired
public MyClass(MyClassConstructorDto constructorDependencyContainer) {
this.constructorDependencyContainer = constructorDependencyContainer;
}
public void doSomething() {
constructorDependencyContainer.getCoolDependency().performThing();
}
}
with supporting class
#Component
public class MyClassConstructorDto {
private CoolDependency coolDependency;
public MyClassConstructorDto(CoolDependency coolDependency) {
this.coolDependency = coolDependency;
}
public CoolDependency getCoolDependency() {
return coolDependency;
}
}

A known design pattern for dynamic factory

Does this have a proper name?
public class SomethingFactory {
private final String someParameter;
public SomethingFactory(String someParameter) {
this.someParameter = someParameter;
}
public Something create(String anotherParameter) {
return new Something(someParameter, anotherParameter);
}
}
public class Something {
public final String someParameter;
public final String anotherParameter;
public Something(String someParameter, String anotherParameter) {
this.someParameter = someParameter;
this.anotherParameter = anotherParameter;
}
}
What's different from a regular factory is that you have to specify a parameter at runtime to create() whenever you need to create an object.
That way you can make a singleton factory within Spring context for example, configuring first half of parameters there, and then finish with the rest of parameters at runtime when you call create().
Why I need that in the first place if you're curious:
I used to have regular singleton objects in Spring context and it was fine in thread-per-request applications, but now my whole app is non-blocking and I can't use ThreadLocal to keep stuff throughout entire request processing. For example, to keep info on timings with something like Apache StopWatch.
I needed to find a way to implement a "request scope" in a multithreading, non-blocking environment without having to supply the object representing the scope in every method (that would be silly) of my code.
So I thought let's make every (service) class take this scope object in constructor and let's create those classes on every request, but that goes against the singletons. The singletons we're talking are like, UserService that logs a user in, or a CryptoService that generates digital signatures. They're configured once in Spring, injected wheneven needed and everything's ok. But now I need to create those service classes in every method where they're needed, instead of just referencing an injected singleton instance.
So I thought let's call those singletons "templates" and whenever you need an actual instance you call create() supplying the said scope object. That way every class has the scope object, you just have to keep supplying it into other template service constructors. The full thing would look like this:
public class UserService {
private final Scope scope;
private final Template t;
private UserService(Template t, Scope scope) {
this.t = t;
this.scope = scope;
}
public void login(String username) {
scope.timings.probe("before calling database");
t.database.doSomething(username);
scope.timings.probe("after calling database");
}
public static class Template { /* The singleton configured in Spring */
private Database database;
public void setDatabase(Database database) { /* Injected by Spring */
this.database = database;
}
public UserService create(Scope scope) {
return new UserService(this, scope);
}
}
}
public class LoginHttpHandler { /* Also a Spring singleton */
private UserService.Template userServiceT;
public void setUserServiceT(UserService.Template userServiceT) { /* Injected by Spring */
this.userServiceT = userServiceT;
}
public void handle(HttpContext context) { /* Called on every http request */
userServiceT.create(context.scope).login("billgates");
}
}
In Spring you'd just describe a UserService.Template bean with the appropriate dependencies it needs and then inject that bean whenever a UserService is needed.
I just call that a "template". But like always I feel it's already been done. Does it have any name?
That is almost the example given for Guice's AssistedInject:
public class RealPaymentFactory implements PaymentFactory {
private final Provider<CreditService> creditServiceProvider;
private final Provider<AuthService> authServiceProvider;
#Inject
public RealPaymentFactory(Provider<CreditService> creditServiceProvider, Provider<AuthService> authServiceProvider) {
this.creditServiceProvider = creditServiceProvider;
this.authServiceProvider = authServiceProvider;
}
public Payment create(Date startDate, Money amount) {
return new RealPayment(creditServiceProvider.get(), authServiceProvider.get(), startDate, amount);
}
}
public class RealPayment implements Payment {
public RealPayment(
CreditService creditService, // from the Injector
AuthService authService, // from the Injector
Date startDate, // from the instance's creator
Money amount) // from the instance's creator
{
...
}
}
Assisted injection is used to "create classes that need extra arguments at construction time".
Also, this is similar to partial application, so you could have a PartialUserService that creates a UserService.

DeltaSpike custom ConfigSource with CDI

I am trying to define a custom DeltaSpike ConfigSource. The custom config source will have the highest priority and check the database for the config parameter.
I have a ConfigParameter entity, that simply has a key and a value.
#Entity
#Cacheable
public class ConfigParameter ... {
private String key;
private String value;
}
I have a #Dependent DAO that finds all config parameters.
What I am trying to do now, is define a custom ConfigSource, that is able to get the config parameter from the database. Therefore, I want to inject my DAO in the ConfigSource. So basically something like
#ApplicationScoped
public class DatabaseConfigSource implements ConfigSource {
#Inject
private ConfigParameterDao configParameterDao;
....
}
However, when registering the ConfigSource via META-INF/services/org.apache.deltaspike.core.spi.config.ConfigSource, the class will be instantiated and CDI will not work.
Is there any way to get CDI working in this case?
Thanks in advance, if you need any further information, please let me know.
The main problem is, that the ConfigSource gets instantiated very early on when the BeanManager is not available yet. Even the JNDI lookup does not work at that point in time. Thus, I need to delay the injection/lookup.
What I did now, is add a static boolean to my config source, that I set manually. We have a InitializerService that makes sure that the system is setup properly. At the end of the initialization process, I call allowInitialization() in order to tell the config source, that the bean is injectable now. Next time the ConfigSource is asked, it will be able to inject the bean using BeanProvider.injectFields.
public class DatabaseConfigSource implements ConfigSource {
private static boolean allowInit;
#Inject
private ConfigParameterProvider configParameterProvider;
#Override
public int getOrdinal() {
return 500;
}
#Override
public String getPropertyValue(String key) {
initIfNecessary();
if (configParameterProvider == null) {
return null;
}
return configParameterProvider.getProperty(key);
}
public static void allowInitialization() {
allowInit = true;
}
private void initIfNecessary() {
if (allowInit) {
BeanProvider.injectFields(this);
}
}
}
I have a request-scoped bean that holds all my config variables for type-safe access.
#RequestScoped
public class Configuration {
#Inject
#ConfigProperty(name = "myProperty")
private String myProperty;
#Inject
#ConfigProperty(name = "myProperty2")
private String myProperty2;
....
}
When injecting the Configuration class in a different bean, each ConfigProperty will be resolved. Since my custom DatabaseConfigSource has the highest ordinal (500), it will be used for property resolution first. If the property is not found, it will delegate the resolution to the next ConfigSource.
For each ConfigProperty the getPropertyValue function from the DatabaseConfigSource is called. Since I do not want to retreive the parameters from the database for each config property, I moved the config property resolution to a request-scoped bean.
#RequestScoped
public class ConfigParameterProvider {
#Inject
private ConfigParameterDao configParameterDao;
private Map<String, String> configParameters = new HashMap<>();
#PostConstruct
public void init() {
List<ConfigParameter> configParams = configParameterDao.findAll();
configParameters = configParams.stream()
.collect(toMap(ConfigParameter::getId, ConfigParameter::getValue));
}
public String getProperty(String key) {
return configParameters.get(key);
}
}
I could sure change the request-scoped ConfigParameterProvider to ApplicationScoped. However, we have a multi-tenant setup and the parameters need to be resolved per request.
As you can see, this is a bit hacky, because we need to explicitly tell the ConfigSource, when it is allowed to be instantiated properly (inject the bean).
I would prefer a standarized solution from DeltaSpike for using CDI in a ConfigSource. If you have any idea on how to properly realise this, please let me know.
Even though this post has been answered already I'd like to suggest another possible solution for this problem.
I managed to load properties from my db service by creating an #Signleton #Startup EJB which extends the org.apache.deltaspike.core.impl.config.BaseConfigSource and injects my DAO as delegate which I then registered into the org.apache.deltaspike.core.api.config.ConfigResolver.
#Startup
#Singleton
public class DatabaseConfigSourceBean extends BaseConfigSource {
private static final Logger logger = LoggerFactory.getLogger(DatabaseConfigSourceBean.class);
private #Inject PropertyService delegateService;
#PostConstruct
public void onStartup() {
ConfigResolver.addConfigSources(Collections.singletonList(this));
logger.info("Registered the DatabaseConfigSourceBean in the ConfigSourceProvider ...");
}
#Override
public Map<String, String> getProperties() {
return delegateService.getProperties();
}
#Override
public String getPropertyValue(String key) {
return delegateService.getPropertyValue(key);
}
#Override
public String getConfigName() {
return DatabaseConfigSourceBean.class.getSimpleName();
}
#Override
public boolean isScannable() {
return true;
}
}
I know that creating an EJB for this purpose basically produces a way too big overhead, but I think it's a bit of a cleaner solution instead of handling this problem by some marker booleans with static accessors ...
DS is using the java se spi mechanism for this which is not CD'Injectable'. One solution would be to use the BeanProvider to get hold of your DatabaseConfigSource and delegate operations to it.

Categories