I have been refactoring a huge method in the project I work and came up with this idea to create a validation service like this -
public class TrickyValidation {
String validationVariable1;
String validationVariable2;
String validationVariable3;
HashMap<String, Object> itemsMap;
Object dependentObject;
#Autowired
SpringService service;
public static boolean doTrickyValidation(HashMap<String, Object> itemsMap, Object dependentObject) {
return new TrickyValidation(itemsMap, dependentObject).validate();
}
private TrickyValidation(Object itemsMap, Object dependentObject) {
this.itemsMap = itemsMap;
this.someDependentObject = dependentObject;
init();
}
private boolean validate() {
// loads of logic for validation by using validationVaribales
return true;
}
private void init() {
// Some methods to extract thease variables from itemsMap, dependentObject etc..
this.validationVariable1 = service.get(dependentObject);
this.validationVariable1 = ...;
this.validationVariable1 = ...;
}
}
My goal what I want to do here is to Encapsulate everything as much as possible and use clean code principles.
I feel a bit here like fighting spring framework because I don't want
that "TrickyValidation" class would be #Servcie and belong to spring container. Will Autowired even work here?
Is it a good design? Most likely I will use this validation in a loop. I like this solution because when I have to validate things I just simply call one and only public static method of this class TrickyValidation.doTrickyValidation(map, obj)
Any suggestions are welcome on how to improve this, or why it's a bad idea.
This code probably won't work because in the init method of the object you're trying to access service which is not autowired into this instance. In general the autowiring works only for objects managed (created by) Spring.
In this case you create "manually" the object of class TrickyValidation...
IMO the better design is to split the "Validator" object that can be Spring managed and the Validation itself that is not spring based.
#Component
public class Validator {
#Autowired
private Service service;
public boolean doTrickyValidation(HashMap<String, Object> itemsMap, Object dependentObject) {
// resolve the validation strategy from the items passed to this method.
TrickyValidation validation = resolveTrickyValidation(itemsPam, dependentObject);
return validation.validate();
}
private TrickyValidation resolveTrickyValidation(...) {
// construct the proper validation strategy
// access service if you want
}
}
Related
I am trying to define a custom DeltaSpike ConfigSource. The custom config source will have the highest priority and check the database for the config parameter.
I have a ConfigParameter entity, that simply has a key and a value.
#Entity
#Cacheable
public class ConfigParameter ... {
private String key;
private String value;
}
I have a #Dependent DAO that finds all config parameters.
What I am trying to do now, is define a custom ConfigSource, that is able to get the config parameter from the database. Therefore, I want to inject my DAO in the ConfigSource. So basically something like
#ApplicationScoped
public class DatabaseConfigSource implements ConfigSource {
#Inject
private ConfigParameterDao configParameterDao;
....
}
However, when registering the ConfigSource via META-INF/services/org.apache.deltaspike.core.spi.config.ConfigSource, the class will be instantiated and CDI will not work.
Is there any way to get CDI working in this case?
Thanks in advance, if you need any further information, please let me know.
The main problem is, that the ConfigSource gets instantiated very early on when the BeanManager is not available yet. Even the JNDI lookup does not work at that point in time. Thus, I need to delay the injection/lookup.
What I did now, is add a static boolean to my config source, that I set manually. We have a InitializerService that makes sure that the system is setup properly. At the end of the initialization process, I call allowInitialization() in order to tell the config source, that the bean is injectable now. Next time the ConfigSource is asked, it will be able to inject the bean using BeanProvider.injectFields.
public class DatabaseConfigSource implements ConfigSource {
private static boolean allowInit;
#Inject
private ConfigParameterProvider configParameterProvider;
#Override
public int getOrdinal() {
return 500;
}
#Override
public String getPropertyValue(String key) {
initIfNecessary();
if (configParameterProvider == null) {
return null;
}
return configParameterProvider.getProperty(key);
}
public static void allowInitialization() {
allowInit = true;
}
private void initIfNecessary() {
if (allowInit) {
BeanProvider.injectFields(this);
}
}
}
I have a request-scoped bean that holds all my config variables for type-safe access.
#RequestScoped
public class Configuration {
#Inject
#ConfigProperty(name = "myProperty")
private String myProperty;
#Inject
#ConfigProperty(name = "myProperty2")
private String myProperty2;
....
}
When injecting the Configuration class in a different bean, each ConfigProperty will be resolved. Since my custom DatabaseConfigSource has the highest ordinal (500), it will be used for property resolution first. If the property is not found, it will delegate the resolution to the next ConfigSource.
For each ConfigProperty the getPropertyValue function from the DatabaseConfigSource is called. Since I do not want to retreive the parameters from the database for each config property, I moved the config property resolution to a request-scoped bean.
#RequestScoped
public class ConfigParameterProvider {
#Inject
private ConfigParameterDao configParameterDao;
private Map<String, String> configParameters = new HashMap<>();
#PostConstruct
public void init() {
List<ConfigParameter> configParams = configParameterDao.findAll();
configParameters = configParams.stream()
.collect(toMap(ConfigParameter::getId, ConfigParameter::getValue));
}
public String getProperty(String key) {
return configParameters.get(key);
}
}
I could sure change the request-scoped ConfigParameterProvider to ApplicationScoped. However, we have a multi-tenant setup and the parameters need to be resolved per request.
As you can see, this is a bit hacky, because we need to explicitly tell the ConfigSource, when it is allowed to be instantiated properly (inject the bean).
I would prefer a standarized solution from DeltaSpike for using CDI in a ConfigSource. If you have any idea on how to properly realise this, please let me know.
Even though this post has been answered already I'd like to suggest another possible solution for this problem.
I managed to load properties from my db service by creating an #Signleton #Startup EJB which extends the org.apache.deltaspike.core.impl.config.BaseConfigSource and injects my DAO as delegate which I then registered into the org.apache.deltaspike.core.api.config.ConfigResolver.
#Startup
#Singleton
public class DatabaseConfigSourceBean extends BaseConfigSource {
private static final Logger logger = LoggerFactory.getLogger(DatabaseConfigSourceBean.class);
private #Inject PropertyService delegateService;
#PostConstruct
public void onStartup() {
ConfigResolver.addConfigSources(Collections.singletonList(this));
logger.info("Registered the DatabaseConfigSourceBean in the ConfigSourceProvider ...");
}
#Override
public Map<String, String> getProperties() {
return delegateService.getProperties();
}
#Override
public String getPropertyValue(String key) {
return delegateService.getPropertyValue(key);
}
#Override
public String getConfigName() {
return DatabaseConfigSourceBean.class.getSimpleName();
}
#Override
public boolean isScannable() {
return true;
}
}
I know that creating an EJB for this purpose basically produces a way too big overhead, but I think it's a bit of a cleaner solution instead of handling this problem by some marker booleans with static accessors ...
DS is using the java se spi mechanism for this which is not CD'Injectable'. One solution would be to use the BeanProvider to get hold of your DatabaseConfigSource and delegate operations to it.
i am building a http API client that needs to call out to a specific endpoint like so:
public class MyApiClient {
private static final String ENDPOINT ="http://myapi....";
}
Here the endpoint won't change so its constant. However, I want to be able to override this for testing so that I can test against a mock http server for example.
Whats the best way to do this? Is it just to make it an instance variable and provide it with a starting value:
private String endpoint = ="http://myapi....";
public void setEndpoint(String endpoint){
...
}
Well, there are of course many solutions to this and one way of doing it is to use a system property with a default value:
private static final String DEFAULT_ENDPOINT = "http://myapi....";
private static final String ENDPOINT =
System.getProperty("my.endpoint", DEFAULT_ENDPOINT);
This way you get a configurable way of solving your problem. If you need even more flexibility when initializing your static constants you could also use a static initializer:
private static final String ENDPOINT;
static {
// do initialization here but do not throw any exceptions (bad practice)
// you can e.g. read from files etc...
// Then assign your constant...
ENDPOINT =
}
System properties are passed on the command line as -D parameters e.g:
java -Dmy.endpoint=http://...
But in my opinion, an even better approach is to actually inject the value to the class that is using it:
public class ClassThatIsUsingTheConfig {
private final String endpoint;
public ClassThatIsUsingTheConfig(final String endpoint) {
this.endpoint = endpoint;
}
public void someMethod() {
// use endpoint
}
}
And then, make the selection of which endpoint to use in the caller class. From a test case, this will be very easy to mock.
public class MyTest {
#Test
public void testMethod() {
ClassThatIsUsingTheConfig var = new ClassThatIsUsingTheConfig(TEST_ENDPOINT);
var.someMethod();
}
}
public class MyProdClass {
public void prodMethod() {
ClassThatIsUsingTheConfig var = new ClassThatIsUsingTheConfig(PROD_ENDPOINT);
var.someMethod();
}
}
You can read more about dependency injection here.
On a side note, if you are using some kind of framework for managing dependencies such as Spring Framework or CDI it is common to be able to inject properties and constants in various ways (e.g. based on which environment that is currently running). An example, when using Spring Framework you can declare all your constants in a property file and inject the property using annotations:
#Autowired
public ClassWhoIsUsingTheConfig(#Value("my.endoint") final String endpoint) {
this.endpoint = endpoint;
}
The property file for prod could be along the lines of:
my.endpoint=http://prodserver...
wheras the property file for test would look like this:
my.endpoint=http://testserver...
The approach of using a Dependency Injection engine allows for a very flexible way of handling external constants, paths, resources etc and simplifies your life when it comes to testing the code.
I am using Spring DI to wire my components and I came across this issue.
I have a BaseService class which has multiple implementations. And the layer above it, has a builder which calls the service to get data to populate POJOs. Service implementation I need to call (ServiceA,ServiceB) changes according to the type of POJO I need to build.
In such case, how can I autowire the service, as it requires late binding the service. How can I tackle this kind of scenario? (Example in Spring DI would really help)
I read similar questions but could not find the answer. And I read that SOA patterns such as Service Host provide different solutions to exact use case.
Please help.
Thanks
How about using a FactoryBean:
public class BuilderFactory implements FactoryBean<Builder> {
#Autowired
private ApplicationContext appContext;
...
#Override
public Builder getObject() {
Builder builder = new Builder();
switch(something()) {
case "foo":
builder.service = new ServiceA();
break;
case "bar":
builder.service= new ServiceB();
break;
...
default:
//handle cases where it's unclear which type to create
}
return builder;
}
}
where Builder instances have a public/package-private field BaseService service that gets called in their getData(), buildPojos() and wherever other methods.
(you could also use static factory methods to instantiate Builder if you want this field to be private)
You can use ServiceLocatorFactoryBean. In your case you would do something like this:
public interface BaseServiceLocator {
BaseService lookup(String qualifier); //use whatever qualifier type makes sense here
}
<bean id="serviceLocatorFactoryBean"
class="org.springframework.beans.factory.config.ServiceLocatorFactoryBean">
<property name="serviceLocatorInterface"
value="your.package.BaseServiceLocator" />
</bean>
Then your builder would look something like this:
public class Builder {
#Autowired
private BaseServiceLocator baseServiceLocator;
#Override
public YourReturnType businessMethod() {
SomeData data = getData();
BaseService baseService = baseServiceLocator(data.getType()); //here I am assuming that getType() is a String
//whatever
}
I had the same requirement in one of my projects. I used reflection to get the services according to the pojo requirement. This way there will be no static values even if you define new pojo and service in future you wont have to change any implementation.
I had named my pojos and Services similarly. ie
POJO Name:Pond5DownloadStrategy and ServiceName: Pond5DownloadStrategyService.
I defined all the services in spring. I had a DownloadStrategyFactory which had a single method
getService(Object obj). which is also instantiated as spring bean.
what getService method did is.
I get the POJO name as string using obj.getClass().getSimpleName() and then I append Service at the end. ex.
If I pass Pond5DownloadStrategy then I do AppContext.getBean("Pond5DownloadStrategyService");
Please look at my answer here.
Although is under spring batch topic it’s actually related to your question and the Strategy Design pattern.
StrategyA StrategyB are your ServiceA,ServiceB etc.
You need to use the StrategyLocator in your Builder class (in the original answer it’s equivalent is MyTaskelt). The look-up will be based on your pojo type.
strategy = strategyLocator.lookup(POJOs.class);
In the answer I suggested a PlugableStrategyMapper, but if you predefine all Servcies you can place them in a Map in the application-context.xml
For example, for manual binding:
public class Builder {
#Autowired
private Map<String, Service> services;
// Bind pojo classes to bean names.
private Map<Class<?>, String> binding;
public Service getService(Object object) {
return services.get(binding.get(object.getClass()));
}
public Map<Class<?>, String> getBinding() {
return binding;
}
public void setBinding(Map<Class<?>, String> binding) {
this.binding = binding;
}
}
However, manual binding could be repetitive so if you don't really need his flexibility, you could use a naming convention (#AmitChotaliya answer) or enforce the binding via Service method.
public interface Service {
Class<?> getTargetType();
}
public class Builder {
#Autowired
private Set<Service> services;
// Bind pojo classes to Services.
private Map<Class<?>, Service> binding = new ConcurrentHashMap<Class<?>, Service>();
#PostConstruct
public void init() {
for (Service service : services) {
binding.put(service.getTargetType(), service);
}
}
public Service getService(Object object) {
return binding.get(object.getClass());
}
}
I would like to be able to change the Guice injections at runtime to support multiple injections based on user input. This is what I would like to achieve:
public interface IDao {
public int someMethod();
}
public class DaoEarth implements IDao {
#Override
public int someMethod(){ ... }
}
public class DaoMars implements IDao {
#Override
public int someMethod(){ ... }
}
public class MyClass {
#Inject
private IDao myDao;
public int myMethod(String domain) {
//If Domain == Earth, myDao should be of the type DaoEarth
//If Domain == DaoMars, myDao should be of the type DaoMars
}
}
I was thinking of writing my own Provider, but I don't know how to use that provider to change my bindings at runtime. Any input is welcome and appreciated :)!
Update
Here's what I currently came up with, it's not as pretty as I'd like, so I'm still looking for feedback
public class DomainProvider {
#Inject #Earth
private IDaoProvider earthDaoProvider;
#Inject #Mars
private IDaoProvider marsDaoProvider;
public IDaoProvider get(Domain domain){
switch (domain){
case EARTH:
return earthDaoProvider;
case MARS:
return marsDaoProvider;
}
}
public IDaoProvider get(String domain){
Domain parsedDomain = Domain.valueOf(domain.toUpperCase());
return get(parsedDomain);
}
}
//MarsDaoProvider would be equivalent
public class EarthDaoProvider implements IDaoProvider {
#Inject #Earth
private IDao earthDao;
public IDao getDao() {
return earthDao;
}
}
// This means that in "MyClass", I can do:
public class MyClass {
#Inject
private DomainProvider domainProvider;
public int myMethod(String domain) {
IDaoProvider daoProvider = domainProvider.get(domain);
IDao dao = daoProvider.getDao();
//Now "dao" will be of the correct type based on the domain
}
}
//Of course elsewhere I have the bindings set like
bind(IDao.class).annotatedWith(Earth.class).to(EarthDao.class);
Your version is almost perfect as it is: You're going to need to inject some kind of object that returns one or the other based on code you write, and don't need assisted injection or anything like that. That said, you can skip some of the boilerplate:
public class DomainProvider {
// Just inject Providers directly without binding them explicitly.
#Inject #Earth Provider<IDao> earthDaoProvider;
#Inject #Mars Provider<IDao> marsDaoProvider;
public Provider<IDao> get(Domain domain){
switch (domain){
case EARTH:
return earthDaoProvider;
case MARS:
return marsDaoProvider;
}
}
public Provider<IDao> get(String domain){
Domain parsedDomain = Domain.valueOf(domain.toUpperCase());
return get(parsedDomain);
}
}
Your MyClass in that case would be exactly identical. Here, Provider is either the one-method generic interface com.google.inject.Provider, or the equivalent builtin javax.inject.Provider that it extends. Read more about Guice Providers on the relevant Guice wiki topic.
bind(IDao.class).annotatedWith(Earth.class).to(EarthDao.class);
// You can now inject "#Earth IDao" and also "#Earth Provider<IDao>".
Basically, if you bind a key Foo (to a class, provider, #Provides method, or instance), you automatically get to inject either a Foo or Provider<Foo> with no additional work. Providers are also a great way to ensure that you get a new instance with every call to get, if that's what you want; with your original, you'll always get the same instance of EarthDao or MarsDao for any given DomainProvider you inject. (If you have a scoped binding like #Singleton, Guice will respect that too; Provider just lets Guice get involved, rather than reusing a plain old Java reference.)
This means you can skip your custom EarthDaoProvider and MarsDaoProvider, unless you really need to perform any external initialization on them—at which point you'd probably be better off calling bind(EarthDao.class).toProvider(EarthDaoProvider.class) so the preparation also happens when injecting EarthDao directly. You could also just have DomainProvider return an IDao instance directly by calling get on the appropriate Provider, and be assured that it'll be a new instance every time.
I want to reinject singleton-scoped dependencies into prototype Spring beans, after they have been deserialized.
Say I've got a Process bean, which depends on a Repository bean. The Repository bean is a scoped as a singleton, but the Process bean is prototype-scoped. Periodically I serialize the Process, and then later deserialize it.
class Process {
private Repository repository;
// getters, setters, etc.
}
I don't want to serialize and deserialize the Repository. Nor do I want to put "transient" on the member variable that holds a reference to it in Process, nor a reference to some kind of proxy, or anything other than a plain old member variable declared as a Repository.
What I think I want is for the Process to have its dependency filled with a serializable proxy that points (with a transient reference) to the Repository, and, upon deserialization, can find the Repository again. How could I customize Spring to do that?
I figure I could use a proxy to hold the dependency references, much like . I wish I could use that exact technique. But the proxy I've seen Spring generate isn't serializable, and the docs say that if I use it with a singleton bean, I'll get an exception.
I could use a custom scope, perhaps, on the singleton beans, that would always supply a proxy when asked for a custom-scoped bean. Is that a good idea? Other ideas?
I used this instead, without any proxy:
public class Process implements HttpSessionActivationListener {
...
#Override
public void sessionDidActivate(HttpSessionEvent e) {
ServletContext sc = e.getSession().getServletContext();
WebApplicationContext newContext = WebApplicationContextUtils
.getRequiredWebApplicationContext(sc);
newContext.getAutowireCapableBeanFactory().configureBean(this, beanName);
}
}
The example is for a web environment when the application server serializes the session, but it should work for any ApplicationContext.
Spring provides a solution for this problem.
Take a look at the spring documentation http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/aop.html#aop-atconfigurable.
7.8.1 Using AspectJ to dependency inject domain objects with Spring
...
The support is intended to be used for objects created outside
of the control of any container. Domain objects often fall into
this category because they are often created programmatically
using the new operator, or by an ORM tool as a result of a database query.
The trick is to use load time weaving. Just start the jvm with -javaagent:path/to/org.springframework.instrument-{version}.jar. This agent will recognize every object that is instantiated and if it is annotated with #Configurable it will configure (inject #Autowired or #Resource dependencies) that object.
Just change the Process class to
#Configurable
class Process {
#Autowired
private transient Repository repository;
// getters, setters, etc.
}
Whenever you create a new instance
Process process = new Process();
spring will automatically inject the dependencies.
This also works if the Process object is deserialized.
How about added using aspects to add an injection step when you deserialize the object?
You would need AspectJ or similar for this. It would work very similarly to the #Configurable function in Spring.
e.g. add some advice around the a "private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException" method
This article may also help: http://java.sun.com/developer/technicalArticles/Programming/serialization/
I think the idea of serializing a bean and then forcing a reinjection of dependencies is not the best architecture.
How about having some sort of ProcessWrapper bean instead which could be a singleton. It would be injected with the Repository and either manages the deserialization of the Process or has a setter for it. When a new Process is set in the wrapper, it would call setRepository() on the Process. The beans that use the Process could either be set with the new one by the wrapper or call the ProcessWrapper which would delegate to the Process.
class ProcessWrapper {
private Repository repository;
private Process process;
// getters, setters, etc.
public void do() {
process.do();
}
public void setProcess(Process process) {
this.process = process;
this.process.setRepository(repository);
}
}
Answering my own question: how I've solved the problem so far is to create a base class which serializes and deserializes using a cheap little proxy. The proxy contains only the name of the bean.
You'll note that it uses a global to access the Spring context; a more elegant solution might store the context in a thread-local variable, something like that.
public abstract class CheaplySerializableBase
implements Serializable, BeanNameAware {
private String name;
private static class SerializationProxy implements Serializable {
private final String name;
public SerializationProxy(CheaplySerializableBase target) {
this.name = target.name;
}
Object readResolve() throws ObjectStreamException {
return ContextLoader.globalEvilSpringContext.getBean(name);
}
}
#Override
public void setBeanName(String name) {
this.name = name;
}
protected Object writeReplace() throws ObjectStreamException {
if (name != null) {
return new SerializationProxy(this);
}
return this;
}
}
The resulting serialized object is 150 bytes or so (if I remember correctly).
The method applicationContext.getAutowireCapableBeanFactory().autowireBean(detachedBean); can be used to reconfigure a Spring-managed bean that was serialized and then de-serialized (whose #Autowired fields become null). See example below. The serialization details are omitted for simplicity.
public class DefaultFooService implements FooService {
#Autowired
private ApplicationContext ctx;
#Override
public SerializableBean bar() {
SerializableBean detachedBean = performAction();
ctx.getAutowireCapableBeanFactory().autowireBean(detachedBean);
return detachedBean;
}
private SerializableBean performAction() {
SerializableBean outcome = ... // Obtains a deserialized instance, whose #Autowired fields are detached.
return outcome;
}
}
public class SerializableBean {
#Autowired
private transient BarService barService;
private int value;
public void doSomething() {
barService.doBar(value);
}
}