I want to stream events multiple events, all inherited from the same base type, from a mongoDB, using the spring ReactiveMongoRepository. Next I want to have them all differently handled and thus defined several overloads of a handle method, all for one child. However, the compiler complains, he can't find a proper method.
The problem is exemplarily shown in this test method:
#Test
public void polymorphismTest() {
this.createFlux()
.map(this::polymorphicMethod)
.subscribe();
}
private Flux<A> createFlux() {
A1 a1 = new A1();
a1.string1 = "foo";
A2 a2 = new A2();
a2.string2 = "bar";
return Flux.just(a1, a2);
}
private void polymorphicMethod(A1 a1) {
System.out.println(a1.string1);
}
private void polymorphicMethod(A2 a2) {
System.out.println(a2.string2);
}
I somehow understand the issue, since the compiler can't know I have a proper method for all inherited classes. However, it would be nice to have a solution similar to my approach, since it is (in my eyes) clean and readable.
I know, a solution would be to define the handle as an abstract method in the base type and implement it in the inherited classes, but this would break the functional approach of the rest of the application, plus events in a database should be POJOs.
I also would love to avoid the typical command pattern approach with one huge mapping of types to functions, but if there is no other idea this might be the solution.
You can utilize Spring to provide types to help Reactor sort appropriate "events" into appropriate "event handlers".
For example, you can define the following:
public interface EventListener<E extends EventType> {
default Class<E> getType() {
return (Class<E>) GenericTypeResolver.resolveTypeArgument(getClass(), EventListener.class);
}
Mono<Void> execute(E event);
}
You can now do this:
EventListener<E> listener = ...
sourceFlux.ofType(listener.getType()).flatMap(listener::execute)
With Spring you can define multiple EventListener instances (either by creating classes inheriting it and using #Component or defining an #Configuration with many #Bean instances of that interface) which you can collect by #Autowire or #Bean to "register" for these events.
This avoids needing to define huge maps and has about as much code as if you were trying to handle each event type anyway.
Related
I have a hierarchical list of converters like the following for example:
#Named
public class NameConverter {
#Inject
private AddressConverter addressConverter;
public package2.Name convert(package1.Name source) {
package2.Name target = new package2.Name();
target.setFirstName(source.getName());
target.setLastName(source.getName());
target.setAddress(addressConverter.convert(source.getAddress()));
}
}
and AddressConverter has ZipCodeConverter and so on ...
In the Unit Testing class,
1) I would create a mock for addressConverter - EasyMock.createNiceMock.
2) Set the expectation -
EasyMock.expect(addressConverter.convert(EasyMock.anyObject(package1.Address.class))).andReturn(addressList); # What this addressList should be?
3) Whitebox.setInternalState for private fields.
Question :
I would assert on first name and last name if they are equal which is straight forward.
But, NameConverter is also responsible for setting the converted Address.There is a possibility for NameConverter to change the values of returned converted Address and other POJOs inside.
So how do I ensure through Asserts or something else, that NameConverter just sets the Address(and the POJOs encapsulated by it) as it is and does not tamper with the values ??
Possible Solution: In the EasyMock.expect return, should I create and set values for all POJOs till the last one in the hierarchy and assert on each of the values?
But that doesn't seem like unit testing !!
Please help as how to unit test this converter.
It is unit testing to set the return value of a mock object and to assert that that return value is put in the right place by your NameConverter.
However, perhaps what you're coming across is a failure of appropriate layering. If you have a set of 'converter' classes which then need to be co-ordinated in some fashion you may want to make each converter independent and to bring the co-ordination responsibility elsewhere. So, your NameConverter should be completely independent of AddressConverter and you perhaps need a third class which is responsible for calling a set of converters, which each just do their job. You could restructure each converter to be given an instance of both their input and output object and their unit tests assert that they only act on known fields within each object. Then the co-ordinator object doesn't need to know anything about what each converter does, it just needs to locate / create instances of the input and output objects and to call each converter in turn. That's very amenable to a unit-testing approach, without resulting in a lot of layering concerns.
Example code:
public interface Converter<S, T> {
convert(S source, T target);
}
public class NameConverter implements Converter<p1.Name, p2.Name> {
#Override
public void convert(p1.Name source, p2.Name target) {
target.setFirstName(source.getName());
target.setLastName(source.getName());
}
}
public class AddressConverter implements Converter<p1.Name, p2.Name> {
#Override
public void convert(p1.Name source, p2.Name target) {
// more stuff.
}
}
public class ConversionService {
private final Set<Converter> converters;
#Inject
public ConversionService(Set<Converter> converters) {
this.converters = converters;
}
public p2.Name convert(p1.Name source) {
p2.Name target = new p2.Name();
converters.forEach((converter) -> converter.convert(source, target);
return target;
}
}
Then your unit test really just need to know that all your lower-level converters were called.
I would suggest three options:
Return a new empty instance of address from your mock and compare that the same exact instance is set to the target. Don't test if the address value is modified. It's Ok not to test every single possibility of things going wrong.
Return a strict mock of address without any expectations set. It will throw if there is an attempt to modify it. And again check for instance equality.
Don't use mocks at all. Test the entire hierarchy as a whole. It does not look like unit testing, but it may be a good option. I think mocks are often overused and should be avoided when possible. Please see more here.
I would recommend the following as a good unit test for NameConverter:
public final class NameConverterTest {
#SUT
NameConverter tested;
#Test
public void convertNameFromPackage1ToNameFromPackage2() {
Address address = new Address();
package1.Name source = new package1.Name("A Name", address);
package2.Name converted = tested.convert(source);
assertEquals(source.getName(), converted.getFirstName());
assertEquals(source.getName(), converted.getLastName());
assertNotNull(converted.getAddress());
}
}
According to Martin Fowler's definition, the above is still a unit test for the NameConverter unit, even if it doesn't isolate it from its dependency on AddressConverter (which would have its own unit test).
(For simplicity, I used an hypothetical #SUT annotation which takes care of instantiating the "system under test" with injected dependencies - and actual testing libraries for this do exist.)
I am working on a project where I am using MyBatis annotations as persistence framework. Therefore, I have to create an interface for the 'mapper' and compose the mapper in the service like :
class XYZServiceImpl{
public XYZMapper getXYZMapper(){
return SessionUtil.getSqlSession().getMapper(XYZMapper.class)
}
}
Now while unit testing the service with Mockito, I am trying to inject a mock for the mapper. But since I am injecting mock in an instance of XYZService, how can mock a method of the service itself, in this case getXYZMapper() is what I am trying to stub. Although I have got a solution of creating the instance XYZMapper in the service and not call on demand like the above code does something like :
Class XYZServiceImpl{
XYZMapper mapper;
public void useXYZMapper(){
mapper = SessionUtil.getSqlSession().getMapper(XYZMapper.class);
}
}
But that would bring a lot of code changes (ofcourse I can refactor) but is there a way to achieve without having to make code changes?
Also what would be a 'purist' way to have a mapper instance in the class is it the method 1 that is better than method 2 in terms of performance?
EDIT : Here XYZMapper is an interface. Something like :
public interface XYZMapper{
#Select("SELECT * FROM someclass WHERE id = #{id}")
public SomeClass getSomeClass(int id);
}
EDIT : I am facing a similar situation but with a variance that I have a service that I do want to test like XYZServiceImpl. Now it has a method getXYZDetails() which has a lot of business logic handled within the service. Now if getXYZDetails looks like the following :
public XYZDetails getXYZDetails(int id){
XYZDetails details = new XYZDetails();
details.set1Details(fetchSet1Details(id));
//Perform some business logic
details.set2Details(fetchSet2Details(id));
if(details.set2Details() != null){
for(int i = 0; i < details.set2Details().size(); i++){
flushTheseDetails(i);
}
}
.
.
}
Kindly notice that fetchSet1Details(), fetchSet2Details(), flushTheseDetails are public service, public and private service respectively.
I want to know of a method that can mock/stub these methods while testing getXYZDetails() thus enabling me to
There are several options you can use.
Inject dependency
This works only for simple methods like getXYZMapper when method only returns external dependency of you object. This may require to create new XYZServiceImpl instances if for example mapper is bound to connection which is opened per request.
Encapsulate method behavior in object
Another way to achieve similar result is to use a factory or service locator
like this:
public class XYZServiceImpl {
public XYZServiceImpl(XYZMapperFactory mapperFactory) {
this.mapperFactory = mapperFactory;
}
public XYZMapper getXYZMapper() {
return mapperFactory.getMapper();
}
}
This will allow you easily substitute factory in test with implementation which returns mock mapper.
The similar approach can be used for other methods fetchSet1Details, fetchSet2Details, flushTheseDetails that is moving them to other class or classes. If the method contains complex (and may be loosely related) logic it is a good candidate to be moved in separate class. Think about what these methods do. Usually you can move some essential and unrelated part of them to other class or classes and this makes mocking them much easier.
Subclass
This is not recommended but in legacy code sometimes is very helpful as a temporary solution.
In your test subclass you class under test and override methods you need:
#Test
public void someTest() {
XYZServiceImpl sut = new XYZServiceImpl() {
public XYZMapper getXYZMapper() {
return mapperMock;
}
public Whatever fetchSet1Details() {
return whateverYouNeedInTest;
}
}
sut.invokeMethodUnderTest();
}
The only thing you may need to do is to change access modifier of private method to package-private or protected so you can override them.
Spying
This method in also discouraged but you can use mockito spies:
XYZServiceImpl realService = new XYZServiceImpl();
XYZServiceImpl spy = Mockito.spy(realService);
when(spy.fetchSet1Details()).thenReturn(whaeveryouneed);
when(spy.getXYZMapper()).thenReturn(mockMapper);
spy.methodUnderTest();
I would suggest the "purist" way of doing this is to accept an XYZMapper instance in your constructor and store it in a local field.
In production use, you can pass an e.g. SQLXYZMapper, which will interact with your database. In test use, you can pass in a mocked object that you can verify interactions with.
I have 3 objects, a DTO called BalanceDTO which implements a interface RequestDTO and a Balance Entity. I created the DTO because the entity I can't use, JAXB compliance (legacy code).
The DTO is used in the web service layer, BalanceService, and the Entity in the API I integrate to from the web service. Between the web service and the API there is validation. RequestValidation which has sub validations for each type of RequestDTO i.e. BalanceRequestValidation.
The validation component takes in a RequestDTO as a parameter and then needs to do validation for the specific component. At the point of input the validation component doesn't know which object has been passed to it i.e. BalanceDTO, it only sees the interface.
I want to avoid using instanceof so I was thinking of using a visitor on the DTO so that it delegates itself to the validation that needs to be performed on it.
But the validation needs more/other components as well not just the BalanceDTO as input parameters and different validations needs different input params.
Is there another way to know which object you are working with and the validation to choose without using instanceof? Another design that I can follow?
You are well on the right track here - the Visitor design pattern is often the best way to avoid downcasting.
I am going to suggest a combination of the visitor and delegation design patterns, though let's walk through some alternatives.
Having the object do the validation itself via the RequestDTO interface is not viable since you need different components and the validation is not trivial in nature.
Using instanceof and downcasting looks a little messy, and the compiler won't complain if you add a new validatable class and forget to add the validator - you'll be relying on a runtime error via ...else { throw new IllegalArgumentException("Unknown RequestDTO subtype!"); }
The visitor design pattern is the classic way to avoid downcasting plus it also gives you a compiler error if you add a new class that should be validatable and forget to add the validation.
You can use accept() and visit() methods, or you can use method naming that is closer to your domain, e.g. validate(), like this:
public interface RequestDTO {
boolean validate(RequestValidation validator);
}
public class BalanceDTO implements RequestDTO {
// ...
#Override
public boolean validate(RequestValidation validator) {
return validator.validate(this);
}
}
public class RequestValidation {
// components...
public boolean validate(BalanceDTO balanceDTO) {
return true; // todo...
}
public boolean validate(AnotherDTO anotherDTO) {
return true; // todo...
}
}
If you want to go a bit further you can delegate the validation to specific validation components, like this:
public class RequestValidation {
BalanceRequestValidation balanceRequestValidation;
AnotherRequestValidation anotherRequestValidation;
public boolean validate(BalanceDTO balanceDTO) {
return balanceRequestValidation.validate(balanceDTO, a, b, c);
}
public boolean validate(AnotherDTO anotherDTO) {
return anotherRequestValidation.validate(anotherDTO, x, y, z);
}
}
Given I have understood your problem correctly, the visitor design pattern, possibly combined with the delegation design pattern, is indeed a good approach.
I better explain the question with an example.
I have an Interface Model which can be used to access data.
There can be different implementations of Model which can represent the data in various format say XMl , txt format etc. Model is not concerned with the formats.
Lets say one such implementation is myxmlModel.
Now i want to force myxmlModel and every other implementation of Model to follow Singleton Pattern.The usual way is to make myxmlModels constructor private and provide a static factory method to return an instance of myModel class.But the problem is interface cannot have static method definitions and a result i cannot enforce a particular Factory method definition on all implementation of Model. So one implementation may end with providing getObject() and other may have getNewModel()..
One work around is to allow package access to myxmlModel's constructor and create a Factory class which creates the myxmlModel object and cache it for further use.
I was wondering if there is a better way to achieve the same functionality .
Make a factory that returns
instances of your interface, Model.
Make all concrete implementations of the model package-private classes
in the same package as your factory.
If your model is to be a singleton, and you are using java
5+, use enum instead of traditional
singleton, as it is safer.
public enum MyXMLModel{
INSTANCE();
//rest of class
};
EDIT:
Another possibility is to create delegate classes that do all the work and then use an enum to provide all of the Model Options.
for instance:
class MyXMLModelDelegate implements Model {
public void foo() { /*does foo*/}
...
}
class MyJSONModelDelegate implements Model {
public void foo() { /*does foo*/ }
...
}
public enum Models {
XML(new MyXMLModelDelgate()),
JSON(new MyJSONModelDelegate());
private Model delegate;
public Models(Model delegate) { this.delegate=delegate; }
public void foo() { delegate.foo(); }
}
You can use reflection. Something like this:
public interface Model {
class Singleton {
public static Model instance(Class<? extends Model> modelClass) {
try {
return (Model)modelClass.getField("instance").get(null);
} catch (blah-blah) {
blah-blah
}
}
}
public class XmlModel implements Model {
private static final Model instance = new XmlModel();
private XmlModel() {
}
}
usage:
Model.Singleton.instance(XmlModel.class)
Actually, I don't like this code much :). First, it uses reflection - very slow, second - there are possibilities of runtime errors in case of wrong definitions of classes.
Can you refactor the interface to be an abstract class? This will allow you to force a particular factory method down to all implementing classes.
I used to ask myself the same question. And I proposed the same answer ;-)
Now I normally drop the "forcing" behavior, I rely on documentation.
I found no case where the Singleton aspect was so compelling that it needed to be enforced by all means.
It is just a "best-practice" for the project.
I usually use Spring to instanciate such an object,
and it is the Spring configuration that makes it a Singleton.
Safe, and so easy ... plus additionnal Spring advantages (such as Proxying, substituing a different object once to make some tests etc...)
This is more an answer to your comment/clarification to kts's answer. Is it so, that the real problem is not using the Singleton pattern but instead defining an eclipse (equinox) extension point schema that allows contributing a singleton?
I think, this can't be done, because everytime you call IConfigurationElement.createExecutableExtension you create a new instance. This is quite incompatible with your singleton requirement. And therefore you need the public default constructor so that everybody can create instances.
Unless you can change the extension point definition so that plugins contribute a ModelFactory rather than a model, like
public interface ModelFactory {
public Model getModelInstance();
}
So the extension user will instantiate a ModelFactory and use it to obtain the singleton.
If I guessed wrong, leave a comment and I delete the answer ;)
Maybe I am just blind, but I do not see how to use Guice (just starting with it) to replace the new call in this method:
public boolean myMethod(String anInputValue) {
Processor proc = new ProcessorImpl(anInputValue);
return proc.isEnabled();
}
For testing there might be a different implementation of the Processor, so I'd like to avoid the new call and in the course of that get rid of the dependency on the implementation.
If my class could just remember an instance of Processor I could inject it via the constructor, but as the Processors are designed to be immutable I need a new one every time.
How would I go about and achieve that with Guice (2.0) ?
There is some time since I used Guice now, but I remember something called "assisted injection". It allows you to define a factory method where some parameters are supplied and some are injected. Instead of injecting the Processor you inject a processor factory, that has a factory method that takes the anInputValue parameter.
I point you to the javadoc of the FactoryProvider. I believe it should be usable for you.
You can get the effect you want by injecting a "Provider", which can by asked at runtime to give you a Processor. Providers provide a way to defer the construction of an object until requested.
They're covered in the Guice Docs here and here.
The provider will look something like
public class ProcessorProvider implements Provider<Processor> {
public Processor get() {
// construct and return a Processor
}
}
Since Providers are constructed and injected by Guice, they can themselves have bits injected.
Your code will look something like
#Inject
public MyClass(ProcessorProvider processorProvider) {
this.processorProvider = processorProvider;
}
public boolean myMethod(String anInputValue) {
return processorProvider.get().isEnabled(anInputValue);
}
Does your Processor need access to anInputValue for its entire lifecycle? If not, could the value be passed in for the method call you're using, something like:
#Inject
public MyClass(Processor processor) {
this.processor = processor;
}
public boolean myMethod(String anInputValue) {
return processor.isEnabled(anInputValue);
}