Java design, calling the associated business logic/service - java

I have 3 objects, a DTO called BalanceDTO which implements a interface RequestDTO and a Balance Entity. I created the DTO because the entity I can't use, JAXB compliance (legacy code).
The DTO is used in the web service layer, BalanceService, and the Entity in the API I integrate to from the web service. Between the web service and the API there is validation. RequestValidation which has sub validations for each type of RequestDTO i.e. BalanceRequestValidation.
The validation component takes in a RequestDTO as a parameter and then needs to do validation for the specific component. At the point of input the validation component doesn't know which object has been passed to it i.e. BalanceDTO, it only sees the interface.
I want to avoid using instanceof so I was thinking of using a visitor on the DTO so that it delegates itself to the validation that needs to be performed on it.
But the validation needs more/other components as well not just the BalanceDTO as input parameters and different validations needs different input params.
Is there another way to know which object you are working with and the validation to choose without using instanceof? Another design that I can follow?

You are well on the right track here - the Visitor design pattern is often the best way to avoid downcasting.
I am going to suggest a combination of the visitor and delegation design patterns, though let's walk through some alternatives.
Having the object do the validation itself via the RequestDTO interface is not viable since you need different components and the validation is not trivial in nature.
Using instanceof and downcasting looks a little messy, and the compiler won't complain if you add a new validatable class and forget to add the validator - you'll be relying on a runtime error via ...else { throw new IllegalArgumentException("Unknown RequestDTO subtype!"); }
The visitor design pattern is the classic way to avoid downcasting plus it also gives you a compiler error if you add a new class that should be validatable and forget to add the validation.
You can use accept() and visit() methods, or you can use method naming that is closer to your domain, e.g. validate(), like this:
public interface RequestDTO {
boolean validate(RequestValidation validator);
}
public class BalanceDTO implements RequestDTO {
// ...
#Override
public boolean validate(RequestValidation validator) {
return validator.validate(this);
}
}
public class RequestValidation {
// components...
public boolean validate(BalanceDTO balanceDTO) {
return true; // todo...
}
public boolean validate(AnotherDTO anotherDTO) {
return true; // todo...
}
}
If you want to go a bit further you can delegate the validation to specific validation components, like this:
public class RequestValidation {
BalanceRequestValidation balanceRequestValidation;
AnotherRequestValidation anotherRequestValidation;
public boolean validate(BalanceDTO balanceDTO) {
return balanceRequestValidation.validate(balanceDTO, a, b, c);
}
public boolean validate(AnotherDTO anotherDTO) {
return anotherRequestValidation.validate(anotherDTO, x, y, z);
}
}
Given I have understood your problem correctly, the visitor design pattern, possibly combined with the delegation design pattern, is indeed a good approach.

Related

Overloading in Flux mapping?

I want to stream events multiple events, all inherited from the same base type, from a mongoDB, using the spring ReactiveMongoRepository. Next I want to have them all differently handled and thus defined several overloads of a handle method, all for one child. However, the compiler complains, he can't find a proper method.
The problem is exemplarily shown in this test method:
#Test
public void polymorphismTest() {
this.createFlux()
.map(this::polymorphicMethod)
.subscribe();
}
private Flux<A> createFlux() {
A1 a1 = new A1();
a1.string1 = "foo";
A2 a2 = new A2();
a2.string2 = "bar";
return Flux.just(a1, a2);
}
private void polymorphicMethod(A1 a1) {
System.out.println(a1.string1);
}
private void polymorphicMethod(A2 a2) {
System.out.println(a2.string2);
}
I somehow understand the issue, since the compiler can't know I have a proper method for all inherited classes. However, it would be nice to have a solution similar to my approach, since it is (in my eyes) clean and readable.
I know, a solution would be to define the handle as an abstract method in the base type and implement it in the inherited classes, but this would break the functional approach of the rest of the application, plus events in a database should be POJOs.
I also would love to avoid the typical command pattern approach with one huge mapping of types to functions, but if there is no other idea this might be the solution.
You can utilize Spring to provide types to help Reactor sort appropriate "events" into appropriate "event handlers".
For example, you can define the following:
public interface EventListener<E extends EventType> {
default Class<E> getType() {
return (Class<E>) GenericTypeResolver.resolveTypeArgument(getClass(), EventListener.class);
}
Mono<Void> execute(E event);
}
You can now do this:
EventListener<E> listener = ...
sourceFlux.ofType(listener.getType()).flatMap(listener::execute)
With Spring you can define multiple EventListener instances (either by creating classes inheriting it and using #Component or defining an #Configuration with many #Bean instances of that interface) which you can collect by #Autowire or #Bean to "register" for these events.
This avoids needing to define huge maps and has about as much code as if you were trying to handle each event type anyway.

Abstraction Layer (Java)

I'm currently working on a project that involves creating an abstraction layer. The goal of the project is to support multiple implementations of server software in the event that I might need to switch over to it. The list of features to be abstracted is rather long, so I'm going to want to look into a rather painless way to do it.
Other applications will be able to interact with my project and make calls that will eventually boil down to being passed to the server I'm using.
Herein lies the problem. I haven't much experience in this area and I'm really not sure how to make this not become a sandwich of death. Here's a chain of roughly what it's supposed to look like (and what I'm trying to accomplish).
/*
Software that is dependent on mine
|
Public API layer (called by other software)
|
Abstraction between API and my own internal code (this is the issue)
|
Internal code (this gets replaced per-implementation, as in, each implementation needs its own layer of this, so it's a different package of entirely different classes for each implementation)
|
The software I'm actually using to write this (which is called by the internal code)
*/
The abstraction layer (the one in the very middle, obviously) is what I'm struggling to put together.
Now, I'm only stuck on one silly aspect. How can I possibly make the abstraction layer something that isn't a series of
public void someMethod() {
if(Implementation.getCurrentImplementation() == Implementation.TYPE1) {
// whatever we need to do for this specific implementation
else {
throw new NotImplementedException();
}
}
(forgive the pseudo-code; also, imagine the same situation but for a switch/case since that's probably better than a chain of if's for each method) for each and every method in each and every abstraction-level class.
This seems very elementary but I can't come up with a logical solution to address this. If I haven't explained my point clearly, please explain with what I need to elaborate on. Maybe I'm thinking about this whole thing wrong?
Why not using inversion of control ?
You have your set of abstractions, you create several implementations, and then you configure your public api to use one of the implementations.
Your API is protected by the set of interfaces that the implementations inherit. You can add new implementations later without modifying the API code, and you can switch even at runtime.
I don't know anymore if inversion of control IS dependency injection, or if DI is a form of Ioc but... it's just that you remove the responsibility of dependency management from your component.
Here, you are going to have
API layer (interface that the client uses)
implementations (infinite)
wrapper (that does the IoC by bringing the impl)
API layer:
// my-api.jar
public interface MyAPI {
String doSomething();
}
public interface MyAPIFactory {
MyAPI getImplementationOfMyAPI();
}
implementations:
// red-my-api.jar
public class RedMyAPI implements MyAPI {
public String doSomething() {
return "red";
}
}
// green-my-api.jar
public class GreenMyAPI implements MyAPI {
public String doSomething() {
return "green";
}
}
// black-my-api.jar
public class BlackMyAPI implements MyAPI {
public String doSomething() {
return "black";
}
}
Some wrapper provide a way to configure the right implementation. Here, you can hide your switch case in the factory, or load the impl from a config.
// wrapper-my-api.jar
public class NotFunnyMyAPIFactory implements MyAPIFactory {
private Config config;
public MyAPI getImplementationOfMyAPI() {
if (config.implType == GREEN) {
return new GreenMyAPI();
} else if (config.implType == BLACK) {
return new BlackMyAPI();
} else if (config.implType == RED) {
return new RedMyAPI();
} else {
// throw...
}
}
}
public class ReflectionMyAPIFactory implements MyAPIFactory {
private Properties prop;
public MyAPI getImplementationOfMyAPI() {
return (MyAPI) Class.forName(prop.get('myApi.implementation.className'))
}
}
// other possible strategies
The factory allows to use several strategies to load the class. Depending on the solution, you only have to add a new dependency and change a configuration (and reload the app... or not) to change the implementation.
You might want to test the performances as well.
If you use Spring, you can only use the interface in your code, and you inject the right implementation from a configuration class (Spring is a DI container). But no need to use Spring, you can do that on the Main entry point directly (you inject from the nearest of your entry point).
The my-api.jar does not have dependencies (or maybe some towards the internal layers).
All the jar for implementations depend on my-api.jar and on you internal code.
The wrapper jar depends on my-api.jar and on some of the impl jar.
So the client load the jar he wants, use the factory he wants or a configuration that inject the impl, and use your code. It depends also on how you expose your api.

Factory pattern and complex beans

I'm trying to apply a factory pattern for creating request beans to use on a protocol stack. Now the request beans hold properties with other beans - which also should be part of the factory pattern (as they are different depending on the stack).
Something like:
public interface Factory {
public Request createRequest();
}
public interface Request {
public Details getDetails();
public void setDetails(Details details);
..
}
public interface Details {
public String getSource();
public void setSource(String s);
..
}
My first attempt was to add factory methods for Details as well, but this quickly becomes a hazard - especially pass some arguments for the factory.
Also the setters become a bit weird as they actually throw a ClassCastException if you were to pass an ´Details´ implementation from another factory.
The main reason for my situation is that I'm sitting on a rather complex 3rd party request/response/stack implementation which I want to fit in under my own bean interfaces. Is there a more sensible way to do this?
You might look more into your design requirement : Which one has more different variants or implementations. Make that into a factory and leave the other. In this case it looks to me as Details can be created using a factory . (If request is not much implemented in many different ways.)

Where to validate String parameter

I have a Student class that has to have a property of a String ID, which has to be validated. I'm not sure whether to validate it inside the student class or the class that I'm implementing the Student class in. Does that make sense?
Assuming ID is final and immutable, then one approach is to have Student constructor throw an exception, probably new IllegalArgumentException("Invalid student ID");
You may additionally provide static method in Student class, which verifies if string is valid, in case you need to check it without creating Student object.
But the logic of determining if ID is valid or not should be in the Student class, I think.
If there are (or can be in future) different kind of student IDs, you could also consider abstract factory pattern, but sounds like that is bit of an overkill.
If Student already has any business inside use validate inside else use second one
Class Student
{
public boolean validate ()
{
//some logic to validation
}
}
Inside of Model or controller or Action
public boolean validate ()
{
//some logic to validation
}
One of the approach is to use validation object. For instance see the Validation approach uses in the Spring Framework. You create an object which implements the interface Validator with two methods: one to detect if the Validator can validate the instance to validate, and another one which validate it.
public class StudentValidator implements Validator<Student> {
public boolean supports(Student student) {
// ...
}
public void validate(Object target, Errors errors) {
// ...
}
}
This approach leads to separation of the code of the object and the way to validate it, offering more flexibility when combining validator:
you can combine several Validator even if the class hierarchy is not respected (POJO principle).
when you need to validate field with data from other system (for instance a database), this approach avoid to mix database / persistence code in the POJO domain class.
Please see the documentation of Spring about Validation.

Force Singleton Pattern on a Class implementing an Interface

I better explain the question with an example.
I have an Interface Model which can be used to access data.
There can be different implementations of Model which can represent the data in various format say XMl , txt format etc. Model is not concerned with the formats.
Lets say one such implementation is myxmlModel.
Now i want to force myxmlModel and every other implementation of Model to follow Singleton Pattern.The usual way is to make myxmlModels constructor private and provide a static factory method to return an instance of myModel class.But the problem is interface cannot have static method definitions and a result i cannot enforce a particular Factory method definition on all implementation of Model. So one implementation may end with providing getObject() and other may have getNewModel()..
One work around is to allow package access to myxmlModel's constructor and create a Factory class which creates the myxmlModel object and cache it for further use.
I was wondering if there is a better way to achieve the same functionality .
Make a factory that returns
instances of your interface, Model.
Make all concrete implementations of the model package-private classes
in the same package as your factory.
If your model is to be a singleton, and you are using java
5+, use enum instead of traditional
singleton, as it is safer.
public enum MyXMLModel{
INSTANCE();
//rest of class
};
EDIT:
Another possibility is to create delegate classes that do all the work and then use an enum to provide all of the Model Options.
for instance:
class MyXMLModelDelegate implements Model {
public void foo() { /*does foo*/}
...
}
class MyJSONModelDelegate implements Model {
public void foo() { /*does foo*/ }
...
}
public enum Models {
XML(new MyXMLModelDelgate()),
JSON(new MyJSONModelDelegate());
private Model delegate;
public Models(Model delegate) { this.delegate=delegate; }
public void foo() { delegate.foo(); }
}
You can use reflection. Something like this:
public interface Model {
class Singleton {
public static Model instance(Class<? extends Model> modelClass) {
try {
return (Model)modelClass.getField("instance").get(null);
} catch (blah-blah) {
blah-blah
}
}
}
public class XmlModel implements Model {
private static final Model instance = new XmlModel();
private XmlModel() {
}
}
usage:
Model.Singleton.instance(XmlModel.class)
Actually, I don't like this code much :). First, it uses reflection - very slow, second - there are possibilities of runtime errors in case of wrong definitions of classes.
Can you refactor the interface to be an abstract class? This will allow you to force a particular factory method down to all implementing classes.
I used to ask myself the same question. And I proposed the same answer ;-)
Now I normally drop the "forcing" behavior, I rely on documentation.
I found no case where the Singleton aspect was so compelling that it needed to be enforced by all means.
It is just a "best-practice" for the project.
I usually use Spring to instanciate such an object,
and it is the Spring configuration that makes it a Singleton.
Safe, and so easy ... plus additionnal Spring advantages (such as Proxying, substituing a different object once to make some tests etc...)
This is more an answer to your comment/clarification to kts's answer. Is it so, that the real problem is not using the Singleton pattern but instead defining an eclipse (equinox) extension point schema that allows contributing a singleton?
I think, this can't be done, because everytime you call IConfigurationElement.createExecutableExtension you create a new instance. This is quite incompatible with your singleton requirement. And therefore you need the public default constructor so that everybody can create instances.
Unless you can change the extension point definition so that plugins contribute a ModelFactory rather than a model, like
public interface ModelFactory {
public Model getModelInstance();
}
So the extension user will instantiate a ModelFactory and use it to obtain the singleton.
If I guessed wrong, leave a comment and I delete the answer ;)

Categories