CQRS and Commands Java + Spring similar to MediatR - java

While working on a .NET project I come across a library MediatR which made CQRS and Commands simple to implement. I really like using commands and commands handlers as I've worked on far too many projects that have giant procedural style service classes that inject way to many dependencies making unit testing painful. I am looking for something similar to MediatR for Spring + Java. Essentially I would like to inject a single dependency into the controller class and have it delegate commands to the appropriate command handler. I provided a few snippets of what MediatR looks like below. I prefer the way mediator does it as injecting the CommandHandlers into the controller class can lead to the same issue with the class having tons of dependencies injected.
I've came across this library but it seems more like a side project that something that is production ready. https://github.com/sleroy/spring-cqrs-arch. I am aware of the Axon framework, but I'm not looking to go full blown event sourcing at this point. Are there any libraries out there already for this that maybe I'm haven't stumbled across yet? I suppose I could just use the Guava EventBus.
Below is a C# example of what MediatR usage looks like.
Controller
namespace DAB.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class PersonController : ControllerBase
{
private readonly IMediator mediator;
public PersonController(IMediator mediator)
{
this.mediator = mediator;
}
// GET api/values
[HttpPut("{id}/changename")]
public async Task<ActionResult> ChangeName([FromBody] ChangeNameCommand command)
{
await this.mediator.Send(command);
return Ok();
}
}
}
Command
public class ChangeNameCommand: IRequest<bool>
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
CommandHandler
public class ChangeNameHandler: IRequestHandler<ChangeNameCommand, bool>
{
public Task<bool> Handle(ChangeNameCommand request, CancellationToken cancellationToken)
{
Console.WriteLine($"Changing name to {request.FirstName} {request.LastName}");
return Task.FromResult(true);
}
}

Check out PipelinR. It's 15KB, zero-dependency library, with a nice Spring and Spring Boot integration.

if you are still looking for a similar library, I made a library that works similarly with mhinze/ShortBus. You may check it at https://github.com/kazupooot/shortbus. Currently it supports single handler request/response message.

Related

How to provide components from a library, for consumption by multiple DI frameworks

My team owns a library that provides components that must be referencable by code that consumes the library. Some of our consumers use Spring to instantiate their apps; others use Guice. We'd like some feedback on best-practices on how to provide these components. Two options that present themselves are:
Have our library provide a Spring Configuration that consumers can #Import, and a Guice Module that they can install.
Have our library provide a ComponentProvider singleton, which provides methods to fetch the relevant components the library provides.
Quick sketches of what these would look like:
Present in both approaches
// In their code
#AllArgsConstructor(onConstructor = #__(#Inject))
public class ConsumingClass {
private final FooDependency foo;
...
}
First approach
// In our code
#Configuration
public class LibraryConfiguration {
#Bean public FooDependency foo() {...}
...
}
---
public class LibraryModule extends AbstractModule {
#Provides FooDependency foo() {...}
...
}
========================
========================
// In their code
#Configuration
#Import(LibraryConfiguration.java)
public class ConsumerConfiguration {
// Whatever initiation logic they want - but, crucially, does
// *not* need to define a FooDependency
...
}
---
// *OR*
public class ConsumerModule extends AbstractModule {
#Override
public void configure() {
// Or, simply specify LibraryModule when creating the injector
install(new LibraryModule());
...
// As above, no requirement to define a FooDependency
}
}
Second approach
// In our code
public class LibraryProvider {
public static final INSTANCE = buildInstance();
private static LibraryProvider buildInstance() {...}
private static LibraryProvider getInstance() {return INSTANCE;}
}
========================
========================
// In their code
#Configuration
public class ConsumerConfiguration {
#Bean public FooDependency foo() {
return LibraryProvider.getInstance().getFoo();
}
...
}
// or equivalent for Guice
Is there an accepted Best Practice for this situation? If not, what are some pros and cons of each, or of another option I haven't yet thought of? The first approach has the advantage that consumers don't need to write any code to initialize dependencies, and that DI frameworks can override dependencies (e.g. with mocked dependencies for testing); whereas the second approach has the advantage of being DI-framework agnostic (if a new consumer wanted to use Dagger to instantiate their app, for instance, we wouldn't need to change the library at all)
I think the first option is better. If your library has inter-dependencies between beans then the code of #Configuration in case of spring in the second approach) will be:
Fragile (what if application doesn't know that a certain bean should be created)
Duplicated - this code will appear in each and every consumer's module
When the new version of your library gets released and a consumer wants to upgrade- there might be changes in consumer's configuration ( the lib might expose a new bean, deprecate or even remove some old stuff, etc.)
One small suggestion:
You can use Spring factories and then you don't even need to make an #Import in case of spring boot. just add a maven dependency and it will load the configuration automatically.
Now, make sure that you work correctly with dependencies in case of that approach.
Since you code will include both spring and Juice dependent code, you'll add dependencies on both for your maven/gradle module of the library. This means, that consumer that uses, say, guice, will get all the spring stuff because of your library. There are many ways to overcome this issue depending on the build system of your choice, just want wanted to bring it up

How to test NetBeans Platform code which uses Lookups?

TL;DR How does one write unit tests for NetBeans Platform code which uses static methods to look up dependencies?
In a NetBeans platform application I come across code like this:
MyService service = Lookup.getDefault().lookup(MyService.class);
service.doStuff(....);
To me the static access seems like an antipattern and hard to test. When I Google around I only find comments about low coupling and high cohesion, teleinterfaces etc.
Many people seem to think this is a Good Idea but I am wondering how I can write a reasonable unit test for code like this, without resorting to mocking static methods or using the Lookup feature in my unit test.
The first idea that comes to my mind is to refactor the lookup as a regular dependency:
public class MyClass {
private Lookup lookup = Lookup.getDefault();
public void myMethod() {
MyService service = lookup.lookup(MyService .class);
service.doStuff(....);
}
public void setLookup(Lookup lookup) {
this.lookup = lookup;
}
And then use the setter to provide a mock Lookup for testing.
This would work, but still causes the tested code to call Lookup.getDefault() before setting the mock. There is no regular dependency injection mechanism provided by Netbeans Platform so if I introduce it like this it feels like swimming against the stream.
I get the feeling I am missing something. Is there a standard approach to write unit tests for Netbeans Platform code?
So far I found several ways of solving this.
1 - Publish a test version of the class in Lookup with a higher position
#org.openide.util.lookup.ServiceProvider(service = MyService.class, position = 1)
public class TestService implements MyService {
public void doStuff(....) {
2 - Use NBJunit's MockService
public class MyTest extends NbTestCase {
public void setUp() throws Exception {
org.netbeans.junit.MockServices.setServices(TestService.class);
}
3- Register your own lookup implementation:
static {
System.setProperty("org.openide.util.Lookup", TestLookup.class.getName());
}
public class TestLookup extends org.openide.util.lookup.AbstractLookup {
public TestLookup() {
this(new org.openide.util.lookup.InstanceContent());
}
private TestLookup(org.openide.util.lookup.InstanceContent ic) {
super(ic);
ic.add(new TestService());
}
Some of these ideas were found here: https://openide.netbeans.org/tutorial/test-patterns.html.
The class TestService must be visible in your test class (In general, we use Lookup to loose dependency, that's why the interfaces and the implementations are in separate modules).
Think about adding the module of TestService in the independency of your test module.

Is it a good idea to use static helper classes in Play Framework 2?

I'm developing a Play Framework 2 application in Java, and I'm wondering if I can use static helper classes.
For example, I want to know if a user is logged and have completed its profile. This test take a few lines, may be subject to change, and is used a lot in the application.
So I write a class with these tests in one method with one argument (the Session object) that I use everywhere.
But I have to instantiate a class each time to use the method, so at scale it may be inefficient. Is it safe to make it static ? If it is, what other play object can I use safely as a parameter ?
When you say "test", I assume you mean some checking logic instead of unit tests.
In that case, you can use dependency injection instead of static helpers.
https://www.playframework.com/documentation/2.3.x/JavaInjection
The above link shows an example of how to use Guice to inject your controller when processing requests.
So previously your controller would be:
public class Application extends Controller {
public static Result index() {
if (YourStaticHelper.yourStaticMethod.isOk()) {
return ok("It works!");
}
else {
return forbidden("NO");
}
}
}
Now it would become:
public class Application extends Controller {
#Inject
YourStaticHelperInterface checker;
public Result index() { // no longer static
if (checker.isOk()) {
return ok("It works!");
}
else {
return forbidden("NO");
}
}
}
The difference is in the previous one if you somehow need a new helper, you would have to change the controller code to adapt it to the change, whereas in the second you just need to inject a different implementation at runtime as isOk() there becomes a contract in the interface.
The benefit? Who knows. If I'm writing something completely myself at home or the controller code is actually tightly coupled with the helper, I would choose the first. If I'm working with others in a company, I would pick the second. It's all about software engineering shit but that's how things work.

Is there a Java wrapper annotation?

Trying to find a way to wraps an object, which is auto generated based on some model with lots of getters and setters. For example:
class ObjectToWrap {
public int getIntA();
public int getIntB();
... // Tons of other getters
}
I have to create a wrapper that wraps this object and use some annotation that generates methods from ObjectToWrap for me. Code looks like the following:
class Wrapper {
private ObjectToWrap obj;
public int getIntA() {
return obj.getIntA();
}
public int getIntB() {
return obj.getIntB();
}
... // Tons of other getters
}
Is there an annotation to do this? I just don't want to make the code look lengthy.
Take a look at Project Lombok which has a #Delegate annotation which does exactly what you want.
#Delegate documentation
I think you would be able to do this:
import lombok.Delegate;
class Wrapper {
//the types field in the annotation says
//to only auto generate deleagate methods
//for only the public methods in the ObjectToWrap class
//and not any parent classes if ObjectToWrap extended something
#Delegate(types = ObjectToWrap.class)
private ObjectToWrap obj;
}
If you are using the maven build infrastructure with dependency management, you could have a dependent sub-project that collects the generated sources as-is (not as code). Another sub-project could then generate real sources out of them (source code transformation) as zip, which then could be imported by maven in the main project as pre-compile target.
On that basis you could use dynamic proxy classes, or even immediate generated classes.
The only other alternative would be to use the java scripting API, and do the business in JavaScript or so. Loosing the type safeness of java and lowering the software quality.
Unfortunately the alternative of hybrid usage of another JVM language I cannot consider productive. The very nice and powerful Scala still is too wild/complex/ticklish.

How can I exclude annotated definition from build in java?

I am building an Android app. Now, I have a source code for API #1, I should get it adapted for API #2. Then I will publish the both versions for API #1 and API #2 in different packages. I can't use something like values-en, because both versions can be used worldwide. Also, the user may not have choice.
As the new version will use same UI and DB logic, (and because now the code is erroneous,) I don't want to separate the code. If i were coding in c or c++, I must use #ifdef and Makefile. However, I'm in Java. It's possible to run the API-dependent code by determining the package name in runtime, but it's somewhat weird.
I think I can use annotations. What I expect is:
package foo.app;
public class API {
public boolean prepare() { ... }
#TargetPlatform(1)
public void open() { ... }
#TargetPlatform(2)
public void open() { ... }
}
and use only one of them. Also, this is good:
package foo.app;
public class R {
#TargetPlatform(1) com.example.foo.app.R R;
#TargetPlatform(2) net.example.foo.app.R R;
}
Just defining an annotation is simple. What I don't know is, how can I exclude unused duplicates from build or execution, or so on? If the work can be done in this way, I can do anything.
You cannot use annotations for that.
It would be better to hide the implementation specific classes behind an interface.
public interface Api {
boolean prepare();
void open();
}
To create a Api instance use a factory class:
public class ApiFactory {
public static Api createApi() {
if(isTargetPlatform1())
return new com.example.foo.app.Api();
else
return new net.example.foo.app.Api();
}
private boolean isTargetPlatform1() {
// determine the current platform, e.g. by reading a configuration file
}
}
In all other places you only refer to the Api interface and ApiFactory class.
Use it like that:
Api api = ApiFactory.createApi();
api.open();
// ...
A more advanced solution would be to use dependency injection.

Categories