Fluent Interface in Spring Boot Service - java

I am building a Spring Boot project for work.
In this project I have service which is tasked with getting certain Documents from another backend. There a quite a lot of different scenarios where the documents have to meet certain criteria e.g. be from a certain date, which can be matched freely. Currently this is accomplished with normal method like so:
#Service
public class DocumentService(){
private OtherService otherService;
#Autowire
public DocumentService(OtherService otherService){
this.otherService = otherService;
}
public List<Document> getDocuments() {
...
}
public List<Document> getDocuments(LocalDate date) {
...
}
public List<Document> getDocuments(String name){
...
}
public List<Document> getDocuments(String name, LocalDate date){
...
}
}
I find this to be a rather bad solution because for every new combination there would need to be a new method.
For that reason I'd like to use a fluent style interface for that, something like this:
//Some other class that uses DocumentService
documentService().getDocuments().withDate(LocalDate date).withName(String name).get();
I'm familiar with the Builder Pattern and method chaining but I don't see a way how I can adapt either one of those. Seeing as, per my understanding, #Service-classes are singletons in Spring Boot.
Is this at all possible with Spring Boot?

Doesn't have to be a Spring Boot solution, why not just introduce a POJO builder-like local class:
#Service
public class DocumentService(){
public Builder documents() {
return new Builder();
}
public class Builder {
private LocalDate date;
private String name;
public Builder withDate(LocalDate date) {
this.date = date;
return this;
}
// etc
public List<String> get() {
final List<SomeDTO> results = otherService.doQuery(name, date, ...);
// TODO - tranform DTO to List<String>
return list;
}
}
}
Obviously make it static if it doesn't need access to the parent component.
You could make the Spring component and the builder be the same object but that does feel contrived, also I expect you would like to be able to support multiple builders.
Also I'm assuming the parent component is genuinely a service, i.e. it doesn't contain any state or mutators, otherwise you are introducing potential synchronization problems.
EDIT: Just for illustration the builder maintains the arguments to be passed to the otherService and performs any service-like transforms.

If you want to use a fluent interface here, the object returned by your getDocuments() method would have to be the starting point for the method chain. Perhaps create something like a DocumentFilter class that you can return from there, then you'll end up with something like this:
documentService.documents().getDocuments().withDate(LocalDate date).withName(String name).getDocuments()
In this example, your DocumentFilter will have withDate(...) and withName(...) methods, and each subsequent call includes all of the criteria from the preceding DocumentFilter.

Related

How can I use Java Enums with Amazon DynamoDB and AWS SDK v2?

I am trying to implement a simple java event-handler lambda for AWS. It receives sqs events and should make appropriate updates to the dynamoDB table.
One of the attributes in this table is a status field that has 4 defined states; therefore I wanted to use an enum class in java and map it to this attribute.
Under AWS SDK v1 I could use the #DynamoDBTypeConvertedEnum annotation. But it does not exist anymore in v2. Instead, there is the #DynamoDbConvertedBy() which receives a converter class reference. There is also an EnumAttributeConverter class which should work nicely with it.
But for some reason, it does not work. The following is a snip from my current code:
#Data
#DynamoDbBean
#NoArgsConstructor
public class Task{
#Getter(onMethod_ = {#DynamoDbPartitionKey})
String id;
...
#Getter(onMethod_ = {#DynamoDbConvertedBy(EnumAttributeConverter.class)})
ExportTaskStatus status;
}
The enum looks as follows:
#RequiredArgsConstructor
public enum TaskStatus {
#JsonProperty("running") PROCESSING(1),
#JsonProperty("succeeded") COMPLETED(2),
#JsonProperty("cancelled") CANCELED(3),
#JsonProperty("failed") FAILED(4);
private final int order;
}
With this, I get the following exception when launching the application:
Class 'class software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter' appears to have no default constructor thus cannot be used with the BeanTableSchema
For anyone else coming here, it looks do me like just dropping the annotation from the enum altogether works just fine, i.e. the SDK applies the provided attribute converters implicitly. This is also mentioned in this Github issue. My own class looks like this (Brand is an enum here), and the enum is converted without any issues when fetching items.
#Value
#Builder(toBuilder = true)
#DynamoDbImmutable(builder = User.UserBuilder.class)
public class User {
#Getter(onMethod = #__({#DynamoDbPartitionKey}))
String id;
Brand brand;
...
}
How can I use Java Enums with Amazon DynamoDB and AWS SDK v2?
Although the documentation doesn't state it, the DynamoDbConvertedBy annotation requires any AttriuteConverter you supply to contain a parameterles default constructor
Unfortunately for you and me, whoever wrote many of the built-in AttributeConverter classes decided to use static create() methods to instantiate them instead of a constructor (maybe they're singletons under the covers? I don't know). This means anyone who wants to use these helpful constructor-less classes like InstantAsStringAttributeConverter and EnumAttributeConverter needs to wrap them in custom wrapper classes that simple parrot the converters we instantiated using create. For a non-generic typed class like InstantAsStringAttributeConverter, this is easy. Just create an wrapper class that parrots the instance you new up with create() and refer to that instead:
public class InstantAsStringAttributeConverterWithConstructor implements AttributeConverter<Instant> {
private final static InstantAsStringAttributeConverter CONVERTER = InstantAsStringAttributeConverter.create();
#Override
public AttributeValue transformFrom(Instant instant) {
return CONVERTER.transformFrom(instant);
}
#Override
public Instant transformTo(AttributeValue attributeValue) {
return CONVERTER.transformTo(attributeValue);
}
#Override
public EnhancedType<Instant> type() {
return CONVERTER.type();
}
#Override
public AttributeValueType attributeValueType() {
return CONVERTER.attributeValueType();
}
}
Then you update your annotation to point to that class intead of the actual underlying library class.
But wait, EnumAttributeConverter is a generic typed class, which means you need to go one step further. First, you need to create a version of the converter that wraps the official version but relies on a constructor taking in the type instead of static instantiation:
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
public class EnumAttributeConverterWithConstructor<T extends Enum<T>> implements AttributeConverter<T> {
private final EnumAttributeConverter<T> converter;
public CustomEnumAttributeConverter(final Class<T> enumClass) {
this.converter = EnumAttributeConverter.create(enumClass);
}
#Override
public AttributeValue transformFrom(T t) {
return this.converter.transformFrom(t);
}
#Override
public T transformTo(AttributeValue attributeValue) {
return this.converter.transformTo(attributeValue);
}
#Override
public EnhancedType<T> type() {
return this.converter.type();
}
#Override
public AttributeValueType attributeValueType() {
return this.converter.attributeValueType();
}
}
But that only gets us half-way there-- now we need to generate a version for each enum type we want to convert that subclasses our custom class:
public class ExportTaskStatusAttributeConverter extends EnumAttributeConverterWithConstructor<ExportTaskStatus> {
public ExportTaskStatusAttributeConverter() {
super(ExportTaskStatus.class);
}
}
#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)
public ExportTaskStatus getStatus() { return this.status; }
Or the Lombok-y way:
#Getter(onMethod_ = {#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)})
ExportTaskStatus status;
It's a pain. It's a pain that could be solved with a little bit of tweaking and a tiny bit of reflection in the AWS SDK, but it's where we're at right now.
I am thinking that your annotations might actually be the problem here. I would remove all annotations that mention a constructor, and instead, write out your own constructor(s). For both Task and TaskStatus.
The dynamodb-enhanced SDK does this out of the box.
When you declare a #DynamoDbBean the DefaultAttributeConverterProvider provides a long list of possible ways to convert attributes between java types, including an EnumAttributeConverter which is used if type.rawClass().isEnum() is true. So you don't need to worry about it.
If you ever wanted to extend the number of converters, you would need to add the converterProviders annotation parameter, and declare the default one (or omit it), as well as any other providers you want.
Example:
#DynamoDbBean(converterProviders = { DefaultAttributeConverterProvider.class, MyCustomAttributeConverterProvider.class });
Solution based on watkinsmatthewp Answer:
public class TaskStatusConverter implements AttributeConverter<TaskStatus> {
#Delegate
private final EnumAttributeConverter<TaskStatus> converter;
public TaskStatusConverter() {
converter = EnumAttributeConverter.create(TaskStatus.class);
}
}
Task status attribute looks like this:
#Getter(onMethod_ = {#DynamoDbConvertedBy(TaskStatusConverter.class)})
TaskStatus status;

Instance created by new depend on #Service member operation

First, please let me introduce a minimal scene demo to explain the problem.
Let's say i have a strategy pattern interface.
public interface CollectAlgorithm<T> {
public List<T> collect();
}
And a implementation of this strategy, the ConcreteAlgorithm.
public class ConcreteAlgorithm implements CollectAlgorithm<Integer> {
#Resource
QueryService queryService;
#Override
public List<Integer> collect() {
// dummy ...
return Lists.newArrayList();
}
}
As you can see, the implementation depend on some query operation provided by a #Service component.
The ConcreteAlgorithm class will be created by new in some places, then the collect method will be called.
I've read some related link like Spring #Autowired on a class new instance, and know that the above code cannot work, since the instance created by new has a #Resource annotated member.
I'm new to Spring/Java, and i wonder if there are some ways, or different design, to make scene like above work.
I've thought about use factory method, but it seems that it will involve many unchecked type assignment since i provided a generic interface.
UPDATE
To make it more clear, i add some detail about the problem.
I provide a RPC service for some consumers, with an interface like:
public interface TemplateRecommendService {
List<Long> recommendTemplate(TemplateRecommendDTO recommendDTO);
}
#Service
public class TemplateRecommandServiceImpl implements TemplateRecommendService {
#Override
public List<Long> recommendTemplate(TemplateRecommendDTO recommendDTO) {
TemplateRecommendContext context = TemplateRecommendContextFactory.getContext(recommendDTO.getBizType());
return context.process(recommendDTO);
}
}
As you can see, i will create different context by a user pass field, which represent different recommendation strategy. All the context should return List<Long>, but the pipeline inside context is totally different with each other.
Generally there are three main stage of the context process pipeline. Each stage's logic might be complicated and varied. So there exists another layer of strategy pattern.
public abstract class TemplateRecommendContextImpl<CollectOut, PredictOut> implements TemplateRecommendContext {
private CollectAlgorithm<CollectOut> collectAlgorithm;
private PredictAlgorithm<CollectOut, PredictOut> predictAlgorithm;
private PostProcessRule<PredictOut> postProcessRule;
protected List<CollectOut> collect(TemplateRecommendDTO recommendDTO){
return collectAlgorithm.collect(recommendDTO);
}
protected List<PredictOut> predict(TemplateRecommendDTO recommendDTO, List<CollectOut> predictIn){
return predictAlgorithm.predict(recommendDTO, predictIn);
}
protected List<Long> postProcess(TemplateRecommendDTO recommendDTO, List<PredictOut> postProcessIn){
return postProcessRule.postProcess(recommendDTO, postProcessIn);
}
public /*final*/ List<Long> process(TemplateRecommendDTO recommendDTO){
// pipeline:
// dataCollect -> CollectOut -> predict -> Precision -> postProcess -> Final
List<CollectOut> collectOuts = collect(recommendDTO);
List<PredictOut> predictOuts = predict(recommendDTO, collectOuts);
return postProcess(recommendDTO, predictOuts);
}
}
As for one specific RecommendContext, its creation likes below:
public class ConcreteContextImpl extends TemplateRecommendContextImpl<GenericTempDO, Long> {
// collectOut, predictOut
ConcreteContextImpl(){
super();
setCollectAlgorithm(new ShopDecorateCrowdCollect());
setPredictAlgorithm(new ShopDecorateCrowdPredict());
setPostProcessRule(new ShopDecorateCrowdPostProcess());
}
}
Instead od using field oriented autowiring use constructor oriented one - that will force the user, creating the implementation instance, to provide proper dependency during creation with new
#Service
public class ConcreteAlgorithm implements CollectAlgorithm<Integer> {
private QueryService queryService;
#Autowired // or #Inject, you cannot use #Resource on constructor
public ConcreteAlgorithm(QueryService queryService) {
this.queryService = queryService;
}
#Override
public List<Integer> collect() {
// dummy ...
return Lists.newArrayList();
}
}
There are 4 (+1 Bonus) possible approaches I can think of, depending on your "taste" and on your requirements.
1. Pass the service in the constructor.
When you create instances of your ConcreteAlgorithm class you provide the instance of the QueryService. Your ConcreteAlgorithm may need to extend a base class.
CollectAlgorithm<Integer> myalg = new ConcreteAlgorithm(queryService);
...
This works when the algorithm is a stateful object that needs to be created every time or, with some variations, when you actually don't know the algorithm at all as it comes from another library (in which case you might have a factory or, in rare cases which most likely don't fit your scenario, create the object through reflection).
2. Turn your algorithm into a #Component
Annotate your ConcreteAlgorithm with the #Component annotation and then reference it wherever you want. Spring will take care of injecting the service dependency when the bean is created.
#Component
public class ConcreteAlgorithm implements CollectAlgorithm<Integer> {
#Resource
QueryService queryService;
....
}
This is the standard and usually preferred way in Spring. It works when you know ahead of time what all the possible algorithms are and such algorithms are stateless.
This is the typical scenario. I don't know if it fits your needs but I would expect most people to be looking for this particular option.
Note that in the above scenario the recommendation is to use constructor-based injection. In other words, I would modify your implementation as follows:
#Component
public class ConcreteAlgorithm implements CollectAlgorithm<Integer> {
final QueryService queryService;
#Autowired
public ConcreteAlgorithm(QueryService queryService) {
this.queryService = queryService;
}
#Override
public List<Integer> collect() {
// dummy ...
return Lists.newArrayList();
}
}
On the most recent versions of Spring you can even omit the #Autowired annotation.
3. Implement and call a setter
Add a setter for the QueryService and call it as needed.
CollectAlgorithm<Integer> myalg = new ConcreteAlgorithm();
myalg.setQueryService(queryService);
...
This works in scenarios like those of (1), but lifts you from the need of passing parameters to the constructor, which "may" help getting rid of reflection in some cases.
I don't endorse this particular solution however as it forces to know that you have to call the setQueryService method prior to invoking other methods. Quite error-prone.
4. Pass the QueryService directly to your collect method.
Possibly the easiest solution.
public interface CollectAlgorithm<T> {
public List<T> collect(QueryService queryService);
}
public class ConcreteAlgorithm implements CollectAlgorithm<Integer> {
#Override
public List<Integer> collect(QueryService queryService) {
// dummy ...
return Lists.newArrayList();
}
}
This works well if you want your interface to be a functional one, to be used in collections.
Bonus: Spring's SCOPE_PROTOTYPE
Spring doesn't only allow to instantiate singleton beans but also prototype beans. This effectively means it will act as a factory for you.
I will leave this to an external example, at the following URL:
https://www.boraji.com/spring-prototype-scope-example-using-scope-annotation
This "can" be useful in specific scenarios but I don't feel comfortable recommending it straight away as it's significantly more cumbersome.

Handling different table names using JDBCTemplate in Spring Application

I am working with Spring Boot on a project. JdbcNamedTemplates are used in my DAOs to access data. I write queries in my daos and then Map some parameters at runtime to get correct data.
Now, I have to handle retrieving data from multiple identical tables depending on the request. Retrieval logic is the same except I need to use different table names. JdbcTemplate does not allow using table names as parameters. I do not want to use string formatting as I wish my queries to be final.
I could create abstract class with most of the functionality and the concrete classes that extend it to handle differences in table names (basicly, have method "getTableName()"). This works. However, it seems like I am creating a lot of classes and I would like to create less of them.
Are there better ways to do it?
I was thinking that using interfaces for specific tablenames would be nice, but I cant wrap my head around how that could work with Spring and Autowiring.
UPDATE:
Just giving a sample of what I would like to improve.
So far I have couple of abstract DAOs like this. They do the database talk.
public abstract class Dao1 {
private static final String PARAM = "p";
private final String QUERY1 = " SELECT * FROM " + getTableName() + " WHERE something";
//here we would also have autowired jdbcNamedTemplate and maybe some other stuff.
public getAll() {
//map parameters, do query return results
}
protected abstract String getTableName();
}
Next, I have couple of data access objects that implemenent abstract method getTableName(). So if the table was "Autumn", I woould have.
#Component
public class AutumnDao1 extends Dao1 {
#Override
protected String getTableName() {
return "AUTUMN";
}
}
So from example above you can see that for each abstract dao I would have to make couple of Concrete Daos (autumn, winter, spring, summer). This is acceptable for now, but at some point this might grow to quite sizeable collection of daos.
I would like to know if there is a way to avoid that by for instance creating just one class / interface for each "season" / name and somehow attatching it to Dao1, Dao2 etc, as needed. I only know which name is relevant when the user request arrive.
With #Qualifier("nameOfBean") you can inject the instance you are looking for.
If you have, for instance:
#Component
public class AutumnDao1 extends Dao1 {
#Override
protected String getTableName() {
return "AUTUMN";
}
}
#Component
public class SummerDao1 extends Dao1 {
#Override
protected String getTableName() {
return "SUMMER";
}
}
In this case you are creating two beans that can be injected in the parent class Dao1. To inject the right one you should do:
#Autowire
#Qualifier("autumnDao1")
private Dao1 autumnDao;
#Autowire
#Qualifier("summerDao1")
private Dao1 summerDao;
Try this!

Verify that all getter methods are called

I have the following test where I need to verify that all getters of the Person class are being called. So far I have used mockito's verify() to make sure that each getter is called. Is there a way to do that by reflection? It can be the case that a new getter is added to the Person class but the test will miss that.
public class GetterTest {
class Person{
private String firstname;
private String lastname;
public String getFirstname() {
return firstname;
}
public String getLastname() {
return lastname;
}
}
#Test
public void testAllGettersCalled() throws IntrospectionException{
Person personMock = mock(Person.class);
personMock.getFirstname();
personMock.getLastname();
for(PropertyDescriptor property : Introspector.getBeanInfo(Person.class).getPropertyDescriptors()) {
verify(personMock, atLeast(1)).getFirstname();
//**How to verify against any getter method and not just getFirstName()???**
}
}
}
Generally, don't mock the class under test. If your test is for a Person, you shouldn't ever see Mockito.mock(Person.class) in it, as that's a pretty clear sign that you're testing the mocking framework instead of the system-under-test.
Instead, you may want to create a spy(new Person()), which will create a real Person implementation using a real constructor and then copy its data to a Mockito-generated proxy. You can use MockingDetails.getInvocations() to reflectively check that every getter was called.
// This code is untested, but should get the point across. Edits welcome.
// 2016-01-20: Integrated feedback from Georgios Stathis. Thanks Georgios!
#Test
public void callAllGetters() throws Exception {
Person personSpy = spy(new Person());
personSpy.getFirstname();
personSpy.getLastname();
assertAllGettersCalled(personSpy, Person.class);
}
private static void assertAllGettersCalled(Object spy, Class<?> clazz) {
BeanInfo beanInfo = Introspector.getBeanInfo(clazz);
Set<Method> setOfDescriptors = beanInfo.getPropertyDescriptors()
.stream()
.map(PropertyDescriptor::getReadMethod)
.filter(p -> !p.getName().contains("getClass"))
.collect(Collectors.toSet());
MockingDetails details = Mockito.mockingDetails(spy);
Set<Method> setOfTestedMethods = details.getInvocations()
.stream()
.map(InvocationOnMock::getMethod)
.collect(Collectors.toSet());
setOfDescriptors.removeAll(setOfTestedMethods);
// The only remaining descriptors are untested.
assertThat(setOfDescriptors).isEmpty();
}
There might be a way to call verify and invoke on the Mockito-generated spy, but that seems very fragile, and very dependent on Mockito internals.
As an aside, testing bean-style getters seems like an odd use of time/effort. In general focus on testing implementations that are likely to change or break.
I can think of two solutions for your problem:
Generate the Builder code programmatically, so you don't need to run tests. Java code is generated by a program and never edited by a user. Test the generator instead. Use a text template and build definitions from a serialized domain model or directly from Java compiled classes (you'll need a separate module dependent on the bean's one)
Write your tests against a proxy library. The problem is that regular proxies can only implement interfaces, not regular classes, and it's very cumbersome to have interfaces for Javabeans. If you choose this route, I'd go with Javassist. I coded a runnable sample and put it on GitHub. The test cases use a proxy factory to instantiate beans (instead of using new)
public class CountingCallsProxyFactory {
public <T> T proxy(Class<T> classToProxy) {
ProxyFactory factory = new ProxyFactory();
factory.setSuperclass(classToProxy);
Class clazz = factory.createClass();
T instance = (T) clazz.newInstance();
ProxyObject proxy = (ProxyObject) instance;
MethodCallCounter handler = new MethodCallCounter();
proxy.setHandler(handler);
return instance;
}
public void verifyAllGettersCalled(Object bean) {
// Query the counter against the properties in the bean
}
}
The counter is kept inside the class MethodCallCounter

Java CDI: Dynamically selecting an implementation based on qualifier?

I'm trying to make an application extensible by using CDI, but it seems like I'm missing a piece of the puzzle.
What I want:
Have a global configuration that will define which implementation of an interface to use. The implementations would have annotations like #ImplDescriptor(type="type1").
What I tried:
#Produces
public UEInterface createUserExit(#Any Instance<UEInterface> instance, InjectionPoint ip) {
Annotated annotated = ip.getAnnotated();
UESelect ueSelect = annotated.getAnnotation(UESelect.class);
if (ueSelect != null) {
System.out.println("type = " + ueSelect.type());
}
System.out.println("Inject is ambiguous? " + instance.isAmbiguous());
if (instance.isUnsatisfied()) {
System.out.println("Inject is unsatified!");
return null;
}
// this would be ok, but causes an exception
return instance.select(ueSelect).get();
// or rather this:
for (Iterator<UEInterface> it = instance.iterator(); it.hasNext();) {
// problem: calling next() will trigger instantiation which will call this method again :(
UEInterface candidate = it.next();
System.out.println(candidate.getClass().getName());
}
}
This code is close to an example I've seen: The #Produces method will be used to select and create instances and a list of candidates is injected as Instance<E>. If the method simply creates and returns an implementation, it works fine. I just don't know how to examine and select a candidate from the Instance<E>. The only way of looking the the "contents" seems to be an Iterator<E>. But as soon as I call next(), it will try to create the implementation... and unfortunately, calls my #Produces method for that, thereby creating an infinite recursion. What am I missing? How can I inspect the candidates and select one? Of course I want to instantiate only one of them...
Thanks in advance for any help and hints!
I think the issue is you are trying to select the annotation's class rather than using the annotation as a selector qualifier. Using the class directly searches for an implementation that implements that class. You need to create an AnnotationLiteral using the #ImplDescriptor class to perform a select using it as a qualifier. Create a class extending AnnotationLiteral like so.
public class ImplDescriptorLiteral extends AnnotationLiteral<ImplDescriptor> implements ImplDescriptor {
private String type;
public ImplDescriptorLiteral(String type) {
this.type = type;
}
#Override
public String type() {
return type;
}
}
then you can pass an instance of this class to the select method using the type you want.
instance.select(new ImplDescriptorLiteral("type1")).get();
Refer to the Obtaining a contextual instance by programmatic lookup documentation for more information.
Finch, what you have here should work. it assumes though that you have an instance of UEInterface that is annotated #UESelect, e.g.
#UESelect("one")
public class UEOne implements UEInterface {
..
}
Is this how you're expecting it to work?

Categories