I have a Java Spring Service with a RestController that calls an async method:
#RestController
public class SomeController {
#Autowired
//this is the service that contains the async-method
OtherService otherService;
#GetMapping
public void someFunctionWithinTheMainRequestThread() {
otherService.asyncMethod(RequestContextHolder.getRequestAttributes());
}
}
That async method needs to use the RequestContextAttributes because it is building Links with linkTo(...). The problem is that no matter how I pass the RequestAttributes to the method, I always get the error
java.lang.IllegalStateException: Cannot ask for request attribute - request is not active anymore!
This is the annotation on the async method:
public class OtherService {
#Async
#Transactional(readOnly = true)
public void asyncMethod(RequestAttributes context) {
RequestContextHolder.setRequestAttributes(context);
//doing a lot of stuff that takes a while
linkTo(methodOn(...)) //-> here the error occurs
}
What I tried:
Passing RequestAttributes manually as a Parameter (as seen in the code-snippets above)
Using the context-aware-pool executor described in this answer: How to enable request scope in async task executor - which basically seems to do the same as if I pass the context as a variable only that is is configured globally
Updating the servlet config and setting ThreadContextInheritable to true
Assigning the RequestAttributes to a final variable to try to get a copy of the original object which is marked as inactive by the main thread
No matter what I do, the request always seems to finish before my async method and I apparently never have a deep copy of the Attributes so they always get marked as inactive by the main thread before the async method is finished and then I can't use them anymore -> at least that is my understanding of the error.
I just want to be able to get the requestAttributes needed for the linkTo method in my async method even after the main thread finished the request, can someone point me in the right direction?
I found a solution that does work and removes the error. Since I don't think this is really clean I am hoping for more answers but in case it helps someone:
First I added this class. It creates a custom and very simple RequestAttributes-Implementation that enables us to keep the Attributes active for longer than they normally would be:
import org.springframework.web.context.request.RequestAttributes;
import org.springframework.web.context.request.ServletRequestAttributes;
import javax.servlet.http.HttpServletRequest;
import java.util.HashMap;
import java.util.Map;
public class AsyncRequestScopeAttr extends ServletRequestAttributes {
private Map<String, Object> requestAttributeMap = new HashMap<>();
public AsyncRequestScopeAttr(HttpServletRequest request) {
super(request);
}
#Override
public void requestCompleted() {
//keep the request active, normally here this.requestActive would be set to false -> we do that in the completeRequest()-method which is manually called after the async method is done
}
/**
* This method should be called after your async method is finished. Normally it is called when the
* request completes but since our async method can run longer we call it manually afterwards
*/
public void completeRequest() {
super.requestCompleted();
}
#Override
public Object getAttribute(String name, int scope) {
if(scope== RequestAttributes.SCOPE_REQUEST) {
return this.requestAttributeMap.get(name);
}
return null;
}
#Override
public void setAttribute(String name, Object value, int scope) {
if(scope== RequestAttributes.SCOPE_REQUEST){
this.requestAttributeMap.put(name, value);
}
}
#Override
public void removeAttribute(String name, int scope) {
if(scope== RequestAttributes.SCOPE_REQUEST) {
this.requestAttributeMap.remove(name);
}
}
#Override
public String[] getAttributeNames(int scope) {
if(scope== RequestAttributes.SCOPE_REQUEST) {
return this.requestAttributeMap.keySet().toArray(new String[0]);
}
return new String[0];
}
#Override
public void registerDestructionCallback(String name, Runnable callback, int scope) {
// Not Supported
}
#Override
public Object resolveReference(String key) {
// Not supported
return null;
}
#Override
public String getSessionId() {
return null;
}
#Override
public Object getSessionMutex() {
return null;
}
#Override
protected void updateAccessedSessionAttributes() {
}
}
Then in the RestController before the async method is called:
#Autowired
//this is the service that contains the async-method
OtherService otherService;
public void someFunctionWithinTheMainRequestThread(){
otherService.asyncMethod(getIndependentRequestAttributesForAsync());
}
private RequestAttributes getIndependentRequestAttributesForAsync(){
RequestAttributes requestAttributes = new AsyncRequestScopeAttr(((ServletRequestAttributes)RequestContextHolder.getRequestAttributes()).getRequest());
for (String attributeName : RequestContextHolder.getRequestAttributes().getAttributeNames(RequestAttributes.SCOPE_REQUEST)) {
RequestContextHolder.getRequestAttributes().setAttribute(attributeName, RequestContextHolder.getRequestAttributes().getAttribute(attributeName, RequestAttributes.SCOPE_REQUEST), RequestAttributes.SCOPE_REQUEST);
}
return requestAttributes;
}
And then in the async function:
public class OtherService {
#Async
#Transactional(readOnly=true)
public void asyncMethod(RequestAttributes context) {
//set the RequestAttributes for this thread
RequestContextHolder.setRequestAttributes(context);
// do your thing .... linkTo() etc.
//cleanup
((AsyncRequestScopeAttr)context).completeRequest();
RequestContextHolder.resetRequestAttributes();
}
}
Related
I need to asynchronously react to the #EventListener, therefore I've created something like this.
#Service
public class AsyncHandler {
private CompletableFuture<My> future;
#Async
public CompletableFuture<My> getMy() {
future = new CompletableFuture<>();
return future;
}
#EventListener
public void processEvent(MyEvent event) {
future.complete(event.my());
}
}
The problem here is that AsyncHandler is now stateful. Which is wrong.
And I don't want to use database, so is there any other way to make the bean stateless while using #EventListener?
You are right, your singleton has state, which is "not good".
One (possible) solution:
make the/refactor "statfeul part" to a "prototype" (request, session) scope bean.
make your "singleton" abstract!
inject the "stateful part" via "method injection" (we cannot "auto wire" lower(shorter-living) scope beans to higher ones...)
As code (example):
State holder:
public class MyStateHolder {
// State:
private CompletableFuture<My> future;
#Async // ?
public CompletableFuture<My> getMy() {
future = new CompletableFuture<>();
return future;
}
}
Abstract, (no #Service ..yet, no state!):
public abstract class AsyncHandler {
#EventListener
public void processEvent(MyEvent event) {
// !!
delegate().getMy().complete(event.my());
}
// and now only (abstract!):
public abstract MyStateHolder delegate();
}
Wiring :
#Configuration
class MyConfig {
#Bean
#Scope("prototype") // !
public MyStateHolder stateful() {
return new MyStateHolder();
}
#Bean // singleton/service:
public AsyncHandler asyncHandler() {
return new AsyncHandler() { // !
#Override // !
public MyStateHolder delegate() {
return stateful();// !;)
}
};
}
}
refs: (most of) https://docs.spring.io/spring-framework/docs/current/reference/html/core.html
Especially:
https://docs.spring.io/spring-framework/docs/current/reference/html/core.html#beans-factory-scopes-sing-prot-interaction
I am working within an environment that changes credentials every several minutes. In order for beans that implement clients who depend on these credentials to work, the beans need to be refreshed. I decided that a good approach for that would be implementing a custom scope for it.
After looking around a bit on the documentation I found that the main method for a scope to be implemented is the get method:
public class CyberArkScope implements Scope {
private Map<String, Pair<LocalDateTime, Object>> scopedObjects = new ConcurrentHashMap<>();
private Map<String, Runnable> destructionCallbacks = new ConcurrentHashMap<>();
private Integer scopeRefresh;
public CyberArkScope(Integer scopeRefresh) {
this.scopeRefresh = scopeRefresh;
}
#Override
public Object get(String name, ObjectFactory<?> objectFactory) {
if (!scopedObjects.containsKey(name) || scopedObjects.get(name).getKey()
.isBefore(LocalDateTime.now().minusMinutes(scopeRefresh))) {
scopedObjects.put(name, Pair.of(LocalDateTime.now(), objectFactory.getObject()));
}
return scopedObjects.get(name).getValue();
}
#Override
public Object remove(String name) {
destructionCallbacks.remove(name);
return scopedObjects.remove(name);
}
#Override
public void registerDestructionCallback(String name, Runnable runnable) {
destructionCallbacks.put(name, runnable);
}
#Override
public Object resolveContextualObject(String name) {
return null;
}
#Override
public String getConversationId() {
return "CyberArk";
}
}
#Configuration
#Import(CyberArkScopeConfig.class)
public class TestConfig {
#Bean
#Scope(scopeName = "CyberArk")
public String dateString(){
return LocalDateTime.now().toString();
}
}
#RestController
public class HelloWorld {
#Autowired
private String dateString;
#RequestMapping("/")
public String index() {
return dateString;
}
}
When I debug this implemetation with a simple String scope autowired in a controller I see that the get method is only called once in the startup and never again. So this means that the bean is never again refreshed. Is there something wrong in this behaviour or is that how the get method is supposed to work?
It seems you need to also define the proxyMode which injects an AOP proxy instead of a static reference to a string. Note that the bean class cant be final. This solved it:
#Configuration
#Import(CyberArkScopeConfig.class)
public class TestConfig {
#Bean
#Scope(scopeName = "CyberArk", proxyMode=ScopedProxyMode.TARGET_CLASS)
public NonFinalString dateString(){
return new NonFinalString(LocalDateTime.now());
}
}
Problem statement:
I have to process request similar to a pipeline.
For example:
When a request comes, it has to undergo a sequence of operations, like (step1,step2,step3...).
So, in order to achieve that, I am using Template design pattern.
Please review and suggest if I am implementing this problem correctly, or there is a better solution.
I am suspecting my approach will introduce code smells, as I am changing values of objects very frequently.
Also, suggest if I & how can I use Java 8 to accomplish this?
Thanks.
Code:
package com.example.demo.design;
import java.util.List;
public abstract class Template {
#Autowired
private Step1 step1;
#Autowired
private Step2 step2;
#Autowired
private Save save;
List<String> stepOutput = null;
List<String> stepOutputTwo = null;
List<String> stepOutputThree = null;
public void step1(String action1) {
stepOutput = step1.method(action1);
}
public void step2(String action2) {
stepOutputTwo = step2.method(stepOutput, action2);
}
abstract public void step3();
public void save() {
save.persist(stepOutputThree);
}
final public void run(String action1, String action2) {
step1(action1);
step2(action2);
stepOutputTwo = step3();
}
}
In Java 8 streams model, that could look like the following:
final public void run(String action1, String action2) {
Stream.of(action1) // Stream<String>
.map(s -> step1.method(s)) // Stream<List<String>>
.map(l -> step2.method(l,action2) // Stream<List<String>>
.map(l -> step3.method(l)) // Stream<List<String>>
.forEach(l -> save.persist(l));
}
I had same issue! you can do something like this: and uncheckCall method is for handling exceptions.
final public void run(String action1, String action2) {
//other stuffs
Stream.of(step1.method(action1))
.map(stepOutput->uncheckCall(() ->step2.method(stepOutput,action2)))
.forEach(stepOutputThree -> uncheckCall(()->save.persist(stepOutputThree)));
//.....
}
for uncheckCall method:
public static <T> T uncheckCall(Callable<T> callable) {
try {
return callable.call();
} catch (RuntimeException e) {
// throw BusinessException.wrap(e);
} catch (Exception e) {
//throw BusinessException.wrap(e);
}
}
Well, when there are "pipelines", "sequence of operations", etc. the first design pattern that comes to mind is Chain of Responsibility, that looks like the following
and provides you with these benefits:
allows you to add new handlers when necessary (e.g. at runtime) without modifying other handlers and processing logic (Open/Closed Principle of SOLID)
allows a handler to stop processing a request if necessary
allows you to decouple processing logic of the handlers from each other (Single Responsibility Principle of SOLID)
allows you to define the order of the handlers to process a request outside of the handlers themselves
One example of real world usage is Servlet filters where you call doFilter(HttpRequest, HttpResponse, FilterChain) to invoke the next handler
protected void doFilter(HttpServletRequest req, HttpServletResponse resp, FilterChain chain) {
if (haveToInvokeNextHandler) {
chain.doFilter(req, resp);
}
}
In case of using classical Chain of Responsibility pattern your processing pipeline may look like the following:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
public abstract class AbstractStep implements Step {
private Step next;
public AbstractStep() {
}
public AbstractStep(Step next) {
this.next = next;
}
protected void next(StepContext ctx) {
if (next != null) {
next.handle(ctx);
}
}
}
Implementation
public class Step1 extends AbstractStep {
public Step1(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
next(ctx); // invoke next step
}
}
public class Step2 extends AbstractStep {
public Step2(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
next(ctx); // invoke next step
}
}
public class Step3 extends AbstractStep {
public Step3(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
next(ctx); // invoke next step
}
}
Client code
Step step3 = new Step3(null);
Step step2 = new Step2(step3);
Step step1 = new Step1(step2);
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
step1.handle(ctx);
Also all this stuff can be simplified into a chain of handlers decoupled from each other by means of removing the corresponding next references in case your processing pipeline will have to always invoke all the available steps without controlling the necessity of invocation from the previous one:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
Implementation
public class Step1 implements Step {
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
}
}
public class Step2 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
}
}
public class Step3 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
}
}
Client code
Note that in case of Spring framework (just noticed #Autowired annotation) the client code may be simplified even more as the #Autowired annotation can be used to inject all the beans of the corresponding type into a corresponding collection.
Here what the documentation states:
Autowiring Arrays, Collections, and Maps
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
public class StepsInvoker {
// spring will put all the steps into this collection in order they were declared
// within the spring context (or by means of `#Order` annotation)
#Autowired
private List<Step> steps;
public void invoke(String action1, String action2) {
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
steps.forEach(step -> step.handle(ctx))
}
}
I want to invoke Lambda function A from another lambda function B with some parameters.
The following is the invoking lambda function.
#SpringBootApplication
public class Application extends SpringBootServletInitializer implements CommandLineRunner {
#Autowired
private ConfigurableApplicationContext context;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
public void run(String... args) {
DCService dcService = LambdaInvokerFactory.builder().lambdaFunctionNameResolver(
(method, lambdaFunction, lambdaInvokerFactoryConfig) -> "EventPlanDCFunction-Dev")
.build(DCService.class);
log.info("Response from DC service :: {}",dcService.getClass());
String[] params = new String[]{"Subir has invoked"};
dcService.run(params);
SpringApplication.exit(context);
}
}
following is the code of DCService.java file.
public interface DCService {
#LambdaFunction(functionName = "DeliveryCycleLambdaHandler",
invocationType = InvocationType.Event)
void run(String... params);
}
The following is the code of the lambda function which I want to invoke.
#SpringBootApplication
public class Application extends SpringBootServletInitializer implements CommandLineRunner {
#Autowired
private ConfigurableApplicationContext context;
#Autowired
private DeliveryCycleService deliveryCycleService;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
public void run(String... args) {
deliveryCycleService.printMessage(args[0]);
SpringApplication.exit(context);
}
}
As you can see, I tried to pass the parameter by creating an array of String from the invoking method but I am getting ArrayOutOFBoundException in the other method meaning the parameter is not actually reaching the invoked method. If I do not pass parameter it works fine but for my use case, I need to pass parameter and invoke the method asynchronously.
NOTE: The lambdaHandle code is same for both of them. The following belongs to one of them.
#Slf4j
public class DCInvokeHandler implements RequestStreamHandler {
private static final Logger LOGGER = LoggerFactory.getLogger(DCInvokeHandler.class);
private volatile SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
#Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) throws IOException {
if (handler == null) {
synchronized (this) {
if (handler == null) {
handler = initHandler();
}
}
}
handler.proxyStream(inputStream, outputStream, context);
}
private static SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> initHandler() {
try {
return SpringBootLambdaContainerHandler.getAwsProxyHandler(Application.class, Env.getEnv().name());
} catch (ContainerInitializationException e) {
LOGGER.error("Failed to start spring boot lambda handler", e);
// if we fail here. We re-throw the exception to force another cold start
throw new IllegalStateException("Could not initialize Spring Boot application", e);
}
}
}
This is the basic code to invoke another lambda from a lambda function.aws sdk doc
try {
InvokeRequest invokeRequest = new InvokeRequest();
invokeRequest.setFunctionName(FunctionName);
invokeRequest.setPayload(ipInput);
returnDetails = byteBufferToString(
lambdaClient.invoke(invokeRequest).getPayload(),
Charset.forName("UTF-8"),logger);
} catch (Exception e) {
logger.log(e.getMessage());
}
To invoke the another lambda function asynchronously, set InvocationType to Event.aws api docs
Following are the invocation type RequestResponse, Event, DryRun.
RequestResponse (default) - Invoke the function synchronously. Keep the connection open until the function returns a response or times out. The API response includes the function response and additional data.
Event - Invoke the function asynchronously. Send events that fail multiple times to the function's dead-letter queue (if it's configured). The API response only includes a status code.
DryRun - Validate parameter values and verify that the user or role has permission to invoke the function.
I have followed a tutorial on dynamic datasource routing tutorial in Spring. For that I have to extend AbstractRoutingDataSource to tell spring which datasource to get, so I do:
public class CustomRouter extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return CustomerContextHolder.getCustomerType();
}
}
Everything goes fine till I find the class responsible for keeping the value of the customerType (it should be the same during the whole session):
public class CustomerContextHolder {
private static final ThreadLocal<Integer> contextHolder = new ThreadLocal<Integer>();
public static void setCustomerType(Integer customerType) {
contextHolder.set(customerType);
}
public static Integer getCustomerType() {
return (Integer) contextHolder.get();
}
public static void clearCustomerType() {
contextHolder.remove();
}
}
This creates a thread-bound variable customerType, but I have a web application with spring and JSF I don't think with threads but with sessions. So I set it in the login page with thread A (View), but then thread B (Hibernate) request the value to know what datasource to use, it is null indeed, because it has a new value for this thread.
Is there any way to do it Session-bounded instead of Thread-bounded?
Things I have tried so far:
Inject the CustomRouter in the view to set it in the session: Not working, it causes a cycle in dependecies
Replace the ThreadLocal with an Integer: Not working, the value is always set by the last user logged in
Is FacesContext.getCurrentInstance() working? If so then you may try with this:
public class CustomerContextHolder {
private static HttpSession getCurrentSession(){
HttpServletRequest request = (HttpServletRequest)FacesContext.getCurrentInstance()
.getExternalContext().getRequest();
return request.getSession();
}
public static void setCustomerType(Integer customerType) {
CustomerContextHolder.getCurrentSession().setAttribute("userType", customerType);
}
public static Integer getCustomerType() {
return (Integer) CustomerContextHolder.getCurrentSession().getAttribute("userType");
}
public static void clearCustomerType() {
contextHolder.remove(); // You may want to remove the attribute in session, dunno
}
}