How to persist a Server Sent Event object into my database? - java

I am trying to build a notification system where-in my back-end, upon receiving a webhook from a third party, will notify the front-end of a particular status, and have decided to go ahead with the Server Sent Events implementation in Spring Boot. Having gone through 10s of implementations of the same, I found that the common way of doing it is:
having your client hit a subscription API on your back-end
Create an object of SseEmitter class and store it in an in-memory HashMap or ArrayList
Fetch the events stored in your memory, to send notification when the desired event occurs
However, my issue is that I can't store these SseEmitter Objects in-memory, in a production grade code. I have tried serializing the object, but upon deserialization a new object of SseEmitter is created and the connection to client is lost. How do I do this gracefully?
#RestController
public class EventController {
Map<String, SseEmitter> sseEmitters = new ConcurrentHashMap<>();
#Autowired
SerializerUtils serializerUtils;
String sseEmitterSerialized = null;
#CrossOrigin
#GetMapping(value="/subscribe", consumes = MediaType.ALL_VALUE)
public SseEmitter subscribe(#RequestParam(value = "tokenId") String tokenId) throws IOException {
DefaultSseEmitter sseEmitter = new DefaultSseEmitter(Long.MAX_VALUE);
sseEmitter.send(SseEmitter.event().name("latestEvent").data("INITIALIZED"));
sseEmitter.onCompletion(() -> sseEmitters.remove(sseEmitter));
sseEmitters.put(tokenId, sseEmitter);
return sseEmitter;
}
#CrossOrigin
#PostMapping(value="/dispatchEvent", produces = MediaType.ALL_VALUE)
public void dispatchToClients(#RequestParam (value = "freshEvent") String freshEvent, #RequestParam(value = "tokenId") String tokenId)
throws IOException, ClassNotFoundException {
sseEmitters.get(tokenId).send(SseEmitter.event().name("latestEvent").data(freshEvent));
}
I have tried serializing and conversion to JSON. None of that stuff works.

However my issue is that I can't be storing these SseEmitter Objects
in-memory, in a production grade code
There is nothing inherently bad about storing these SSE sessions in a hashmap mapping id key to the emitter as value. (Make sure to handle disconnection / hashmap clean up).
I don't think it's possible to even persist these sessions for your intended purpose. It's not possible with SSE for the server to initiate the connection with the client.

Related

Track users activities on a Spring web app

I have a web application written in Java using the Spring framework.
I would like to store the users activities like, page visits, actions, interactions etc.
I read that usually this is done by creating a table for each tracked aspect. I was wondering if there is a better way to do it using Spring framework, like a way to intercept all the requests and trigger some actions.
What kind of technology do you recommend to store all these information? Right know I’m using a MySql database interacting with it through JPA. But, since I’m really really new to these kind of things I don’t know if I should go with a NoSql database or stay with my already existing MySql database. This wonder comes from the idea that this kind of data flow will be much bigger than a normal data flow coming from more traditional actions such as signin, creation, deletion etc.
Hope to have explained myself... if not just tell me and I’ll try to add more details.
[EDIT 1]
The web app is an e-commerce. So far it does not have So many users but it will (in the order of thousands).
The goal of the user tracking it’s just to profile them in order to give them a better and more custom service. For instance, if a see that a user is taking a look to a lot of items of a precise category I can show him more items of that kind.
I do no care that much about the performance, I mean, it does not have to be so fast.
Right know I have just one database and everything is stored inside it. I don’t know if charging it with this kind of data flow would slow down its performance.
The application is running on AWS ElasticBeanstalk and the database is on AWS RDS.
In general its a very broad topic.
The following considerations come to my mind:
How many requests pass to the microservice per some period of time? If its a small number of users (which translates to the number of records to the database) - then its ok to go with the MySQL approach - the table won't be large. Note however, that sometimes it should be cleaned anyway
Is the latency important? Sometimes requests have to be served very quickly, adding a hop to the database to save the user preference can be problematic
How do you want to consume this kind of information? Are you planning to use dashboards (in this case micrometer + Prometheus / InfluxDB and Grafana can be a good solution). Are you planning to actually charge the users per number of requests with an ability to send the monthly bill to their email in PDF or provide a web access to such an information (like AWS does for example)?
How about Rate limiter? Are you planning to deny some requests if they're frequent and coming from the same user?
How many instance will "add" this kind of information? What if you have thousands of microservices that now will have to write to MySQL - it might not survive such a load (in addition to the regular load its set up for)?
The range of solutions can vary.
You can Batch the requests per user in memory and send once in while a message into Kafka and then use kafka streams to provide aggregations on it. With this approach you'll minimize the impact of the added processing on the existing solution and will deploy some other service that will be able to process this pretty large amount of data.
Another option: maybe you can create an asynchronously populated log file and store the information there. Then you might want to add some "agent" / side-car container like logstash and stream the data into some storage. Yet Another project that might be relevant in this field is Apache Flume which will allow you to construct a pipeline.
For billing you might use specialized systems AFAIK spring doesn't have anything like this usually these are ready products that you can integrate with.
For Rate Limiting you might consider: Resilience4j or solve it with redis
Yeah , That's possible , Here below are the three approaches with some sample snippets which would help you in the implementation , Moreover it depends on the data you store when you log the activity and when do you consider the activity data as obsolete and there are many factors which can decides your data store.
Approach 1: You can keep track of the login user using Spring-Security
You can write a HTTPSessionBindingListener and track the actions something like this
#Component
public class LoggedUser implements HttpSessionBindingListener {
private String username;
private ActiveUserStore activeUserStore;
public LoggedUser(String username, ActiveUserStore activeUserStore) {
this.username = username;
this.activeUserStore = activeUserStore;
}
public LoggedUser() {}
#Override
public void valueBound(HttpSessionBindingEvent event) {
List<String> users = activeUserStore.getUsers();
LoggedUser user = (LoggedUser) event.getValue();
if (!users.contains(user.getUsername())) {
users.add(user.getUsername());
}
}
#Override
public void valueUnbound(HttpSessionBindingEvent event) {
List<String> users = activeUserStore.getUsers();
LoggedUser user = (LoggedUser) event.getValue();
if (users.contains(user.getUsername())) {
users.remove(user.getUsername());
}
}
// standard getter and setter
}
and for login and logout you can track using AuthenticationSuccessHandler
#Component("myAuthenticationSuccessHandler")
public class MySimpleUrlAuthenticationSuccessHandler implements AuthenticationSuccessHandler {
#Autowired
ActiveUserStore activeUserStore;
#Override
public void onAuthenticationSuccess(HttpServletRequest request,
HttpServletResponse response, Authentication authentication)
throws IOException {
HttpSession session = request.getSession(false);
if (session != null) {
LoggedUser user = new LoggedUser(authentication.getName(), activeUserStore);
session.setAttribute("user", user);
}
}
}
Approach 2 : The other method if you want to make it very simple is that you can write a OncePerRequestFilter
#Component
#Ordered(Ordered.LOWEST_PRECEDENCE)
public class LogFilter extends OncePerRequestFilter {
#Override
protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain)
throws ServletException, IOException {
// Log the info you need
// ...
filterChain.doFilter(request, response);
}
}
Approach 3 : Implement using Spring AOP.
#Aspect
#Component
public class WebMethodAuditor {
protected final Log logger = LogFactory.getLog(getClass());
public static final String DATE_FORMAT_NOW = "yyyy-MM-dd HH:mm:ss";
#Autowired
AuditRecordDAO auditRecordDAO;
#Before("execution(* com.mycontrollers.*.*(..))")
public void beforeWebMethodExecution(JoinPoint joinPoint) {
Object[] args = joinPoint.getArgs();
String methodName = joinPoint.getSignature().getName();
User principal = (User)SecurityContextHolder.getContext().getAuthentication().getPrincipal();
Timestamp timestamp = new Timestamp(new java.util.Date().getTime());
// only log those methods called by an end user
if(principal.getUsername() != null) {
for(Object o : args) {
Boolean doInspect = true;
if(o instanceof ServletRequestDataBinder) doInspect = false;
if(o instanceof ExtendedModelMap) doInspect = false;
if(doInspect) {
if(o instanceof BaseForm ) {
// only show form objects
AuditRecord ar = new AuditRecord();
ar.setUsername(principal.getUsername());
ar.setClazz(o.getClass().getCanonicalName());
ar.setMethod(methodName);
ar.setAsString(o.toString());
ar.setAudit_timestamp(timestamp);
auditRecordDAO.save(ar);
}
}
}
}
}
}
Source and More details :
https://www.baeldung.com/spring-security-track-logged-in-users
Spring / AOP: Best way to implement an activities log in the database
What is the best way to log Activity in Spring Boot with Thymeleaf?

Does #Cacheable annotated methods execute when the actual data is modified?

I am building a RESTful web service that can be consumed by a browser or another web service.
I am willing to reduce the bandwidth through caching, however i want the method to be executed and send the actual data only if it's different than the last modified cache.
From my understanding of the #cacheable annotation, the method is only executed once and the output is cached until the cache expires .
Also #CachePut executes everytime and updates the cache but does it send the cache again even if it's not updated?
summary is: i need the client to be able to send the last modified date of it's cache and only get a new data if it has been modified.
Also how does Spring handle the client side caching and if-modified-since headers? does i need to save the last modified time or it is automatically handled ?
No, you need to do it by yourself.
You need to annotate your "fetch" method with #Cacheable(docs) and then, annotate "update" method with #CacheEvict (docs) in order to "drop" your cache. So when you would fetch your data next time after its modification, it will be fresh.
Alternatively, you can create another method with #CacheEvict and manually call it from "update" method.
The cache related annotations (#Cacheable, #CacheEvict etc) will only deal with the cache being maintained by application. Any http response header like last-modified etc has to be managed seperately. Spring MVC provides a handy way to deal with it (docs).
The logic to calculate the last modified time has to be obviously application specific.
An example of its usage would be
MyController {
#Autowire
CacheService cacheService;
#RequestMapping(value = "/testCache", method = RequestMethod.GET)
public String myControllerMethod(WebRequest webRequest, Model model, HttpServletResponse response) {
long lastModified = // calculate as per your logic and add headers to response
if (request.checkNotModified(lastModified)) {
// stop processing
return null;
} else {
return cacheService.getData(model);
}
}
#Component
public class CacheService{
#Cacheable(value = "users", key = "#id")
public String getData(Model model) {
//populate Model
return "dataview";
}

Server Sent Event with Spring with having a thread being freed from request

I need to show off an example of server sent events. I learned about it in a spring talk. People used Webflux there to show the reactive principles. I understood the part on how this will free thread resources because the request thread won't be blocked until the job is done and the server returns the response.
I have an example here but actually I don't really know how I can make this thread resource example be clear enough.
I do not want to use the WebFlux framework here. Just need to know what to put into a separate thread here - if necessary at all?!
As you can see I have a GetMapping to subscribe to the event stream. And then I have a GetMapping to launch or fire an event. This example is fast for sure but should be considered as heavy database call.
So I actually need to have the whole logic be separated in another thread right? So the request thread is free as soon as possible?
#RestController
public class EventStreamRequestHandler {
#Autowired
ObjectMapper objectMapper;
SseEmitter sseEmitter = new SseEmitter(1000000L);
#GetMapping("/get/event/stream")
public SseEmitter getStream() {
return this.sseEmitter;
}
#GetMapping("/launch/event")
public void fireEvent() throws IOException {
Person peter = new Person("Peter", "25");
String valueAsString = objectMapper.writeValueAsString(peter);
SseEmitter.SseEventBuilder sseEventBuilder = SseEmitter.event()
.id("foo")
.name("awesome-event")
.data(valueAsString);
sseEmitter.send(sseEventBuilder);
}
}
Yes, Server sent events are supposed to send messages to the client asynchronously without client keep on polling for message.
The sending of messages from client to server needs to be done asynchronously. With the way you have done it. When a GET request is sent to /get/event/stream an SseEmitter will be created but messages will only be sent when a GET request is sent to /launch/event. And the GET request thread for /launch/event will be used to send the message.
Sometime back I wrote post to send SSE messages using a different thread. I hope this helps.
But I don't recommend storing the SseEmitter in an instance variable as it will overridden by multiple requests. You must at least make it ThreadLocal

It is possible to return custom data to Kafka Producer

I am learning kafka and I want to split my app to 2 microservices.
First save all incoming messages from KafkaConsumer to database and select entity by given id.
Second provide REST api to save and get entities.
Interaction between them provided with kafka.
How can I receive stored ID from db in REST api with kafka?
Here is sample code of producer which calls on POST request.
public void sendToKafka(MyObject myobject) throws ExecutionException, InterruptedException {
LOGGER.info("sending payload='{}' to topic='{}'", myobject, myTopic);
byte[] bytes = parseObjectToByte(myobject);
ListenableFuture<SendResult<String, byte[]>> resultFuture = kafkaTemplate.send(topicSave, bytes);
SendResult<String, byte[]> result = resultFuture.get();
LOGGER.info(result.toString());
}
and Consumer, which save myObject to database
#KafkaListener(topics = "${kafka.topic.mytopic}")
public void saveMyObject(byte[] value) {
MyObject myobject = parseToMyObject(value);
LOGGER.info("received myobject='{}'", myobject);
MyObject myobjectSaved = myObjectRepository.insert(myobject);
}
I'm using spring-kafka with spring-boot.
Rest api has 2 methods:
POST - save myObject
Get - return saved object by id.
It is possible do it with kafka or i must connect this microservices directly? Thank you.
Not sure I fully understand your question, but if you want to send a message to kafka, and wait for this message to be consumed and handled by some microservice which would then return some information (a primary key) to the sender of the message you cannot do that without adding more to your architecture.
A message sent to kafka is "fire and forget" and from the sender point of view you know nothing about what will happen with this message (if, when, how often, and by how many consumers it will be consumed.)
In your scenario, the consumer microservice could also send messages with the primary key in another kafka topic, that you would consume if you need that information.
Remember Kafka serves to decouple your architecture and to introduce asynchronous message handling, if you need to synchronously have a response from a consumer you are probably using the wrong solution.

Asynchronous event in Spring taking lot of time to execute(get its turn)

I have an application where I need to trigger email whenever a REST call is made for an endpoint. The design is that whenever a REST call is invoked, I save the data in the db, emit an asynchrnous event and return.
My problem is that due to huge no of requests that keep coming, the async events which are emitted do not get a chance for quite lot of time. Sometimes as the server is up for some weeks, the delay keeps increasing.
The scenario
Server endpoint is invoked
Server saves data to db, emits a Spring Asynchronous event
Returns from the endpoint
The call 2 is getting delayed as the listener is invoked quite late sometimes.
public class DataController {
#Inject
ApplicationEventPublisher eventPublisher;
#RequestMapping(value = "data", method = RequestMethod.POST)
#ResponseStatus(HttpStatus.NO_CONTENT)
public void addData(#RequestBody DataDTO data) {
dataService.addData(data);
eventPublisher.publishEvent(new DataRequest(new DataDTO());
}
}
public class DataRequest extends ApplicationEvent {
private DataDTO dataDTO;
public DataRequest(DataDTO dataDTO) {
super(dataDTO);
this.dataDTO = dataDTO;
}
}
#Component
public class DataListener {
#EventListener
#Async
private void dataListener(DataDTO dataDTO) {
// Send email
}
}
Since it is an Async event , the JVM gives the dataListener chance to execute very late. Sometimes
the events triggered earlier gets chance late than the ones that were triggered after that.
So 2 fundamental problems
Emails delayed. Delay can range from 1 min, to 4 hours to 8 days etc
If an event is triggered at 12 PM to send email to xyz#gmail.com, and another at 12:15 PM which sends email to abc#gmail.com, then there are chances of abc#gmail.com receiving email before xyz#gmail.com.
Appreciate your help
Spring Asynchronous event is limited to the size of the Thread pool and as soon as the incoming requests are higher than the size of active threads there will be delays.
You need to use a Message Queue like RabbitMQ, Kafka, etc. Your architecture should be changed to do the following;
Serialize a JSON message in the REST Endpoint with all information like to email address, database entry data, etc and just store that JSON message in the message queue and return a status code
There must be consumers for message queue (separate Java applications) which poll or get notified when there is data in the message queue.
These consumers should de-serialize the JSON message, save an entry in the database and send an email.
With this architecture you can increase consumers at times of high load and thus scale as required.

Categories