I am very new to Spring MVC and AngularJS. The application basically does sync between selected tables from the two db instances. When sync is called the following service is called at the server end as below:
#RequestMapping(value = "configversion/{srcConfVersionId}/sync", method = RequestMethod.POST)
#ModelAttribute(DEFAULT_MODEL_ATTR_NAME)
#ResponseBody
public CustomResponse syncConfigurations(#PathVariable Long srcConfVersionId, #RequestBody SyncDTO dto) {
Date processingTime = new Date();
dto.setSrcConfVersionId(srcConfVersionId);
DeferredResult<Object> deferredResult = new DeferredResult<>();
SyncProcessingTask task = new SyncProcessingTask( dataSyncService, deferredResult, srcConfVersionId, dto);
Timer timer = new Timer();
timer.schedule(task, processingTime);
return new CustomResponse(deferredResult);
}
As far as i know the DeferredResult is used for async process where the result can be read in future.
From the front end side once the sync button is clicked it goes to new page which lists the results of sync. As the server continues in background the client shows the "In Progess" status. Now the question is how can i check the DeferredResult is changed from the AngularJS side?
You need to use the callback methods of DeferredResult. Implement the following.
DeferredResult.onCompletion(Runnable callback);
Related
I'm coding a game, when a player end its turn, I want to notify the opponent that it's his turn to play.
So I'm storing WebSocketSessions in "Player" classes, so I just need to get an instance of a player to have access to his websocketsession.
The problem is that nothing is happening when I use the "send" method of a websocketsession stored in a "player" instance.
Here is my code to store a WebSocketSession in a player object, it actually receive properly messages from front end, and it is able to send a message back and it works:
#Component("ReactiveWebSocketHandler")
public class ReactiveWebSocketHandler implements WebSocketHandler {
#Autowired
private AuthenticationService authenticationService;
#Override
public Mono<Void> handle(WebSocketSession webSocketSession) {
Flux<WebSocketMessage> output = webSocketSession.receive()
.map(msg -> {
String payloadAsText = msg.getPayloadAsText();
Account account = authenticationService.getAccountByToken(payloadAsText);
Games.getInstance().getGames().get(account.getIdCurrentGame()).getPlayerById(account.getId()).setSession(webSocketSession);
return "WebSocketSession id: " + webSocketSession.getId();
})
.map(webSocketSession::textMessage);
return webSocketSession
.send(output);
}
}
And here is the code I use to notify the opponent player that it is its turn to play, the "opponentSession.send" method seems to produce no result, there is no error message, and it looks like I receive nothing on the front end. The sessions has the same ID than in the handle method so I think the session object is good, also the websocket session was opened and ready when I did my tests:
#RequestMapping(value = "/game/endTurn", method = RequestMethod.POST)
GameBean endTurn(
#RequestHeader(value = "token", required = true) String token) {
ObjectMapper mapper = new ObjectMapper();
Account account = authenticationService.getAccountByToken(token);
gameService.endTurn(account);
Game game = gameService.getGameByAccount(account);
//GameBean opponentGameBean = game.getOpponentGameState(account.getId());
//WebSocketMessage webSocketMessage = opponentSession.textMessage(mapper.writeValueAsString(opponentGameBean));
WebSocketSession opponentSession = game.getPlayerById(game.getOpponentId(account.getId())).getSession();
WebSocketMessage webSocketMessage = opponentSession.textMessage("test message");
opponentSession.send(Mono.just(webSocketMessage));
return gameService.getGameStateByAccount(account);
}
}
You can see on the screenshot that the handle method is working correctly, I'm able to send and receive message.
Websocket input and output
Does someone know how can I make the opponentSession.send method works correctly so that I can receive messages on the front end?
You are using the reactive stack for your websocket and WebSocketSession#send return a Mono<Void> but you don't subscribe to this Mono (you just assembled it) so nothing will happen until something subscribe to it.
In your endpoint it doesn't look like you are using webflux so you are in synchronous world so you don't have other choice than to block
opponentSession.send(Mono.just(webSocketMessage)).block();
If you are using webflux then you should change your method to return a Mono and do something like:
return opponentSession.send(Mono.just(webSocketMessage)).then(gameService.getGameStateByAccount(account));
If you are not familiar with this you should look into projectreactor and WebFlux
Suppose I have some service S that receives requests from client C.
S cannot response immediately due to heavy calculations, C also cannot wait until forever and has his own timeout period.
My idea is to implement the server side as described here:
REST and long running jobs, Farazdagi
In my ServerController I have a thread pool for deferred calculations and a concurrent map to store responses.
private final int NUMBER_OF_WORKERS = 10;
private Map<String, ResponseEntity<MathResponse>> responseMap = new ConcurrentHashMap<>();
private ExecutorService executorService = Executors.newFixedThreadPool(NUMBER_OF_WORKERS);
My /calculate mapping submits jobs to the thread pool and returns with 202 (Accepted) HTTP status and puts redirection link to Location header.
#RequestMapping(value = "/calculate", method = RequestMethod.POST)
public ResponseEntity<String> startWorkflow(#RequestBody MathRequest request, UriComponentsBuilder builder) {
UUID uuid = UUID.randomUUID();
executorService.submit(() -> {
// time-consuming calculations here
ResponseEntity<MathResponse>response = HardMath.execute(request)
responseMap.put(uuid.toString(), response);
});
HttpHeaders headers = new HttpHeaders();
UriComponents uriComponents = builder.path("/wf/queue/{id}").buildAndExpand(uuid.toString());
headers.setLocation(uriComponents.toUri());
return new ResponseEntity<>(headers, HttpStatus.ACCEPTED);
}
In /queue/id mapping I return result if it's in the map:
#RequestMapping(value = "/queue/{id}", method = RequestMethod.GET)
public ResponseEntity<MathResponse> getQueueInfo(#PathVariable("id") String id) {
ResponseEntity<MathResponse> defaultQueueResponse = new ResponseEntity<>(new MathResponse(), HttpStatus.OK);
return responseMap.getOrDefault(id, defaultQueueResponse);
}
I suppose that using such low-level things like ConcurrentHashMap is not a good idea. Are there any options in Spring that I could use instead of reinventing the wheel?
There's also the question of resilience; if the results are local to an instance of S (i.e. in an in-process Map) then if that instance of S crashes or is restarted then the results are lost and C would be forced to resubmit its request(s). If the results cache within S was backed by a resilient store then the results could survive a crash/restart of S.
Spring's caching abstraction with a backing store of <insert storage technology here> could help.
From a high level, my application flow looks like
REST Controller RequestMapping is triggered by a GET() request. REST Controller calls a method in a Service class.
#RequestMapping(value="/eventreports", method = RequestMethod.POST, produces = "application/json")
public #ResponseBody List<EventReports> addReportIds(#RequestBody List<Integer> reportIds) {
List<EventReports> eventReports = railAgentCollectorServiceImpl.addReportIds(reportIds);
return eventReports;
}
Service method calls a methodin a DAO class.
#Override
public List<EventReports> addReportIds(List<Integer> reportIds) {
List<EventReports> eventReports = eventReportsDAOImpl.listEventReportsInJsonRequest(reportIds);
return eventReports;
}
DAO method executes a StoredProcedureQuery against a SQL datasource that returns results as an ArrayList of domain objects. Service class passes this Arraylist of domain objects back to REST Controller, which returns the ArrayList of domain objects as a JSON string.
#Override
public List<EventReports> listEventReportsInJsonRequest(List<Integer> reportIds) {
ArrayList<EventReports> erArr = new ArrayList<EventReports>();
try {
StoredProcedureQuery q = em.createStoredProcedureQuery("sp_get_event_reports", "eventReportsResult");
q.registerStoredProcedureParameter("reportIds", String.class, ParameterMode.IN);
q.setParameter("reportIds", reportIdsList);
boolean isResultSet = q.execute(); //try catch here
erArr = (ArrayList<EventReports>) q.getResultList();
} catch (Exception e) {
System.out.println("No event reports found for list " + reportIdsList);
}
return erArr;
}
I've been investigating integrating Spring Batch processing into the above pattern. I've been looking at the Spring getting started guide for batch processing here https://spring.io/guides/gs/batch-processing/ - paying particular attention to the source code for BatchConfiguration.java - I'm uncertain whether my application is suited for Spring Batch, maybe my imcomplete knowledge of Spring Batch and the various ways it can be implemented is preventing me from conceptualizing it. The BatchConfiguration.java code below suggests to me that Spring Batch may be best suited to iterate through a list of items, read them one by one, process them one by one, and write them one by one - whereas my service code is based on gathering and and writing a list of objects all at once.
#Bean
public FlatFileItemReader<Person> reader() {
FlatFileItemReader<Person> reader = new FlatFileItemReader<Person>();
reader.setResource(new ClassPathResource("sample-data.csv"));
reader.setLineMapper(new DefaultLineMapper<Person>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "firstName", "lastName" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}});
}});
return reader;
}
#Bean
public PersonItemProcessor processor() {
return new PersonItemProcessor();
}
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
// end::readerwriterprocessor[]
// tag::jobstep[]
#Bean
public Job importUserJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
Is this true? Could I still take advantage of resume-ability, scheduling and synchronization provided by Spring Batch for my existing code? Any suggestions appreciated.
I think the main thing that you need to consider this synchronous behavior and asynchronous behavior. Batch process are used for long running tasks, So,
Consider your task it long running or not. If your task is long running you can use batch. This is going to be asynchronous because, your request come in and start the task and then respond back to the user.
And the batch will run and complete and write a result back to the database, User will have to either poll for the result using ajax or you may have to implement push notification mechanism to handle the state of the task for the asynchronous behavior/ prevent polling.
Its true that a Spring Batch chunk consisting of a reader -> processor -> writer reads one item, processes one item but writes a chunk of items according to defined chunk size.
So you can send thousand of items in one go to writer to write to storage depending on your defined chunk_size.
Having said that, a reader reads one item but its not necessary to read only one item from source(from file/DB etc) itself. There are readers which can read a large quantity of items in one go from source, hold it in itself in a list and hand over one by one to processor till list is exhausted.
One such reader is JdbcPagingItemReader so e.g. it reads few thousand rows from database in one go as per defined reader page_size ( that reduces DB calls significantly ) and then keep handing over one by one to processor and then processor automatically keep accumulating processed outputs till chunk_size is reached and then hands over to writer in bulk.
Its just another case that something might not be ready off the shelf for your requirement in API - in that case, you will have to write your own ItemReader.
Look at the code of JdbcPagingItemReader to get ideas.
For your situation, writer of Spring Batch doesn't seem a problem at all, it already writes in bulk with just a simple configuration. You will have to feed Controller's output to reader which works on similar lines as JdbcPagingItemReader.
All I want to say that in-memory processing is one by one ( and that is very fast ) but IO can be done in bulk in spring batch ( if you choose so).
Hope it helps !!
My method in my rest controller keeps looping.
The method is supposed to get a project from the Mongo database then call another API to get some data and save it again in mongo. It works but it just keeps looping
I use retrofit for my API calls and the application uses spring boot
This is the method that keeps looping
If you need something else just ask me
#RequestMapping(value = "/projects/{id}",method = RequestMethod.PUT)
public void updateCampaign(#PathVariable String id) throws IOException {
Project p = projectRepository.findProjectById(id);
Call<ResponseIndie> get = service.getAllFromCampaign(p.getIndiegogoCampaignId(),API_KEY);
ResponseIndie responseIndie =get.execute().body();
IndiegogoCampaign campaign = responseIndie.getIndiegogoCampaign();
p.setIndiegogoCampaignId(campaign.getId());
p.setIndiegogoCampaign(campaign);
projectRepository.save(p);
logger.info("project saved");
}
I'm trying to implement a notification mechanism where a client connects to the server and is receiving updates.
Each user connects to a service end point like this
#ManagedService(path = "/chat/{userId}")
When they connect, they are registered in the broadcaster like this
#Ready
public void onReady(final AtmosphereResource resource) {
Broadcaster broadcaster = BroadcasterFactory.getDefault().lookup(userId,true);
broadcaster.addAtmosphereResource(resource);
}
When i want to send a message from a REST end point for example, i do it like this
#RequestMapping(value = "/ws2/{userId}", method = RequestMethod.GET)
public void test(#PathVariable("userId") String userId) {
Broadcaster broadcaster = BroadcasterFactory.getDefault().lookup(userId,true);
broadcaster.broadcast(new Message(userId, "User id : "));
}
It works very well when i'm using the web-socket implementation.
When i change to long-polling and calling this REST method, only the first message is sent, others are ignored with no errors or logs of any kind.
What can i do in this case?