I want to aggregate responses coming from 3 different endpoints(#ServiceActivator) and persist aggregated response to DB.
I am getting following exception
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: c.b.bean.jpa.PersonEntity.listsOfEmails, could not initialize proxy - no Session
How to make message flow transaction aware? Or I am missing somthing?
Following is code snippet,
Configuration
#Configuration
#EnableIntegration
#ComponentScan(basePackages={"integration.endpoint", "integration.sync"})
#IntegrationComponentScan(basePackages={"integration.gateway"})
public class InfrastructureConfiguration {
#Bean
#Description("Entry to the messaging system through the gateway.")
public MessageChannel requestChannel(){
return pubSubChannel();
}
#Bean
#Description("Sends transformed message to outbound channel.")
public MessageChannel invocationChannel(){
return pubSubChannel();
}
#Bean
#Description("Sends handler message to aggregator channel.")
public MessageChannel aggregatorChannel(){
return pubSubChannel();
}
#Bean
#Description("Sends handler message to response channel.")
public MessageChannel responseChannel(){
return pubSubChannel();
}
private PublishSubscribeChannel pubSubChannel() {
PublishSubscribeChannel pubSub = new PublishSubscribeChannel(executor());
pubSub.setApplySequence(true);
return pubSub;
}
private Executor executor() {
return Executors.newFixedThreadPool(10);
}
}
Starting Gateway
#MessagingGateway(name="entryGateway", defaultRequestChannel="requestChannel")
public interface IntegrationService {
String initiateSync(AnObject obj);
}
Message Builder: It transforms the message, by fetching an entity and set that as a property to message and message is send to the channel. Later this entity used by #Autowired serives in #ServiceActivator( 3 Endpoints). This Entity is lazily initialized for its associations.
#Component
public class MessageBuilder {
private static final Logger LOGGER = LoggerFactory.getLogger(MessageBuilder.class);
#Autowired
private ODao dao;
#Transformer(inputChannel="requestChannel", outputChannel="invocationChannel")
public OMessage buildMessage(Message<AnObject> msg){
LOGGER.info("Transforming messages for ID [{}]", msg.getPayload().getId());
OMessage om = new OMessage(msg.getPayload());
om.buildMessage(dao);
return om;
}
}
Endpoint-1
#Component
public class Handler1 {
private static final Logger LOGGER = LoggerFactory.getLogger(Handler1.class);
#Autowired
private service1 Service1;
#Override
#ServiceActivator(inputChannel="invocationChannel", outputChannel="aggregatorChannel")
public ResponseMessage handle(Message<OMessage> msg) {
OMessage om = msg.getPayload();
ResponseMessage rm = null;
if(map.get("toProceed")){
LOGGER.info("Handler1 is called");
rm = service1.getResponse(om);
}else{
LOGGER.info("Handler1 is not called");
}
return rm;
}
}
Endpoint-2
#Component
public class Handler2 {
private static final Logger LOGGER = LoggerFactory.getLogger(Handler2.class);
#Autowired
private service2 Service2;
#Override
#ServiceActivator(inputChannel="invocationChannel", outputChannel="aggregatorChannel")
public ResponseMessage handle(Message<OMessage> msg) {
OMessage om = msg.getPayload();
ResponseMessage rm = null;
if(map.get("toProceed")){
LOGGER.info("Handler2 is called");
rm = service2.getResponse(om);
}else{
LOGGER.info("Handler2 is not called");
}
return rm;
}
}
Endpoint-3
#Component
public class Handler3 {
private static final Logger LOGGER = LoggerFactory.getLogger(Handler3.class);
#Autowired
private service3 Service3;
#Override
#ServiceActivator(inputChannel="invocationChannel", outputChannel="aggregatorChannel")
public ResponseMessage handle(Message<OMessage> msg) {
OMessage om = msg.getPayload();
ResponseMessage rm = null;
if(map.get("toProceed")){
LOGGER.info("Handler3 is called");
rm = service3.getResponse(om);
}else{
LOGGER.info("Handler3 is not called");
}
return rm;
}
}
Aggregator
#Component
public class MessageAggregator {
private static final Logger LOGGER = LoggerFactory.getLogger(MessageAggregator.class);
#Aggregator(inputChannel="aggregatorChannel", outputChannel="responseChannel")
public Response aggregate(List<ResponseMessage> resMsg){
LOGGER.info("Aggregating Responses");
Response res = new Response();
res.getResponse().addAll(resMsg);
return res;
}
#ReleaseStrategy
public boolean releaseChecker(List<Message<ResponseMessage>> resMsg) {
return resMsg.size() ==3;
}
#CorrelationStrategy
public ResponseMessage corelateBy(ResponseMessage resMsg) {
LOGGER.info("CorrelationStrategy: message payload details {}", resMsg);
return resMsg;
}
}
You might fetch reference to lazy loaded domain inside a dao layer. So when it will be used later, it will be instantiated properly without proxy.
For example it might be like this snippet:
public List<PersonEntity> fetchPersonsWithMails() {
return sessionFactory.getCurrentSession()
.createCriteria(PersonEntity.class)
.setFetchMode("listsOfEmails", FetchMode.JOIN)
.list();
}
Related
I am trying to obtain the currently authenticated user in the controller for websockets. The problem is, I cannot access the user using SecurityContextHolder.getContext().getAuthentication().getPrincipal()).getId().
I have tried to give Principal as a parameter to the method but it returns principal null.
Security configuration:
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
#Override
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/connect").setAllowedOrigins("*");
}
#Override
public void configureMessageBroker(MessageBrokerRegistry registry) {
registry.enableSimpleBroker("/topic/messages");
registry.setApplicationDestinationPrefixes("/ws");
}
}
Controller for websocket:
#Controller
public class MessageController {
#Autowired
private Consumer consumer;
#Autowired
private Utils utils;
#Autowired
private PersonService personService;
#Autowired
SimpMessagingTemplate simpMessagingTemplate;
String destination = "/topic/messages";
ExecutorService executorService =
Executors.newFixedThreadPool(1);
Future<?> submittedTask;
#MessageMapping("/start")
public void startTask(Principal principal){
// Here, I would like to get the logged in user
// If I use principal like this: principal.getName() => NullPointerException
if ( submittedTask != null ){
simpMessagingTemplate.convertAndSend(destination,
"Task already started");
return;
}
simpMessagingTemplate.convertAndSendToUser(sha.getUser().getName(), destination,
"Started task");
submittedTask = executorService.submit(() -> {
while(true){
simpMessagingTemplate.convertAndSend(destination,
// "The calculated value " + val + " is equal to : " + max);
}
});
}
How can I get the authenticated user? I needed it to check when to start the task for the web socket
Try to implement ChannelInterceptor, that need to be registrated in Config file (class that implements WebSocketMessageBrokerConfigurer)
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
private final ChannelInterceptor serverPushInBoundInterceptor;
#Autowired
public WebSocketConfig(#Qualifier("serverPushInBoundInterceptor") ChannelInterceptor serverPushInBoundInterceptor) {
this.serverPushInBoundInterceptor = serverPushInBoundInterceptor;
}
....
#Override
public void configureClientInboundChannel(ChannelRegistration registration) {
registration.interceptors(serverPushInBoundInterceptor);
}
}
#Component("serverPushInBoundInterceptor")
public class ServerPushInBoundInterceptor implements ChannelInterceptor {
private static final Logger log = LoggerFactory.getLogger(ServerPushInBoundInterceptor.class);
#Override
#SuppressWarnings("NullableProblems")
public Message<?> postReceive(Message<?> message, MessageChannel channel) {
StompHeaderAccessor accessor = MessageHeaderAccessor.getAccessor(message, StompHeaderAccessor.class);
if (StompCommand.CONNECT.equals(Objects.requireNonNull(accessor).getCommand())) {
List<String> authorization = accessor.getNativeHeader("Authorization");
if (authorization != null && !authorization.isEmpty()) {
String auth = authorization.get(0).split(" ")[1];
System.out.println(auth);
try {
// find Principal
Principal principal = ...
accessor.setUser(new UsernamePasswordAuthenticationToken(principal, principal.getCredentials(), principal.getAuthorities()));
} catch (Exception exc) {
log.error("preSend", exc);
}
}
}
return message;
}
}
I have modified my RabbitMQ from a previous post (spring-rabbit JSON deserialization for ArrayList contents) to now use a DirectMessageListener with MessagePostProcessors to GZip and GUnzip the message payloads.
However, it doesn't appear to be working as the breakpoints are not activated, but also because my RabbitListeners are no longer receiving messages, whereas they did with a SimpleMessageFactoryListenerContainer.
Also, it appears the SimpleMessageListenerContainer(?) is still being used. On a side-note, I am autowiring the DirectMessageListenerContainer so I can dynamically set the queues I used.
spring-rabbit: 2.0.3.RELEASE.
spring-boot: 2.0.1.RELEASE.
RabbitMQ configuration:
#Configuration
#EnableRabbit
public class MessagingConfiguration implements ShutdownListener {
#Autowired
private RabbitListenerEndpointRegistry registry;
#Autowired
private DirectMessageListenerContainer container;
#Bean
public DirectMessageListenerContainer messageListenerContainer(final ConnectionFactory connectionFactory) {
final DirectMessageListenerContainer listenerContainer = new DirectMessageListenerContainer();
listenerContainer.setConnectionFactory(connectionFactory);
listenerContainer.setMessageConverter(jsonConverter()); // i.e.#RabbitListener to use Jackson2JsonMessageConverter
listenerContainer.setAutoStartup(false);
// container.setErrorHandler(errorHandler);
final MessageListenerAdapter messageListener = new MessageListenerAdapter(new Object() {
#SuppressWarnings("unused")
public String handleMessage(final String message) {
return message.toUpperCase();
}
});
messageListener.setBeforeSendReplyPostProcessors(new GZipPostProcessor());
listenerContainer.setMessageListener(messageListener);
listenerContainer.setAfterReceivePostProcessors(new GUnzipPostProcessor());
return listenerContainer;
}
#EventListener(ApplicationDatabaseReadyEvent.class)
public void onApplicationDatabaseReadyEvent() {
log.info("Starting all RabbitMQ Listeners..."); //$NON-NLS-1$
for (final MessageListenerContainer listenerContainer : registry.getListenerContainers()) {
listenerContainer.start();
}
log.info("Register is running: {}", registry.isRunning()); //$NON-NLS-1$
log.info("Started all RabbitMQ Listeners."); //$NON-NLS-1$
}
#Bean
public List<Declarable> bindings() {
final List<Declarable> declarations = new ArrayList<>();
final FanoutExchange exchange = new FanoutExchange("fx", true, false);
final Queue queue = QueueBuilder.durable("orders").build();
declarations.add(exchange);
declarations.add(queue);
declarations.add(BindingBuilder.bind(queue).to(exchange));
List<String> q = new ArrayList<>();
q.add(queue.getName());
container.addQueueNames(q.toArray(new String[queues.size()]));
return declarations;
}
#Bean
public Jackson2JsonMessageConverter jsonConverter() {
final Jackson2JsonMessageConverter converter = new Jackson2JsonMessageConverter();
converter.setClassMapper(classMapper());
return converter;
}
private static DefaultJackson2JavaTypeMapper classMapper() {
final DefaultJackson2JavaTypeMapper classMapper = new DefaultJackson2JavaTypeMapper();
classMapper.setTrustedPackages("*"); //$NON-NLS-1$ //TODO add trusted packages
return classMapper;
}
#ConditionalOnProperty(name = "consumer", havingValue = "true")
#Bean
public ConsumerListener listenerConsumer() {
return new ConsumerListener();
}
#ConditionalOnProperty(name = "producer", havingValue = "true")
#Bean
public ProducerListener listenerProducer() {
return new ProducerListener();
}
#Bean
public RabbitAdmin rabbitAdmin(final CachingConnectionFactory connectionFactory) {
return new RabbitAdmin(connectionFactory);
}
#Bean
public RabbitTemplate rabbitTemplate(final ConnectionFactory connectionFactory) {
final RabbitTemplate rabbitTemplate = new RabbitTemplate(connectionFactory);
rabbitTemplate.setMessageConverter(jsonConverter()); // convert all sent messages to JSON
rabbitTemplate.setReplyTimeout(TimeUnit.SECONDS.toMillis(3));
rabbitTemplate.setReceiveTimeout(TimeUnit.SECONDS.toMillis(3));
return rabbitTemplate;
}
#Override
public void shutdownCompleted(final ShutdownSignalException arg0) {
}
}
It doesn't work that way, you can't autowire containers for #RabbitListeners; they are not beans; they are created by the container factory and registered in the registry. Instead you have to retrieve them from the registry (by id).
However, since you have autoStartup set to false, it shouldn't be "stealing" messages from your #RabbitListener.
Generally, DEBUG logging should help.
I have the following configuration file for Spring Integration File:
#Configuration
#EnableIntegration
public class MyIntegrationConfiguration {
private static final String FILE_CHANNEL_PROCESSING = "processingfileChannel";
private static final String INTERVAL_PROCESSING = "5000";
private static final String FILE_PATTERN = "*.txt";
#Value("${import.path.source}")
private String sourceDir;
#Value("${import.path.output}")
private String outputDir;
#Bean
#InboundChannelAdapter(value = FILE_CHANNEL_PROCESSING, poller = #Poller(fixedDelay = INTERVAL_PROCESSING))
public MessageSource<File> sourceFiles() {
FileReadingMessageSource source = new FileReadingMessageSource();
source.setAutoCreateDirectory(true);
source.setDirectory(new File(sourceDir));
source.setFilter(new SimplePatternFileListFilter(FILE_PATTERN));
return source;
}
#Bean
#ServiceActivator(inputChannel = FILE_CHANNEL_PROCESSING)
public MessageHandler processedFiles() {
FileWritingMessageHandler handler = new FileWritingMessageHandler(new File(outputDir));
handler.setFileExistsMode(FileExistsMode.REPLACE);
handler.setDeleteSourceFiles(true);
handler.setExpectReply(true);
return handler;
}
#Bean
public IntegrationFlow processFileFlow() {
return IntegrationFlows
.from(FILE_CHANNEL_PROCESSING)
.transform(fileToStringTransformer())
.handle("fileProcessor", "processFile").get();
}
#Bean
public MessageChannel fileChannel() {
return new DirectChannel();
}
#Bean
public FileProcessor fileProcessor() {
return new FileProcessor();
}
#Bean
public FileToStringTransformer fileToStringTransformer() {
return new FileToStringTransformer();
}
}
For FileWritingMessageHandler from this documentation it says that if setExpectReply(true) is set:
Specify whether a reply Message is expected. If not, this handler will simply return null for a successful response or throw an Exception for a non-successful response.
My question is: where can I catch these exceptions or where can I retrieve this message/response?
The #ServiceActivator has an outputChannel attribute:
/**
* Specify the channel to which this service activator will send any replies.
* #return The channel name.
*/
String outputChannel() default "";
That's for successful replies.
Any exceptions (independently of the setExpectReply()) are just thrown to the caller. In your case the story is about an #InboundChannelAdapter. In this case the exception is caught and wrapped to the ErrorMessage to be sent to the errorChannel on the #Poller. It is a global errorChannel by default: https://docs.spring.io/spring-integration/docs/5.0.4.RELEASE/reference/html/configuration.html#namespace-errorhandler
However you have some other problem in your code. I see the second subscriber to the FILE_CHANNEL_PROCESSING. If it's not a PublishSubscribeChannel, you are going to have a round-robin distribution for messages sent to this channel. But that's already a different story. Just don't ask that question here, please!
I want spring-vault configuration marked with VaultPropertySource to be able to retry the requests to the vault if they fail.
What should i mark as retryable ? I'm using Spring-Retry and i was looking over http://www.baeldung.com/spring-retry .
There is no visible method to mark as retryable. Should I change the implementation of the vaultTemplate and mark the vaultOperations as retryable ?
ProvisioningSecrets.java
#Configuration
#VaultPropertySource(
value="secret/provisioning",
propertyNamePrefix = "provisioning.",
renewal = Renewal.RENEW
)
#EnableRetry
#Lazy
#Profile("!test")
public class ProvisioningSecrets {
private static final Logger logger = LoggerFactory.getLogger(ProvisioningSecrets.class);
#Autowired
public void setPassword(#Value("${provisioning.password}") final String password) throws Exception {
logger.info("We successfully set the provisioning db password.");
EnvVars.changeSetting(Setting.PROVISIONING_PASS, password);
}
#Autowired
public void setHost(#Value("${provisioning.host}") final String host) throws Exception {
logger.info("We successfully set the provisioning db host.");
EnvVars.changeSetting(Setting.PROVISIONING_HOST, host);
}
#Autowired
public void setPort(#Value("${provisioning.port}") final int port) throws Exception {
logger.info("We successfully set the provisioning db port.");
EnvVars.changeSetting(Setting.PROVISIONING_PORT, Integer.toString(port));
}
#Autowired
public void setUsername(#Value("${provisioning.username}") final String username) throws Exception {
logger.info("We successfully set the provisioning db username.");
EnvVars.changeSetting(Setting.PROVISIONING_USER, username);
}
#Autowired
public void setDbName(#Value("${provisioning.name}") final String name) throws Exception {
logger.info("We successfully set the provisioning db name.");
EnvVars.changeSetting(Setting.PROVISIONING_DB_NAME, name);
}
}
VaultConfiguration.java
#Configuration
#Profile("!test")
public class VaultConfiguration extends AbstractVaultConfiguration {
private static final Logger logger = LoggerFactory.getLogger(VaultConfiguration.class);
private URI vaultHost;
private String vaultToken;
/**
* Configure the Client Authentication.
*
* #return A configured ClientAuthentication Object.
* #see ClientAuthentication
*/
#Override
public ClientAuthentication clientAuthentication() {
// testing out environment variable value injection
logger.debug("Vault Token configuration done.");
return new TokenAuthentication(vaultToken);
}
#Override
#Bean
#DependsOn("vaultToken")
public SessionManager sessionManager() {
return super.sessionManager();
}
#Override
public SslConfiguration sslConfiguration() {
logger.info("Configuring Vault SSL with NONE.");
return SslConfiguration.NONE;
}
/**
* Specify an endpoint for connecting to Vault.
*
* #return A configured VaultEndpoint.
* #see VaultEndpoint
*/
#Override
public VaultEndpoint vaultEndpoint() {
logger.debug("Vault Host:" + vaultHost.toString());
if (vaultHost.toString().isEmpty()) {
logger.info("Creating default Vault Endpoint.");
return new VaultEndpoint();
}
logger.info("Creating Vault Endpoint based on address: " + vaultHost.toString());
final VaultEndpoint endpoint = VaultEndpoint.from(vaultHost);
logger.info("Created Vault Endpoint: " + endpoint.toString());
return endpoint;
}
#Bean("vaultHost")
public URI vaultHost(#Value("${spring.vault.host}") final URI vaultHost) {
this.vaultHost = vaultHost;
return vaultHost;
}
#Override
#Bean
#DependsOn("vaultHost")
public VaultTemplate vaultTemplate() {
return super.vaultTemplate();
}
#Bean("vaultToken")
public String vaultToken(#Value("${spring.vault.token}") final String vaultToken) {
this.vaultToken = vaultToken;
return vaultToken;
}
}
How about creating a custom VaultTemplate bean class using RetryTemplate?
public class RetryableVaultTemplate extends VaultTemplate {
private final RetryTemplate retryTemplate;
public RetryableVaultTemplate(VaultEndpointProvider endpointProvider,
ClientHttpRequestFactory clientHttpRequestFactory,
SessionManager sessionManager, RetryTemplate retryTemplate) {
super(endpointProvider, clientHttpRequestFactory, sessionManager);
this.retryTemplate = retryTemplate;
}
#Override
public VaultResponse read(final String path) {
return retryTemplate
.execute(new RetryCallback<VaultResponse, RuntimeException>() {
#Override
public VaultResponse doWithRetry(RetryContext context) {
System.out.println("doWithRetry");
return RetryableVaultTemplate.super.read(path);
}
});
}
#Override
public <T> VaultResponseSupport<T> read(final String path, final Class<T> responseType) {
return retryTemplate
.execute(new RetryCallback<VaultResponseSupport<T>, RuntimeException>() {
#Override
public VaultResponseSupport<T> doWithRetry(RetryContext context) {
return RetryableVaultTemplate.super.read(path, responseType);
}
});
}
}
Make sure to register this bean class as vaultTemplate bean instead of VaultTemplate.
I am experimenting with the Spring Reactor 3 components and Spring Integration to create a reactive stream (Flux) from a JMS queue.
I am attempting to create a reactive stream (Spring Reactor 3 Flux) from a JMS queue (ActiveMQ using Spring Integration) for clients to get the JMS messages asynchronously. I believe that I have everything hooked up correctly but the client does not receive any of the JMS messages until the server is stopped. Then all of the messages get "pushed" to the client a once.
Any help would be appreciated.
Here is the configuration file that I am using to configure the JMS, Integration components and the reactive publisher:
#Configuration
#EnableJms
#EnableIntegration
public class JmsConfiguration {
#Value("${spring.activemq.broker-url:tcp://localhost:61616}")
private String defaultBrokerUrl;
#Value("${queues.patient:patient}")
private String patientQueue;
#Autowired
MessageListenerAdapter messageListenerAdapter;
#Bean
public DefaultJmsListenerContainerFactory myFactory(
DefaultJmsListenerContainerFactoryConfigurer configurer) {
DefaultJmsListenerContainerFactory factory =
new DefaultJmsListenerContainerFactory();
configurer.configure(factory, jmsConnectionFactory());
return factory;
}
#Bean
public Queue patientQueue() {
return new ActiveMQQueue(patientQueue);
}
#Bean
public ActiveMQConnectionFactory jmsConnectionFactory() {
ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory();
connectionFactory.setBrokerURL(defaultBrokerUrl);
connectionFactory.setTrustedPackages(Arrays.asList("com.sapinero"));
return connectionFactory;
}
// Set the jackson message converter
#Bean
public JmsTemplate jmsTemplate() {
JmsTemplate template = new JmsTemplate();
template.setConnectionFactory(jmsConnectionFactory());
template.setDefaultDestinationName(patientQueue);
template.setMessageConverter(jacksonJmsMessageConverter());
return template;
}
#Bean
public MessageListenerAdapter messageListenerAdapter() {
MessageListenerAdapter messageListenerAdapter = new MessageListenerAdapter();
messageListenerAdapter.setMessageConverter(jacksonJmsMessageConverter());
return messageListenerAdapter;
}
#Bean
public AbstractMessageListenerContainer messageListenerContainer() {
DefaultMessageListenerContainer defaultMessageListenerContainer = new DefaultMessageListenerContainer();
defaultMessageListenerContainer.setMessageConverter(jacksonJmsMessageConverter());
defaultMessageListenerContainer.setConnectionFactory(jmsConnectionFactory());
defaultMessageListenerContainer.setDestinationName(patientQueue);
defaultMessageListenerContainer.setMessageListener(messageListenerAdapter());
defaultMessageListenerContainer.setCacheLevel(100);
defaultMessageListenerContainer.setErrorHandler(new ErrorHandler() {
#Override
public void handleError(Throwable t) {
t.printStackTrace();
}
});
return defaultMessageListenerContainer;
}
#Bean // Serialize message content to json using TextMessage
public MessageConverter jacksonJmsMessageConverter() {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setTargetType(MessageType.TEXT);
converter.setTypeIdPropertyName("_type");
return converter;
}
#Bean
public MessageChannel jmsOutboundInboundReplyChannel() {
return MessageChannels.queue().get();
}
#Bean
public Publisher<Message<String>> pollableReactiveFlow() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(messageListenerContainer()).get())
.channel(MessageChannels.queue())
.log(LoggingHandler.Level.DEBUG)
.log()
.toReactivePublisher();
}
#Bean
public MessageChannel jmsChannel() {
return new DirectChannel();
}
The controller that creates the Flux is:
#RestController
#RequestMapping("patients")
public class PatientChangePushController {
private LocalDateTime lastTimePatientDataRetrieved = LocalDateTime.now();
private int durationInSeconds = 30;
private Patient patient;
AtomicReference<SignalType> checkFinally = new AtomicReference<>();
#Autowired
PatientService patientService;
#Autowired
#Qualifier("pollableReactiveFlow")
private
Publisher<Message<String>> pollableReactiveFlow;
#Autowired
private JmsTemplate jmsTemplate;
#Autowired
private Queue patientQueue;
/**
* Subscribe to a Flux of a patient that has been updated.
*
* #param id
* #return
*/
#GetMapping(value = "/{id}/alerts", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<Message<String>> getPatientAlerts(#PathVariable Long id) {
Flux<Message<String>> messageFlux = Flux.from(pollableReactiveFlow);
return messageFlux;
}
#GetMapping(value = "/generate")
public void generateJmsMessage() {
for (long i = 0L; i < 100; i++) {
Patient patient = new Patient();
patient.setId(i);
send(patient);
System.out.println("Message was sent to the Queue");
}
}
void send(Patient patient) {
this.jmsTemplate.convertAndSend(this.patientQueue, patient);
}
}
If anyone can tell me why the messages do not get sent to the client until after the server is killed, I would appreciate it.
Works well for me:
#SpringBootApplication
#RestController
public class SpringIntegrationSseDemoApplication {
public static void main(String[] args) {
SpringApplication.run(SpringIntegrationSseDemoApplication.class, args);
}
#Autowired
private ConnectionFactory connectionFactory;
#Autowired
private JmsTemplate jmsTemplate;
#Bean
public Publisher<Message<String>> jmsReactiveSource() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
.destination("testQueue"))
.channel(MessageChannels.queue())
.log(LoggingHandler.Level.DEBUG)
.log()
.toReactivePublisher();
}
#GetMapping(value = "/events", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> getPatientAlerts() {
return Flux.from(jmsReactiveSource())
.map(Message::getPayload);
}
#GetMapping(value = "/generate")
public void generateJmsMessage() {
for (int i = 0; i < 100; i++) {
this.jmsTemplate.convertAndSend("testQueue", "testMessage #" + (i + 1));
}
}
}
In one terminal I have curl http://localhost:8080/events which waits for SSEs from that Flux.
In other terminal I perform curl http://localhost:8080/generate and see in the first one:
data:testMessage #1
data:testMessage #2
data:testMessage #3
data:testMessage #4
I use Spring Boot 2.0.0.BUILD-SNAPSHOT.
Also see here: https://spring.io/blog/2017/03/08/spring-tips-server-sent-events-sse