Spring JDBC Adapter in Cluster mode - java

I am using spring JDBC inbound channel adapter in my web application. If I deploy this application in clustered environment, two or more instances pickup the same job and run.
Can anybody help to overcome this issue by changing the spring configuration ?
I have attached my spring configuration.
<int-jdbc:inbound-channel-adapter
query=" SELECT JOBID,
JOBKEY,
JOBPARAM
FROM BATCHJOB
WHERE JOBSTATUS = 'A' "
max-rows-per-poll="1" channel="inboundAdhocJobTable" data-source="dataSource"
row-mapper="adhocJobMapper"
update=" delete from BATCHJOB where JOBKEY in (:jobKey)"
>
<int:poller fixed-rate="1000" >
<int:advice-chain>
</int:advice-chain>
</int:poller>
</int-jdbc:inbound-channel-adapter>

Unfortunately this will not be possible without some sort of syncing. Additionally using the database as some sort of message queue is not a good idea (http://mikehadlow.blogspot.de/2012/04/database-as-queue-anti-pattern.html). I'd try to follow different approaches:
Use some sort of message bus + message store to store the jobs objects rather than executing SQL directly. In this case you'll have to change the way jobs are being stored. Either by using some sort of message store backed channel (Spring integration only) or push them to a message queue like RabbitMQ to store these jobs.
I'm not 100% sure but remember that Spring Batch offers something similar like Master-Slave-Job splitting and synchronization. Maybe you have a look there.

Related

How to send List of Objects to kafka topic using spring-kafka framework?

Me using spring-kafka in spring-boot application to send the data topic.
I need to fetch the data from a oracle table and send it.
I fetch List from oracle table. How to send them to topic ?
i.e.
Is there any way to send them as a List ? if yes how ?
If yes, then how to deserialize it at consumer side ?
Is it possible to send data like a streaming fashion using spring-book and spring-kafka ? if yes any more info or sample/snippet plz ...
How to handle partitionKey if I send List at a time?
Currently I am sending individual Company obj hence have key defined as below
companyKafkaTemplate.send(COMPANY_TOPIC,this.getKey(company), company);
For a List serialization and deserialization I would suggest to use a JSON support in Spring Kafka: https://docs.spring.io/spring-kafka/docs/2.2.7.RELEASE/reference/html/#serdes
For a streaming I would suggest to take a look into a Reactive support in Spring Kafka, based on the Reactor Kafka project: https://github.com/reactor/reactor-kafka
For that purpose we provide a ReactiveKafkaProducerTemplate and ReactiveKafkaConsumerTemplate.

How to get UpsertResult when using Mongo Outbound channel adapter?

I'm using spring integration to store data in a mongo database. I'm using the java classes (MongoDbStoringMessageHandler), not the xml configuration and I can't find the way to get the results when adding some data in the database...
Is it possible ? How ?
The MongoDbStoringMessageHandler is a one-way component and it doesn't return anything.
Consider to use a MongoDbOutboundGateway instead with the CollectionCallback injected where you can perform an updateMany() and get UpdateResult as a reply from this gateway.
See more info in the Reference Manual: https://docs.spring.io/spring-integration/reference/html/mongodb.html#mongodb-outbound-gateway
UPDATE
but I don't know what parameter to put for the function to insert the Message payload.... Since there is no reference of the message in the ServiceActivator
Oh! I see. That a bug. We can't get access to the message from that context. Please, raise a JIRA on the matter: https://jira.spring.io/projects/INT/
Meanwhile as a workaround I suggest you ti write a custom POJO with injected MongoOperations and ther you can build any possible logic against a requestMessage.
The JIRA is here: https://jira.spring.io/browse/INT-4570

How to maintain SseEmitters list between multiple instances of a microservice?

Language: Spring Boot, JS
Overview: I am implementing server sent events functionality in my application which will be deployed in cloud foundry,
wherein based on a new message in a queue(which I have subscribed in my micro-service), I will send some update to my client/browser(which is using EventSource).
For this, I am maintaining a SseEmitters List(for mainitaining all the active SseEmitter) on my server side. Once I receive a new message from the queue, based on the id(a field in the queue message), I will emit the message to corresponding client.
PROBLEM: How will the above scenario work, when I scale my application by creating multiple instances of it. Since only one instance will receive the new queue message, it may happen that the active SseEmitter is not maintained in that particular instance, how do I solve this?
To solve this problem, following approaches can be observed.
DNS concept
If you think about it, knowing where your user (SSE Emitter) is, is like knowing where some website is. You can use DNS-look-alike protocol to figure out where your user is. Protocol would be as follows:
Whe user lands on any of your instances, associate user with that instance. Association can be done by either using external component, e.g. Redis or a distributed map solution like Hazelcast.
Whenever user disconnects from SSE, remove association. Sometimes disconnect is not registered properly with Spring SSEEmiter, so disassociation can be done when sendig message fails.
Other parties (microservices) can easily query Redis/Hazelcast to figure on which instance user is.
Message routing concept
If you're using messaging middleware for communication between your microservices, you can use routing feature which AMQP protocol provides. Protocol would be as follows:
each SSE instance creates their own queue on boot
user lands on any of SSE instances and instance adds exchange-queue binding with routing key = user uid
Whenever user disconnects from SSE, remove association. Sometimes disconnect is not registered properly with Spring SSEEmiter, so disassociation can be done when sendig message fails.
Other parties (microservices) need to send message to the exchange and define routing key. AMQP broker figures out which queue should receive message based on the routing key.
Bindings are not resource intesive on modern AMQP brokers like RabbitMQ.
Your question is old, and if you didnt figure this out by now, hope this helps.

Spring Integration SFTP - Getting configurations from XML

Let say I have these configurations in my xml,
<int-sftp:outbound-channel-adapter id="sftpOutbound"
channel="sftpChannel"
auto-create-directory="true"
remote-directory="/path/to/remote/directory/"
session-factory="cachingSessionFactory">
<int-sftp:request-handler-advice-chain>
<int:retry-advice />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-channel-adapter>
How can I retrieve the attributes i.e, remote-directory in Java class ?
I tried to use context.getBean('sftpOutbound') but it returns EventDrivenConsumer class which doesn't have methods to get the configurations.
I'm using spring-integration-sftp v 4.0.0.
I am actually more concerned with why you wan to access it. I mean the remote directory and other attributes will come with the headers of each message, so you will have access to it at the Message level, but not at the level of Event Driven Consumer and that is by design, hence my question.

Configure activemq to be transactional

It´s more of a conceptual question: I currently have a working activemq queue which is consumed by a Java Spring application. Now I want the queue not to permanently delete the messages until the Java app tells it the message has been correctly saved in DB. After reading documentation I get I have to do it transactional and usa the commit() / rollback() methods. Correct me if I'm wrong here.
My problem comes with every example I find over the internet telling me to configure the app to work this or that way, but my nose tells me I should instead be setting up the queue itself to work the way I want. And I can't find the way to do it.
Otherwise, is the queue just working in different ways depending on how the consumer application is configured to work? What am I getting wrong?
Thanks in advance
The queue it self is not aware of any transactional system but you can pass the 1st parameter boolean to true to create a transactional session but i propose the INDIVIDUAL_ACKNOWLEDGE when creating a session because you can manage messages one by one. Can be set on spring jms DefaultMessageListenerContainer .
ActiveMQSession.INDIVIDUAL_ACKNOWLEDGE
And calling this method to ack a message, unless the method is not called the message is considered as dispatched but not ack.
ActiveMQTextMessage.acknowledge();
UPDATE:
ActiveMQSession.INDIVIDUAL_ACKNOWLEDGE can be used like this :
onMessage(ActiveMQTextMessage message)
try {
do some stuff in the database
jdbc.commit(); (unless auto-commit is enabled on the JDBC)
message.acknowledge();
}
catch (Exception e) {
}
There are 2 kinds of transaction support in ActiveMQ.
JMS transactions - the commit() / rollback() methods on a Session (which is like doing commit() / rollback() on a JDBC connection)
XA Transactions - where the XASession acts as an XAResource by communicating with the Message Broker, rather like a JDBC Connection takes place in an XA transaction by communicating with the database.
http://activemq.apache.org/how-do-transactions-work.html
Should I use XA transactions (two phase commit?)
A common use of JMS is to consume messages from a queue or topic, process them using a database or EJB, then acknowledge / commit the message.
If you are using more than one resource; e.g. reading a JMS message and writing to a database, you really should use XA - its purpose is to provide atomic transactions for multiple transactional resources. For example there is a small window from when you complete updating the database and your changes are committed up to the point at which you commit/acknowledge the message; if there is a network/hardware/process failure inside that window, the message will be redelivered and you may end up processing duplicates.
http://activemq.apache.org/should-i-use-xa.html

Categories