Polling SQS using dropwizard - java

What I am trying to achieve:
I want to make a dropwizard client that polls Amazon SQS.
Whenever a message is found in the queue, it is processed and stored.
Some information about the processed messages will be available through an API.
Why I chose Dropwizard:
Seemed like a good choice to make a REST client. I need to have metrics, DB connections and integrate with some Java services.
What I need help with:
It is not very clear how and where the SQS polling will fit in a typical dropwizard application.
Should it be a managed resource? Or a console reporter console-reporter? Or something else.

You can use com.google.common.util.concurrent.AbstractScheduledService to create a consumer thread and add it to the dropwizard's environment lifecycle as ManagedTask. Following is the pseudocode -
public class YourSQSConsumer extends AbstractScheduledService {
#Override
protected void startUp() {
// may be print something
}
#Override
protected void shutDown() {
// may be print something
}
#Override
protected void runOneIteration() {
// code to poll on SQS
}
#Override
protected Scheduler scheduler() {
return newFixedRateSchedule(5, 1, SECONDS);
}
}
In Main do this -
YourSQSConsumer consumer = new YourSQSConsumer();
Managed managedTask = new ManagedTask(consumer);
environment.lifecycle().manage(managedTask);

As an alternative to RishikeshDhokare's answer, one can also go ahead with the following code which does not need to include additional jar as a dependency in your project to keep the uber jar as much lightweight as possible.
public class SQSPoller implements Managed, Runnable {
private ScheduledExecutorService mainRunner;
#Override
public void start() throws Exception {
mainRunner = Executors.newSingleThreadScheduledExecutor()
mainRunner.scheduleWithFixedDelay(this, 0, 100, TimeUnit.MILLISECONDS);
}
#Override
public void run() {
// poll SQS here
}
#Override
public void stop() throws Exception {
mainRunner.shutdown();
}
}
And in the run() of your Application class, you can register the above class as follows.
environment.lifecycle().manage(new SQSPoller());
You can use either scheduleWithFixedDelay() or scheduleAtFixedRate() depending upon your use case.

Related

How to isolate 2 schedulers?

I have a Java application that runs several scheduler to get and provide data to an external application. I will have to add another scheduler to get data from another external application. That would be the exact same process as one of the scheduler already existing for the first application.
So roughly it would be something like this:
However I have small confidence in the formatting of the data of this second application, I know that they have less verifications that the fist application and I might get funny things. I will obviously put plenty of null/bad format check on my side, but I have to make sure that if they ever send me bad data this doesn't impact my others schedulers.
#EnableScheduling
public class myApp{
#Scheduled(fixedRate = 1000)
public void externalApp1() {
do stuff...
commonMethod();
}
#Scheduled(fixedRate = 1000)
public void externalApp2() {
do stuff...
commonMethod();
}
public void commonMethod(){
doStuff...
}
}
One of my first idea is to put dedicated threads to each scheduler, so that if they send bad data and it ends up killing the thread for whatever reason, it only impacts their own process and not the schedulers for the first external application. I have done this for now based on what I found, I suppose this should work as intended:
#Configuration
#EnableAsync
#EnableScheduling
public class MyApp{
#Scheduled(fixedRate = 1000)
#Async(value = "threadPool1")
public void externalApp1() {
dostuff...
commonMethod();
}
#Scheduled(fixedRate = 1000)
#Async(value = "threadPool2")
public void externalApp2() {
dostuff...
commonMethod();
}
public void commonMethod(){
doStuff...
}
#Bean
public Executor threadPool1() {
return Executors.newFixedThreadPool(1);
}
#Bean
public Executor threadPool2() {
return Executors.newFixedThreadPool(1);
}
}
(actual code would be with beans properly separated from main class)
But I am wondering if there is any other way to fully ensure the processes are totally independant from one another?
EDIT: I precise that the data I get from the second application are not used for any process of the first application. It has a process on its own and data are not shared between those 2 external applications

How to keep apache camel context alive in thread main

I'm trying to make simple application that will listen one queue from artemis and then proceed messages and after that create new message in second queue.
I have created in method Main Camel context and added routing (it forwards messages to bean). And to test this routing and that this bean works correctly I'm sending
few messages to this queue - rigth after context started in main thread
public static void main(String args[]) throws Exception {
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("tcp://localhost:61616", "admin", "admin");
context.addComponent("cmp/q2", JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
context.addRoutes(new RouteBuilder() {
public void configure() {
from("cmp/q2:cmp/q2").bean(DataRequestor.class, "doSmth(${body}, ${headers})");
}
});
ProducerTemplate template = context.createProducerTemplate();
context.start();
for (int i = 0; i < 2; i++) {
HashMap<String, Object> headers = new HashMap<String, Object>();
headers.put("header1", "some header info");
template.sendBodyAndHeaders("cmp/q2:cmp/q2", "Test Message: " + i, headers);
}
context.stop();
}
And in this case application works fine, but it stops when method main completed - it proceess only messages that were created by it self.
Now after I have test bean that is used in routing, I want to modify application such way that it should start and stay active(keeping camle context and routin alive ) - so that i can create massages manually in web UI (active mq management console).
But I really don't know how.
I have tried infinite loop with Thread.sleep(5000);
I tried to start one more thread(also with infinite loop) in main method.
But it didn't work.(The most suspicious for me in case with infinite loop is that apllication is running, but when i create message in web UI it just desapears - and no any traces in system out that it was processed by my bean in routing, a suppose that it should be processed by my bean or just stay in the queue untouched, but it just disapears).
I now that my question is dummy, but I already have wasted 3 days to find a solution, so any advices or link to tutorials or some valueable information are appreciated.
PS: I've got one painfull restriction - Spring frameworks are not allowed.
I think the most simple solution for running standalone camel is starting it with camel Main. Camel online documentation has also an example of using it http://camel.apache.org/running-camel-standalone-and-have-it-keep-running.html.
I will copy paste the example code here just in case:
public class MainExample {
private Main main;
public static void main(String[] args) throws Exception {
MainExample example = new MainExample();
example.boot();
}
public void boot() throws Exception {
// create a Main instance
main = new Main();
// bind MyBean into the registry
main.bind("foo", new MyBean());
// add routes
main.addRouteBuilder(new MyRouteBuilder());
// add event listener
main.addMainListener(new Events());
// set the properties from a file
main.setPropertyPlaceholderLocations("example.properties");
// run until you terminate the JVM
System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
main.run();
}
private static class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("timer:foo?delay={{millisecs}}")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
System.out.println("Invoked timer at " + new Date());
}
})
.bean("foo");
}
}
public static class MyBean {
public void callMe() {
System.out.println("MyBean.callMe method has been called");
}
}
public static class Events extends MainListenerSupport {
#Override
public void afterStart(MainSupport main) {
System.out.println("MainExample with Camel is now started!");
}
#Override
public void beforeStop(MainSupport main) {
System.out.println("MainExample with Camel is now being stopped!");
}
}
}
The route keeps executing until you hit Ctlr+c or stop it in some other way...
If you test this, notice that you need example.properties file in your classpath, with the property millisecs.
At the very minimum you need a main thread to kick off a thread to run the camel route and then check for when that thread is done. The simple java threading approach using the main loop to check .wait() and the end of the camel route thread to signal .notify() when it finishes (or shutdown) would get the job done.
From there you can look into an executor service or use a micro-container like Apache Karaf
PS. Props for going Spring-free!
Disclaimer: this is written in Kotlin but it is somewhat trivial to port to java
Disclaimer: this is written for Apache-Camel 2.24.2
Disclaimer: I am also learning about Apache-Camel. The docs are a little heavy for me
I tried the Main route to set it up but it quickly got a little convoluted. I know that this is a java thread but I'm using kotlin ATM, I'll leave most of the types and imports available so it's easier for java devs.
class Listener
The first I had to fight with was understanding the lifecycle of Main. It turns out that there is an interface you can implement to add in the implementations of such events. With such an implementation I can hook up any routines that have to be sure that camel has started (no guessing required).
import org.apache.camel.CamelContext
import org.apache.camel.main.MainListener
import org.apache.camel.main.MainSupport
typealias Action = () -> Unit
class Listener : MainListener {
private var afterStart: Action? = null
fun registerOnStart(action:Action) {
afterStart = action
}
override fun configure(context: CamelContext) {}
override fun afterStop(main: MainSupport?) {}
override fun afterStart(main: MainSupport?) {
println("started!")
afterStarted?.invoke().also { println("Launched the registered function") }
?: println("Nothing registered to start")
}
override fun beforeStop(main: MainSupport?) {}
override fun beforeStart(main: MainSupport?) {}
}
class ApplicationCore
Then I set up the configuration of the context (Routes, Components, etc,...)
import org.apache.camel.CamelContext
import org.apache.camel.impl.DefaultCamelContext
import org.apache.camel.impl.SimpleRegistry
import org.apache.camel.main.Main
class ApplicationCore : Runnable {
private val main = Main()
private val registry = SimpleRegistry()
private val context = DefaultCamelContext(registry)
private val listener = Listener() // defined above
// for Java devs: this is more or less a constructor block
init {
main.camelContexts.clear()
listener.registerOnStart({ whateverYouAreDoing().start() })// <- your stuff should run in its own thread because main will be blocked
main.camelContexts.add(context)
main.duration = -1
context.addComponent("artemis", ...)// <- you need to implement your own
context.addRoutes(...)// <- you already know how to do this
...// <- anything else you could need to initialize
main.addMainListener(listener)
}
fun run() {
/* ... add whatever else you need ... */
// The next line blocks the thread until you close it
main.run()
}
fun whateverYouAreDoing(): Thread {
return Thread() {
ProducerTemplate template = context.createProducerTemplate();
for (i in 0..1) {
val headers = HashMap<String, Any>()
headers["header1"] = "some header info"
template.sendBodyAndHeaders("cmp/q2:cmp/q2", "Test Message: $i", headers)
}
context.stop()// <- this is not good practice here but its what you seem to want
}
}
}
In kotlin, initialization is rather easy. You can easily translate this into java because it is quite straight forward
// top level declaration
fun main(vararg args:List<String>) = { ApplicationCore().run() }

Reactor Eventbus example which gives back a response or an error

First thing first, I am pretty new to the domain of asychronous processing. In my current project, we are using spring boot along with project reactor, specifically Eventbus, to do some asynchronous processing. Use of eventbus I guess would also make our system more scalable.
Till now, the use of EventBus has been pretty limited where we do some processing in an EventBus consumer which does not return something. The configiuration and example processor are as follows:
//Config File
#SpringBootApplication
public class Application implements CommandLineRunner {
#Autowired
private EventBus eventBus;
#Autowired
private BatchProcessor batchProcessor;
#Override
public void run(String... arg0) throws Exception {
eventBus.on("batchProcessor", batchProcessor);
}
}
//Consumer
#Service
public class BatchProcesspr implements Consumer<Event<Request>> {
#Override
public void accept(Event<Request> event) {
// processing goes here
}
Till now, this was fine with the accept method having a void return type. But, now I have a scenario where I would like to return a response from the processor method, or if an error occurs while processing need to throw an appropriate exception and in either case, the response/exception needs to be returned to the point of invocation.
Can this be done using reactor? if yes, please provide an easy example for this. I have read about Promise but cannot find an example similar to my case.
Did you try sendAndReceive? http://projectreactor.io/ext/docs/reference/#bus-request-reply
EventBus bus;
bus.receive($("job.sink"), (Event<String> ev) -> {
return ev.getData().toUpperCase();
});
bus.sendAndReceive(
"job.sink",
Event.wrap("Hello World!"),
s -> System.out.printf("Got %s on thread %s%n", s, Thread.currentThread())
);
You can easily register another consumer on the caller side that is notified when the service responds.

Was asynchronous jobs removed from the Play framework? What is a better alternative?

I wanted to use Job so I can kick them off on the start of application. Now it seems like it has been removed from Play completely?
I saw some samples where people create a Global class, but not entirely sure if/how I should use that to replace Job.
Any suggestions?
Edit: If you gonna downvote, give a reason. Maybe I'm missing something in the question, maybe this doesn't belong here. At least something...
The Job class was removed in Play 2.0.
You have some alternatives though depending on your Play version and if you need asynchrony or not:
Akka Actors
For all version since Play 2.0 you can use Akka Actors to schedule an asynchronous task/actor once and execute it on startup via Play Global class.
public class Global extends GlobalSettings {
#Override
public void onStart(Application app) {
Akka.system().scheduler().scheduleOnce(
Duration.create(10, TimeUnit.MILLISECONDS),
new Runnable() {
public void run() {
// Do startup stuff here
initializationTask();
}
},
Akka.system().dispatcher()
);
}
}
See https://www.playframework.com/documentation/2.3.x/JavaAkka for details.
Eager Singletons
Starting with Play 2.4 you can eagerly bind singletons with Guice
import com.google.inject.AbstractModule;
import com.google.inject.name.Names;
public class StartupConfigurationModule extends AbstractModule {
protected void configure() {
bind(StartupConfiguration.class)
.to(StartupConfigurationImpl.class)
.asEagerSingleton();
}
}
The StartupConfigurationImpl would have it's work done in the default constructor.
#Singleton
public class StartupConfigurationImpl implements StartupConfiguration {
#Inject
private Logger log;
public StartupConfigurationImpl() {
init();
}
public void init(){
log.info("init");
}
}
See https://www.playframework.com/documentation/2.4.x/JavaDependencyInjection#Eager-bindings

how to re-initialize java servlet on text file change

I have a servlet that pulls data from a text file during initialization.
Now I am updating that text file with a cron job(say everyday at 10am) and want to reinitialize the servlet every time this particular file changes.
Second approach I can follow is to sync the reinitialization of the servlet to my cron job.
Kindly suggest on how to go about implementing either of the above two approaches.
thanks.
Don't get hold of it as instance variable of the servlet. Create a ServletContextListener which stores it in the application scope and runs a thread which updates it on every interval with help of ScheduledExecutorService.
E.g.
#WebListener
public class Config implements ServletContextListener {
private ScheduledExecutorService scheduler;
#Override
public void contextInitialized(ServletContextEvent event) {
Data data = new Data(); // Your class which reads and holds data upon construction.
event.getServletContext().setAttribute("data", data);
scheduler = Executors.newSingleThreadScheduledExecutor();
scheduler.scheduleAtFixedRate(new Reloader(data), 0, 1, TimeUnit.DAYS);
}
#Override
public void contextDestroyed(ServletContextEvent event) {
scheduler.shutdownNow();
}
}
with this runnable
public class Reloader implements Runnable {
private Data data;
public Reloader(Data data) {
this.data = data;
}
#Override
public void run() {
data.reload();
}
}
It's accessible a random servlet.
#Override
protected void doGet(HttpServletRequest request, HttpServletResponse response) {
Data data = (Data) getServletContext().getAttribute("data");
// ...
}
And even in a random JSP.
${data.something}
Have your servlet occasionally check the file for changes with a timer.
Googling "Java monitor file for changes" will present many examples, one of which you can find here: http://www.devdaily.com/java/jwarehouse/jforum/src/net/jforum/util/FileMonitor.java.shtml

Categories