Spring batch jobs from Web ui? - java

Has anyone worked or has any experience with executing spring batch jobs from web UI. Currently I have written few jobs for data-copy from CSV to DB table, it runs fine from command prompt and in a JUnit test. But now these jobs have to be executed through web, JSF is being used as the front controller framework. Any suggestions about the best practices in this case would be very helpful.
Thanks!

Spring Batch Admin is a deployable web frontend for your Spring Batch jobs. If all you want is a simple UI instead of a shell script for Administrators, take this approach:
http://static.springsource.org/spring-batch-admin/getting-started.html
If you're looking for a way to integrate the job trigger mechanism with your existing application, look at this implementation using Spring's JobLauncher which can be invoked from Controller/Servlet:
http://docs.spring.io/spring-batch/trunk/reference/html/configureJob.html#runningJobsFromWebContainer

Related

Cron job from application or external server

I am working on an application which is deployed in pcf and we are using caching mechanism. We want to implement a batch job which will purge the data from cache region.
I want get some suggestions on below:
should i include this batch job in the same application so that it uses the same server to run the batch jobs?
or should i create a new server for running these batch jobs?
Just want to see how the performance of the current application will be impacted, if we run the batch job from the same server. Just want to see the advantages and disadvantages for the same..
TIA

Spring Batch without Spring Cloud Data Flow

I have a Spring Boot application that uses Spring Batch. I want now to implement an admin panel to see all job statuses. For this, Spring has "spring-batch-admin" But I see that is deprecated long time ago:
The functionality of Spring Batch Admin has been mostly duplicated
and
expanded upon via Spring Cloud Data Flow and we encourage all users to
migrate to that going forward.
But then Spring Cloud Data Flow says:
Pipelines consist of Spring Boot apps, built using the Spring Cloud
Stream or Spring Cloud Task microservice frameworks
So in order to use this functionality do I really need to convert my spring boot app to a microservice? Isn't this an overkill just to see some batch statuses? Also I can not install docker on my production server(for various reasons) Can I still use Spring Cloud Data Flow without docker?
Yes, spring boot batch should be wrapped as spring cloud task, which should not be too complicated.
If Docker does not suit your needs - https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#getting-started-local-deploying-spring-cloud-dataflow

Overhead in Spring Batch

I have a Restful microservice that uses Spring-Boot and Spring-Batch. When I run it locally it uses H2 database. Sending requests to microservice, I see the Spring Batch serializes the context of Job. Using a tool for profiling I see the microservice stays too much on this serialization.
Table used: BATCH_JOB_EXECUTION_CONTEXT.
Call graph:
Do you have any idea how the serialization of context can be disabled?
Thank you

Managing quartz jobs from different web based application

I have a core java application which is using Quartz 2.2.1 with JDBC job store. All the jobs are scheduled in the same.
I am building another Spring based application using Appfuse, maven and Quartz.
I want to reschedule the jobs running in the former application from the spring application.
While doing that I am getting class not found exception as I have not added the job classes in the class path.If I am adding them I am able to update the jobs.
Is there any way to manage the jobs from Spring application without adding the job classes in the class path.
I do not want to update the quartz database using jdbc or hibernate.
Yes, that is a known limitation of the Quartz remote API. It becomes very painful if you have to communicate with or manage multiple Quartz scheduler versions remotely. It becomes even more painful when the management application uses the Quartz API internally (which seems to be your case).
If you take a look at the QuartzDesk project that I founded, you will find out that it very elegantly solves the problem by exposing a JAX-WS SOAP interface through which you can communicate with and manage external Quartz scheduler instances. It hides all Quartz scheduler API complexities and Quartz differences behind a simple Quartz-like API.
The JAX-WS interface is described here and there is also the relevant WSDL file available for download.

Execute hadoop jar through Spring MVC using process command

I am new to Java and currently working on a project where a Hadoop job needs to be triggered from Spring MVC application. The manager asked me to use "process" for which I have no clue. I have written a shell script to trigger the job but the client wants it to be triggered directly from the Spring MVC app so that log can be written in local file system.
Can anyone help me how to trigger a Hadoop jar (more specifically Yarn command with different arguments) to be triggered in edge node through Java process?
You can try using ProcessBuilder.
http://docs.oracle.com/javase/7/docs/api/java/lang/ProcessBuilder.html

Categories