My spring boot application implemented using command line runner (meaning it's not an API) is executed using java -jar with bunch of command line arguments. We have been running our application manually on AWS EC2 so far and now in an attempt to Automate application execution we have started a POC using Lambda. Lambda is chosen because application must trigger at SNS event (which is for file upload event in S3).
I have configured lambda function using Java runtime and attached it to SNS topic. Lambda is successfully triggered and sending a call to my application Jar which is uploaded in lambda function through S3.
Application first step is to download the file from S3, so I am implementing LambdaHandler class using S3 event, as shown below.
public class LambdaHandler implements RequestHandler<S3Event, String> {
// Code to fetch S3 object name here which is needed in application processing logic further
}
I am unable to figure out how to initialize spring boot batch application which is implemented using command line runner, is this even possible?
Would you recommend an alternate approach (Jenkins connecting to EC2 and running a bash script to download file from S3 then wget jar file from artifactory and run java -jar build command to execute)
Related
I have two Azure functions linked together. I want to test them locally using Rest Assured. But to do this, I have to manually run them every time before the tests. Is there any way to automate this moment? So that when the test starts, the service starts automatically.
You would have to include a step to start your application (function app) using Azure Functions Core CLI. If you do not have that already you have to set it up: https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cwindows%2Ccsharp%2Cportal%2Cbash%2Ckeda.
To start your application with command line you can use func host start. Posting a documentation for reference: https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cwindows%2Ccsharp%2Cportal%2Cbash%2Ckeda#start.
We build package that contains around 13 amazon lambda handlers [java].
Package is build using maven. When functions are deployed manually - we might upload same jar that includes dependencies and handlers multiple times specifying just different handler name. Functions share dependencies so this makes sense for us.
Can we run amazon CLI in parallel for faster deployment?
Yes - you should consider writing a bash or python deployment script that calls the AWS CLI upload commands in parallel. Then, you should be able to deploy all thirteen of your Lambda handlers at the same time.
Perhaps you could upload the JAR to S3 once, and then have a Lambda function, triggered by that upload, written in JavaScript do async code deployments from S3 to your Lambda functions. For example, see the Lambda Auto-Deployer at New Deployment Options for AWS Lambda.
I wrote a Java program and packaged it into an executable Jar file.
What I am looking for now, is to schedule this jar file to run daily using some sort of online Cron Job service.
I was considering using Amazon AWS. Can it achieve what I wanted? If yes, what service shall I use exactly and what are the steps I should proceed with? If not what are other alternatives?
I currently am hosting a java project on an Amazon EC2 instance. You can select the server instance you would like to use, e.g. Ubuntu, Windows Server, etc. Once this is complete, you must configure your security settings so can connect to your EC2 instance and transfer your jar file to this server instance using scp or another file transfer service. I have a repository in an Amazon S3 bucket, and it is very easy to transfer files from S3 to an EC2 instance via the "s3cmd" command (Note: I am using Ubuntu for my EC2 server instance). Once the jar file is hosted on the server, all you need to do is create the cron job and it will run as scheduled as long as your EC2 instance is running.
I have a rails app that talks to an api running on the same domain via ajax calls. I want to test this app using cucumber. The api is written in java and packaged as a jar. How can I mount the jar when using cucumber?
There is no way to do it automatically but you can add Before hook into env.rb or put it into separate file and in this method you can load your java extension by issuing shell command, you can store process pid in variable and kill this process in After callbalk. You can configure Capybara to start server on specific port and I think you can tune your application to use specific port too.
I am new to Java and currently working on a project where a Hadoop job needs to be triggered from Spring MVC application. The manager asked me to use "process" for which I have no clue. I have written a shell script to trigger the job but the client wants it to be triggered directly from the Spring MVC app so that log can be written in local file system.
Can anyone help me how to trigger a Hadoop jar (more specifically Yarn command with different arguments) to be triggered in edge node through Java process?
You can try using ProcessBuilder.
http://docs.oracle.com/javase/7/docs/api/java/lang/ProcessBuilder.html