We build package that contains around 13 amazon lambda handlers [java].
Package is build using maven. When functions are deployed manually - we might upload same jar that includes dependencies and handlers multiple times specifying just different handler name. Functions share dependencies so this makes sense for us.
Can we run amazon CLI in parallel for faster deployment?
Yes - you should consider writing a bash or python deployment script that calls the AWS CLI upload commands in parallel. Then, you should be able to deploy all thirteen of your Lambda handlers at the same time.
Perhaps you could upload the JAR to S3 once, and then have a Lambda function, triggered by that upload, written in JavaScript do async code deployments from S3 to your Lambda functions. For example, see the Lambda Auto-Deployer at New Deployment Options for AWS Lambda.
Related
I have two Azure functions linked together. I want to test them locally using Rest Assured. But to do this, I have to manually run them every time before the tests. Is there any way to automate this moment? So that when the test starts, the service starts automatically.
You would have to include a step to start your application (function app) using Azure Functions Core CLI. If you do not have that already you have to set it up: https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cwindows%2Ccsharp%2Cportal%2Cbash%2Ckeda.
To start your application with command line you can use func host start. Posting a documentation for reference: https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cwindows%2Ccsharp%2Cportal%2Cbash%2Ckeda#start.
My spring boot application implemented using command line runner (meaning it's not an API) is executed using java -jar with bunch of command line arguments. We have been running our application manually on AWS EC2 so far and now in an attempt to Automate application execution we have started a POC using Lambda. Lambda is chosen because application must trigger at SNS event (which is for file upload event in S3).
I have configured lambda function using Java runtime and attached it to SNS topic. Lambda is successfully triggered and sending a call to my application Jar which is uploaded in lambda function through S3.
Application first step is to download the file from S3, so I am implementing LambdaHandler class using S3 event, as shown below.
public class LambdaHandler implements RequestHandler<S3Event, String> {
// Code to fetch S3 object name here which is needed in application processing logic further
}
I am unable to figure out how to initialize spring boot batch application which is implemented using command line runner, is this even possible?
Would you recommend an alternate approach (Jenkins connecting to EC2 and running a bash script to download file from S3 then wget jar file from artifactory and run java -jar build command to execute)
I've been uploading java app to AWS by extracting a war file from eclipse,
But now I've moved my code to github and,
I want to pull it from github onto my AWS server without generating a war file.
I've tried pulling it but it gives me an error. The requested resource is not available.
You can use Jenkins or other CI tool. Jenkins will fetch code from GitHub, compile it and push it to AWS. You only need to specify deploy script.
The CI enthusiast within me suggests to automate the entire thing, going as far as creating an AWS CodePipeline with 3 steps:
(I have three of these pipelines active and am very happy I created them.)
Listen to GitHub repository for changes.
Pull latest repo changes and push code to a Jenkins instance
To save money on the Jenkins instance running 24/7 you could even create AWS Lambda functions to automate the starting up and shutting down of the Jenkins server
Output Jenkins artifact (your war file) to AWS Elastic Beanstalk.
The AWS Documentation as with most services is pretty elaborate on this subject as is Jenkins'. You shouldn't have too much trouble setting this up and it'll save you a lot of time as the WAR file transfer between each step happens on the internal magically-high speed network of AWS.
It will cost some time to set up, especially if you're not that familiar with CI and this process but so very worth it to be familiar with these type of stacks.
I wrote a Java program and packaged it into an executable Jar file.
What I am looking for now, is to schedule this jar file to run daily using some sort of online Cron Job service.
I was considering using Amazon AWS. Can it achieve what I wanted? If yes, what service shall I use exactly and what are the steps I should proceed with? If not what are other alternatives?
I currently am hosting a java project on an Amazon EC2 instance. You can select the server instance you would like to use, e.g. Ubuntu, Windows Server, etc. Once this is complete, you must configure your security settings so can connect to your EC2 instance and transfer your jar file to this server instance using scp or another file transfer service. I have a repository in an Amazon S3 bucket, and it is very easy to transfer files from S3 to an EC2 instance via the "s3cmd" command (Note: I am using Ubuntu for my EC2 server instance). Once the jar file is hosted on the server, all you need to do is create the cron job and it will run as scheduled as long as your EC2 instance is running.
I now have a java program and amazon instance provided with a key..
Used to run java on aws programmatically, but now all I need to do is, use this instance, powerful.. to run my java application, but how.?
Should I make my java program a .jar and upload on the instance?
And documents about the command needed?
Thanks.
You upload a file to your EC2 instance with scp:
scp your.jar root#your.ec2:/tmp
You run your file with ssh:
ssh your.ec2 "java -jar /tmp/your.jar"
All that can be easily automated since no passwords are required to execute scp or ssh, you'll just need to exchange certificates.
Or did you mean that you wanted to make Java program part of your image?
If your Java application is web based, I would encourage you to have a look at Amazon's Elastic Beanstalk
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html
This service allows you to quickly deploy and manage applications to the cloud without worrying about the underlying infrastructure.
See more at http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html
Seb