I wrote a Java program and packaged it into an executable Jar file.
What I am looking for now, is to schedule this jar file to run daily using some sort of online Cron Job service.
I was considering using Amazon AWS. Can it achieve what I wanted? If yes, what service shall I use exactly and what are the steps I should proceed with? If not what are other alternatives?
I currently am hosting a java project on an Amazon EC2 instance. You can select the server instance you would like to use, e.g. Ubuntu, Windows Server, etc. Once this is complete, you must configure your security settings so can connect to your EC2 instance and transfer your jar file to this server instance using scp or another file transfer service. I have a repository in an Amazon S3 bucket, and it is very easy to transfer files from S3 to an EC2 instance via the "s3cmd" command (Note: I am using Ubuntu for my EC2 server instance). Once the jar file is hosted on the server, all you need to do is create the cron job and it will run as scheduled as long as your EC2 instance is running.
Related
I've been uploading java app to AWS by extracting a war file from eclipse,
But now I've moved my code to github and,
I want to pull it from github onto my AWS server without generating a war file.
I've tried pulling it but it gives me an error. The requested resource is not available.
You can use Jenkins or other CI tool. Jenkins will fetch code from GitHub, compile it and push it to AWS. You only need to specify deploy script.
The CI enthusiast within me suggests to automate the entire thing, going as far as creating an AWS CodePipeline with 3 steps:
(I have three of these pipelines active and am very happy I created them.)
Listen to GitHub repository for changes.
Pull latest repo changes and push code to a Jenkins instance
To save money on the Jenkins instance running 24/7 you could even create AWS Lambda functions to automate the starting up and shutting down of the Jenkins server
Output Jenkins artifact (your war file) to AWS Elastic Beanstalk.
The AWS Documentation as with most services is pretty elaborate on this subject as is Jenkins'. You shouldn't have too much trouble setting this up and it'll save you a lot of time as the WAR file transfer between each step happens on the internal magically-high speed network of AWS.
It will cost some time to set up, especially if you're not that familiar with CI and this process but so very worth it to be familiar with these type of stacks.
I have an android client who sends a sentence to the server. Now this server does some processing and sends the string back to the Android client. I'm using basic client server communication for this, and it is working fine on the local machine. Now I want my server code to be deployed on a server so that I can provide that server address to my client code. The server is not a web app, it's a simple core Java project. As far as I understand, I'll have to deploy it as a runnable jar. But how do I do that using AWS?
Set up apache tomcat and use the deployment endpoint, namely /manager/text/deploy?path=/footoo&war=file:/path/to/foo.war to deploy the war.
If you'd rather deploy a runnable jar, set up an instance, install java on it, and put the jar onto said instance, before going in and using java -jar runnable-foo.jar in a screen session.
I'm trying to use the AWS free tier for host a java web application. I created an EC2 instance but i don't figure out how can I deploy the application to this instance. I was trying to use the AWS Toolkit for eclipse to deploy the web site to Elastic Beanstalk, but from here i need a second tier to deploy the application to the production.
My question is: What is the free solution to make a deploy to my EC2 instance and how?
Thank you!
If you are using Elastic Beanstalk ( which I recommend), then you should create the Beanstalk stack manually from the AWS console. Before you do that, I suggest you to terminate your other instance, because you wont use that.
The beanstalk stack will create an EC2 instance in the background, also an RDS database if you ask for it. You pay for the resources ( EC2, RDS), but no extra cost for the "Beanstalk stack".
After you having the Beanstalk stack, you can deploy it with the Eclipse plugin, or just simply generate the WAR file and upload it via the AWS console. ( On the Beanstalk page there is a place to upload a war file for deployment.)
Be aware to set the DB connection details to the RDS ( if you are using it).
Also note that the free tier is nice to warm up but not recommended for production.
When you create the Beanstalk stack make sure you create a single AZ web frontend, so you will have only 1 webserver running.
I now have a java program and amazon instance provided with a key..
Used to run java on aws programmatically, but now all I need to do is, use this instance, powerful.. to run my java application, but how.?
Should I make my java program a .jar and upload on the instance?
And documents about the command needed?
Thanks.
You upload a file to your EC2 instance with scp:
scp your.jar root#your.ec2:/tmp
You run your file with ssh:
ssh your.ec2 "java -jar /tmp/your.jar"
All that can be easily automated since no passwords are required to execute scp or ssh, you'll just need to exchange certificates.
Or did you mean that you wanted to make Java program part of your image?
If your Java application is web based, I would encourage you to have a look at Amazon's Elastic Beanstalk
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html
This service allows you to quickly deploy and manage applications to the cloud without worrying about the underlying infrastructure.
See more at http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html
Seb
I have this project that involves both a client and a server. I developped both parts of the application in java and I want to test it in a hadoop cluster, since the server side is a simulation of a cloud, so by using hadoop I want to give my application a real sense of cloud environment. I started by creatin a multi-node Hadoop Cluster but I don"t know what should be the next step.
I would appreciate any enlightening.
The proper way to accomplish this would be to use a restful interface to send the commands.
For instance, on the computer that is the JobTracker, you could host a tomcat rest server. (Make sure that the hadoop dameons are running in the background). You could use a Spring/Hibernate based servlet in order to process the requests to the server. Finally, in the servlet, you could either include the hadoop-jars and call hadoop through the hadoop API, or you can call hadoop through the console (./hadoop runjar blah).
In order to upload files to the Server, you can use an sftp interface, or possibly directly upload files to the hdfs.
If you would like, I can share some code with you, because I have done a similar project.
Also, you can look into Apache Oozie. They host a restful job flow api for hadoop.