We want to start a project using spring and mongodb and finally want to deploy it on cloudfoundry. There will be three different applications which should be binded to a mongodb service i.e. all three applications need access to the mongodb database.
Now we want to develop locally (without actually deploying to the cloud) using local mongodb and tomcat,...) since this way we get better debug capabilities, hot deployment, etc... (is this even possible with cloudfoundry?) How can i configure my spring applications in a way that they are using the locally installed mongodb when deployed locally and are binded to the cloudfoundry mongodb service wen deployed in cloudfoundry?
Is this even a good idea to develop/deploy locally and deploy to the cloud at the end?
regards
The envisioned development model for Cloud Foundry based applications is using a so called Micro Cloud Foundry for your local development needs, which currently/usually is a virtual machine (VM) based single node but otherwise complete Cloud Foundry installation 'in a box'.
The only downside of this approach is that (by default) you always need to resort to remote debugging techniques (even when developing locally/offline) rather than e.g. the change code and reload browser approach many dynamic language developers are used to enjoy.
Whether or not this matters largely depends on your development
process, but it is probably fair to say that this approach favors
more elaborate/mature coding techniques based on unit tests, continuous integration etc. - given you specified Java/Spring as your target language/framework you should be well prepared for this ...
Cloud Foundry endpoint targeting
Once you have such a VM in place, you simply need to switch the target Cloud Foundry endpoint and deploy to either one in an otherwise identical fashion, i.e. the Cloud Foundry runtime automatically takes care of binding to the respective services as usual, see e.g. Targeting Cloud Foundry:
Target Cloud Foundry in the Cloud:
prompt$ vmc target api.cloudfoundry.com
Target a standalone a Micro Cloud Foundry running on your local virtual machine:
prompt$ vmc target api.<domain>.cloudfoundry.com
Micro Cloud Foundry variations
Most Cloud Foundry vendors offer a custom variation of such a Micro Cloud Foundry VM, e.g.:
Micro Cloud Foundry
Stackato Micro Cloud
Micro Iron Foundry
Related
Is there a way I can make my java application (non web) running in google cloud platform. I can see that Cloud run, App engine and Cloud functions works for Web applications. My application is a Java application that runs a report using Google Ad manager API. Can I run this app in any of the GCP tools ?
Cloud Run is predominantly used for stateless operations (similar to Firebase Functions or AWS Lambda). Ie. a request is sent to the instance, which spins up resources, completes the task then shuts down. It is great for API endpoints that don't store anything in memory.
Another thing to note is that both app engine and cloud run are designed to work with dockerized applications.
From your description, it sounds like you should be using a Compute Engine Instance (A virtual computer). You can clone your git repository into the VM and run it manually. There are also GCP tools that will allow you to run the java executable on a timer.
Compute Engine Instances give you the most flexibility to configure the service you're building to your needs.
See docs for Compute Engine Here
I couldn't find any good example of a worker role for java on azure cloud.
I am writing an amqp publisher jms application for event hubs to simulate large amount of data as a stream. I wanted to run this application on cloud and scale it to produce data according to changing needs.
As I known, Azure plugins for Eclipse is to support the features of Cloud Services few years ago, so you can search many resources like the channel9 videos as #Micah_MSFT said. But now, I found it has removed these features for Cloud Services after I tried to install the plugin in my Eclipse.
There are two old blogs which may be helpful in your scenario.
Deploying Java Applications in Azure
Installing Java Runtime in Azure Cloud Services with Chocolatey
Meanwhile, Microsoft Azure Service Fabric is the next-generation cloud application platform for highly scalable, highly reliable distributed applications, that can be instead of Cloud Service, you can refer to the offical document Learn about the differences between Cloud Services and Service Fabric before migrating applications. to compare them, and there is a tutorial for Java.
Just per my experience, as workaround, there are other simple services which be suitable for generating data by Java on Azure cloud, and that can be scaled.
For using App Services, Continuous WebJobs can be scaled with the number of WebApp instances.
On Azure, Use Batch to run large-scale parallel and high-performance computing (HPC) applications efficiently in the cloud. So you can write a Java Application to produce data and run on Batch service parallelly. There is an offical sample in Java which you can refer to.
I have deployed few apps in cloud foundry and i want to see the CPU usage and Memory consumption etc like we see through Jprofile,JMX etc.
So can i deploy them as a plugin in my space if so - what are the steps.
not expecting a full length answer you can give some blog url or reference.
is deploying a plugin is same as application deployment.
Any other way to see the usage .
Because cloud foundry provider shows very basic usage in numbers but what i want is a graphical interface.
Any hint on this.
To monitor Pivotal Cloud Foundry, there are multiple tools available.
For platform monitoring you can use PCF Metrics, Data Dog, New Relic, Prometheus + Grafana.
Most of these do need the JMX Bridge tile installed.
For app performance monitoring you can install or setup tools like New Relic or App Dynamics. You will have to pay for these tools.
Register with network.pivotal.io. Sign in and you will get a list of all tiles from Pivotal and 3rd party vendors.
I have coded a Spring MVC Hibernate application with RabbitMQ as a messaging server & a MySQL DB. I have also used Hazelcast an in-memory distributed cache to centralize the state of the application, moving the local tomcat session to a centralized session & implementing distributed locks.
The app right now is hosted on a single tomcat server in my local system.
I want to test my application on a multiple JVM node environment i'e app running on multiple tomcat servers.
What would be the best approach to test the app.
Few things that come to my mind
A. Install & configure a load balancer & set up a tomcat cluster in my local system. This I believe is a tedious task & requires much effort.
B. Host the application on a PAAS like OpenShift, cloudfoundry but I am not sure if I will be able to test my application on several nodes.
C. Any other way to simulate a clustered environment on my local windows system?
I would suggest first you should understand your application requirement. For the real production/live environment, are you going to use Infrastructure as a service or PAAS.
If Infrastructure as a service then
I would suggest create local cluster environment and use the tomcat and spring application sticky session concept. Persist the session in Hazelcast or redis server installed on different node. Configure load balancer for multiple nodes having tomcat server. 2-3 VMs for testing purpose would be suitable.
If requirement is PAAS then
Don't think about local environment. Test directly on OpenShift or AWS free account and trust me you would be able to test on PAAS if all setup is fine.
I am working on a Servlet/JSP project and I want to host it on aws.amazon.com. I have already signed up for Amazon Web Services and after signing in this page opens up and I have no idea what to do or which option to select.
I think AWS provides a lot of customization with a lot advanced technical options to choose from, but this is difficult for beginners who just want to make their site running.
My project will use these:-
JSP/Servlets
CSS
MySQL
Struts2
Tomcat WebServer
I would suggest these approaches to study:
Elastic BeanStalk - This is AWS simply hosting model. If you're not IT savy you should pursue this approach
EC2 with MySQL RDS - In this case you'll create a Virtual Machine(s) (EC2) install Tomcat and other dependencies and deploy your app. You'll then use RDS to store your data (which is MySql as a service)
EC2 only - YOu'll do the same as 2. but install your own instance of MySql. There may be AMI's offered that you can provision that will meet your application requirements.
Other reading:
Route53 if your going to use AWS for your domain records
Elastic Load Balancing if your going to need High Availability
Elastic Block Store if you want persistent disks accross VMs
Network Security Groups to secure your VMs (for 1. and 2.)
Virtual Private Cloud for additional security
CloudFormation if you want to automate provisioning
There are many articles on: AWS Architecture
There is a eclipse plugin for Amazon web services.
The AWS Toolkit provides an AWS Java web project template for use in Eclipse. The template creates a web tools platform (WTP) dynamic web project that includes the AWS SDK for Java in the project's classpath. Your AWS account credentials and a simple index.jsp file are provided to help you get started. The following instructions assume you have installed both the Eclipse IDE for Java EE Developers and the AWS Toolkit plug-in. For more information, see Setting Up the AWS Toolkit for Eclipse.
Also check this & this
I would recommend 1st approach using Beanstalk to deploy your jsp application. There you are going to leverage all the advantages of AWS like load balancing, auto scaling, ddb and DW support and many other technologies. With Beanstalk you setup dev environment on your local machine and deploy the changes in AWS and once setup is done you are done...
May be you will need to spend some time on migrating from MySQL but that will be work on longer duration when you are going to have lot of users.