We have got a web based Java application which we are planning to migrate to cloud with an intention that multiple clients will be using it in a SaaS based environment. The current architecture of the application is quite asynchronous in nature. There are 4 different modules, each having a database of its own. When there is a need of data exchange between the modules we push the data using Pentaho and make use of a directory structure to store the interim data file, which is then picked up by the other module to populate its database. Given the nature of our application this asynchronous communication is very important for us.
Now we are facing a couple of challenges while migrating this application to cloud:
We are planning to use Multi Tenancy on our database server, but how do we ensure that the flat files we use for transferring the data between different modules are also channelized to their respective tenants in the DB.
Since we are planning to host this in cloud, would seek your views, if keeping a text file on a cloud server would be safe from a data security perspective.
File storage in cloud is safe and you can use control IAM roles setup to control the permissions of a file. Cloud providers like Google (Cloud storage), Amazon (AWS S3), etc provides a secure and scalable infrastructure to maintain files in the cloud.
In general setup, cloud storage provides you with buckets which are tagged with a global unique identification. For a multi-tenant setup you can create multiple buckets for individual tenants and store the necessary data feeds in it. Next, you can have jobs batch or streaming jobs using kettle (Pentaho) to push it to the right database based on the unique bucket definition.
Alternatively, you can also push (like other answers) to a streaming setup (like ActiveMQ, Kafka, etc) with user specific topics and have a streaming service (using java or pentaho) to ingest the data to respective database based on the topic.
Hope this helps :)
I cannot realistically give any specific advice without knowing more
about your system. However, based on my experience, I would
recommend switching to message queues, something like Kafka would
work nicely.
Yes, cloud providers offer enough security for static file storage. You can
limit access however you see fit, for example using AWS S3.
1- The multi tenancy may create a bit of issue while transferring the files. But from what information you have given the process of flat file movement across application will not be impacted. Still you can think of moving to MQ mode for passing the data across.
2-From data security view, AWS provides lot of features at access level, MFA, etc. If it needs to be highly secured i would recommend to get AWS Private cloud where nothing is shared with any one at any level.
Related
I have 3 different applications
ASP.NET web application
Java Desktop application
Android Studio mobile application
These 3 applications have the same database and and they need to connect from any part of the world with an internet connection. They share almost all the information, so, if you move something in one application it has to update the information in the other 2 applications.
I have the database on a physical server and I want to know how best to make this connection.
I have searched but I couldn't find if I have to connect directly to the server with some SQL Server, using Web Service, or something like that.
I hope someone could help.
Thank you.
I believe the best way is to first create a Web API layer (REST/SOAP) that will be used to perform all the relative operations in the centralized DB. Once that is setup, any of your applications written in any language can use the exposed web API methods to manipulate the data of the same DB.
If you are looking at a global solution - will you have multiple copies of the applications in different parts of the world as well?
In this scenario you should be looking at a cloud-hosted database with some form of geo-replication so that you can keep latency to a minimum.
There are no restrictions on the number of applications that can connect to a specific database - you do not have to create a different database for each and you may be able to reuse Stored Procedures between applications if they perform the same task.
I would however look at the concept of schemas - any database objects that are specific to one app should be separated from other - so put them in a schema for "App1". Shared objects can be in a shared schema.
We are trying to get live configuration data from our kubernetes cluster. Therefore we would like to read the configmaps from each of our services.
Is there a way to exctract this data with a spring microservice which runs alongside the rest of the services?
Or are there other (better?) ways / tools to get this information?
Using Kubernetes APIs you can get the configmaps you need. I am not familiar with the Java client, but here it is:
https://github.com/kubernetes-client/java
You can retrieve a list of configmaps and their contents using these APIs. Your application will need a cluster role and a cluster role binding to allow it reading from configmap resources if you're using RBAC.
To extract information you can just query the Kubernetes API, likely in your case using the Java Kubernetes client. Likely the biggest issue you will face will be ensuring you have read access for the namespace(s) that the ConfigMaps are in.
The bigger question about a 'better way' is trying to understand why you want to read all of the ConfigMaps for your applications. The goal you are trying to accomplish will guide the solution.
I have a use-case to build a centralized log aggregation tool which would work with multiple platforms. Basically the suit of apps in my firm include an Angular based UI, an Ionic based Hybrid mobile app, both interaction with a Java Spring Boot Restful backend as well as a PHP based monolithic internal CRM.
Now I need a way to aggregate logs from all these applications in a centralized location filtered on the severity and the user should have access to them via a UI where he can further group and filter the logs based on App, keywords etc.
https://dzone.com/articles/distributed-logging-architecture-for-microservices
Will a solution like this work independent of the platform or the tech stacks of the Apps whose logs it is aggregating?
What other options are there?
Generally I'd log to JSON. This is just a configuration in your log appender like Monolog in PHP or Logback in Spring Boot.
Then you can use Filebeat to tail those files and store them in Elasticsearch (and you don't have to do any parsing), visualize / search in Kibana, and you're done.
This is the easiest and probably also most versatile solution for the tag elastic-stack.
We are developing a cloud ERP product using java and want to provide the users to have an option to work either with a local database file or the database on the cloud. To some of our customers, their data is very sensitive and they do not want their data stored on web server, instead want to have the database on their own server/pc.
Will this kind of offering be technically viable, secure & effective to implement and maintain? If so, can anyone recommend the best work around for this kind of architecture where the application on our cloud server can work seamlessly with the local database?
Many Thanks
LJ
For your customers concerned with security, maybe using a local datastore such as MySQL running on a local server and a local instance of the ERP product, also running on a local server. And I would advise encrypting all sensitive columns using something like AES_ENCRYPT() -- even on this local database.
Otherwise I don't know of a way to run a hosted App with a secured local database without introducing all kinds of data vulnerabilities.
If I had an application that stored information in its datastore. Is there a way to access that same datastore from a second application?
Yes you can, with the Remote APIs.
For example, you can use Remote API to access a production datastore
from an app running on your local machine. You can also use Remote API
to access the datastore of one App Engine app from a different App
Engine app.
You need to configure the servlet (see documentation for that) and import the appengine-remote-api.jar in your project (You can find it ..\appengine-java-sdk\lib\)
Only remember that Ancestor Queries with Remote APIs are not working (See this)
You didn't mention why you wanted to access the datastore of one application from another, but depending on the nature of your situation, App Engine modules might be a solution. These are structurally similar to separate applications, but they run under the same application "umbrella" and can access a common datastore.
You can not directly access datastore of another application. Your application must actively serve that data in order for another application to be able to access it. The easiest way to achieve this is via Remote API, which needs a piece of code installed in order to serve the data.
If you would like to have two separate code bases (even serving different hostnames/urls), then see the new AppEngine Modules. They give you ability to run totally different code on separate urls and with different runtime settings (instances), while still being on one application sharing all stateful services (datastore, tasks queue, memcache..).