How to use two differents projects id of GCP in Cloud Function - java

I have a Cloud Function, in this function I'm connecting database through secrets by GCP
for my secrets I'm using different project id (com-project-data)
For my Google Cloud Function I'm using another project id (com-project-common)
Exists a conflict because each project have a different service account, then I'm asking the following question...
Can I inject two differents service accounts in my Cloud Function? If this is probably,
How Can I do it?

A service account is one identity. Your function has its own identity. Then, this identity should access the required resources.
Therefore, grant the Cloud Functions service account the permission to access the resources, secret or whatever.
In that concept, you can easily understand that you can't have 2 identities for the same service (Cloud Functions can't be schizophrenia!)
Of course, you can do unrecommended things, like using a service account key file for each project and have 2 authentications, but it's a terrible thing and you have to avoid this pattern.

Related

What is the preferred way to configure GCP credentials in Spring boot application

I am in process of writing a spring boot based microservice, which will be deployed on GKE. To configure service account credentials i do see there are multiple options available . What is the most preferred and possibly safer option available. I have tried below approaches, kindly suggest other ways
CredentialsProvider interface with spring.cloud.gcp.credentials.location
spring.cloud.gcp.credentials.encoded-key
GCP secrete manager
In Cloud environment generally the safest and best option with least administrative overhead is to use the corresponding service from the Cloud provider - in your case it would be Secret Manager. Of course if you are planning to not tie your applications with a specific cloud provider or cater to on-prem environment as well then you can use third party Secret management providers like HashiCorp Vault.
However even with Secret Manager if you interact with the API directly you will have to provide keys to call the API somewhere which creates another problem. The general recommended solution is to to use application authenticating as Service accounts and interacting with Secret manager directly as outlined here. Or there are alternative ways of mounting Secrets from Secret Manager on the GKE Volume using CSI Driver as explained here.
Running a secure cluster and container applications is a critical requirement and here are further recommendations for GKE security hardening which covers Secret management as well. More specifically you can check the recommendation in section "CIS GKE Benchmark Recommendation: 6.3.1"
Although #Shailendra gives you a good solution, as you are using GKE, you can store sensitive information as Kubernetes secrets.
Both the Kubernetes and GKE documentation provide several examples of creating secrets.
You can later use the configured secrets in multiple ways, in the simple use case, as environment variables that can be consumed in the application. Please, see as well this article.
Th Spring Cloud Kubernetes project provides support for consuming this secrets as property sources.
This approach allows you to test your application deployment locally, with minikube or kind, and later deploy the same artifacts to the cloud provider. In addition, it is cloud provider agnostic as you are using out-of-the-box Kubernetes artifacts.
I am afraid that we were so focused in provide you further alternatives that at the end we do not actually answer your question.
Previously, I will give you the advice of use Kubernetes Secrets, and it is still perfectly fine, but please, allow me to come back to it later.
According to the different properties you are trying setting, you are trying configuring the credentials on behalf your application with interact with other services deployed in GCP.
For that purpose the first thing you need is a service account.
In a nutshell, a service account is a software artifact that agglutinates several permissions.
This service account can be later assigned to a certain GCP resource, to a certain GCP service, and it will allow that resource to inherit or act on behalf of the configured permissions when interacting with other GCP resources and services.
Every service account will have an associated set of keys which identify the service account - the information you are trying to keep safe.
There are different types of service accounts, mainly, default service accounts, created by GCP when you enable or use some Google Cloud services - one for Compute Engine and one for App Engine - and user defined ones.
You can modify the permissions associated with these service accounts: the important thing to keep in mind is always follow the principle of least privilege, only grant the service account the necessary permissions for performing its task, nothing else.
By default, your GKE cluster will use the default Compute Engine service account and the scopes for it defined. These permissions will be inherited by your pods when contacting other services.
As a consequence, one possible option is just configuring an appropriate service account for GKE and use these permissions in your code.
You can use the default Compute Engine service account, but, as indicated in the GCP docs when describing how to harden the cluster security:
Each GKE node has an Identity and Access Management (IAM) Service Account associated with it. By default, nodes are given the Compute Engine default service account, which you can find by navigating to the IAM section of the Cloud Console. This account has broad access by default, making it useful to wide variety of applications, but it has more permissions than are required to run your Kubernetes Engine cluster. You should create and use a minimally privileged service account to run your GKE cluster instead of using the Compute Engine default service account.
So probably you will need to create a service account with the minimum permissions to run your cluster (and) application. The aforementioned link provides all the necessary information.
As an alternative, you can configure a different service account for your application and this is where, as a possible alternative, you can use Kubernetes Secrets.
Please:
Do not directly provide your own implementation of CredentialsProvider, I think it will not provide you any additional benefit compared with the rest of solutions.
If you want to use the spring.cloud.gcp.credentials.location configuration property, create a Kubernetes secret and expose it as a file, and set the value of this property to that file location.
In a very similar way, using Kubernetes Secrets, and as exemplified for instance in this article, you can expose the service account credentials under the environment variable GOOGLE_APPLICATION_CREDENTIALS, both Spring GCP and the different GCP client libraries will look for this variable in order to obtain the required credentials.
I would not use the configuration property spring.cloud.gcp.credentials.encoded-key, in my opinion this approach makes the key more suitable for threats - probably you have to deal with VCS problems, etc.
Secret Manager... as I told, it is a suitable solution as indicated by #Shailendra in his answer.
The options provided by Guillaume are very good as well.
The preferred way is hard to answer. Depends on your wishes...
Personally, I prefer to keep a high level of security, it's related to service account authentication and a breach can be a disaster.
Therefore, too keep the secrets secret, I prefer not having secrets. Neither in K8S secret nor in secret manager! Problem solved!!
You can achieve that with ADC (application default credential). But like that, out of the box, the ADC use the Node identity. The problem here is if you run several pods on the same node, all will have the same identity and thus the same permissions.
A cool feature is to use Workload Identity. It allows you to bind a service account with your K8S deployment. The ADC principle is the same, except that it's bind at the pods level and not at the node level (a proxy is created that intercept the ADC requests)
If your application is running on GCP the preferred way would be to use default credentials provided by the Google GCP clients. When using the default credentials provider the clients will use the service account that is associated with you application.

Integrate My Module having Own DB in Another App

I have a Notes App which contain DB, I need to use this Notes App as lib or module in My two parent apps, how can I use this in my two apps?
AAR:
If I am using .aar means is there possibility to create or store Database in aar module.
Sub Module:
if I go for sub module I need to create table in each projects.
Individual APP:
If I go for individual app,How I share my DB details.
I guess, the best option for you to using Content provider.
However, content providers are primarily intended to be used by other
applications, which access the provider using a provider client
object. Together, providers and provider clients offer a consistent,
standard interface to data that also handles inter-process
communication and secure data access.
Typically you work with content providers in one of two scenarios; you
may want to implement code to access an existing content provider in
another application, or you may want to create a new content provider
in your application to share data with other applications.
Source: https://developer.android.com/guide/topics/providers/content-provider-basics
After you created a content provider in your Notes App other apps can access by content resolver. This is totally similar way as you can access bookmarks or contacts on Android.
One more suggestion: don't forget to take care about security as well if you want to share only between your apps.
If you are using a content provider for sharing data between only your
own apps, it is preferable to use the android:protectionLevel
attribute set to signature protection. Signature permissions do not
require user confirmation, so they provide a better user experience
and more controlled access to the content provider data when the apps
accessing the data are signed with the same key.
Source: https://developer.android.com/training/articles/security-tips#ContentProviders

Client-Server architecture: Login system with JAVA

I have an assignment to realize a client-server based shop whose checkout system acts like a cashier desk in supermarket. The main programming language is JAVA.
Server: I need to deploy locally a database (SQL) by using JDBC and Myadminphp (already done)
Client: There are two types of users: Cashier and Manager. The Cashier is able to access only to checkOut() function where he makes queries to the database to do the checkout process for customers. The Manager is able to access getData() to retrieve a table of available data in the shop and addItem() to permit re-stocking for a particular product.
My question is: Since I need to realize a way using JAVA language to create a login system in the client site (using Java GUI also) to do the login process in order to grant access to the functions for different types of user, what kind of methods could I use to implement the login system? And do I also need to implement the system by running another server rather than the one that keeps the database?
May be you need to create a separate server to provide Identity and Access Management (IAM). You can take a look at spring-cloud-oauth2 and an API Gateway infront of it. For API Gateway spring-cloud-zuul is a better idea.
1) problem of giving access can be solved by two ways one by using the built in frameworks like spring security or apache shiro. Other way is to verify the permissions and roles on your own by loading the union of permission available in the roles assigned to logged in user and make a check on the methods which you want to be in authorized access criteria. you can implement login using token based(oauth2.0) system.
2) Database should be deployed on server system so that multiple application instance can access to it.

Google Cloud DataStore Clarification

I know appengine datastore and what is cloud datastore, using cloud datastore is always an confusion for me, following are my questions:
does cloud datastore requires Compute Engine ?
(in google docs i saw like enable compute engine)
how to access cloud datastore from appengine application (this is very needed)
how to enable multiple application to access this common datastore ?
Note: I know how to activate it, but I didn't get clear answers for my questions above.
Your questions:
1) No, cloud datastore can also be user by other platforms
2) App engine has native support to use the (cloud) datastore. The cloud datastore is based on the App Engine datastore to make it available for others.
3) You can share the cloud datastore, but see this issue
Atlast, i can able to access from cloud datastore from appengine application, following are my answer,
Ans For Ques 1:
Cloud Datstore doesn't require compute engine instance.
Ans for Ques 2:
To access cloud datastore, you need either service account credentials or u must have access token for authenticated user with Scopes set to Datastore, and user email.
Use Google api Client library,or simply use ProtoBuf library provided in documentation.
Ans for Ques 3:
simply use create credentials from application u want to access and use it in other applications
https://developers.google.com/datastore/docs/getstarted/overview
Thanks!

Can a second GAE application access the datastore of a primary application?

If I had an application that stored information in its datastore. Is there a way to access that same datastore from a second application?
Yes you can, with the Remote APIs.
For example, you can use Remote API to access a production datastore
from an app running on your local machine. You can also use Remote API
to access the datastore of one App Engine app from a different App
Engine app.
You need to configure the servlet (see documentation for that) and import the appengine-remote-api.jar in your project (You can find it ..\appengine-java-sdk\lib\)
Only remember that Ancestor Queries with Remote APIs are not working (See this)
You didn't mention why you wanted to access the datastore of one application from another, but depending on the nature of your situation, App Engine modules might be a solution. These are structurally similar to separate applications, but they run under the same application "umbrella" and can access a common datastore.
You can not directly access datastore of another application. Your application must actively serve that data in order for another application to be able to access it. The easiest way to achieve this is via Remote API, which needs a piece of code installed in order to serve the data.
If you would like to have two separate code bases (even serving different hostnames/urls), then see the new AppEngine Modules. They give you ability to run totally different code on separate urls and with different runtime settings (instances), while still being on one application sharing all stateful services (datastore, tasks queue, memcache..).

Categories