It is possible to set limits for the Google API credentials, to limit the access of an API key or token to an specific spreadsheet or folder, so this way those credentials can't access to all the information of the account but only the specify files.
Sure!
The Google Cloud Platform has robust tools to manage access to all sorts of things, including API credential access.
GCP IAM - Cloud Permissions and Access
You can create a cloud service to send respond with your key, only authorizing certain services to receive/request the key.
Here's the GCP IAM Documentation. Follow their instructions either via the frontend cloud console or command line tools to set a policy for your api key service.
Here's the gist of what you'll do for IAM:
authorize various google apis for your project
create a service account, e.g. my-api-key#see-the-docs-for-google-service-domain
For each of your apps that need the service key, create another service account, i.e. my-app#see-the-docs-for...
give whatever app service accounts your chosen access level/permission for the service account you created for your api key service
you're authorizing each app to access the api-key-service
deploy a simple Flask service to send your api-key using your api-key-service account
access the api credentials within your apps which have been given IAM permissions of their own
remember, you authorized your apps in step 4
On disk
For credentials stored on disk, it's best to encrypt/decrypt them on demand in app.
See this SO answer. If you encrypt your keys, go ahead and add to version control. Otherwise, avoid.
Secrets Manager or Berglas
However, I recommend you use either the open source Berglas tool or Google's managed Secrets product. You'll essentially give the secrets manager your api key, store it, then fetch it when necessary in-app or at load.
Adapted from the Google Cloud Documentation, almost verbatim:
# Import the Secret Manager client library.
from google.cloud import secretmanager_v1beta1 as sm
# GCP project in which to store secrets in Secret Manager.
project_id = 'YOUR_PROJECT_ID'
# ID of the secret to create.
secret_id = 'YOUR_SECRET_ID'
# Create the Secret Manager client.
client = sm.SecretManagerServiceClient()
# Build the parent name from the project.
parent = client.project_path(project_id)
# Create the parent secret
secret = client.create_secret(parent, secret_id, {
'replication': {
'automatic': {},
},
})
# Add the api key
version = client.add_secret_version(secret.name, {'data': b'my-google-api-credentials'})
# Access the api key
response = client.access_secret_version(version.name)
# Now you have your decoded api credentials you can use for authentication
payload = response.payload.data.decode('UTF-8')
I changed some comments in the above but be sure to check Google's documentation and their github examples.
If you're more adventurous, the Berglas library is fantastic and I use it directly in several projects, via its Go client locally and its docker image within deployed services.
Related
What's the best practice to grant application code (during local development) access to Google Cloud resources,
without generate and download a Service Account JSON Key File (setting GOOGLE_APPLICATION_CREDENTIALS env. variable)
without giving Google user direct access to resources
I have following 2 options I could potentially use:
gcloud auth application-default login --> together with --impersonate-service-account=SERVICE_ACCOUNT_EMAILS flag
OR, use gcloud auth application-default login normally --> authenticate with MY Google User Account to generate application default credentials ---> and update my application code as follow as in this reference code
from google.auth import impersonated_credentials
target_credentials = impersonated_credentials.Credentials(
source_credentials = google.auth.default(),
target_principal='SERVICE_ACCOUNT_EMAILS')
client = storage.Client(credentials=target_credentials)
Is there any pros and cons of each approach?
In python, I'm trying to call the GMail API via a service account with Delegated domain-wide authority, without using SERVICE_ACCOUNT_FILE.
My objective is to avoid creating a secret Key for the service account. Instead, I gave the Service Account Token Creator role to the process owner (me in local dev, App Engine Service Account in prod).
In the code below I successfully get and use an access token for the service account, without any SERVICE_ACCOUNT_FILE.
from google.cloud.iam_credentials_v1 import IAMCredentialsClient
from google.oauth2.credentials import Credentials
import googleapiclient.discovery
tk = IAMCredentialsClient().generate_access_token(
name=f'projects/-/serviceAccounts/{service_id}',
scope=['https://www.googleapis.com/auth/gmail.insert'],
# subject='admin#my.domain' doesn't work here :'(
)
service = googleapiclient.discovery.build('gmail', 'v1', credentials=Credentials(tk.access_token))
response = service.users().messages().insert(userId='user#my.domain', body=body).execute()
Problem is, after granting permissions to the service account in Google Admin of my.domain, I get the following error:
{'message': 'Precondition check failed.', 'domain': 'global', 'reason': 'failedPrecondition'}
I suspect that what I am missing is the subject, i.e. the email of an admin at my.domain.
I know I can provide the subject by constructing the credentials differently:
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(key_file, scopes=scopes)
delegated_credentials = credentials.with_subject('admin#my.domain'). # <- I need this
service = googleapiclient.discovery.build('gmail', 'v1', credentials=delegated_credentials)
But this requires the creation of a secret key, which I'd like to avoid.
Is there a way to pass the subject in the first code example above ?
AFAIK this is not possible
Would be happy to be proved wrong on this!
I place this as a disclaimer because you are right, it does not explicitly deny the possibility. Its not generally a requirement for documentation to explicitly express everything you can not do with it, though I do see the potential for confusion here so it might be worth "Sending Feedback" to the documentation page.
What the instructions say
Preparing to make an authorized API call
After you obtain the client email address and private key from the API Console ...
source
In no place does it say "this is an optional step". This is also true in other places like the Directory API instructions on domain-wide delegation, or the Reports API
The Recommendation to not use Keys
The following is from the article you linked:
Use service account keys only if there is no viable alternative
A service account key lets an application authenticate as a service
account, similar to how a user might authenticate with a username and
password.
source
When you attach a service account, for example to an App Engine instance, the instance is like the user and the service account is its user account. It serves as a sort of identity for the instance. More info on that on this other StackOverflow thread.
So the only way for a service account to act as-if it were another account, is with a key. Likewise the only way for a user to act as-if it were a service account is with a key. Therefore, if you are trying to enable domain-wide delegation, which presupposes that you want to have a service account act as-if it were other accounts, having a key is essential, whether the service account is attached to an App Engine instance or not.
Granted, its not mentioned explicitly that you can not achieve this without a key. Though it doesn't mention that you can either. From the tests that I have run, I have run into similar results as yours, so it would seem that you just can't. As mentioned in the disclaimer, I would be happy to be proved wrong, however, I suspect that it would be due to a vulnerability, not intended behavior.
You haven't mentioned why you need domain-wide delegation of authority, so you may want to evaluate if you really need it. Usually all that is needed is the OAuth flow, in which an app acts oh-behalf of a user, not as-if it were the user.
References
Access to Google APIs with Service Accounts
How to Authenticate Service Accounts
Directory API instructions on domain-wide delegation
Reports API instructions on domain-wide delegation
"How to assign multiple service accounts to cloud functions" (you can't).
We're running a server on AWS that will be using a few constants. These constants may be details that are confidential like a few API tokens, Client secrets & even DB credentials. We have been saving these details in one of our files on the server itself (say Credentials.js). So,
What is the best possible way to store these Credentials and in a secure manner.
We were also planing to switch to AWS SSM parameter store. Is it worth considering it? It also provides KMS encryption to confidential parameters.
Even if we do switch to AWS SSM Parameter store, we will have to call them multiple times when we make requests to third-party application servers (as we'll need the API tokens for those apps). Does this justify the cost we'll pay for SSM (Considering we take Standard store with High throughput) ?
Also, Please let me know if there are there alternatives to securely store these Parameters.
Thanks.
Secret Manager
Secrets Manager enables you to replace hardcoded credentials in your code, including passwords, with an API call to Secrets Manager to retrieve the secret programmatically. This helps ensure the secret can't be compromised by someone examining your code, because the secret no longer exists in the code. Also, you can configure Secrets Manager to automatically rotate the secret for you according to a specified schedule. This enables you to replace long-term secrets with short-term ones, significantly reducing the risk of compromise.
To get an overview how it look like, see AWS Secrets Manager: Store, Distribute, and Rotate Credentials Securely.
Cost
See Pricing. $0.40 USD per secret per month and $0.05 per 10,000 API calls.
Documents
Tutorials - Start here to get the ideas
secrets_getsecretvalue.js - Example to get secrets in JS
JS SDK for Secret Manager - Look further here to know the JS SDK
CreatSecretAPI - AWS API to create a secret for the detailed references
Create a secret via the AWS console or using SDK. See Creating a secret. A secret is a key/value pair where the value is in JSON format.
Alternatives
Hashicorp Vault
Static Secrets: Key/Value Secrets Engine
Vault JS client
Lambda
Use a lambda which only accepts an access from those with a specific IAM role/permission attached to the IAM profile of an EC2 instance to run your app.
Others
Just Googling "parameter store for secret management" showed bunch of articles and how-to. Please do the research first.
I have a pretty standard application written in Java which also runs queries against a DB. The application resides on GCP and the DB on Atlas.
For understandable reasons, I don't want to keep the username and password for the DB in the code.
So option number 1 that I had in mind, is to pass the username and password as environment variables to the application container in GCP.
Option number 2 is using Secret Manager in GCP and store my username and password there, and pass the GCP Credentials as an environment variable to the application container in GCP.
My question is, what is the added value of option number 2 if it has any? It seems that option 2 is even worse from a security aspect since if some hacker gets the google credentials, it has access to all of the secrets stored in the Secret Manager.
I don't know what are the best practices and what is advised to do in such cases. Thank you for your help.
Having credentials in GCP secret manager will help you to keep track of all the secrets and changes in a centralized location and access globally from any of your app.
For a standard application where one JAVA is connecting to a DB, may not add much values.
You may look into kubernetes secret for that reason.
If your application resides in GCP, you don't need a service account key file (which is your security concern, and you are right. I wrote an article on this)
TL;DR use ADC (Application Default Credential) to automatically get the service account credential provided automatically on Google Cloud Component (look at metadata server for more details).
Then grant this component identity (by default or user defined, when supported), i.e. the service account email, to access to your secrets.
And that's all! You haven't secrets in your code and your environment variable, neither the login/password, nor the service account key file.
If you have difficulties to use ADC in Java, don't hesitate to share your code. I will be able to help your to achieve this.
To use Secret Manager on Google Cloud you need to install the Secret Manager Java SDK Libraries. This documentation shows how to get started with the Cloud Client Libraries for the Secret Manager API, you only need to go to the Java section.
This Libraries helps you to access your keys in order that it can be used by your app.
The following link shows how to get details about a secret by viewing its metadata. Keep in mind that viewing a secret's metadata requires the Secret Viewer role (roles/secretmanager.viewer) on the secret, project, folder, or organization.
I recommend you to create a special Service Account to handle the proper permissions for your app, because if you don’t have a SA defined, the default SA is what is going to generate the request, and it is not secure. you can learn more about how to create a service account in this link
On the other hand, you can find an example on how you can use the following guide that contains a good example of finding your credentials automatically, that's more convenient and secure than manually passing credentials.
I can't find a way to have a working signed url on Google App Engine Standard environment with Python3.7.
I have look at the documentation here :
https://cloud.google.com/storage/docs/access-control/signing-urls-manually
Within a Google App Engine application, you can use the App Engine App Identity service to sign your string.
But the App Engine App Identity rely on google.appenginepackage, that is not availalble on python 3.7 env as explain here
Proprietary App Engine APIs are not available in Python 3. This section lists recommended replacements.
The overall goal is that your app should be fully portable and run in any standard Python environment. You write a standard Python app, not an App Engine Python app. As part of this shift, you are no longer required to use proprietary App Engine APIs and services for your app's core functionality. At this time, App Engine APIs are not available in the Python 3.7 runtime.
All the api on sdk rely on google.appengine and raise an exception on python 3.7 env : EnvironmentError('The App Engine APIs are not available.') raise here
that rely on proprietary api :
try:
from google.appengine.api import app_identity
except ImportError:
app_identity = None
I know I can use many solution like ServiceAccountCredentials.from_json_keyfile_dict(service_account_dict) but I have to upload a file with credentials directly on app engine and I can't do it since the project credential will be expose on git or ci.
I really want to rely on default credential from app engine like other Google Cloud api like storage.Client() for example that work out of box.
Any suggestion ?
For Python interactions with Google Cloud use Python Client that is supported on App Engine standard Python 3 runtime.
To access Cloud Storage using google-cloud-storage from App Engine Standard:
Add dependency to the requirements.txt > google-cloud-storage==1.14.0
Use Storage Client library, authenticating with storage.Client() only.
Depending on what you need to achieve, I would also suggest trying different possible approaches:
Allow anonymous access for public data stored in the bucket.
For signed URL API call use Method: projects.serviceAccounts.signBlob. Documentation includes examples:
It is important to grant correct permissions to create tokens for Service account
You can also check how to use the API - explained on SO.
This example explains how to implement signing of the bucket URL using python
It is also possible to sign blobs with appengine api using:
google.appengine.api.app_identity.sign_blob()
This question might be old, but it's one the first ones to show on a Google search, so I thought it might help someone who comes looking in the future to post this here as well.
The answer #guillaume-blaquiere posted here does work, but it requires an additional step not mentioned, which is to add the Service Account Token Creator role in IAM to your default service account, which will allow said default service account to "Impersonate service accounts (create OAuth2 access tokens, sign blobs or JWTs, etc)."
This allows the default service account to sign blobs, as per the signBlob documentation.
I tried it on AppEngine and it worked perfectly once that permission was given.
import datetime as dt
from google import auth
from google.cloud import storage
# SCOPES = [
# "https://www.googleapis.com/auth/devstorage.read_only",
# "https://www.googleapis.com/auth/iam"
# ]
credentials, project = auth.default(
# scopes=SCOPES
)
credentials.refresh(auth.transport.requests.Request())
expiration_timedelta = dt.timedelta(days=1)
storage_client = storage.Client(credentials=credentials)
bucket = storage_client.get_bucket("bucket_name")
blob = bucket.get_blob("blob_name")
signed_url = blob.generate_signed_url(
expiration=expiration_timedelta,
service_account_email=credentials.service_account_email,
access_token=credentials.token,
)
I downloaded a key for the AppEngine default service account to test locally, and in order to make it work properly outside of the AppEngine environment, I had to add the proper scopes to the credentials, as per the commented lines setting the SCOPES. You can ignore them if running only in AppEngine itself.