Integrate Google Secret Manager with Google App Engine in python3 - python-3.x

I have an error when I try to access secret manager from Google App Engine Standard, but if I access from my laptop with the JSON key is fine.
requirements.txt:
Flask
google-cloud-storage
google-cloud-secret-manager
psycopg2
app.yaml
runtime: python39
automatic_scaling:
max_instances : 2
max_idle_instances : 1
target_cpu_utilization : 0.9
target_throughput_utilization : 0.9
max_concurrent_requests : 80
Code:
from flask import Flask,render_template, request
from werkzeug.utils import secure_filename
import os
app = Flask(__name__)
#app.route('/s1',methods=['GET'])
def s1():
from google.cloud import secretmanager
client = secretmanager.SecretManagerServiceClient()
return client.access_secret_version(request={"name": "projects/363745113141/secrets/API_OCR/versions/1"}).payload.data.decode("UTF-8")
if __name__ == '__main__':
app.run()
error 1:
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
at ._end_unary_response_blocking ( /layers/google.python.pip/pip/lib/python3.9/site-packages/grpc/_channel.py:849 )
at .__call__ ( /layers/google.python.pip/pip/lib/python3.9/site-packages/grpc/_channel.py:946 )
at .error_remapped_callable ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:50 )
error 2:
google.api_core.exceptions.PermissionDenied: 403 Permission 'secretmanager.versions.access' denied for resource 'projects/363745113141/secrets/API_OCR/versions/1' (or it may not exist).
at .error_remapped_callable ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:52 )
at .retry_target ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/api_core/retry.py:190 )
at .retry_wrapped_func ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/api_core/retry.py:283 )
at .__call__ ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/api_core/gapic_v1/method.py:154 )
at .access_secret_version ( /layers/google.python.pip/pip/lib/python3.9/site-packages/google/cloud/secretmanager_v1/services/secret_manager_service/client.py:1439 )
at .accessSecret ( /srv/lib/secretManager.py:18 )
at .s1 ( /srv/main.py:39 )
at .dispatch_request ( /layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py:1509 )
at .full_dispatch_request ( /layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py:1523 )
at .full_dispatch_request ( /layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py:1525 )
at .wsgi_app ( /layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py:2077 )

In order to have access to a secret version, you must grant access to it by adding the proper role.
As described in the documentation Accessing a secret version:
Accessing a secret version requires the Secret Manager Secret Accessor role (roles/secretmanager.secretAccessor) on the secret, project, folder, or organization.
In this tutorial, Share your secrets between your teams and applications with Secret Manager on Google Cloud Platform, it shows how to give access to a given secret:
If we want to give control the access to our secrets from Secret Manager console, we have to select one secret and press “Show Info Panel” button
In the info panel, we want review the actual permission and we can add new member:
And from the info panel, we can modify the access (only if the role is not inherited)
Here we add the App Engine default service account (or the account you've created to run your App Engine if you have to do so).
In the Viewing the App Engine default service account documentation, you can view your service accounts:
In the Cloud console, go to the Service accounts page.
Go to Service accounts
Select your project.
In the list, locate the email address of the App Engine default service account:
YOUR_PROJECT_ID#appspot.gserviceaccount.com
See also:
Changing service account permissions
Another approach is to use the account you've been using to do the tests on your laptop (the account used to generate the JSON key) as the account to run your application on App Engine:
By adding the account on your app.yaml file:
service_account: {SERVICE_ACCOUNT_YOU_WANT_TO_USE_TO_ACCESS_APP_ENGINE}
By running the following command to deploy your application:
gcloud beta app deploy --service-account=<your_service_account> app.yaml
App Engine app's identity is not restricted to the AppEngine default service account anymore. You can deploy with custom service account for each AppEngine app now by following https://cloud.google.com/appengine/docs/standard/python/user-managed-service-accounts#app.yaml.
These approaches are taken from the following question: Custom service account for App Engine.

Related

Azure App Service to Key Vault - Managed Identity Access Issue using Python

I have a setup as below.
Azure App Service connects to Azure Key Vault using Azure-Identity Python SDK. Azure KV's data plane has access policy enabled to include App Service's managed identity. I have used the following code to connect to KV from app service webjobs.
import os
import json
from azure.identity import DefaultAzureCredential, ManagedIdentityCredential
from azure.keyvault.secrets import SecretClient
keyVaultName = os.getenv('KeyVaultNm')
KVUri = f"https://{keyVaultName}.vault.azure.net"
credential = DefaultAzureCredential()
_sc = SecretClient(vault_url=KVUri, credential=credential)
srv = _sc.get_secret("server-nm").value
print(f'KV Secret: {srv}')
This setup had been working without any issues for almost a year to allow App service extract secrets from KV but over the last weekend, it started failing with the following error.
[08/29/2022 09:14:14 > 6f77be: ERR ] ManagedIdentityCredential.get_token failed: <urllib3.connection.HTTPConnection object at 0x00000234348D8AC8>: Failed to establish a new connection:
[WinError 10049] The requested address is not valid in its context
[08/29/2022 09:14:14 > 6f77be: ERR ] DefaultAzureCredential failed to retrieve a token from the included credentials.
[08/29/2022 09:14:14 > 6f77be: ERR ] Attempted credentials:
Any idea what's this error is regarding? Will this happen due to any firewall issue?
PS: The same code works well inside a VM using its managed identity and retrieves secrets from KV. The issue happens only when the code is run from Webjobs.

How to access azure AD using python SDK

I am new to Azure, I want to write a python function to access Azure AD and list the existing groups there, I am facing issues logging into azure. I have been working in AWS there I use boto3 as SDK and I use the command line or programmatic acess. Following is the code that I have
from azure.graphrbac import GraphRbacManagementClient
from azure.common.credentials import UserPassCredentials
# See above for details on creating different types of AAD credentials
credentials = UserPassCredentials(
'user#domain.com', # The user id I use to login into my personal account
'my_password', # password of that account
resource="https://graph.windows.net"
)
tenant_id = "82019-1-some-numbers"
graphrbac_client = GraphRbacManagementClient(
credentials,
tenant_id
)
I want to know which is professional way of logging into azure, how do i list the groups present in my azure AD, what code changes do i have to do for that
Snapshot of the API permission
To retrieve list of Azure AD groups, make sure to grant Directory.Read.All for your application like below:
Go to Azure Portal -> Azure Active Directory -> App Registrations -> Your App -> API permissions
You can make use of below script to get the list of Azure AD Groups by Krassy in this SO Thread:
from azure.common.credentials import ServicePrincipalCredentials
from azure.graphrbac import GraphRbacManagementClient
credentials = ServicePrincipalCredentials(
client_id="Client_ID",
secret="Secret",
resource="https://graph.microsoft.com",
tenant = 'tenant.onmicrosoft.com'
)
tenant_id = 'tenant_id'
graphrbac_client = GraphRbacManagementClient(credentials,tenant_id)
groups = graphrbac_client.groups.list()
for g in groups:
print(g.display_name)
For more in detail, please refer below links:
Azure Python SDK - Interact with Azure AD by Joy Wang
How to access the list of Azure AD Groups using python by Will Shao - MSFT

how to authenticate to google text-to-speech with service account

I am trying to use google text-to-speech and other translation service in my nodejs but when i connect to google api I get this error message
"Your application has authenticated using end user credentials from the Google Cloud SDK or Google Cloud Shell which are not supported by the texttospeech.googleapis.com. We recommend configuring the billing/quota_project setting in gcloud or using a service account through the auth/impersonate_service_account setting. For more information about service accounts and how to use them in your application, see https://cloud.google.com/docs/authentication/. If you are getting this error with curl or similar tools, you may need to specify 'X-Goog-User-Project' HTTP header for quota and billing purposes. For more information regarding 'X-Goog-User-Project' header, please check https://cloud.google.com/apis/docs/system-parameters.",
metadata: Metadata {
internalRepr: Map(2) {
'google.rpc.errorinfo-bin' => [Array],
'grpc-status-details-bin' => [Array]
},
options: {}
},
note: 'Exception occurred in retry method that was not classified as transient'
}
so after many research i tried to verify that i am authenticating using my service account credentials. I ran this command
gcloud auth activate-service-account --key-file=./auth/service_acct_key.json
and it shows this
Activated service account credentials for: [firebase-adminsdk-uwecx#xxxxx.iam.gserviceaccount.com]
but when run the server again
node server.js
I still got the error
what is causing this error and how can authenticate correctly ?
With gcloud CLI, you have 2 level of authentication:
The CLI level
The Google Cloud Auth library level (Also named ADC, for Application Default Credential)
When you perform the command gcloud auth .... you are at the CLI level
When you perform the command gcloud auth application-default ... you are at the ADC level.
In your case, you only set the authentication at the CLI level, and, of course, that authentication isn't detected in your NODE app, that use Google Cloud libraries and search credential at ADC level.
When you use service account key file (that is a bad practice, but too often prosed and shared in tutorial, even on Google Cloud tutorials (...)), you have to set an environment variable GOOGLE_APPLICATION_CREDENTIALS with the value equals to the absolute path of your service account key file. Try that
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/auth/service_acct_key.json
node server.js
It should work.

What and how to pass credential using using Python Client Library for gcp compute API

I want to get list of all instances in a project using python google client api google-api-python-client==1.7.11
Am trying to connect using method googleapiclient.discovery.build this method required credentials as argument
I read documentation but did not get crdential format and which credential it requires
Can anyone explain what credentials and how to pass to make gcp connection
The credentials that you need are called "Service Account JSON Key File". These are created in the Google Cloud Console under IAM & Admin / Service Accounts. Create a service account and download the key file. In the example below this is service-account.json.
Example code that uses a service account:
from googleapiclient import discovery
from google.oauth2 import service_account
scopes = ['https://www.googleapis.com/auth/cloud-platform']
sa_file = 'service-account.json'
zone = 'us-central1-a'
project_id = 'my_project_id' # Project ID, not Project Name
credentials = service_account.Credentials.from_service_account_file(sa_file, scopes=scopes)
# Create the Cloud Compute Engine service object
service = discovery.build('compute', 'v1', credentials=credentials)
request = service.instances().list(project=project_id, zone=zone)
while request is not None:
response = request.execute()
for instance in response['items']:
# TODO: Change code below to process each `instance` resource:
print(instance)
request = service.instances().list_next(previous_request=request, previous_response=response)
Application default credentials are provided in Google API client libraries automatically. There you can find example using python, also check this documentation Setting Up Authentication for Server to Server Production Applications.
According to GCP most recent documentation:
we recommend you use Google Cloud Client Libraries for your
application. Google Cloud Client Libraries use a library called
Application Default Credentials (ADC) to automatically find your
service account credentials
In case you still want to set it manaully, you could, first create a service account and give all necessary permissions:
# A name for the service account you are about to create:
export SERVICE_ACCOUNT_NAME=your-service-account-name
# Create service account:
gcloud iam service-accounts create ${SERVICE_ACCOUNT_NAME} --display-name="Service Account for ai-platform-samples repo"
# Grant the required roles:
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com --role roles/ml.developer
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com --role roles/storage.objectAdmin
# Download the service account key and store it in a file specified by GOOGLE_APPLICATION_CREDENTIALS:
gcloud iam service-accounts keys create ${GOOGLE_APPLICATION_CREDENTIALS} --iam-account ${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com
Once it's done check whether the ADC path has been set properly by checking:
echo $GOOGLE_APPLICATION_CREDENTIALS
Having set the ADC path, you don't need to import from code the service access key, which undesirable, so the code looks as follows:
service = googleapiclient.discovery.build(<API>, <version>,cache_discovery=False)

BigQuery Node.js API Permission Bug

I am building a Node.js server to run queries against BigQuery. For security reasons, I want this server to be read only. For example, if I write a query with DROP, INSERT, ALTER, etc. statement my query should get rejected. However, something like SELECT * FROM DATASET.TABLE LIMIT 10 should be allowed.
To solve this problem, I decided to use a service account with "jobUser" level access. According to BQ documentation, that should allow me to run queries, but I shouldn't be able to "modify/delete tables".
So I created such a service account using the Google Cloud Console UI and I pass that file to the BigQuery Client Library (for Node.js) as the keyFilename parameter in the code below.
// Get service account key for .env file
require( 'dotenv' ).config()
const BigQuery = require( '#google-cloud/bigquery' );
// Query goes here
const query = `
SELECT *
FROM \`dataset.table0\`
LIMIT 10
`
// Creates a client
const bigquery = new BigQuery({
projectId: process.env.BQ_PROJECT,
keyFilename: process.env.BQ_SERVICE_ACCOUNT
});
// Use standard sql
const query_options = {
query : query,
useLegacySql : false,
useQueryCache : false
}
// Run query and log results
bigquery
.query( query_options )
.then( console.log )
.catch( console.log )
I then ran the above code against my test dataset/table in BigQuery. However, running this code results in the following error message (fyi: exemplary-city-194015 is my projectID for my test account)
{ ApiError: Access Denied: Project exemplary-city-194015: The user test-bq-jobuser#exemplary-city-194015.iam.gserviceaccount.com does not have bigquery.jobs.create permission in project exemplary-city-194015.
What is strange is that my service account (test-bq-jobuser#exemplary-city-194015.iam.gserviceaccount.com) has the 'Job User' role and the Job User role does contain the bigquery.jobs.create permission. So that error message doesn't make sense.
In fact, I tested out all possible access control levels (dataViewer, dataEditor, ... , admin) and I get error messages for every role except the "admin" role. So either my service account isn't correctly configured or #google-cloud/bigquery has some bug. I don't want to use a service account with 'admin' level access because that allows me to run DROP TABLE-esque queries.
Solution:
I created a service account and assigned it a custom role with bigquery.jobs.create and bigquery.tables.getData permissions. And that seemed to work. I can run basic SELECT queries but DROP TABLE and other write operations fail, which is what I want.
As the error message shows, your service account doesn't have permissions to create BigQuery Job
You need to grant it roles/bigquery.user or roles/bigquery.jobUser access, see BigQuery Access Control Roles, as you see in this reference dataViewer and dataEditor don't have Create jobs/queries, but admin does, but you don't need that
To do the required roles, you can follow the instructions in Granting Access to a Service Account for a Resource
From command line using gcloud, run
gcloud projects add-iam-policy-binding $BQ_PROJECT \
--member serviceAccount:$SERVICE_ACOUNT_EMAIL \
--role roles/bigquery.user
Where BQ_PROJECT is your project-id and SERVICE_ACOUNT_EMAIL is your service-account email/id
Or from Google Cloud Platform console search or add your service-account email/id and give it the required ACLs
I solved my own problem. To make queries you need both bigquery.jobs.create and bigquery.tables.getData permissions. The JobUser role has the former but not the latter. I created a custom role (and assigned my service account to that custom role) that has both permissions and now it works. I did this using the Google Cloud Console UI ( IAM -> Roles -> +Add ) then ( IAM -> IAM -> <set service account to custom role> )

Resources