Connect to GCP SQL using the .json credentials file - python-3.x

I have a PostgreSQL DB at GCP. Right now I can login using a username, password e.g
import pandas as pd
import pyodbc
conn_str = (
"DRIVER={PostgreSQL Unicode};"
"DATABASE=test;"
"UID=user;"
"PWD=a_very_strong_password;"
"SERVER=34.76.yy.xxxx;"
"PORT=5432;"
)
with pyodbc.connect(conn_str) as con:
print(pd.read_sql("SELECT * from entries",con=con))
Is there a way to use the .json credentialsfile which is downloaded when I created my IAM user, instead of "hard typing" the credentials like above? I recon I can use the file to connect to a GCP storage, where I then can save my credentials for the DB thus I can write a script which loads the username,password etc. from the storage, but I feel it is a kinda "clunky" workaround.
From the guide here it seems like you can create IAM roles for such, but you only grants access for an hour at a time, and you need to create a token-pair each time.

Short answer: Yes, you can connect to a Cloud SQL instance using SA keys (json file) but only with PostgreSQL but you need to refresh the token every hour.
Long answer: The purpouse of the json is more intended to make operations in the instance at resource level or when using the Cloud SQL proxy.
For example when you use the Cloud SQL proxy with a service account you make a "magical bridge" to the instance but at the end you need to authenticate the way you're doing right now but using as SERVER = 127.0.0.1. This is the recommended method in most cases.
As well you've mentioned that using IAM authentication can work, this approach works for 1 hour since you depend on token refresh. If you're okay with this, just keep in mind you need to be refreshing the token.
Another approach I can think of for now is to use Secret Manager. The steps can be as follows:
Create a service account and a key for that.
Create a secret which contains your password.
Grant access to this particular secret to the SA created in step 1:
Go to Secret Manager.
Select the secret and click on Show info panel
Click on Add member and type or paste the email of the SA
Grant Secret Manager Secret Accessor
Click on Save
Now in your code you can get the secret content (which is the password) with maybe this sample code:
import pandas as pd
import pyodbc
from google.cloud import secretmanager
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('/path/to/key.json')
client = secretmanager.SecretManagerServiceClient(credentials=credentials)
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}"
response = client.access_secret_version(request={"name": name})
secret_password = response.payload.data.decode("UTF-8")
conn_str = (
"DRIVER={PostgreSQL Unicode};"
"DATABASE=test;"
"UID=user;"
"PWD=" + secret_password + ";"
"SERVER=34.76.yy.xxxx;"
"PORT=5432;"
)
with pyodbc.connect(conn_str) as con:
print(pd.read_sql("SELECT * from entries",con=con))
BTW, you can install the lib using pip install google-cloud-secret-manager.
Finally, you can also use this approach to keep the instance IP, user, DB name, etc. creating more secrets if you prefer

Related

Generating Cloud Storage Signed URL from Google Cloud Function without using explicit key file

I'd like to create a pre-signed upload URL to a storage bucket, and would like to avoid an explicit reference to a json key.
Currently, I'm attempting to do this with the Default App Engine Service Account
I'm attempting to follow along with this answer but am getting this error:
AttributeError: you need a private key to sign credentials.the
credentials you are currently using <class
'google.auth.compute_engine.credentials.Credentials'> just contains a
token. see
https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account
for more details.
My Cloud Function code looks like this:
from google.cloud import storage
import datetime
import google.auth
def generate_upload_url(blob_name, additional_metadata: dict = {}):
credentials, project_id = google.auth.default()
# Perform a refresh request to get the access token of the current credentials (Else, it's None)
from google.auth.transport import requests
r = requests.Request()
credentials.refresh(r)
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob(blob_name)
service_account_email = credentials.service_account_email
print(f"attempting to create signed url for {service_account_email}")
url = blob.generate_signed_url(
version="v4",
service_account_email=service_account_email,
access_token=credentials.token,
# This URL is valid for 120 minutes
expiration=datetime.timedelta(minutes=120),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
return url
def get_upload_url(request):
blob_name = get_param(request, "blob_name")
url = generate_upload_url(blob_name)
return url
When you use version v4 of signed URL, the first line of the method calls ensure_signed_credentialsmethod that check if the current service account can generate a signature in standalone mode (so with a private key). And so, that's break the current behavior.
In the comment of the function, it's clearly describe that a service account JSON file is required
If you are on Google Compute Engine, you can't generate a signed URL.
Follow `Issue 922`_ for updates on this. If you'd like to be able to
generate a signed URL from GCE, you can use a standard service account
from a JSON file rather than a GCE service account.
So, use v2 version instead.

pyodbc will support connecting to an Azure SQL DB using the AD access token instead of user/password?

Currently, I use device code credential to get the access to Azure AD.
device_code_credential = DeviceCodeCredential(
azure_client_id,
tenant_id=azure_tenant_id,
authority=azure_authority_uri)
But I still need to use Azure account username/password to connect to Azure SQL server
driver = 'ODBC Driver 17 for SQL Server'
db_connection_string = f'DRIVER={driver};SERVER={server};' \
f'DATABASE={database};UID={user_name};PWD={password};'\
f'Authentication=ActiveDirectoryPassword;'\
'Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;'
connector = pyodbc.connect(db_connection_string)
Is any way in python under linux/MacOS can allow me to use device_code_credential and access_token to connect to Azure SQL server?
https://github.com/mkleehammer/pyodbc/issues/228
I only got this link and it doesn't seem to work.
Anyone has a fully working sample?
You could reference this tutorial: AzureAD/azure-activedirectory-library-for-python: Connect to Azure SQL Database.
It is doable to connect to Azure SQL Database by obtaining a token from Azure Active Directory (AAD), via ADAL Python. We do not currently maintain a full sample for it, but this essay outlines some key ingredients.
You follow the instruction of Connecting using Access Token to
provision your application. There is another similar blog post here.
Your SQL admin need to add permissions for the app-registration to
the specific database that you are trying to access. See details in
this blog post Token-based authentication support for Azure SQL DB
using Azure AD auth by Mirek H Sztajno.
It was not particularly highlighted in either of the documents
above, but you need to use https://database.windows.net/ as the
resource string. Note that you need to keep the trailing slash,
otherwise the token issued would not work.
Feed the configuration above into ADAL Python's Client Credentials
sample.
Once you get the access token, use it in this way in pyodbc to
connect to SQL Database.
This works with AAD access tokens. Example code to expand the token and prepend the length as described on the page linked above, in Python 2.x:
token = "eyJ0eXAiOi...";
exptoken = "";
for i in token:
exptoken += i;
exptoken += chr(0);
tokenstruct = struct.pack("=i", len(exptoken)) + exptoken;
conn = pyodbc.connect(connstr, attrs_before = { 1256:bytearray(tokenstruct) });
3.x is only slightly more involved due to annoying char/bytes split:
token = b"eyJ0eXAiOi...";
exptoken = b"";
for i in token:
exptoken += bytes({i});
exptoken += bytes(1);
tokenstruct = struct.pack("=i", len(exptoken)) + exptoken;
conn = pyodbc.connect(connstr, attrs_before = { 1256:tokenstruct });
(SQL_COPT_SS_ACCESS_TOKEN is 1256; it's specific to msodbcsql driver so pyodbc does not have it defined, and likely will not.)
Hope this helps.
You can get a token via
from azure.identity import DeviceCodeCredential
# Recommended to allocate a new ClientID in your tenant.
AZURE_CLI_CLIENT_ID = "04b07795-8ddb-461a-bbee-02f9e1bf7b46"
credential = DeviceCodeCredential(client_id=AZURE_CLI_CLIENT_ID)
databaseToken = credential.get_token('https://database.windows.net/.default')
Then use databaseToken.token as an AAD Access Token as described in Leon Yue's answer.

Authenticate calls to Google Cloud Functions programmatically

I am trying to authenticate to Google Cloud Functions from SAP CPI to fetch some data from a database. To push data, we use pub/sub, with a service account access token, and it works perfectly. But for the functions, it needs an identity token instead of an access token. We get the previous token with a groovy script (No Jenkins). Is it possible to authenticate to the functions also with an access token? Or to get the identity token without building a whole IAP layer?
You have to call your Cloud Functions (or Cloud Run, it's the same) with a signed identity token.
So you can use a groovy script for generating a signed identity token. Here an example
import com.google.api.client.http.GenericUrl
import com.google.api.client.http.HttpRequest
import com.google.api.client.http.HttpRequestFactory
import com.google.api.client.http.HttpResponse
import com.google.api.client.http.javanet.NetHttpTransport
import com.google.auth.Credentials
import com.google.auth.http.HttpCredentialsAdapter
import com.google.auth.oauth2.IdTokenCredentials
import com.google.auth.oauth2.IdTokenProvider
import com.google.auth.oauth2.ServiceAccountCredentials
import com.google.common.base.Charsets
import com.google.common.io.CharStreams
String myUri = "YOUR_URL";
Credentials credentials = ServiceAccountCredentials
.fromStream(new FileInputStream(new File("YOUR_SERVICE_ACCOUNT_KEY_FILE"))).createScoped("https://www.googleapis.com/auth/cloud-platform");
String token = ((IdTokenProvider) credentials).idTokenWithAudience(myUri, Collections.EMPTY_LIST).getTokenValue();
System.out.println(token);
IdTokenCredentials idTokenCredentials = IdTokenCredentials.newBuilder()
.setIdTokenProvider((ServiceAccountCredentials) credentials)
.setTargetAudience(myUri).build();
HttpRequestFactory factory = new NetHttpTransport().createRequestFactory(new HttpCredentialsAdapter(idTokenCredentials));
HttpRequest request = factory.buildGetRequest(new GenericUrl(myUri));
HttpResponse httpResponse = request.execute();
System.out.println(CharStreams.toString(new InputStreamReader(httpResponse.getContent(), Charsets.UTF_8)));
Service account key file is required only if you are outside GCP. Else, the default service account is enough, but must be a service account. Your personal user account won't work
Add this dependency (here in Maven)
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.20.0</version>
</dependency>
Or you can use a tool that I wrote and open sourced. I also wrote a Medium article for explaining the use cases
You can only access your secured cloud function using Identity token.
1.Create a service account with roles/cloudfunctions.invoker
2.Create a cloud function that allows only authenticated requests
https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME
from google.oauth2 import service_account
from google.auth.transport.requests import AuthorizedSession
target_audience = 'https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME'
creds = service_account.IDTokenCredentials.from_service_account_file(
'/path/to/svc.json', target_audience=target_audience)
authed_session = AuthorizedSession(creds)
# make authenticated request and print the response, status_code
resp = authed_session.get(target_audience)
print(resp.status_code)
print(resp.text)

What and how to pass credential using using Python Client Library for gcp compute API

I want to get list of all instances in a project using python google client api google-api-python-client==1.7.11
Am trying to connect using method googleapiclient.discovery.build this method required credentials as argument
I read documentation but did not get crdential format and which credential it requires
Can anyone explain what credentials and how to pass to make gcp connection
The credentials that you need are called "Service Account JSON Key File". These are created in the Google Cloud Console under IAM & Admin / Service Accounts. Create a service account and download the key file. In the example below this is service-account.json.
Example code that uses a service account:
from googleapiclient import discovery
from google.oauth2 import service_account
scopes = ['https://www.googleapis.com/auth/cloud-platform']
sa_file = 'service-account.json'
zone = 'us-central1-a'
project_id = 'my_project_id' # Project ID, not Project Name
credentials = service_account.Credentials.from_service_account_file(sa_file, scopes=scopes)
# Create the Cloud Compute Engine service object
service = discovery.build('compute', 'v1', credentials=credentials)
request = service.instances().list(project=project_id, zone=zone)
while request is not None:
response = request.execute()
for instance in response['items']:
# TODO: Change code below to process each `instance` resource:
print(instance)
request = service.instances().list_next(previous_request=request, previous_response=response)
Application default credentials are provided in Google API client libraries automatically. There you can find example using python, also check this documentation Setting Up Authentication for Server to Server Production Applications.
According to GCP most recent documentation:
we recommend you use Google Cloud Client Libraries for your
application. Google Cloud Client Libraries use a library called
Application Default Credentials (ADC) to automatically find your
service account credentials
In case you still want to set it manaully, you could, first create a service account and give all necessary permissions:
# A name for the service account you are about to create:
export SERVICE_ACCOUNT_NAME=your-service-account-name
# Create service account:
gcloud iam service-accounts create ${SERVICE_ACCOUNT_NAME} --display-name="Service Account for ai-platform-samples repo"
# Grant the required roles:
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com --role roles/ml.developer
gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com --role roles/storage.objectAdmin
# Download the service account key and store it in a file specified by GOOGLE_APPLICATION_CREDENTIALS:
gcloud iam service-accounts keys create ${GOOGLE_APPLICATION_CREDENTIALS} --iam-account ${SERVICE_ACCOUNT_NAME}#${PROJECT_ID}.iam.gserviceaccount.com
Once it's done check whether the ADC path has been set properly by checking:
echo $GOOGLE_APPLICATION_CREDENTIALS
Having set the ADC path, you don't need to import from code the service access key, which undesirable, so the code looks as follows:
service = googleapiclient.discovery.build(<API>, <version>,cache_discovery=False)

How to Verify Authentication of Microsoft Azure Storage Accounts When Called with Python SDK

Here is a working Python code.
from azure.storage.blob import BlockBlobService
accountName, key='stagingData', 'vZfqyMyHT3A=='
blobService=BlockBlobService(account_name=accountName, account_key=key)
It seems the blobService client object is created even if I pass wrong account credentials. It is not authorised, and the error shows up only later when I try to access some data, possibly from some other file or even when different users try to use it. Is there a way to assert right on the spot whether correct credentials were supplied and halt the execution if not? For reference, I tried dir(blobService) and that displayed 121 methods and attributes. The ones that seemed sensible from the name, show similar results whether the account is actually authenticated or not.
Almost every other API call which uses some access token lets you know right on the spot if the token is not valid, by raising some exception. So I hope there is a way to check it for the BlockBlobService class as well.
As you mentioned that blobService client object doesn't verify the account credentials.For more information, we could get the python source code from github.
The following code is the snippet from the source code. There is no request to Azure storage server side. So it does verify the account credentials.
def create_block_blob_service(self):
'''
Creates a BlockBlobService object with the settings specified in the
CloudStorageAccount.
:return: A service object.
:rtype: :class:`~azure.storage.blob.blockblobservice.BlockBlobService`
'''
try:
from azure.storage.blob.blockblobservice import BlockBlobService
return BlockBlobService(self.account_name, self.account_key,
sas_token=self.sas_token,
is_emulated=self.is_emulated,
endpoint_suffix=self.endpoint_suffix)
except ImportError:
raise Exception('The package azure-storage-blob is required. '
+ 'Please install it using "pip install azure-storage-blob"')
If we want to verify the account credentials. We need to send the request to the Azure storage server and check the response. If you stick on doing that, I recommand that you could write a test method to implement it by yourself.

Resources