python- Cannot create signed url for Google cloud storage object - python-3.x

I use the code below to generate signed url for a file on Google cloud storage but when i click the link i got SignatureDoesNotMatch
from google.cloud.storage._signing import generate_signed_url
from google.oauth2.service_account import Credentials
signed_url = generate_signed_url(credentials=Credentials.from_service_account_file(google_cloud_platform__key_path),
resource=self.canonicalized_resource,
expiration=self.expiration,
content_type=self.content_type
)

Related

Azure python sdk authentication using SAS url [duplicate]

We got error of Authentication fail, when we try to create an azure blob client from connection string, using python v12 sdk with Azure Blob Storage v12.5.0, and Azure core 1.8.2.
I used
azure-storate-blob == 12.5.0
azure-core == 1.8.2
I tried to access my blob storage account using connection string with Python v12 SDK and received the error above. The environment I'm running in is python venv in NixShell.
The code for calling the blob_upload is as following:
blob_service_client = BlobServiceClient(account_url=<>,credential=<>)
blob_client = blob_service_client.get_blob_client(container=container_name,
blob=file)
I printed out blob_client, and it looks normal. But the next line of upload_blob gives error.
with open(os.path.join(root,file), "rb") as data:
blob_client.upload_blob(data)
The error message is as follows
File "<local_address>/.venv/lib/python3.8/site-packages/azure/storage/blob/_upload_helpers.py", in upload_block_blob
return client.upload(
File "<local_address>/.venv/lib/python3.8/site-packages/azure/storage/blob/_generated/operations/_block_blob_operations.py", in upload
raise models.StorageErrorException(response, self._deserialize)
azure.storage.blob._generated.models._models_py3.StorageErrorException: Operation returned an invalid status 'Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.'
So I printed out the http put request to azure blob storage, and get the response value of [403]
I can work the following code well with the version the same as yours.
from azure.storage.blob import BlobServiceClient
blob=BlobServiceClient.from_connection_string(conn_str="your connect string in Access Keys")
with open("./SampleSource.txt", "rb") as data:
blob.upload_blob(data)
Please check your connect-string, and check your PC's time.
There is a similar issue about the error: AzureStorage Blob Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature
UPDATE:
I tried with this code, and get the same error:
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
token_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url="https://pamelastorage123.blob.core.windows.net/",credential=token_credential)
blob_client = blob_service_client.get_blob_client(container="pamelac", blob="New Text Document.txt")
with open("D:/demo/python/New Text Document.txt", "rb") as data:
blob_client.upload_blob(data)
Then I use AzureCliCredential() instead of DefaultAzureCredential(). I authenticate via the Azure CLI with az login. And it works.
If you use environment credential, you need to set the variables. Anyway, I recommend you to use the specific credentials instead DefaultAzureCredential.
For more details about Azure Identity, see here.

Generating Cloud Storage Signed URL from Google Cloud Function without using explicit key file

I'd like to create a pre-signed upload URL to a storage bucket, and would like to avoid an explicit reference to a json key.
Currently, I'm attempting to do this with the Default App Engine Service Account
I'm attempting to follow along with this answer but am getting this error:
AttributeError: you need a private key to sign credentials.the
credentials you are currently using <class
'google.auth.compute_engine.credentials.Credentials'> just contains a
token. see
https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account
for more details.
My Cloud Function code looks like this:
from google.cloud import storage
import datetime
import google.auth
def generate_upload_url(blob_name, additional_metadata: dict = {}):
credentials, project_id = google.auth.default()
# Perform a refresh request to get the access token of the current credentials (Else, it's None)
from google.auth.transport import requests
r = requests.Request()
credentials.refresh(r)
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob(blob_name)
service_account_email = credentials.service_account_email
print(f"attempting to create signed url for {service_account_email}")
url = blob.generate_signed_url(
version="v4",
service_account_email=service_account_email,
access_token=credentials.token,
# This URL is valid for 120 minutes
expiration=datetime.timedelta(minutes=120),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
return url
def get_upload_url(request):
blob_name = get_param(request, "blob_name")
url = generate_upload_url(blob_name)
return url
When you use version v4 of signed URL, the first line of the method calls ensure_signed_credentialsmethod that check if the current service account can generate a signature in standalone mode (so with a private key). And so, that's break the current behavior.
In the comment of the function, it's clearly describe that a service account JSON file is required
If you are on Google Compute Engine, you can't generate a signed URL.
Follow `Issue 922`_ for updates on this. If you'd like to be able to
generate a signed URL from GCE, you can use a standard service account
from a JSON file rather than a GCE service account.
So, use v2 version instead.

Authenticate calls to Google Cloud Functions programmatically

I am trying to authenticate to Google Cloud Functions from SAP CPI to fetch some data from a database. To push data, we use pub/sub, with a service account access token, and it works perfectly. But for the functions, it needs an identity token instead of an access token. We get the previous token with a groovy script (No Jenkins). Is it possible to authenticate to the functions also with an access token? Or to get the identity token without building a whole IAP layer?
You have to call your Cloud Functions (or Cloud Run, it's the same) with a signed identity token.
So you can use a groovy script for generating a signed identity token. Here an example
import com.google.api.client.http.GenericUrl
import com.google.api.client.http.HttpRequest
import com.google.api.client.http.HttpRequestFactory
import com.google.api.client.http.HttpResponse
import com.google.api.client.http.javanet.NetHttpTransport
import com.google.auth.Credentials
import com.google.auth.http.HttpCredentialsAdapter
import com.google.auth.oauth2.IdTokenCredentials
import com.google.auth.oauth2.IdTokenProvider
import com.google.auth.oauth2.ServiceAccountCredentials
import com.google.common.base.Charsets
import com.google.common.io.CharStreams
String myUri = "YOUR_URL";
Credentials credentials = ServiceAccountCredentials
.fromStream(new FileInputStream(new File("YOUR_SERVICE_ACCOUNT_KEY_FILE"))).createScoped("https://www.googleapis.com/auth/cloud-platform");
String token = ((IdTokenProvider) credentials).idTokenWithAudience(myUri, Collections.EMPTY_LIST).getTokenValue();
System.out.println(token);
IdTokenCredentials idTokenCredentials = IdTokenCredentials.newBuilder()
.setIdTokenProvider((ServiceAccountCredentials) credentials)
.setTargetAudience(myUri).build();
HttpRequestFactory factory = new NetHttpTransport().createRequestFactory(new HttpCredentialsAdapter(idTokenCredentials));
HttpRequest request = factory.buildGetRequest(new GenericUrl(myUri));
HttpResponse httpResponse = request.execute();
System.out.println(CharStreams.toString(new InputStreamReader(httpResponse.getContent(), Charsets.UTF_8)));
Service account key file is required only if you are outside GCP. Else, the default service account is enough, but must be a service account. Your personal user account won't work
Add this dependency (here in Maven)
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.20.0</version>
</dependency>
Or you can use a tool that I wrote and open sourced. I also wrote a Medium article for explaining the use cases
You can only access your secured cloud function using Identity token.
1.Create a service account with roles/cloudfunctions.invoker
2.Create a cloud function that allows only authenticated requests
https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME
from google.oauth2 import service_account
from google.auth.transport.requests import AuthorizedSession
target_audience = 'https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME'
creds = service_account.IDTokenCredentials.from_service_account_file(
'/path/to/svc.json', target_audience=target_audience)
authed_session = AuthorizedSession(creds)
# make authenticated request and print the response, status_code
resp = authed_session.get(target_audience)
print(resp.status_code)
print(resp.text)

Python & Firebase - Unable to Read Uploaded File in Storage

I'm trying to upload my file to my Firebase Storage.
Many months ago, I could add my files using my same method, now I can add but I can't see its preview and direct link.
My code is
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
from firebase_admin import storage
cred = credentials.Certificate('C:\\Users\\Wicaledon\\PycharmProjects\\ABC\\XXXXXX.json')
# Initialize the app with a service account, granting admin privileges
firebase_admin.initialize_app(cred, {
'databaseURL': 'https://XXXXXXXXXX.firebaseio.com/',
'storageBucket': 'XXXXXXXXXX.appspot.com'
})
database_url = 'https://XXXXXXXXXX.firebaseio.com/'
bucket = storage.bucket()
blob2 = bucket.blob('my_image.jpg')
blob2.upload_from_filename('C:\\Users\\Wicaledon\\PycharmProjects\\ABC\\my_image.jpg')
I see my file now like this. I can't see its preview and link at the right side.
But I can add same file manually from my hand, and I can see its preview and link.
What is the problem? Can you solve it ?

Python: Upload a package to Azure using SAS URI

I am using the Microsoft's Hardware dashboard API to automate the submission of my (.CAB) package for signing. I have followed the steps in this documentation: https://learn.microsoft.com/en-us/windows-hardware/drivers/dashboard/create-a-new-submission-for-a-product
The response of new submission contains the SAS(Shared Access Signature) URI
like this: (changed the sig and accnt_name for security)
'''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
I need to use this SAS URI to upload by package to azure blob storage.
The examples in documentation shows C# or .NET as follows:
string sasUrl =
"https://productingestionbin1.blob.core.windows.net/ingestion/26920f66-
b592-4439-9a9d-fb0f014902ec?sv=2014-02-
14&sr=b&sig=usAN0kNFNnYE2tGQBI%2BARQWejX1Guiz7hdFtRhyK%2Bog%3D&se=2016-
06-17T20:45:51Z&sp=rwl";
Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob blockBob =
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob(new
System.Uri(sasUrl));
await blockBob.UploadFromStreamAsync(stream);
I want to use the SAS URI obtained from submission resource JSON Response to upload the package.
This link Download file from AZURE BLOB CONTAINER using SAS URI in PYTHON suggests that there is no equivalent method in python and BlockBlobService can be used.
from azure.storage.blob import BlockBlobService
blobservice = BlockBlobService("storage_account",sas_token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx")
blobservice.create_blob_from_path(container_name, local_file_name,
full_path_to_file)
However I am not sure of what is storage_account name and container name from the SAS URI obtained from submission resource.
Also I have created a separate azure storage account and added a new container, blob in it. I have tried passing the new container and storage account name with SAS access token from SAS URI (obtained from submission JSON response micorsoft hardware api) but always get below ERROR
'''
AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. ErrorCode: AuthenticationFailed
AuthenticationFailedServer failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:5463b7d2-901e-0068-6994-36782e000000
Time:2019-07-09T20:23:04.5760736ZSignature did not match. String to sign used was rwl
2019-07-10T18:15:58Z
/blob/evcertautomation/ev2/initial_1152921504628106590.cab
2017-04-17
attachment; filename=initial_1152921504628106563.cab
'''
Thanks in advance
If you have a blob SAS URI as you post below, you can easily upload a file to the blob in Python with requests.
https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab
First, you must have to inspect the values of parameters se and sp. The se parameter means the expire time of the blob SAS URI, and the sp parameter means the operation permisson of the blob SAS URL like w for Blob Write Permission
So for your blob SAS URL above, you have the blob write permission to upload a file to this blob before the time 2019-07-10T18:15:58Z.
Here is my sample code for uploading via a blob sas uri.
import requests
blob_sas_uri = '<your blob sas uri which must includes `sp=w` and do the write operation before `se`>'
local_file_name = '<your local file name>'
headers = {
'x-ms-blob-type': 'BlockBlob'
}
data = open(local_file_name).read()
r = requests.put(blob_sas_uri, headers=headers, data=data)
print(r.status_code)
If you see the result is 201, it works fine and succeed for uploading.
As reference, there is a similar offical sample Example: Upload a Blob using a Container’s Shared Access Signature which using a wide container permission.
As per the SAS URI you provided: '''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
The account name should be accnt_name, the container should be scsjc.
So your code should look like below:
from azure.storage.blob import BlockBlobService
storage_account ="accnt_name"
token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx"
container="scsjc"
blobservice = BlockBlobService(storage_account,sas_token=token)
blobservice.create_blob_from_path(container, local_file_name,
full_path_to_file)

Resources