We got error of Authentication fail, when we try to create an azure blob client from connection string, using python v12 sdk with Azure Blob Storage v12.5.0, and Azure core 1.8.2.
I used
azure-storate-blob == 12.5.0
azure-core == 1.8.2
I tried to access my blob storage account using connection string with Python v12 SDK and received the error above. The environment I'm running in is python venv in NixShell.
The code for calling the blob_upload is as following:
blob_service_client = BlobServiceClient(account_url=<>,credential=<>)
blob_client = blob_service_client.get_blob_client(container=container_name,
blob=file)
I printed out blob_client, and it looks normal. But the next line of upload_blob gives error.
with open(os.path.join(root,file), "rb") as data:
blob_client.upload_blob(data)
The error message is as follows
File "<local_address>/.venv/lib/python3.8/site-packages/azure/storage/blob/_upload_helpers.py", in upload_block_blob
return client.upload(
File "<local_address>/.venv/lib/python3.8/site-packages/azure/storage/blob/_generated/operations/_block_blob_operations.py", in upload
raise models.StorageErrorException(response, self._deserialize)
azure.storage.blob._generated.models._models_py3.StorageErrorException: Operation returned an invalid status 'Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.'
So I printed out the http put request to azure blob storage, and get the response value of [403]
I can work the following code well with the version the same as yours.
from azure.storage.blob import BlobServiceClient
blob=BlobServiceClient.from_connection_string(conn_str="your connect string in Access Keys")
with open("./SampleSource.txt", "rb") as data:
blob.upload_blob(data)
Please check your connect-string, and check your PC's time.
There is a similar issue about the error: AzureStorage Blob Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature
UPDATE:
I tried with this code, and get the same error:
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
token_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url="https://pamelastorage123.blob.core.windows.net/",credential=token_credential)
blob_client = blob_service_client.get_blob_client(container="pamelac", blob="New Text Document.txt")
with open("D:/demo/python/New Text Document.txt", "rb") as data:
blob_client.upload_blob(data)
Then I use AzureCliCredential() instead of DefaultAzureCredential(). I authenticate via the Azure CLI with az login. And it works.
If you use environment credential, you need to set the variables. Anyway, I recommend you to use the specific credentials instead DefaultAzureCredential.
For more details about Azure Identity, see here.
Related
I'd like to read blobs in a storage container, based off this Quickstart. I've already been assigned the role of Storage Account Contributor. Using VS code, I followed the tutorial (in VS code) by starting with az login, then start my environment localy, and finally executing the function. I get the following warning:
DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot.this issue.
ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
AzureCliCredential: Failed to invoke Azure CLI
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
my code, following the tutorial, looks like:
import azure.functions as func
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from azure.identity import DefaultAzureCredential
def main(req=None) -> func.HttpResponse:
account_url = "https://<name>.blob.core.windows.net"
default_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url, credential=default_credential)
container_client = blob_service_client.get_container_client('<name>')
blob_list = container_client.list_blobs()
for b in blob_list:
return func.HttpResponse(f'{b.name}')
Not sure what else I should do if I've already done the az login part.
I tried in my environment and got below results:
DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot.this issue.
ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
AzureCliCredential: Failed to invoke Azure CLI
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
The above error suggest you need to define at least one of the credential like environmentcredential,ManagedIdentitycredential,sharedtokencachecredential,AzureCliCredential. to retrieve token.
You can follow this document to retrieve the token with credential.
I followed the document in terminal I tried az login for signing in and used the below code to get the list of blobs using Http trigger function.
Code:
import azure.functions as func
import logging
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from azure.identity import DefaultAzureCredential
def main(req=None) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
account_url = "https://venkat123.blob.core.windows.net"
default_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url, credential=default_credential)
container_name="test"
container_client = blob_service_client.get_container_client(container_name)
#blob_list = container_client.list_blobs()
blobs=[]
for b in container_client.list_blobs():
blobs.append(b.name)
logging.info(b.name)
return func.HttpResponse(f"Listed blobs in container: {blobs}")
The above code executed successfully and listed blobs from the storage container.
Console:
Browser:
I'd like to create a pre-signed upload URL to a storage bucket, and would like to avoid an explicit reference to a json key.
Currently, I'm attempting to do this with the Default App Engine Service Account
I'm attempting to follow along with this answer but am getting this error:
AttributeError: you need a private key to sign credentials.the
credentials you are currently using <class
'google.auth.compute_engine.credentials.Credentials'> just contains a
token. see
https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account
for more details.
My Cloud Function code looks like this:
from google.cloud import storage
import datetime
import google.auth
def generate_upload_url(blob_name, additional_metadata: dict = {}):
credentials, project_id = google.auth.default()
# Perform a refresh request to get the access token of the current credentials (Else, it's None)
from google.auth.transport import requests
r = requests.Request()
credentials.refresh(r)
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob(blob_name)
service_account_email = credentials.service_account_email
print(f"attempting to create signed url for {service_account_email}")
url = blob.generate_signed_url(
version="v4",
service_account_email=service_account_email,
access_token=credentials.token,
# This URL is valid for 120 minutes
expiration=datetime.timedelta(minutes=120),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
return url
def get_upload_url(request):
blob_name = get_param(request, "blob_name")
url = generate_upload_url(blob_name)
return url
When you use version v4 of signed URL, the first line of the method calls ensure_signed_credentialsmethod that check if the current service account can generate a signature in standalone mode (so with a private key). And so, that's break the current behavior.
In the comment of the function, it's clearly describe that a service account JSON file is required
If you are on Google Compute Engine, you can't generate a signed URL.
Follow `Issue 922`_ for updates on this. If you'd like to be able to
generate a signed URL from GCE, you can use a standard service account
from a JSON file rather than a GCE service account.
So, use v2 version instead.
I have zipped and uploaded a python library O365 for accessing MS outlook calendar inside AWS Lambda-Layer. I'm able to import it, but the problem is the authorization. When I tested it in local the bearer token was generated and stored in the local txt file using the FileSytemTokenBackend.
But When I load this into AWS Lambda using layers, it is again asking to copy paste the URL process which is not able to fetch from the layer token file.
And I have tried FireSystemTokenBackend, but that also I'm failed to configure successfully. I have used this Token storage docs in local while testing the functionality.
My question is how to store and authenticate my account using the token file generated in my local. Because in the AWS lambda the input() functionality is throwing error in runtime. How can I keep that token file inside the aws lambda and use it without doing authentication everytime?
I have faced the same issue. The lambda filesystem is temporal, so you will need to do the autenticate process every time you run the function and the o365 lib will ask for the url.
So try saving your token (o365_token.txt) in S3 instead of getting it in lambda filesystem and the use this token for authentication.
I hope this code will help you:
import boto3
bucket_name = 'bucket_name'
# replace with your bucket name
filename_token = 'o365_token.txt'
# replace with your AWS credentials
s3 = boto3.resource('s3',aws_access_key_id='xxxx', aws_secret_access_key='xxxx')
# Read the token in S3 and save to /tmp directory in Lambda
s3.Bucket(bucket_name).download_file(filename_token, f'/tmp/{filename_token}')
# Read the token in /tmp directory
token_backend = FileSystemTokenBackend(token_path='/tmp',
token_filename=filename_token)
# Your azure credentials
credentials = ('xxxx', 'xxxx')
account = Account(credentials,token_backend=token_backend)
# Then do the normal authentication process and include the refresh token command
if not account.is_authenticated:
account.authenticate()
account.connection.refresh_token()
I am using the Microsoft's Hardware dashboard API to automate the submission of my (.CAB) package for signing. I have followed the steps in this documentation: https://learn.microsoft.com/en-us/windows-hardware/drivers/dashboard/create-a-new-submission-for-a-product
The response of new submission contains the SAS(Shared Access Signature) URI
like this: (changed the sig and accnt_name for security)
'''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
I need to use this SAS URI to upload by package to azure blob storage.
The examples in documentation shows C# or .NET as follows:
string sasUrl =
"https://productingestionbin1.blob.core.windows.net/ingestion/26920f66-
b592-4439-9a9d-fb0f014902ec?sv=2014-02-
14&sr=b&sig=usAN0kNFNnYE2tGQBI%2BARQWejX1Guiz7hdFtRhyK%2Bog%3D&se=2016-
06-17T20:45:51Z&sp=rwl";
Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob blockBob =
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob(new
System.Uri(sasUrl));
await blockBob.UploadFromStreamAsync(stream);
I want to use the SAS URI obtained from submission resource JSON Response to upload the package.
This link Download file from AZURE BLOB CONTAINER using SAS URI in PYTHON suggests that there is no equivalent method in python and BlockBlobService can be used.
from azure.storage.blob import BlockBlobService
blobservice = BlockBlobService("storage_account",sas_token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx")
blobservice.create_blob_from_path(container_name, local_file_name,
full_path_to_file)
However I am not sure of what is storage_account name and container name from the SAS URI obtained from submission resource.
Also I have created a separate azure storage account and added a new container, blob in it. I have tried passing the new container and storage account name with SAS access token from SAS URI (obtained from submission JSON response micorsoft hardware api) but always get below ERROR
'''
AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. ErrorCode: AuthenticationFailed
AuthenticationFailedServer failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:5463b7d2-901e-0068-6994-36782e000000
Time:2019-07-09T20:23:04.5760736ZSignature did not match. String to sign used was rwl
2019-07-10T18:15:58Z
/blob/evcertautomation/ev2/initial_1152921504628106590.cab
2017-04-17
attachment; filename=initial_1152921504628106563.cab
'''
Thanks in advance
If you have a blob SAS URI as you post below, you can easily upload a file to the blob in Python with requests.
https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab
First, you must have to inspect the values of parameters se and sp. The se parameter means the expire time of the blob SAS URI, and the sp parameter means the operation permisson of the blob SAS URL like w for Blob Write Permission
So for your blob SAS URL above, you have the blob write permission to upload a file to this blob before the time 2019-07-10T18:15:58Z.
Here is my sample code for uploading via a blob sas uri.
import requests
blob_sas_uri = '<your blob sas uri which must includes `sp=w` and do the write operation before `se`>'
local_file_name = '<your local file name>'
headers = {
'x-ms-blob-type': 'BlockBlob'
}
data = open(local_file_name).read()
r = requests.put(blob_sas_uri, headers=headers, data=data)
print(r.status_code)
If you see the result is 201, it works fine and succeed for uploading.
As reference, there is a similar offical sample Example: Upload a Blob using a Container’s Shared Access Signature which using a wide container permission.
As per the SAS URI you provided: '''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
The account name should be accnt_name, the container should be scsjc.
So your code should look like below:
from azure.storage.blob import BlockBlobService
storage_account ="accnt_name"
token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx"
container="scsjc"
blobservice = BlockBlobService(storage_account,sas_token=token)
blobservice.create_blob_from_path(container, local_file_name,
full_path_to_file)
Here is a working Python code.
from azure.storage.blob import BlockBlobService
accountName, key='stagingData', 'vZfqyMyHT3A=='
blobService=BlockBlobService(account_name=accountName, account_key=key)
It seems the blobService client object is created even if I pass wrong account credentials. It is not authorised, and the error shows up only later when I try to access some data, possibly from some other file or even when different users try to use it. Is there a way to assert right on the spot whether correct credentials were supplied and halt the execution if not? For reference, I tried dir(blobService) and that displayed 121 methods and attributes. The ones that seemed sensible from the name, show similar results whether the account is actually authenticated or not.
Almost every other API call which uses some access token lets you know right on the spot if the token is not valid, by raising some exception. So I hope there is a way to check it for the BlockBlobService class as well.
As you mentioned that blobService client object doesn't verify the account credentials.For more information, we could get the python source code from github.
The following code is the snippet from the source code. There is no request to Azure storage server side. So it does verify the account credentials.
def create_block_blob_service(self):
'''
Creates a BlockBlobService object with the settings specified in the
CloudStorageAccount.
:return: A service object.
:rtype: :class:`~azure.storage.blob.blockblobservice.BlockBlobService`
'''
try:
from azure.storage.blob.blockblobservice import BlockBlobService
return BlockBlobService(self.account_name, self.account_key,
sas_token=self.sas_token,
is_emulated=self.is_emulated,
endpoint_suffix=self.endpoint_suffix)
except ImportError:
raise Exception('The package azure-storage-blob is required. '
+ 'Please install it using "pip install azure-storage-blob"')
If we want to verify the account credentials. We need to send the request to the Azure storage server and check the response. If you stick on doing that, I recommand that you could write a test method to implement it by yourself.