I had successfully retrieved Azure storage table details using the following code.
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("https://" + storageAccountName + ".table.core.windows.net/" + tableName;);
request.Method = "GET";
request.Accept = "application/json";
var date = DateTime.UtcNow.ToString("R", System.Globalization.CultureInfo.InvariantCulture);
request.Headers["x-ms-date"] = date;
request.Headers["x-ms-version"] = "2015-04-05";
string stringToSign = date + "\n/" + storageAccount + "/" + tableName; //Canonicalized Resource
System.Security.Cryptography.HMACSHA256 hasher = new System.Security.Cryptography.HMACSHA256(Convert.FromBase64String("accessKey"));
string strAuthorization = "SharedKeyLite " + storageAccountName + ":" + System.Convert.ToBase64String(hasher.ComputeHash(System.Text.Encoding.UTF8.GetBytes(stringToSign)));
request.Headers["Authorization"] = strAuthorization;
Task<WebResponse> response = request.GetResponseAsync();
HttpWebResponse responseresult = (HttpWebResponse)response.Result;
But when trying to get table list in a Storage account using the following REST API, exception occurred as "The remote server returned an error: (403) Forbidden."
https://myaccount.table.core.windows.net/Tables
I assumed that Canonicalized Resource should be different for this REST request and analyzed some Microsoft documentation, but cannot able to find any reference to construct it for list tables REST api.
Please help me in retrieving Azure Storage account tables list.
Please change the following line of code:
string stringToSign = date + "\n/" + storageAccount + "/" + tableName;
to
string stringToSign = date + "\n/" + storageAccount + "/Tables";
Also, please note that your request URL will also change to https://storageaccount.table.core.windows.net/Tables.
Related
container name: task
file path example: -task/input/data.csv
-task/input/cost.txt
from urllib.parse import urlparse
from azure.storage.blob import BlobClient
sasUrl = "sas url here"
blobNameWithContainer = "file path"
sasUrlParts = urlparse(sasUrl)
accountEndpoint = sasUrlParts.scheme + '://' + sasUrlParts.netloc
sasToken = sasUrlParts.query
blobSasUrl = accountEndpoint + '/' + blobNameWithContainer + '?' + sasToken
blobClient = BlobClient.from_blob_url(blobSasUrl)
Working example to generate a valid url (including signature) for the Huobi API.
In the Huobi API documenation there is no explicit example that allows you to verify your signature creation method step by step.
My intention is to create that here, but I need help, because I haven't managed yet.
The following is supposed to be the recipe.
Note that once you have this working, substitute valid values for your API key + secret and timestamp:
import hmac
import hashlib
import base64
from urllib.parse import urlencode
API_KEY = 'dummy-key'
API_SECRET = 'dummy-secret'
timestamp = '2021-03-04T11:36:39'
params_dict = {
'AccessKeyId': API_KEY,
'SignatureMethod': 'HmacSHA256',
'SignatureVersion': '2',
'Timestamp': timestamp
}
params_url_enc = urlencode(sorted(params_dict.items()))
pre_signed = 'GET\n'
pre_signed += 'api.huobi.pro\n'
pre_signed += '/v1/account/accounts\n'
pre_signed += params_url_enc
sig_bytes = hmac.new(
API_SECRET.encode(),
pre_signed.encode(),
hashlib.sha256).hexdigest().encode()
sig_b64_bytes = base64.b64encode(sig_bytes)
sig_b64_str = sig_b64_bytes.decode()
sig_url = urlencode({'Signature': sig_b64_str})
url = 'https://api.huobi.pro/v1/account/accounts?'
url += params_url_enc + '&'
url += sig_url
print('API_KEY={}'.format(API_KEY))
print('API_SECRET={}'.format(API_SECRET))
print('timestamp={}'.format(timestamp))
print('params_dict={}'.format(params_dict))
print('params_url_enc={}'.format(params_url_enc))
print('pre_signed:\n{}'.format(pre_signed))
print('sig_bytes={}'.format(sig_bytes))
print('sig_b64_bytes={}'.format(sig_b64_bytes))
print('sig_b64_str={}'.format(sig_b64_str))
print('sig_url={}'.format(sig_url))
print('url={}'.format(url))
Gives:
API_KEY=dummy-key
API_SECRET=dummy-secret
timestamp=2021-03-04T11:36:39
params_dict={'AccessKeyId': 'dummy-key', 'SignatureMethod': 'HmacSHA256', 'SignatureVersion': '2', 'Timestamp': '2021-03-04T11:36:39'}
params_url_enc=AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T11%3A36%3A39
pre_signed:
GET
api.huobi.pro
/v1/account/accounts
AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T11%3A36%3A39
sig_bytes=b'1921de9f42284bc0449c5580f52a9f7e7e3a54a6e8befc0d320992e757517a6b'
sig_b64_bytes=b'MTkyMWRlOWY0MjI4NGJjMDQ0OWM1NTgwZjUyYTlmN2U3ZTNhNTRhNmU4YmVmYzBkMzIwOTkyZTc1NzUxN2E2Yg=='
sig_b64_str=MTkyMWRlOWY0MjI4NGJjMDQ0OWM1NTgwZjUyYTlmN2U3ZTNhNTRhNmU4YmVmYzBkMzIwOTkyZTc1NzUxN2E2Yg==
sig_url=Signature=MTkyMWRlOWY0MjI4NGJjMDQ0OWM1NTgwZjUyYTlmN2U3ZTNhNTRhNmU4YmVmYzBkMzIwOTkyZTc1NzUxN2E2Yg%3D%3D
url=https://api.huobi.pro/v1/account/accounts?AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T11%3A36%3A39&Signature=MTkyMWRlOWY0MjI4NGJjMDQ0OWM1NTgwZjUyYTlmN2U3ZTNhNTRhNmU4YmVmYzBkMzIwOTkyZTc1NzUxN2E2Yg%3D%3D
Also add header in sending:
{"Content-Type": "application/x-www-form-urlencoded"}
Unfortunately, when I substitute my own valid API key + secret and a proper UTC time stamp, I invariably receive:
{"status":"error","err-code":"api-signature-not-valid","err-msg":"Signature not valid: Verification failure [校验失败]","data":null}
So what is going wrong here?
Huobi API documentation is
https://huobiapi.github.io/docs/spot/v1/en/#introduction
To get all accounts, use endpoint GET /v1/account/accounts
from datetime import datetime
import requests
import json
import hmac
import hashlib
import base64
from urllib.parse import urlencode
#Get all Accounts of the Current User
AccessKeyId = 'xxxxx-xxxxx-xxxxx-xxxxx'
SecretKey = 'xxxxx-xxxxx-xxxxx-xxxxx'
timestamp = str(datetime.utcnow().isoformat())[0:19]
params = urlencode({'AccessKeyId': AccessKeyId,
'SignatureMethod': 'HmacSHA256',
'SignatureVersion': '2',
'Timestamp': timestamp
})
method = 'GET'
endpoint = '/v1/account/accounts'
base_uri = 'api.huobi.pro'
pre_signed_text = method + '\n' + base_uri + '\n' + endpoint + '\n' + params
hash_code = hmac.new(SecretKey.encode(), pre_signed_text.encode(), hashlib.sha256).digest()
signature = urlencode({'Signature': base64.b64encode(hash_code).decode()})
url = 'https://' + base_uri + endpoint + '?' + params + '&' + signature
response = requests.request(method, url)
accts = json.loads(response.text)
print(accts)
Subsequently, if you need to run another endpoint (note timestamp allowance is ±5 minutes),
example, to get account balance, use GET /v1/account/accounts/{account_id}/balance
#Get Account Balance of a Specific Account
account_id = accts['data'][0]['id']
method = 'GET'
endpoint = '/v1/account/accounts/{}/balance'.format(account_id)
pre_signed_text = method + '\n' + base_uri + '\n' + endpoint + '\n' + params
hash_code = hmac.new(SecretKey.encode(), pre_signed_text.encode(), hashlib.sha256).digest()
signature = urlencode({'Signature': base64.b64encode(hash_code).decode()})
url = 'https://' + base_uri + endpoint + '?' + params + '&' + signature
response = requests.request(method, url)
r = json.loads(response.text)
print(r)
The mistake was that I took the hexidigest of the hash, whereas the digest was needed.
Working recipe here that you can check numerically to validate your code:
import hmac
import hashlib
import base64
from urllib.parse import urlencode
API_KEY = 'dummy-key'
API_SECRET = 'dummy-secret'
timestamp = '2021-03-04T12:54:56'
params_dict = {
'AccessKeyId': API_KEY,
'SignatureMethod': 'HmacSHA256',
'SignatureVersion': '2',
'Timestamp': timestamp
}
params_url_enc = urlencode(
sorted(params_dict.items(), key=lambda tup: tup[0]))
pre_signed = 'GET\n'
pre_signed += 'api.huobi.pro\n'
pre_signed += '/v1/account/accounts\n'
pre_signed += params_url_enc
sig_bin = hmac.new(
API_SECRET.encode(),
pre_signed.encode(),
hashlib.sha256).digest()
sig_b64_bytes = base64.b64encode(sig_bin)
sig_b64_str = sig_b64_bytes.decode()
sig_url = urlencode({'Signature': sig_b64_str})
url = 'https://api.huobi.pro/v1/account/accounts?'
url += params_url_enc + '&'
url += sig_url
print('API_KEY={}'.format(API_KEY))
print('API_SECRET={}'.format(API_SECRET))
print('timestamp={}'.format(timestamp))
print('params_dict={}'.format(params_dict))
print('params_url_enc={}'.format(params_url_enc))
print('pre_signed:\n{}'.format(pre_signed))
print('sig_bin={}'.format(sig_bin))
print('sig_b64_bytes={}'.format(sig_b64_bytes))
print('sig_b64_str={}'.format(sig_b64_str))
print('sig_url={}'.format(sig_url))
print('url={}'.format(url))
Result:
$ python test_huobi_so.py
API_KEY=dummy-key
API_SECRET=dummy-secret
timestamp=2021-03-04T12:54:56
params_dict={'AccessKeyId': 'dummy-key', 'SignatureMethod': 'HmacSHA256', 'SignatureVersion': '2', 'Timestamp': '2021-03-04T12:54:56'}
params_url_enc=AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T12%3A54%3A56
pre_signed:
GET
api.huobi.pro
/v1/account/accounts
AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T12%3A54%3A56
sig_bin=b'_\xb9k\x82!\xb4B%A\xfe\x0c \xff\x07%JE\xbe\x82\x8b-<^\xb7\xfc\x06\x85G\xb5$\x81\xd7'
sig_b64_bytes=b'X7lrgiG0QiVB/gwg/wclSkW+gostPF63/AaFR7Ukgdc='
sig_b64_str=X7lrgiG0QiVB/gwg/wclSkW+gostPF63/AaFR7Ukgdc=
sig_url=Signature=X7lrgiG0QiVB%2Fgwg%2FwclSkW%2BgostPF63%2FAaFR7Ukgdc%3D
url=https://api.huobi.pro/v1/account/accounts?AccessKeyId=dummy-key&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2021-03-04T12%3A54%3A56&Signature=X7lrgiG0QiVB%2Fgwg%2FwclSkW%2BgostPF63%2FAaFR7Ukgdc%3D
I am running this code to obtain a Bearer token from the InteractiveBrowserCreedentail and log in to azure blob storage:
cred = InteractiveBrowserCredential(authority="login.microsoftonline.com", tenant_id="**", client_id="**")
token = cred.get_token()
print(token)
blobber = BlobServiceClient(account_url="https://**.blob.core.windows.net", credential=cred)
blobs = blobber.list_containers()
for b in blobs:
print(b)
This works well.
I am trying to reuse the token in another call, this time a direct rest interaction:
import requests
auth_header = ("Authorization", "Bearer " + "***")
version = ("x-ms-version", "2017-11-09")
response = requests.get("https://***.blob.core.windows.net/?comp=list", headers=dict([auth_header, version]))
I get a 403 response saying:
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
According to official documentation, this should be working.
What am I missing?
According to my research, when you request AD access token and call Azure blob storage, the scope must contain https://storage.azure.com/user_impersonation or https://storage.azure.com/.default. For more details, please refer to the document. In other words, the request url should be like
https://login.microsoftonline.com/<tenat id>/oauth2/v2.0/authorize?client_id=<>
&scope=https://storage.azure.com/user_impersonation
&...
But when I run the cred.get_token(), the request url just be like below. The scope does not contain https://storage.azure.com/user_impersonation or https://storage.azure.com/.default. So you cannot call Azure Blob rest api with the token.
https://login.microsoftonline.com/<tenat id>/oauth2/v2.0/authorize?
client_id=<>
&scope=offline_access openid profile&state=204238ac-4fcd-44f2-9eed-528ab4d9c37
&...
Meanwhile, I do test, if we run the code blob_service_client = BlobServiceClient(account_url="https://blobstorage0516.blob.core.windows.net/", credential=cred), the request url is
https://login.microsoftonline.com/<tenat id>/oauth2/v2.0/authorize?
client_id=<>
&scope=https://storage.azure.com/.default offline_access openid profile&state=204238ac-4fcd-44f2-9eed-528ab4d9c37
&...
That is my solution:
from azure.identity import InteractiveBrowserCredential
class InteractiveAuthentication():
def __init__(self):
self.tenant_id: str = ""
self.authority: str = ""
self.client_id: str = ""
self.resource_id: str = ""
self.scope: str = f"{self.resource_id}/.default"
self.token: str = ""
def get_access_token(self):
credential = DeviceCodeCredential(
exclude_interactive_browser_credential=True,
disable_automatic_authentication=True,
tenant_id=self.tenant_id,
authority=self.authority,
client_id=self.client_id
)
self.token = credential._request_token(self.scope)
return self.token['access_token']
I am trying to retrieve audit logs from Azure Data Lake Storage (Gen 2)..
So far I have tried using AZCOPY, REST API (unsupported for now) in Gen 2 to retrieve (connect) the audit logs and looking for an alternative solution for retrieving the logs
When connected using AZCOPY it uses nothing but API based calls and when I tried to retrieve log I got the error that API calls are not supported for hierarchical namespace accounts. Image added for reference.
Snapshot of AZCOPY error
Is there any workaround for this use case or any other approach which I can try to retrieve logs?
Update:
I can get the file content from the ADLS GEN2 with read api. I can provide you an example written by python code(you can change to any other language as per my code). From the code below, you can directly get the file content, or get the Authorization which can be used in postman.
Python 3.7 code like below:
import requests
import datetime
import hmac
import hashlib
import base64
storage_account_name = 'xxx'
storage_account_key = 'xxx'
api_version = '2018-11-09'
request_time = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
#the file path on adls gen2
FILE_SYSTEM_NAME='dd1/myfile.txt'
string_params = {
'verb': 'GET',
'Content-Encoding': '',
'Content-Language': '',
'Content-Length': '',
'Content-MD5': '',
'Content-Type': '',
'Date': '',
'If-Modified-Since': '',
'If-Match': '',
'If-None-Match': '',
'If-Unmodified-Since': '',
'Range': '',
'CanonicalizedHeaders': 'x-ms-date:' + request_time + '\nx-ms-version:' + api_version,
'CanonicalizedResource': '/' + storage_account_name+'/'+FILE_SYSTEM_NAME
}
string_to_sign = (string_params['verb'] + '\n'
+ string_params['Content-Encoding'] + '\n'
+ string_params['Content-Language'] + '\n'
+ string_params['Content-Length'] + '\n'
+ string_params['Content-MD5'] + '\n'
+ string_params['Content-Type'] + '\n'
+ string_params['Date'] + '\n'
+ string_params['If-Modified-Since'] + '\n'
+ string_params['If-Match'] + '\n'
+ string_params['If-None-Match'] + '\n'
+ string_params['If-Unmodified-Since'] + '\n'
+ string_params['Range'] + '\n'
+ string_params['CanonicalizedHeaders']+'\n'
+ string_params['CanonicalizedResource'])
signed_string = base64.b64encode(hmac.new(base64.b64decode(storage_account_key), msg=string_to_sign.encode('utf-8'), digestmod=hashlib.sha256).digest()).decode()
#print out the datetime
print(request_time)
#print out the Authorization
print('SharedKey ' + storage_account_name + ':' + signed_string)
headers = {
'x-ms-date' : request_time,
'x-ms-version' : api_version,
'Authorization' : ('SharedKey ' + storage_account_name + ':' + signed_string)
}
url = ('https://' + storage_account_name + '.dfs.core.windows.net/'+FILE_SYSTEM_NAME)
#print out the url
print(url)
r = requests.get(url, headers = headers)
#print out the file content
print(r.text)
After run the code, the file content is fetched:
And you can also use the generated values like authorization / date in the above code, in the postman:
As you may know that the SDK is not ready for azure data lake gen 2, so as of now, the solution is using ADLS Gen2 Read api.
After retrieving the content of the file, you can save it.
And you may do your own work for the authentication. If you have any issues about how to read using ADLS Gen 2 api, please feel free to let me know.
ADLS Gen2 $logs are now available when you sign up for Multi Protocol Access in ADLS Gen2. A blog describing Multi Protocol Access can be found at http://aka.ms/mpaadls. You can sign up for access here.
Enabling logs in the Azure portal is not currently supported. Here's an example of how to enable the logs by using PowerShell.
$storageAccount = Get-AzStorageAccount -ResourceGroupName <resourceGroup> -Name <storageAccountName>
Set-AzStorageServiceLoggingProperty -Context $storageAccount.Context -ServiceType Blob -LoggingOperations read,write,delete -RetentionDays <days>.
To consume logs, you can use AzCopy and SDKs today. You cannot view $logs in Azure Storage Explorer for the time being.
With November 2019 (Version 1.11.1) release of Azure Storage Explorer, it is now possible to view hidden containers such as $logs
I've been struggling with this for about a day now. I am testing direct to Azure Blob storage upload and getting the dreaded CORS issue. "XMLHttpRequest cannot load https://tempodevelop.blob.core.windows.net/tmp/a4d8e867-f13e-343f-c6d3-a603…Ym0PlrBn%2BU/UzUs7QUhQw%3D&sv=2014-02-14&se=2016-10-12T17%3A59%3A26.638531. Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8000' is therefore not allowed access. The response had HTTP status code 403."
Things I have already tried:
set the CORS to all hosts:
tried hosting my app locally and on heroku
made sure that I could upload a file using a different tool (Azure Storage Explorer)
configured my AccessPolicy to 'rwdl' and I am definitely getting an access signature (verified in unit tests).
The code as a whole is available here: https://github.com/mikebz/azureupload
But the relevant parts are here, front end upload:
<script>
/*
* not a true GUID, see here: http://stackoverflow.com/questions/105034/create-guid-uuid-in-javascript
*/
function guid() {
function s4() {
return Math.floor((1 + Math.random()) * 0x10000)
.toString(16)
.substring(1);
}
return s4() + s4() + '-' + s4() + '-' + s4() + '-' +
s4() + '-' + s4() + s4() + s4();
}
function startUpload() {
var fileName = guid();
jQuery.getJSON("/formfileupload/signature/" + fileName , function(data) {
console.log("got a signature: " + data.bloburl);
uploadFile(data.bloburl, data.signature);
})
.fail(function(jqxhr, textStatus, error) {
console.log( "error: " + textStatus + " - " + error );
})
}
function uploadFile(bloburl, signature) {
var xhr = new XMLHttpRequest();
fileData = document.getElementById('fileToUpload').files[0];
xhr.open("PUT", bloburl + "?" + signature);
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('x-ms-blob-content-type', fileData.type);
result = xhr.send(fileData);
}
</script>
The signature generation code in python is here:
def generate_access_signature(self, filename):
"""
calls the Azure Web service to generate a temporary access signature.
"""
blob_service = BlobService(
account_name=self.account_name,
account_key=self.account_key
)
expire_at = datetime.utcnow()
expire_at = expire_at + timedelta(seconds = 30)
access_policy = AccessPolicy(permission="rwdl", expiry=expire_at.isoformat())
sas_token = blob_service.generate_shared_access_signature(
container_name="tmp",
blob_name = filename,
shared_access_policy=SharedAccessPolicy(access_policy)
)
return sas_token
According to the error message [The response had HTTP status code 403], it may be the CORS is not enabled for the service or no CORS rules matches the preflight request. Detail Please refer to the Cross-Origin Resource Sharing (CORS) Support for the Azure Storage Services.
Or it may be the SAS signature incorrect.
Please have a try to troubleshoot
try to check the CORS setting on the Azure Portal under the Blob Service. As there are other services like table, queue, file.
Also Azure explore tools you can use to generate the SAS token
Get the SAS and try to debug it with the generated SAS
Thanks to Tom and Microsoft's support the issue has been resolved.
Solution part #1 - make sure you use the Azure Storage Library for Python version 0.33 or later.
Here is my requirements file:
azure-common==1.1.4
azure-nspkg==1.0.0
azure-storage==0.33.0
cffi==1.8.3
cryptography==1.5.2
dj-database-url==0.4.1
Django==1.10.2
enum34==1.1.6
futures==3.0.5
gunicorn==19.6.0
idna==2.1
ipaddress==1.0.17
pep8==1.7.0
psycopg2==2.6.2
pyasn1==0.1.9
pycparser==2.16
python-dateutil==2.5.3
requests==2.11.1
six==1.10.0
whitenoise==3.2.2
Second issue is generating the signature. The code that generates the right signature is here:
from azure.storage.blob import BlockBlobService, ContainerPermissions
from datetime import datetime, timedelta
class AzureUtils:
def __init__(self, account_name, account_key):
if account_name is None:
raise ValueError("account_name should not be None")
if account_key is None:
raise ValueError("account_key should not be None")
self.account_name = account_name
self.account_key = account_key
def generate_access_signature(self, filename):
"""
calls the Azure Web service to generate a temporary access signature.
"""
block_blob_service = BlockBlobService(
account_name=self.account_name,
account_key=self.account_key
)
expire_at = datetime.utcnow()
expire_at = expire_at + timedelta(seconds=30)
permissions = ContainerPermissions.READ | ContainerPermissions.WRITE | ContainerPermissions.DELETE | ContainerPermissions.LIST
sas_token = block_blob_service.generate_container_shared_access_signature(
"tmp",
permission=permissions,
expiry=expire_at
)
return sas_token
The solution can also be retrieved here: https://github.com/mikebz/azureupload