I am trying to get credential token expiration
from azure.identity import ClientSecretCredential
token_credential = ClientSecretCredential()
from azure.storage.blob import BlobServiceClient
blob_service_client = BlobServiceClient(account_url=oauth, credential=token_credential)
It seems like the token is encapsulate in ClientSecretCredential but there is no public api to get, is there a way i can see the expiration of the token?
Related
I'd like to read blobs in a storage container, based off this Quickstart. I've already been assigned the role of Storage Account Contributor. Using VS code, I followed the tutorial (in VS code) by starting with az login, then start my environment localy, and finally executing the function. I get the following warning:
DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot.this issue.
ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
AzureCliCredential: Failed to invoke Azure CLI
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
my code, following the tutorial, looks like:
import azure.functions as func
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from azure.identity import DefaultAzureCredential
def main(req=None) -> func.HttpResponse:
account_url = "https://<name>.blob.core.windows.net"
default_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url, credential=default_credential)
container_client = blob_service_client.get_container_client('<name>')
blob_list = container_client.list_blobs()
for b in blob_list:
return func.HttpResponse(f'{b.name}')
Not sure what else I should do if I've already done the az login part.
I tried in my environment and got below results:
DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot.this issue.
ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
AzureCliCredential: Failed to invoke Azure CLI
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
The above error suggest you need to define at least one of the credential like environmentcredential,ManagedIdentitycredential,sharedtokencachecredential,AzureCliCredential. to retrieve token.
You can follow this document to retrieve the token with credential.
I followed the document in terminal I tried az login for signing in and used the below code to get the list of blobs using Http trigger function.
Code:
import azure.functions as func
import logging
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from azure.identity import DefaultAzureCredential
def main(req=None) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
account_url = "https://venkat123.blob.core.windows.net"
default_credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url, credential=default_credential)
container_name="test"
container_client = blob_service_client.get_container_client(container_name)
#blob_list = container_client.list_blobs()
blobs=[]
for b in container_client.list_blobs():
blobs.append(b.name)
logging.info(b.name)
return func.HttpResponse(f"Listed blobs in container: {blobs}")
The above code executed successfully and listed blobs from the storage container.
Console:
Browser:
Is it possible to copy some files in Azure ADLSgen2 with the python sdk ?
I want to copy some files from one folder to another.
So far i could only find a rename method.
from azure.storage.filedatalake import DataLakeServiceClient
from azure.identity import ClientSecretCredential
tenant_id = azure_tenant_id
client_id = azure_client_id
client_secret = azure_client_secret
credential = ClientSecretCredential(tenant_id,client_id,client_secret)
service = DataLakeServiceClient(account_url="https://xxxx.dfs.core.windows.net/",credential=credential,file_system_name="file_system_1", file_path="some/directories")
file_system_client = service.get_file_system_client(file_system="file_system_1")
directory_client = file_system_client.get_directory_client("Some/directories")
new_dir_name = "some/new_copied_directories"
directory_client.rename_directory(new_name=directory_client.file_system_name + '/' + new_dir_name)
I found nothing in the documentation but its such a basic feature i am curious if there is a way to do it
i've also tried with get_blob_client from BlobServiceClient "azure.storage.blob import BlobClient, BlobServiceClient" library but it seems i connot connect with service principal there.
PS : i need to connect via service principal, my storage does not have public access
I have common code that is used in several projects that creates BlobServiceClient and BlobContainerClient like the following
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
Then gets all the blobs in the containerClient using var blobs = containerClient.GetBlobs();
I have AAD app registration created and given access to the storage account, I'm wondering if I can create the BlobServiceClient using the credentials of that app instead of the connectionString
You can use the following section of code. You will need to include the Azure.Identity Nuget package. Don't forget to set your client_id, client_secret and tenantId.
using Azure.Identity;
using Azure.Storage.Blobs;
var credential = new ClientSecretCredential(tenantId, client_id, client_secret);
Uri accountUri = new Uri("https://<storage_acct_name>.blob.core.windows.net/");
BlobServiceClient client = new BlobServiceClient(accountUri, credential);
BlobContainerClient containerClient = client.GetBlobContainerClient($"<container>");
var blobs = containerClient.GetBlobs();
But why are you using Service principal? If you are writing your application on Azure app service you can use managed identity instead. It will be less admin overhead in the long term as there are no credentials to manage. In that case, you will need to switch on the managed identity on the app service and use the DefaultAzureCredential class.
I am trying to authenticate to Google Cloud Functions from SAP CPI to fetch some data from a database. To push data, we use pub/sub, with a service account access token, and it works perfectly. But for the functions, it needs an identity token instead of an access token. We get the previous token with a groovy script (No Jenkins). Is it possible to authenticate to the functions also with an access token? Or to get the identity token without building a whole IAP layer?
You have to call your Cloud Functions (or Cloud Run, it's the same) with a signed identity token.
So you can use a groovy script for generating a signed identity token. Here an example
import com.google.api.client.http.GenericUrl
import com.google.api.client.http.HttpRequest
import com.google.api.client.http.HttpRequestFactory
import com.google.api.client.http.HttpResponse
import com.google.api.client.http.javanet.NetHttpTransport
import com.google.auth.Credentials
import com.google.auth.http.HttpCredentialsAdapter
import com.google.auth.oauth2.IdTokenCredentials
import com.google.auth.oauth2.IdTokenProvider
import com.google.auth.oauth2.ServiceAccountCredentials
import com.google.common.base.Charsets
import com.google.common.io.CharStreams
String myUri = "YOUR_URL";
Credentials credentials = ServiceAccountCredentials
.fromStream(new FileInputStream(new File("YOUR_SERVICE_ACCOUNT_KEY_FILE"))).createScoped("https://www.googleapis.com/auth/cloud-platform");
String token = ((IdTokenProvider) credentials).idTokenWithAudience(myUri, Collections.EMPTY_LIST).getTokenValue();
System.out.println(token);
IdTokenCredentials idTokenCredentials = IdTokenCredentials.newBuilder()
.setIdTokenProvider((ServiceAccountCredentials) credentials)
.setTargetAudience(myUri).build();
HttpRequestFactory factory = new NetHttpTransport().createRequestFactory(new HttpCredentialsAdapter(idTokenCredentials));
HttpRequest request = factory.buildGetRequest(new GenericUrl(myUri));
HttpResponse httpResponse = request.execute();
System.out.println(CharStreams.toString(new InputStreamReader(httpResponse.getContent(), Charsets.UTF_8)));
Service account key file is required only if you are outside GCP. Else, the default service account is enough, but must be a service account. Your personal user account won't work
Add this dependency (here in Maven)
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.20.0</version>
</dependency>
Or you can use a tool that I wrote and open sourced. I also wrote a Medium article for explaining the use cases
You can only access your secured cloud function using Identity token.
1.Create a service account with roles/cloudfunctions.invoker
2.Create a cloud function that allows only authenticated requests
https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME
from google.oauth2 import service_account
from google.auth.transport.requests import AuthorizedSession
target_audience = 'https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME'
creds = service_account.IDTokenCredentials.from_service_account_file(
'/path/to/svc.json', target_audience=target_audience)
authed_session = AuthorizedSession(creds)
# make authenticated request and print the response, status_code
resp = authed_session.get(target_audience)
print(resp.status_code)
print(resp.text)
Does anyone have any example or documentation how to connect a Service Account from Google Drive API with pydrive. I managed to do it with auth2 client.
Apparently this should work, but it does not:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from oauth2client.service_account import ServiceAccountCredentials
gauth = GoogleAuth()
scope = ["https://www.googleapis.com/auth/drive"]
gauth.credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_FILE, scope)
drive = GoogleDrive(gauth)
UPDATE: error was caused by some missing permissions in google IAM.