Azure python SDK - start or run VM from resource group - python-3.x

I am not able to start Azure Vm using python code without using clientId and Secrete Id.
Can we start or stop Azure vm in python without using client_id and secrete id.
Here is the code for reference.
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.compute import ComputeManagementClient,ComputeManagementClientConfiguration
credentials = ServicePrincipalCredentials(
client_id = '<client-id>',
secret = '<key>',
tenant = '<tenant-id>'
)
subscription_id = '<subscription-id>'
compute_config = ComputeManagementClientConfiguration(credentials, subscription_id, api_version='2015-05-01-preview')
compute_client = ComputeManagementClient(compute_config)
resource_group_name = '<resource-group>'
vm_name = '<vm-name>'
result = compute_client.virtual_machines.deallocate(resource_group_name, vm_name)
here we are using client Id and all... but I want to stop my Azure Vm without need of applications id/client id..

you can use azure-identity package for this and DefaultAzureCredential:
from azure.identity import DefaultAzureCredential
credentials = DefaultAzureCredential()
compute_config = ComputeManagementClientConfiguration(credentials, subscription_id, api_version='2015-05-01-preview')
compute_client = ComputeManagementClient(compute_config)
https://learn.microsoft.com/en-us/python/api/overview/azure/identity-readme?view=azure-python
main advantage - you can use MSI authentication

Related

' ClientSecretCredential ' object has no attribute 'signed_session' error occurs when run the code below to get Key Vault infos

subscription_id = os.environ["AZURE_SUBSCRIPTION_ID"]
tenant_id = os.environ["AZURE_TENANT_ID"]
client_id = os.environ["AZURE_CLIENT_ID"]
client_secret = os.environ["AZURE_CLIENT_SECRET"]
credentials = ClientSecretCredential(tenant_id=tenant_id, client_id=client_id, client_secret=client_secret)
kv_client = KeyVaultManagementClient(credentials, subscription_id)
I tried to authenticate using;
credentials=ServicePrincipalCredentials(client_id=client_id,secret=client_secret,tenant=tenant_id)
as well
but I got the following error:
'ServicePrincipalCredentials' object has no attribute 'get_token'. Did you mean: 'set_token'?
Could You explain what is the cause of the problem and how could it be resolved?
Thanks in advance,
ServicePrincipalCredentials will give the same error as its deprecated version of Client Credential so instead of that you will need to use the ClientSecretCredential Only .
I tested the same from my environment using the below code:
AZURE_TENANT_ID = 'Tenant_Id'
AZURE_CLIENT_ID = 'App_Id'
AZURE_CLIENT_SECRET = '<Client_Secret>'
AZURE_SUBSCRIPTION_ID = '<Subscription_Id>'
from azure.identity import ClientSecretCredential
from azure.mgmt.keyvault import KeyVaultManagementClient
credentials = ClientSecretCredential(tenant_id=AZURE_TENANT_ID, client_id=AZURE_CLIENT_ID, client_secret=AZURE_CLIENT_SECRET)
kv_client = KeyVaultManagementClient(credentials, AZURE_SUBSCRIPTION_ID)
kv_list= kv_client.vaults.list()
for item in kv_list:
print(item.name)
Versions I am using are azure-identity == 1.7.1 & azure-mgmt-keyvault == 9.3.0.
Output:

Microsoft cloud, how to authenticate the API

Microsoft cloud, how to call API authentication, some authentication information is created there
import os
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.network import NetworkManagementClient
subscription_id = os.environ.get(
'AZURE_SUBSCRIPTION_ID',
'11111111-1111-1111-1111-111111111111') # your Azure Subscription Id
credentials = ServicePrincipalCredentials(
client_id=os.environ['AZURE_CLIENT_ID'],
secret=os.environ['AZURE_CLIENT_SECRET'],
tenant=os.environ['AZURE_TENANT_ID']
)
network_client = NetworkManagementClient(credentials, subscription_id)
AZURE_CLIENT_ID,AZURE_CLIENT_SECRET,AZURE_TENANT_ID,AZURE_ CLIENT_ ID,AZURE_ CLIENT_ SECRET,AZURE_ TENANT_ Where was the ID created?

Finding the azure account key with Blob Service Client fails(azure python sdk)

I am using
Name: azure-mgmt-storage
Version: 16.0.0
Summary: Microsoft Azure Storage Management Client Library for Python
Home-page: https://github.com/Azure/azure-sdk-for-python
for generating a report to find the storage container size.
The snippet of my code that I am using is as below
from azure.mgmt.storage import StorageManagementClient
subscription_client = Subscription(tenant=tenant_id, client_id=client_id, secret=client_secret)
service_principals = subscription_client.credentials
subscription_id = subscription_client.find_subscription_id()
storage_client = StorageManagementClient(credential=service_principals, subscription_id=subscription_id)
storage_account_list = storage_client.storage_accounts.list()
for storage_account in storage_account_list:
blob_service_client = BlobServiceClient(account_url=storage_account.primary_endpoints.blob,credential=service_principals)
account_info = blob_service_client.get_service_properties()
keys = blob_service_client.credential.keys()
When I evaluate expression blob_service_client.credential, value is
<azure.identity._credentials.client_secret.ClientSecretCredential object at 0x05747E98>
blob_service_client.api_version evaluates to 2020-02-10.
And blob_service_client.credential.account_key or blob_service_client.credential.account_key() evaluates to {AttributeError}'ClientSecretCredential' object has no attribute 'account_key'
or even when I try blob_service_client.credential.keys() I get {AttributeError}'ClientSecretCredential' object has no attribute 'keys' error
Any Azure expert can help me out here? Also connnection strings are another way to approach this problem where I can use:
BlobServiceClient.from_connection_string(connection_string)
for which I am also required to generate the connection_string dynamically, which I am unable to.
Since you are already using the client secret credential, you can do your storage operation (calculating storage container size in this case). Note below in my code, I had the subscription id handy already, so I did not use subscription client. But you can definitely like your original code.
from azure.identity import ClientSecretCredential
from azure.mgmt.storage import StorageManagementClient
from azure.storage.blob import BlobServiceClient, ContainerClient
tenant_id='<tenant id>'
client_id='<client id>'
client_secret='<secret>'
subscription_id='<subscription id>'
credentials = ClientSecretCredential(tenant_id=tenant_id, client_id=client_id, client_secret=client_secret)
storage_client = StorageManagementClient(credential=credentials, subscription_id=subscription_id)
storage_account_list = storage_client.storage_accounts.list()
for storage_account in storage_account_list:
blob_service_client = BlobServiceClient(account_url=storage_account.primary_endpoints.blob,credential=credentials)
containers = blob_service_client.list_containers()
for container in containers:
container_client = ContainerClient(account_url=storage_account.primary_endpoints.blob,credential=credentials, container_name=container.name)
blobs = container_client.list_blobs()
container_size = 0
for blob in blobs:
container_size = container_size + blob.size
print('Storage Account: ' + storage_account.name + ' ; Container: ' + container.name + ' ; Size: ' + str(container_size))

Fail to create dataset using azure sdk python for azure data factory

I am trying to create dataset in ADF using azure sdk for python, unfortunately I am running into this error message. I am not sure what is wrong with my code below.
dsOut_name = 'POC_DatasetName'
ds_ls ="AzureBlobStorage"
output_blobpath = '/tempdir'
df_name = 'pipeline1'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print_item(dsOut)
Error Message: SerializationError: Unable to build a model: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get', DeserializationError: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get'
Help Please
I can reproduce your issue, this line ds_ls ="AzureBlobStorage" is wrong, it should be ds_ls = LinkedServiceReference(reference_name=ls_name).
You could refer to my complete working sample.
Make sure your service principal has an RBAC role(e.g Owner,Contributor) in the Access control (IAM) of your data factory and you have done all the Prerequisites.
My package version:
azure-mgmt-datafactory 0.6.0
azure-mgmt-resource 3.1.0
azure-common 1.1.23
Code:
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *
subscription_id = '<subscription-id>'
ls_name = 'storageLinkedService'
rg_name = '<group-name>'
df_name = '<datafactory-name>'
credentials = ServicePrincipalCredentials(client_id='<client id of the service principal>',
secret='<secret of the service principal>', tenant='<tenant-id>')
resource_client = ResourceManagementClient(credentials, subscription_id)
adf_client = DataFactoryManagementClient(credentials, subscription_id)
storage_string = SecureString('DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=<storage account key>')
ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)
ds_ls = LinkedServiceReference(reference_name=ls_name)
# Create an Azure blob dataset (output)
dsOut_name = 'ds_out'
output_blobpath = '<container name>/<folder name>'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print(dsOut)

ApplicationsOperations object construction

I want to automate application creation in Azure with python. My goal is to execute it with AWS Lambda.
I have found ApplicationsOperations class, but I don't understand how to use it.
For the client part it's ok with a GraphRbacManagementClient object
But for config, serializer and deserializer parameters I don't know how to construct them.
Is someone here has code sample for ApplicationsOperations ?
You don't use it directly, you create a GraphrBac client and you use the "applications" attribute:
https://learn.microsoft.com/en-us/python/api/overview/azure/graph-rbac?view=azure-python
from azure.graphrbac import GraphRbacManagementClient
from azure.common.credentials import UserPassCredentials
credentials = UserPassCredentials(
'user#domain.com', # Your user
'my_password', # Your password
resource="https://graph.windows.net"
)
tenant_id = "myad.onmicrosoft.com"
graphrbac_client = GraphRbacManagementClient(
credentials,
tenant_id
)
apps = list(graphrbac_client.applications.list(
filter="displayName eq 'pytest_app'"
))

Resources