Get Azure VM related details using Microsoft.Azure.Management.Fluent - azure

I am trying to use Microsoft.Azure.Management.Fluent and related set of packages,
I need following details about ALL azure VM in my subscription:
Who created VM, Region of VM, VmSize, Current Status of VM ( Like Stopped/ Running/ Deallocated etc),
I also need
History of VM in terms of duration this VM was up and running for last x months/ weeks.
Is this possible using Microsoft.Azure.Management.Fluent packages?

If you want to know the VM starting and stopping time, we can get it from Azure activity log. Regarding how to retrieve the activity log, we can use Microsoft.Azure.Management.Monitor.Fluent package.
For example
create a service principal and assign Azure RABC role to the sp(I use Azure CLI)
az login
#it will create a service principal and assign contributor role to the sp
az ad sp create-for-rbac -n "jonsp2"
Install package
// for more details about the package, please refer to https://www.nuget.org/packages/Microsoft.Azure.Management.Fluent/
Install-Package Microsoft.Azure.Management.Fluent -Version 1.34.0
Code
AzureCredentials credentials = SdkContext.AzureCredentialsFactory.FromServicePrincipal(
clientId, // the sp appId
clientSecret, // the sp password
tenantId, // the sp tenant
AzureEnvironment.AzureGlobalCloud);
var azure = Microsoft.Azure.Management.Fluent.Azure.Configure()
.Authenticate(credentials)
.WithSubscription(subscriptionId);
var vms = await azure.VirtualMachines.ListAsync();
foreach (var vm in vms)
{
var staus = vm.PowerState.Value; // vm power state
var region = vm.RegionName; // vm region
var size = vm.Size.Value; // vm size
var logs = await azure.ActivityLogs.DefineQuery()
.StartingFrom(DateTime.Now.AddDays(-1))
.EndsBefore(DateTime.Now)
.WithAllPropertiesInResponse()
.FilterByResource("/subscriptions/e5b0fcfa-e859-43f3-8d84-5e5fe29f4c68/resourceGroups/jimtest/providers/Microsoft.Compute/virtualMachines/testvm")
.ExecuteAsync();
List<DateTime?> stopTime = new List<DateTime?>();
List<DateTime?> startTime = new List<DateTime?>();
foreach (var log in logs)
{
// get stop time
if ((log.OperationName.LocalizedValue == "Deallocate Virtual Machine") & (log.Status.LocalizedValue == "Succeeded"))
{
stopTime.Add(log.EventTimestamp);
}
// get start tim
if ((log.OperationName.LocalizedValue == "Strat Virtual Machine") & (log.Status.LocalizedValue == "Succeeded"))
{
startTime.Add(log.EventTimestamp);
}
}
}

Related

What permissions are needed to run the adf pipeline api

What permissions are needed to run the azure data factory management apis. For instance I am trying to execute the Pipeline runs, Query by factory api in a web activity.
Error:
User configuration issue
{"error":{"code":"AuthorizationFailed","message":"The client with object id does not have authorization to perform action 'Microsoft.DataFactory/factories/pipelineruns/read' over scope '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelineruns/{runId}' or the scope is invalid. If access was recently granted, please refresh your credentials."}}
Could you please guide me on how to pass the credentials and get the token for GET and POST methods.
You need to create an Azure Active Directory Application that can access your Data factory.
Create an Azure Active Directory application, For the sign-on URL, you can provide a dummy URL (https://contoso.org/exampleapp).
Get values for signing in, get the application ID and tenant ID, and note down these values that you use later.
Certificates and secrets, get the authentication key, and note down this value that you use later in this tutorial.
Assign the application to a role, assign the application to the Contributor role at the subscription level so that the application can create data factories in the subscription.
After you do the above steps, you need to create the DataFactoryManagementClient and authenticate your application using the below code snippet:
var context = new AuthenticationContext("https://login.microsoftonline.com/" + tenantID);
ClientCredential cc = new ClientCredential(applicationId, authenticationKey);
AuthenticationResult result = context.AcquireTokenAsync("https://management.azure.com/", cc).Result;
ServiceClientCredentials cred = new TokenCredentials(result.AccessToken);
var client = new DataFactoryManagementClient(cred) {
SubscriptionId = subscriptionId };
Once you've authenticated your application, you can start the Pipeline run using the below code snippet:
// Create a pipeline run
Console.WriteLine("Creating pipeline run...");
CreateRunResponse runResponse = client.Pipelines.CreateRunWithHttpMessagesAsync(
resourceGroup, dataFactoryName, pipelineName, parameters: parameters
).Result.Body;
Console.WriteLine("Pipeline run ID: " + runResponse.RunId);
Below is the complete code of a console application to run an Azure Data Factory Pipeline:
using Microsoft.Azure.Management.DataFactory;
using Microsoft.Azure.Management.DataFactory.Models;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Rest;
using System;
namespace ADF
{
class Program
{
static void Main(string[] args)
{
// Set variables
string tenantID = "<your tenant ID>";
string applicationId = "<your application ID>";
string authenticationKey = "<your authentication key for the application>";
string subscriptionId = "<your subscription ID where the data factory resides>";
string resourceGroup = "<your resource group where the data factory resides>";
string dataFactoryName ="<specify the name of data factory to create. It must be globally unique.>";
string pipelineName = "<specify the name of pipeline to run. It must be globally unique.>";
// Authenticate and create a data factory management client
var context = new AuthenticationContext("https://login.microsoftonline.com/" + tenantID);
ClientCredential cc = new ClientCredential(applicationId, authenticationKey);
AuthenticationResult result = context.AcquireTokenAsync("https://management.azure.com/", cc).Result;
ServiceClientCredentials cred = new TokenCredentials(result.AccessToken);
var client = new DataFactoryManagementClient(cred)
{
SubscriptionId = subscriptionId
};
// Create a pipeline run
Console.WriteLine("Creating pipeline run...");
CreateRunResponse runResponse = client.Pipelines.CreateRunWithHttpMessagesAsync(
resourceGroup, dataFactoryName, pipelineName
).Result.Body;
Console.WriteLine("Pipeline run ID: " + runResponse.RunId);
}
}
}
Don't forget to add the Nuget Packages:
Install-Package Microsoft.Azure.Management.DataFactory
Install-Package Microsoft.Azure.Management.ResourceManager -IncludePrerelease
Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory

Access Azure Data Factory V2 programmatically: The Resource Microsoft.DataFactory/dataFactories/ under resource group was not found

I'm trying to access Azure Data Fabric V2 programmatically.
First, I had created an App Registration in Azure portal and a Client secret. Then I gave Contributor permission to this App registration on the entire suscription, and also in the resource group where my data factory lives.
Using this credentials I'm able to login to the portal and create an DataFactoryManagementClient
private void CreateAdfClient()
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId: appRegistrationClientId, clientSecret: appRegistrationClientkey);
var result = authenticationContext.AcquireTokenAsync(resource: "https://management.core.windows.net/", clientCredential: credential).ConfigureAwait(false).GetAwaiter().GetResult();
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.AccessToken;
var tokenCloudCredentials = new TokenCloudCredentials(subscriptionId, token);
datafactoryClient = new DataFactoryManagementClient(tokenCloudCredentials);
}
However, when I try to get my pipeline with
var pipeline = datafactoryClient.Pipelines.Get(resourceGroup, dataFactory, pipelineName);
it throws an error:
System.Private.CoreLib: Exception while executing function:
StartRawMeasuresSync. Microsoft.Azure.Management.DataFactories:
ResourceNotFound: The Resource
'Microsoft.DataFactory/dataFactories/MyPipeline' under resource group
'MyResGroup' was not found.
I had verified that the resource group, the data factory name and the pipeline name are correct, but it keeps throwing this error.
I had the same issue, and it was due to referencing the Nuget package for Azure Data Factory v1 instead of v2.
Version 1: Microsoft.Azure.Management.DataFactories
Version 2: Microsoft.Azure.Management.DataFactory

Azure node SDK to get more than 50 virtual machines

I' using the Azure node SDK to get all virtual machines for the subscription :
var computeClient = new computeManagementClient.ComputeManagementClient(credentials, subscriptionId);
var clientNetworkManagement = new NetworkManagementClient(credentials, subscriptionId);
computeClient.virtualMachines.listAll(function (err, result) {
returnResult(result);
});
But I have subscription with more than 50 vm's and that call only return 50 vm's max.
It's possible to get more than 50 vms with this function computeClient.virtualMachines.listAll ?
https://github.com/Azure-Samples/compute-node-manage-vm
Thx
I tried to reproduce your issue, but failed that I can list all VMs via my code as below. Before to run my code, I assigned a role Virtual Machine Contributor(or you can use higher level role like Contributer or Owner) to my app registed in AzureAD for my current subscription, you can refer to the offical document Manage access to Azure resources using RBAC and the Azure portal to know it.
var msRestAzure = require('ms-rest-azure');
var ComputeManagementClient = require('azure-arm-compute');
var clientId = process.env['CLIENT_ID'] || '<your client id>';
var domain = process.env['DOMAIN'] || '<your tenant id>';
var secret = process.env['APPLICATION_SECRET'] || '<your client secret>';
var subscriptionId = process.env['AZURE_SUBSCRIPTION_ID'] || '<your subscription id for listing all VMs in it>';
var computeClient;
msRestAzure.loginWithServicePrincipalSecret(clientId, secret, domain, function (err, credentials, subscriptions) {
computeClient = new ComputeManagementClient(credentials, subscriptionId);
computeClient.virtualMachines.listAll(function (err, result) {
console.log(result.length);
});
});
On Azure portal, there are 155 VMs list in my current subscription as the figure below. However, the result of my code only is 153 VMs. I don't know why the results are different, but my code result is same with Azure CLI command az vm list | grep vmId | wc -l.
Fig 1. The number of VMs in my current subscription
Fig 2. The result of my code
Fig 3. The result of Azure CLI command az vm list|grep vmId|wc -l
Per my experience, I guess your issue was caused by assigning the lower permission role for your app to only list VMs that you have default accessing permission.
Any concern or update is very helpful for understanding what your real issue is, please feel free to let me know.
I don't know if this is the best way to solve the problem, but I find a solution:
msRestAzure.loginWithServicePrincipalSecret(clientId, secret, domain, function (err, credentials, subscriptions) {
computeClient = new ComputeManagementClient(credentials, subscriptionId);
computeClient.virtualMachines.listAll(function (err, result, httpRequest, response) {
let myResult = JSON.parse(response.body);
console.log(result.length);
nextLink = myResult.nextLink;
console.log(nextLink);
computeClient.virtualMachines.listAllNext(nextLink, function (err, result, request, response) {
console.log(result.length);
});
});
});
The first call (listAll) return 50 Vm's and "nextLink" value.
Than I call listAllNext(nextLink,... that return the others 39 Vm's

Not able to run Azure Data Factory Pipeline using Visual Studio 2015

I have followed the Azure documentation steps to create a simple Copy Data Factory from Blob to SQL.
Now I want to run the pipeline through VS code.
I have checked the authentication keys and Roles assigned are correct.
Below is the code -
var context = new AuthenticationContext("https://login.windows.net/" + tenantID);
ClientCredential cc = new ClientCredential(applicationId, authenticationKey);
AuthenticationResult result = context.AcquireTokenAsync("https://management.azure.com/", cc).Result;
ServiceClientCredentials cred = new TokenCredentials(result.AccessToken);
var client = new DataFactoryManagementClient(cred) { SubscriptionId = subscriptionId };
Console.WriteLine("Creating pipeline run...");
var st = client.Pipelines.Get(resourceGroup, dataFactoryName, pipelineName);
CreateRunResponse runResponse = client.Pipelines.CreateRunWithHttpMessagesAsync(resourceGroup, dataFactoryName, pipelineName).Result.Body;
Console.WriteLine("Pipeline run ID: " + runResponse.RunId);
However, I get Forbidden error.
The client 'xxxx' with object id 'xxxx' does not have authorization to perform action 'Microsoft.DataFactory/factories/pipelines/read' over scope '/subscriptions/xxxxx/resourceGroups/'
How can I fix this?
How can I fix this?
According to the exception message that it indicates that you don't assign the corresponding role to application to access the data factory.
I test your code with Azure Datafactory(V2) on my side , it works correctly. The following is my details steps.
Registry an Azure AD WebApp application.
Get the clientId and clientscret from created Application.
Assign the role the application to access the datafactory.
Test code on my side.

Upload Azure Batch Job Application Package programmatically

I have found how to upload/manage Azure Batch job Application Packages through the UI:
https://learn.microsoft.com/en-us/azure/batch/batch-application-packages
And how to upload and manage Resource Packages programmatically:
https://github.com/Azure/azure-batch-samples/tree/master/CSharp/GettingStarted/02_PoolsAndResourceFiles
But I can't quite seem to put 2 and 2 together on how to manage Application Packages programmatically. Is there an API endpoint we can call to upload/manage an Application Package when setting up a batch job?
Since this is not quite straightforward, I'll write down my findings.
These are the steps to programmatically upload Application Packages via an application that is unattended - no user input (e.g. Azure credentials) is needed.
In Azure Portal:
Create the Azure Batch application
Create a new Azure AD application (as Application Type use Web app / API)
Follow these steps to create the secret key and assign the role to the Azure Batch account
Note down the following credentials/ids:
Azure AD application id
Azure AD application secret key
Azure AD tenant id
Subscription id
Batch account name
Batch account resource group name
In your code:
Install NuGet packages Microsoft.Azure.Management.Batch, WindowsAzure.Storage and Microsoft.IdentityModel.Clients.ActiveDirectory
Get the access token and create the BatchManagementClient
Call the ApplicationPackageOperationsExtensions.CreateAsync method, which should return an ApplicationPackage
ApplicationPackage contains the StorageUrl which can now be used to upload the Application Package via the storage API
After you have uploaded the ApplicationPackage you have to activate it via ApplicationPackageOperationsExtensions.ActivateAsync
Put together the whole code looks something like this:
private const string ResourceUri = "https://management.core.windows.net/";
private const string AuthUri = "https://login.microsoftonline.com/" + "{TenantId}";
private const string ApplicationId = "{ApplicationId}";
private const string ApplicationSecretKey = "{ApplicationSecretKey}";
private const string SubscriptionId = "{SubscriptionId}";
private const string ResourceGroupName = "{ResourceGroupName}";
private const string BatchAccountName = "{BatchAccountName}";
private async Task UploadApplicationPackageAsync() {
// get the access token
var authContext = new AuthenticationContext(AuthUri);
var authResult = await authContext.AcquireTokenAsync(ResourceUri, new ClientCredential(ApplicationId, ApplicationSecretKey)).ConfigureAwait(false);
// create the BatchManagementClient and set the subscription id
var bmc = new BatchManagementClient(new TokenCredentials(authResult.AccessToken)) {
SubscriptionId = SubscriptionId
};
// create the application package
var createResult = await bmc.ApplicationPackage.CreateWithHttpMessagesAsync(ResourceGroupName, BatchAccountName, "MyPackage", "1.0").ConfigureAwait(false);
// upload the package to the blob storage
var cloudBlockBlob = new CloudBlockBlob(new Uri(createResult.Body.StorageUrl));
cloudBlockBlob.Properties.ContentType = "application/x-zip-compressed";
await cloudBlockBlob.UploadFromFileAsync("myZip.zip").ConfigureAwait(false);
// create the application package
var activateResult = await bmc.ApplicationPackage.ActivateWithHttpMessagesAsync(ResourceGroupName, BatchAccountName, "MyPackage", "1.0", "zip").ConfigureAwait(false);
}
Azure Batch Application Packages management operations occur on the management plane. The MSDN docs for this namespace are here:
https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.management.batch
The nuget package for Microsoft.Azure.Management.Batch is here:
https://www.nuget.org/packages/Microsoft.Azure.Management.Batch/
And the following sample shows management plane operations in C#, although it is for non-application package operations:
https://github.com/Azure/azure-batch-samples/tree/master/CSharp/AccountManagement

Resources