Azure Automation Managment not able to execute run book - azure

I am using Azure Automation Management 2.0.1. I am not able find Start method for Runbooks to execute a runbook. How do this with 2.0.1
var client = new Microsoft.Azure.Management.Automation.AutomationManagementClient(new CertificateCloudCredentials(subscriptionId, cert));
var ct = new CancellationToken();
var content = await client.Runbooks.ListByNameAsync("MyAutomationAccountName", "MyRunbookName", ct);
var firstOrDefault = content?.Runbooks.FirstOrDefault();
if (firstOrDefault != null)
{
var operation = client.Runbooks.Start("MyAutomationAccountName", new RunbookStartParameters(firstOrDefault.Id));
}

You need to use the automationManagementClient.Jobs.Create
public static JobCreateResponse Create(
this IJobOperations operations,
string resourceGroupName,
string automationAccount,
JobCreateParameters parameters
)
You can find a full sample here this would be the relevant part -
private void JobStart_Click(object sender, RoutedEventArgs e)
{
// Check for runbook name
if (String.IsNullOrWhiteSpace(RunbookName.Text) || String.IsNullOrWhiteSpace(PublishState.Text)) throw new ArgumentNullException(RunbookName.Text);
// Create job create parameters
var jcparam = new JobCreateParameters
{
Properties = new JobCreateProperties
{
Runbook = new RunbookAssociationProperty
{
// associate the runbook name
Name = RunbookName.Text
},
// pass parameters to runbook if any
Parameters = null
}
};
// create runbook job. This gives back JobId
var job = automationManagementClient.Jobs.Create(this.automationAccountName, jcparam).Job;
JobGuid.Text = JobId.Text = job.Properties.JobId.ToString();
Log.Text += (String.Format("\nJob Started for Runbook {0} , JobId {1}", RunbookName.Text, JobId.Text));
}

Related

How to mock Azure blobContainerClient.GetBlobsAsync()

I have a Azure blob container which I am accessing using below code -
var blobContainerClient = GetBlobContainer(containerName);
if (blobContainerClient != null)
{
// List all blobs in the container
await foreach (BlobItem blobItem in blobContainerClient.GetBlobsAsync())
{
queuedBlobsList.Add(new QueuedBlobs { BlobName = blobItem.Name, LastModified = blobItem.Properties.LastModified });
}
}
private BlobContainerClient GetBlobContainer(string containerName)
{
return gen2StorageClient != null
? gen2StorageClient.GetBlobContainerClient(containerName)
: gen1StorageClient.GetBlobContainerClient(containerName);
}
The clients are initialised in constructor -
public class BlobService : IBlobService
{
private readonly BlobServiceClient gen1StorageClient, gen2StorageClient;
public BlobService(BlobServiceClient defaultClient, IAzureClientFactory<BlobServiceClient> clientFactory)
{
gen1StorageClient = defaultClient;
if (clientFactory != null)
{
gen2StorageClient = clientFactory.CreateClient("StorageConnectionString");
}
}
}
And my unit test where I am setting GetBlobsAsync is like this -
But I want to add list of BlobItems to test another loop.
private static Mock<BlobContainerClient> GetBlobContainerClientMockWithListOfBlobs()
{
var blobContainerClientMock = new Mock<BlobContainerClient>("UseDevelopmentStorage=true", EnvironmentConstants.ParallelUploadContainer);
var cancellationToken = new CancellationToken();
var blobs = new List<BlobItem>();
//AsyncPageable<BlobItem> blobItems = new AsyncPageable<BlobItem>(); -- Not allowing
blobContainerClientMock.Setup(x => x.GetBlobsAsync(BlobTraits.All, BlobStates.All, null, cancellationToken)).Returns(It.IsAny<AsyncPageable<BlobItem>>());
return blobContainerClientMock;
}
I came to this question because I also had the same issue.
Based on this article
AsyncPageable<T> and Pageable<T> are classes that represent collections of models returned by the service in pages.
The method GetBlobsAsync returns an AsyncPageable.
To Create an AsyncPageable you need first to create a BlobItem Page.
To create a Page<T> instance, use the Page<T>.FromValues method, passing a list of items, a continuation token, and the Response.
So let's start creating the list of items:
var blobList = new BlobItem[]
{
BlobsModelFactory.BlobItem("Blob1"),
BlobsModelFactory.BlobItem("Blob2"),
BlobsModelFactory.BlobItem("Blob3")
};
Note: BlobItem has an internal constructor but I found in this answer that there's a BlobsModelFactory.
After having the list of blobs is time to create a Page<BlobItem>:
Page<BlobItem> page = Page<BlobItem>.FromValues(blobList, null, Mock.Of<Response>());
And finally, create the AsyncPageable<BlobItem>
AsyncPageable<BlobItem> pageableBlobList = AsyncPageable<BlobItem>.FromPages(new[] { page });
And now you are able to use this to mock GetBlobsAsync method:
blobContainerClientMock
.Setup(m => m.GetBlobsAsync(
It.IsAny<BlobTraits>(),
It.IsAny<BlobStates>(),
It.IsAny<string>(),
It.IsAny<CancellationToken>()))
.Returns(pageableBlobList);
I hope this helps others with this issue.
André

How to get Azure user's client secrete (without registering app) or how to generate bearer access token of current Azure credential?

I am trying to create Resource Group and Virtual Machine (and other components) programmatically using C#. I want to use
SdkContext.AzureCredentialsFactory.FromServicePrincipal(clientId, clientSecrete, tenantId, environment);
But I don't know my client Secrete. How to get it or generate it? Or how to generate the bearer access token of current Azure user credential for restful api call?
The file %userprofile%.azure\accessTokens.json contains bearer access token. But it is generated by Azure Cli. Is there a way to generate the token via C# code?
This article
https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal shows a way but it needs to register an app and grant it certain permission (I don't have the permission to grant the permission to app).
The link you provided is the correct way to use SdkContext.AzureCredentialsFactory.FromServicePrincipal, if you just have a user account rather than a valid service principal, the best way in this case is to use Azure.Identity to auth, make sure your user account has the permission to create the VM, i.e. RBAC role at the scope you want to create the VM.
using Azure.Identity;
using Azure.ResourceManager.Resources;
using Azure.ResourceManager.Resources.Models;
using Azure.ResourceManager.Compute;
using Azure.ResourceManager.Compute.Models;
using Azure.ResourceManager.Network;
using Azure.ResourceManager.Network.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Linq;
namespace CreateVMSample
{
public class Program
{
protected static string AdminUsername = "<username>";
protected static string AdminPassword = "<password>";
static async Task Main(string[] args)
{
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
var resourceClient = new ResourcesManagementClient(subscriptionId, new DefaultAzureCredential());
// Create Resource Group
Console.WriteLine("--------Start create group--------");
var resourceGroups = resourceClient.ResourceGroups;
var location = "westus2";
var resourceGroupName = "QuickStartRG";
var resourceGroup = new ResourceGroup(location);
resourceGroup = await resourceGroups.CreateOrUpdateAsync(resourceGroupName, resourceGroup);
Console.WriteLine("--------Finish create group--------");
// Create a Virtual Machine
await Program.CreateVmAsync(subscriptionId, "QuickStartRG", location, "quickstartvm");
// Delete resource group if necessary
//Console.WriteLine("--------Start delete group--------");
//await (await resourceGroups.StartDeleteAsync(resourceGroupName)).WaitForCompletionAsync();
//Console.WriteLine("--------Finish delete group--------");
//Console.ReadKey();
}
public static async Task CreateVmAsync(
string subscriptionId,
string resourceGroupName,
string location,
string vmName)
{
var computeClient = new ComputeManagementClient(subscriptionId, new DefaultAzureCredential());
var networkClient = new NetworkManagementClient(subscriptionId, new DefaultAzureCredential());
var virtualNetworksClient = networkClient.VirtualNetworks;
var networkInterfaceClient = networkClient.NetworkInterfaces;
var publicIpAddressClient = networkClient.PublicIPAddresses;
var availabilitySetsClient = computeClient.AvailabilitySets;
var virtualMachinesClient = computeClient.VirtualMachines;
// Create AvailabilitySet
Console.WriteLine("--------Start create AvailabilitySet--------");
var availabilitySet = new AvailabilitySet(location)
{
PlatformUpdateDomainCount = 5,
PlatformFaultDomainCount = 2,
Sku = new Azure.ResourceManager.Compute.Models.Sku() { Name = "Aligned" }
};
availabilitySet = await availabilitySetsClient.CreateOrUpdateAsync(resourceGroupName, vmName + "_aSet", availabilitySet);
// Create IP Address
Console.WriteLine("--------Start create IP Address--------");
var ipAddress = new PublicIPAddress()
{
PublicIPAddressVersion = Azure.ResourceManager.Network.Models.IPVersion.IPv4,
PublicIPAllocationMethod = IPAllocationMethod.Dynamic,
Location = location,
};
ipAddress = await publicIpAddressClient.StartCreateOrUpdate(resourceGroupName, vmName + "_ip", ipAddress)
.WaitForCompletionAsync();
// Create VNet
Console.WriteLine("--------Start create VNet--------");
var vnet = new VirtualNetwork()
{
Location = location,
AddressSpace = new AddressSpace() { AddressPrefixes = new List<string>() { "10.0.0.0/16" } },
Subnets = new List<Subnet>()
{
new Subnet()
{
Name = "mySubnet",
AddressPrefix = "10.0.0.0/24",
}
},
};
vnet = await virtualNetworksClient
.StartCreateOrUpdate(resourceGroupName, vmName + "_vent", vnet)
.WaitForCompletionAsync();
// Create Network Interface
Console.WriteLine("--------Start create Network Interface--------");
var nic = new NetworkInterface()
{
Location = location,
IpConfigurations = new List<NetworkInterfaceIPConfiguration>()
{
new NetworkInterfaceIPConfiguration()
{
Name = "Primary",
Primary = true,
Subnet = new Subnet() { Id = vnet.Subnets.First().Id },
PrivateIPAllocationMethod = IPAllocationMethod.Dynamic,
PublicIPAddress = new PublicIPAddress() { Id = ipAddress.Id }
}
}
};
nic = await networkInterfaceClient
.StartCreateOrUpdate(resourceGroupName, vmName + "_nic", nic)
.WaitForCompletionAsync();
// Create VM
Console.WriteLine("--------Start create VM--------");
var vm = new VirtualMachine(location)
{
NetworkProfile = new Azure.ResourceManager.Compute.Models.NetworkProfile { NetworkInterfaces = new[] { new NetworkInterfaceReference() { Id = nic.Id } } },
OsProfile = new OSProfile
{
ComputerName = vmName,
AdminUsername = Program.AdminUsername,
AdminPassword = Program.AdminPassword,
LinuxConfiguration = new LinuxConfiguration { DisablePasswordAuthentication = false, ProvisionVMAgent = true }
},
StorageProfile = new StorageProfile()
{
ImageReference = new ImageReference()
{
Offer = "UbuntuServer",
Publisher = "Canonical",
Sku = "18.04-LTS",
Version = "latest"
},
DataDisks = new List<DataDisk>()
},
HardwareProfile = new HardwareProfile() { VmSize = VirtualMachineSizeTypes.StandardB1Ms },
AvailabilitySet = new Azure.ResourceManager.Compute.Models.SubResource() { Id = availabilitySet.Id }
};
var operation = await virtualMachinesClient.StartCreateOrUpdateAsync(resourceGroupName, vmName, vm);
var result = (await operation.WaitForCompletionAsync()).Value;
Console.WriteLine("VM ID: " + result.Id);
Console.WriteLine("--------Done create VM--------");
}
}
}
Source - https://github.com/Azure-Samples/azure-samples-net-management/blob/master/samples/compute/create-virtual-machine/Program.cs
This sample uses DefaultAzureCredential of Azure.Identity to auth, it will try the following credentials in this doc to auth automatically, one of them is VisualStudioCredential, means it will use the logged user account of your VS. You can also use it directly, replace all the DefaultAzureCredential with VisualStudioCredential in the sample code.
Besides, if you still want to get the token to call the REST API manually, you could use the code below.
var tokenCredential = new VisualStudioCredential();
var accessToken = await tokenCredential.GetTokenAsync("https://management.azure.com");

How to update file on Azure Devops through console application

I have a project in Azure Devops with underlying repository as git. I have a automatically created database documentation which is stored in the project repository. To keep this documentation up-to date I want to schedule an application to push the generated documentation to azure on daily basis.
Basically, check out the file, write new content & check-in. Can we do this using Azure devops rest APIs? Is there any example code that I can follow?
You can have a scheduled build using Azure Pipelines, and in the build definition you define a Powershell script that run git related git commands as ShaykiAbramczyk suggested.
Need pay attention with below if you want to run Git commands in a script:
Grant version control permissions to the build service
Allow scripts to access the system token
Merge a feature branch to master
More details please refer our official doc here-- Run Git commands in a script
Script snippet:
#Config Set
git config user.email "$(Build.RequestedForEmail)"
git config user.name "$(Build.RequestedFor)"
#Push new Branch
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master:refs/heads/my-branch
#Other command
......
This is how I have implemented the a solution to check-in content in Azure Devops Git Repo.
Below is the generic class & caller method.
class AzureDevops
{
private readonly Uri uri;
private readonly string personalAccessToken;
public AzureDevops(string orgName, string personalAccessToken)
{
this.uri = new Uri("https://dev.azure.com/" + orgName);
this.personalAccessToken = personalAccessToken;
}
public T Post<T>(dynamic body, string path)
{
if (body == null)
throw new ArgumentNullException("body");
if (path == null)
throw new ArgumentNullException("path");
T output = default(T);
using (var client = new HttpClient())
{
client.BaseAddress = uri;
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", personalAccessToken);
string serl_body = JsonConvert.SerializeObject(body);
var content = new StringContent(serl_body, Encoding.UTF8, "application/json");
using (HttpResponseMessage response = client.PostAsync(path, content).Result)
{
response.EnsureSuccessStatusCode();
output = response.Content.ReadAsAsync<T>().Result;
}
}
return output;
}
public T Get<T>(string path)
{
if (path == null)
throw new ArgumentNullException("path");
T output = default(T);
using (var client = new HttpClient())
{
client.BaseAddress = uri;
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", personalAccessToken);
HttpResponseMessage response = client.GetAsync(path).Result;
if (response.IsSuccessStatusCode)
output = response.Content.ReadAsAsync<T>().Result;
else
throw new ApplicationException(string.Format("Response message is not OK. Issues in action: {0}", path));
}
return output;
}
}
public class Main
{
AzureDevops azureDevops = new AzureDevops("OrgName", "PAT");
private void AddNewContent()
{
ListOfRefResponse.Root listOfRefResponse = azureDevops.Get<ListOfRefResponse.Root>(string.Format("{0}/_apis/git/repositories/{1}/refs? api-version=6.0-preview.1&filter=heads/master", "projectId", "repositoryId"));
ArrayList contentArray = new ArrayList();
contentArray.Add(new ChangesBO
{
changeType = "add",
item = new ChangeItemBO { path = string.Concat("/", Constants.BaseAzureFolder, "/", "projedctName" + "/" + "filename.md") },
newContent = new ChangeContent { content = "new text content", contentType = "rawtext" }
});
dynamic body = new
{
refUpdates = new[] { new { name = Constants.Branch, oldObjectId = listOfRefResponse.value.First().objectId } },
commits = new[] {
new {
comment = Constants.AppKeysUpdateComment,
changes = contentArray.ToArray()
}}
};
CommitSuccessBO.Root commitSuccess = azureDevops.Post<CommitSuccessBO.Root>(body, string.Format("_apis/git/repositories/{0}/pushes?api-version=5.0", "RepositoryId"));
}
}

Azure DocumentDB, When uploading af executing - no result?

In my project I am supposed to get data from openweathermap.org and put that in a collection in my DocumentDB database in Azure.
The code below works locally on my development machine, but when i upload the project, it runs and succeed (says the dashboard) but no documents are created. I can only create the documents if I run from local machine.
Why is that?
Here is my code:
public static void Main()
{
JobHost host = new JobHost();
// The following code ensures that the WebJob will be running continuously
host.Call(typeof(Program).GetMethod("saveWeatherDataToAzureDocumentDB"));
}
[NoAutomaticTrigger]
public static async void saveWeatherDataToAzureDocumentDB()
{
string endpointUrl = ConfigurationManager.AppSettings["EndPointUrl"];
string authorizationKey = ConfigurationManager.AppSettings["AuthorizationKey"];
string url = "http://api.openweathermap.org/data/2.5/weather?q=hanstholm,dk&appid=44db6a862fba0b067b1930da0d769e98";
var request = WebRequest.Create(url);
string text;
var response = (HttpWebResponse)request.GetResponse();
using (var sr = new StreamReader(response.GetResponseStream()))
{
text = sr.ReadToEnd();
}
// Create a new instance of the DocumentClient
var client = new DocumentClient(new Uri(endpointUrl), authorizationKey);
// Check to verify a database with the id=FamilyRegistry does not exist
Database database = client.CreateDatabaseQuery().Where(db => db.Id == "weatherdata").AsEnumerable().FirstOrDefault();
// If the database does not exist, create a new database
if (database == null)
{
database = await client.CreateDatabaseAsync(
new Database
{
Id = "weatherdata"
});
}
// Check to verify a document collection with the id=FamilyCollection does not exist
DocumentCollection documentCollection = client.CreateDocumentCollectionQuery(database.SelfLink).Where(c => c.Id == "weathercollection").AsEnumerable().FirstOrDefault();
// If the document collection does not exist, create a new collection
if (documentCollection == null)
{
documentCollection = await client.CreateDocumentCollectionAsync("dbs/" + database.Id,
new DocumentCollection
{
Id = "weathercollection"
});
}
//Deserialiser til et dynamisk object
if (text == "")
{
mark m = new mark() { name = "Something" };
await client.CreateDocumentAsync(documentCollection.DocumentsLink, m);
}
else
{
var json = JsonConvert.DeserializeObject<dynamic>(text);
json["id"] = json["name"] + "_" + DateTime.Now;
await client.CreateDocumentAsync(documentCollection.DocumentsLink, json);
}
}
public sealed class mark
{
public string name { get; set; }
}
UPDATE - This is what I have in my App.config
<appSettings>
<!-- Replace the value with the value you copied from the Azure management portal -->
<add key="EndPointUrl" value="https://<My account>.documents.azure.com:443/"/>
<!-- Replace the value with the value you copied from the Azure management portal -->
<add key="AuthorizationKey" value="The secret code from Azure"/>
Also, At DocumentDB Account i find the Connection string like this. AccountEndpoint=https://knoerregaard.documents.azure.com:443/;AccountKey=my secret password
How should I apply this to the WebJob?
Appriciate your help!

How to set the EventProcessorHost to read events from now on (UTC)?

We are using the EventProcessorHost to receive events from Azure EventHubs. I've been unsuccessfully trying to configure it (through the EventProcessorOptions.InitialOffsetProvider) to read events from UTC now on but it always reads from the start of the feed. I am not saving checkpoints (and I even deleted the BLOB container created).
This is how I am setting it:
DateTime startDate = DateTime.UtcNow;
var epo = new EventProcessorOptions
{
MaxBatchSize = 100,
PrefetchCount = 100,
ReceiveTimeOut = TimeSpan.FromSeconds(120),
InitialOffsetProvider = (name) => startDate
};
Any guidance would be appreciated.
Think this changed in version 2.0.0 - Rajiv's code would now be:
var eventProcessorOptions = new EventProcessorOptions
{
InitialOffsetProvider = (partitionId) => EventPosition.FromEnqueuedTime(DateTime.UtcNow)
};
Here is an example block with fully qualified classnames:
private static async Task MainAsync(string[] args)
{
try{
Console.WriteLine("Registering EventProcessor...");
string AISEhConnectionStringEndpoint = Configuration["AISEhConnectionStringEndpoint"];
string AISEhConnectionStringSharedAccessKeyName = Configuration["AISEhConnectionStringSharedAccessKeyName"];
string AISEhConnectionStringSharedAccessKey = Configuration["AISEhConnectionStringSharedAccessKey"];
string EhConnectionString = $"Endpoint={AISEhConnectionStringEndpoint};SharedAccessKeyName={AISEhConnectionStringSharedAccessKeyName};SharedAccessKey={AISEhConnectionStringSharedAccessKey}";
string AISEhEntityPath = Configuration["AISEhEntityPath"];
string AISEhConsumerGroupName = Configuration["AISEhConsumerGroupName"];
string AISStorageContainerName = Configuration["AISStorageContainerName"];
string AISStorageAccountName = Configuration["AISStorageAccountName"];
string AISStorageAccountKey = Configuration["AISStorageAccountKey"];
string StorageConnectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", AISStorageAccountName, AISStorageAccountKey);
var eventProcessorHost = new Microsoft.Azure.EventHubs.Processor.EventProcessorHost(
AISEhEntityPath,
AISEhConsumerGroupName,
EhConnectionString,
StorageConnectionString,
AISStorageContainerName);
var options = new Microsoft.Azure.EventHubs.Processor.EventProcessorOptions
{
InitialOffsetProvider = (partitionId) => Microsoft.Azure.EventHubs.EventPosition.FromEnqueuedTime(DateTime.UtcNow)
};
// Registers the Event Processor Host and starts receiving messages
await eventProcessorHost.RegisterEventProcessorAsync<GetEvents>(options);
Thread.Sleep(Timeout.Infinite);
// Disposes of the Event Processor Host
await eventProcessorHost.UnregisterEventProcessorAsync();
}
catch(Exception ex)
{
Console.WriteLine(ex.Message);
NLog.LogManager.GetCurrentClassLogger().Error(ex);
throw;
}
}
}
And here are my general settings with secrets/exact addresses obscured to help work things out, as I found working this out to be less pleasurable than extracting teeth:
"AISEhConnectionStringEndpoint": "sb://<my bus address>.servicebus.windows.net/",
"AISEhConnectionStringSharedAccessKeyName": "<my key name>",
"AISEhConnectionStringSharedAccessKey": "<yeah nah>",
"AISEhEntityPath": "<Event Hub entity path>",
"AISEhConsumerGroupName": "<consumer group name e.g $Default>",
"AISStorageContainerName": "<storage container name>",
"AISStorageAccountName": "<storage account name>",
"AISStorageAccountKey": "<yeah nah>",
I found that the checkpoint folder in the blob was still there and my app was considering this and ignoring the date I set in EventProcessorOptions. After I deleted the container it started to run as expected (taking in count the UTC date).
You can use the EventProcessorOptions class for this and provide an offset set to the desired time.
var eventProcessorOptions = new EventProcessorOptions
{
InitialOffsetProvider = (partitionId) => DateTime.UtcNow
};
You can then use any of the RegisterEventProcessAsync overloads that accepts eventProcessorOptions.

Resources