Read XML from Azure Blob storage in Automation PowerShell - azure

Trying to figure out how to read an XML file from public blob storage.
When I have the XML in my powershell script, I was able to reference it like so:
[xml]$xml = #"
<?xml version="1.0"?>
<PublicSubnets>
<Subnet>1.2.3.0/24</Subnet>
<Subnet>2.3.4.0/24</Subnet>
<Subnet>192.168.1.0/24</Subnet>
<Subnet>10.0.0.0/8</Subnet>
</PublicSubnets>
"#
foreach ($subnet in $xml.PublicSubnets.Subnet)
{
$subnet
}
This would print the four subnets.
I'm trying to get that XML into a separate XML document, that's stored in public blob storage.
Example: https://teststorage.blob.core.windows.net/automation/Subnet.xml
So basically looking for the same functionality I had with the XML included in the script, but looking for the XML to be hosted externally (I removed the first and last line of the XML - "[xml]$xml = #"" and ""#"

Generate a SAS token for the blob and grab the URI and SAS token and put them into the variables $XmlBlobUrl (The Uri of the XML Blob) and $SASToken (Query string of the SAS token)
$XmlBlobUrl = "https://BlobStorageAccount.blob.core.windows.net/subnets.xml"
$SASToken = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
$Url = $XmlBlobUrl + $SASToken
$Subnets = #()
[xml]$xml = Invoke-WebRequest $Url | Select-Object -Expand Content
foreach ($subnet in $xml.PublicSubnets.Subnet) {
$Subnets += $Subnet
}
$Subnets

Related

Is there a nicer way to export Azure DevOps project's Templates (fields and states)?

I use 'witadmin listfields' command for whole collection, but wondering if I could scale fields/states just to a single project?
The reason behind this: sometimes I migrate TFS project to AzureDevOps existing project. And collecting data about fields takes a lot of manual work. Wondering about the automation of this process...
Many thanks!
You can check out the rest api to get the fields/states of a project. See below:
Work Item Types Field - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/fields?api-version=4.1
Work Item Type States - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/states?api-version=4.1-preview.1
For below example, call above rest apis in powershell scripts:
[string]$userName = 'domain\username'
[string]$userPassword = 'password'
# Convert to SecureString
[securestring]$secStringPassword = ConvertTo-SecureString $userPassword -AsPlainText -Force
[pscredential]$credOject = New-Object System.Management.Automation.PSCredential ($userName, $secStringPassword)
$uri = "http://{instance}/{collection}/{project}/_apis/wit/workitemtypes/Bug/fields?api-version=4.1"
$invRestMethParams = #{
Credential = $credOject
Uri = $uri
Method = 'Get'
ContentType = 'application/json'
}
Invoke-RestMethod #invRestMethParams

Multi-tenancy in Power BI Embedded

I have a multi-tenant web application and I am using a database per tenant approach. The web application will also use Power BI Embedded to show reports based on the data for that particular tenant and all reports for each tenant will have the same format but the data source will be different.
From what I've seen, there is not straightforward way to implement multi-tenancy in Power BI, such as passing the data source as parameter. I managed to find two ways how to make Power BI embedded multi-tenant. Either use row-level security which would mean that I need to have a single data warehouse for all the tenant's data, and this is not an option for me. The other option would be having a workspace per tenant.
For the second option I would have a template workspace from which a copy will be created for each new tenant. This tutorial here describes how to do it: https://powerbi.microsoft.com/fr-fr/blog/duplicate-workspaces-using-the-power-bi-rest-apis-a-step-by-step-tutorial/ .
Can the same thing be done through the Power BI C# SDK? I would also need to change the data source used per workspace. How can I do this for all reports in my workspace?
Finally, has someone discovered an easier way how to implement multi-tenancy with Power BI embedded or is this it?
It depends on your data source type (SQL Server, SSAS, CSV files, etc.) and data connectivity mode (import, direct query, etc.). If you can use parameters, then one of your options is to allow the newly cloned report to switch it's data source itself by using connection specific parameters. To do this, open Power Query Editor by clicking Edit Queries and in Manage Parameters define two new text parameters, lets name them ServerName and DatabaseName:
Set their current values to point to one of your data sources, e.g. SQLSERVER2016 and AdventureWorks2016. Then right click your query in the report and open Advanced Editor. Find the server name and database name in the M code:
and replace them with the parameters defined above, so the M code will look like this:
Now you can close and apply changes and your report should work as before. But now when you want to change the data source, do it using Edit Parameters:
and change the server and/or database name to point to the other data source, that you want to use for your report:
After changing parameter values, Power BI Desktop will ask you to apply the changes and reload the data from the new data source. To change the parameter values (i.e. the data source) of a report published in Power BI Service, go to dataset's settings and enter new server and/or database name (check the gateway settings too, if this is on-premise data source):
After changing the data source, refresh your dataset to get the data from the new data source. With Power BI Pro account you can do this 8 times per 24 hours, while if the dataset is in a dedicated capacity, this limit is raised to 48 times per 24 hours.
To do this programatically, use Update Parameters / Update Parameters In Group and Refresh Dataset / Refresh Dataset In Group REST API calls. For example, you can do this with PowerShell like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/UpdateParameters' -Method Post -Body '{
"updateDetails": [
{
"name": "ServerName",
"newValue": "SQLSERVER2019"
},
{
"name": "DatabaseName",
"newValue": "AdventureWorks2019"
}
]
}'
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/refreshes' -Method Post
Disconnect-PowerBIServiceAccount
If you can't use parameters, e.g. Live connection to SSAS, the connection string could be changed using Update Datasources In Group REST API call. In PowerShell this could be done like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/Default.UpdateDatasources' -Method Post -Body '{
"updateDetails": [
{
"datasourceSelector": {
"datasourceType": "AnalysisServices",
"connectionDetails": {
"server": "My-As-Server",
"database": "My-As-Database"
}
},
"connectionDetails": {
"server": "New-As-Server",
"database": "New-As-Database"
}
}
]
}'
Disconnect-PowerBIServiceAccount
Note, that you need to provide both old and new server and database names.
In C# you can do the same in a very similar way, even without Power BI Client:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var client = new HttpClient();
client.DefaultRequestHeaders.Add("Accept", "application/json");
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + accessToken);
var restUrlUpdateParameters = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/Default.UpdateParameters";
var postData = new { updateDetails = new[] { new { name = "ServerName", newValue = "NEWSERVER" }, new { name = "DatabaseName", newValue = "Another_AdventureWorks2016" } } };
var responseUpdate = client.PostAsync(restUrlUpdateParameters, new StringContent(JsonConvert.SerializeObject(postData), Encoding.UTF8, "application/json")).Result;
var restUrlRefreshDataset = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/refreshes";
var responseRefresh = client.PostAsync(restUrlRefreshDataset, null).Result;
Using the Power BI C# client can make you life easier, e.g. refreshing the report can be made this way:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var credentials = new TokenCredentials(accessToken, "Bearer");
using (var client = new PowerBIClient(new Uri("https://api.powerbi.com"), credentials))
{
client.Datasets.RefreshDatasetInGroup(group_id, dataset_id);
}
When calling the API, you need to provide an access token. To acquire it use ADAL or MSAL libraries, e.g. with code like this:
private static string resourceUri = "https://analysis.windows.net/powerbi/api";
private static string authorityUri = "https://login.windows.net/common/"; // It was https://login.windows.net/common/oauth2/authorize in prior versions
private static string clientId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"; // Register at https://dev.powerbi.com/apps
private static string groupId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static string reportId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static AuthenticationContext authContext = new AuthenticationContext(authorityUri, new TokenCache());
public string Authenticate()
{
AuthenticationResult authenticationResult = null;
// First check is there token in the cache
try
{
authenticationResult = authContext.AcquireTokenSilentAsync(resourceUri, clientId).Result;
}
catch (AggregateException ex)
{
AdalException ex2 = ex.InnerException as AdalException;
if ((ex2 == null) || (ex2 != null && ex2.ErrorCode != "failed_to_acquire_token_silently"))
{
MessageBox.Show(ex.Message);
return;
}
}
if (authenticationResult == null)
{
var uc = new UserPasswordCredential("user#example.com, "Strong password");
try
{
authenticationResult = authContext.AcquireTokenAsync(resourceUri, clientId, uc).Result;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message + ex.InnerException == null ? "" : Environment.NewLine + ex.InnerException.Message);
return;
}
}
if (authenticationResult == null)
MessageBox.Show("Call failed.");
else
{
return authenticationResult.AccessToken;
}
}

How to generate a ClientContext from SiteId in SharePoint Online?

I have a SiteId and I want to generate a ClientContext to fetch all the groups of that particular site. But I am not able to find a way to generate a ClientContext from the SiteId same we do in SharePoint on-premises.
Is there a way to generate a ClientContext from SiteId in SharePoint Online or we need the URL only?
I want to achieve something like this:
using(var context = new ClientContext(new GUId(siteId))
{
//TODO
}
You can get your ClientContext in two steps:
search the site by its ID using the search API
create a client context using the site's URL
Here's some PowerShell doing exactly this. I'm using the PnP Cmdlets out of convenience, similar results can also be achieved using plain CSOM.
# this is your site's ID
$siteId = "a20d2341-1b4f-47ed-8180-24a5c31adfa9"
# basically any known site URL - the root is probably fine
$anySiteUrl = "https://<yourtenant>.sharepoint.com"
$credential = Get-Credential
Connect-PnPOnline –Url $anySiteUrl –Credentials $credential
# search for site by ID
$site = Submit-PnPSearchQuery -Query "SiteID:$siteId AND ContentClass=STS_Site"
if ($site.ResultRows.Count -eq 1)
{
# URL to use for "real" connection
$siteUrl = $site.ResultRows[0].Path
Connect-PnPOnline –Url $siteUrl –Credentials $credential
$currentSite = Get-PnPSite
# and there is your ClientContext
$ctx = Get-PnPContext
$web = $currentSite.RootWeb
$ctx.Load($web)
$ctx.Load($web.SiteGroups)
$ctx.ExecuteQuery()
# here are your groups
$web.SiteGroups
}
(Note: you must install the SharePointPnPPowerShellOnline PowerShell module for this code to run.)

Test-AzureRmResourceGroupDeployment cmdlet returns Empty when the validation is successful

I use the Azure Powershell cmdlet below to validate both the ARM template json and ARM template params json files.
$result = Test-AzureRmResourceGroupDeployment -ResourceGroupName TestRG -TemplateFile TestARMTemplate.json -ApiVersion TestARMParams.json
I expect the cmdlet to return true (boolean type) if both input arguments are valid.
However, the result is empty.
The documentation is also not clear on the expected response of this cmdlet.
I would like to know whether the response I got is an expected response or not.
Note: I am using Azure PowerShell version 1.5 (June 2016) on Windows 10 machine.
Looking at the source code for this Cmdlet here, I don't think it returns true or false. It actually returns an object of type List<PSResourceManagerError>. If you do a count on the $result object, it should return you zero if everything's ok.
Here is a handy function for creating an AggregateException containing all the error information from a PSResourceManagerError
function New-DeploymentResultException([Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.PSResourceManagerError]$error)
{
$errorMessage = "$($error.Message) ($($error.Code)) [Target: $($error.Target)]"
if ($error.Details)
{
$innerExceptions = $error.Details | ForEach-Object { New-DeploymentResultException $_ }
return New-Object System.AggregateException $errorMessage, $innerExceptions
}
else
{
return New-Object System.Configuration.ConfigurationErrorsException $errorMessage
}
}

Azure storage account backup (tables and blobs)

I need to periodically backup all blobs and tables in an Azure storage account so that we can restore all that data at a later time if we for any reason corrupt our data.
While I trust that data that we store in Azure is durable and recoverable in case of data center failures, we still need data in our storage accounts to be backed up to prevent from accidental overwrites and deletions (the human error factor).
We have implemented a solution for this that periodically lists all blobs and copies them over to a backup storage account. When a blob has been modified or deleted we simply create a snapshot of the old version in the backup account.
This approach has worked OK for us. But it only handles blobs, not table entities. We now need to support backing up table entities too.
Faced with this task now, I'm thinking that someone else probably have had this requirement before and come up with a smart solution. Or maybe there are commercial products that will do this?
It is not a requirement that the backup target is another Azure storage account. All we need is a way to recover all blobs and tables as they were at the time we ran the backup.
Any help is appreciated!
There are a variety of ways this can be handled.
If you want to do this on your own you can use the storage libraries and write code to just run through the table and pull down the data.
There are also a few services that can do this for you as well (FULL Disclosure: I work for a company that provides this as a service). Here is an article by Troy Hunt talking about our option: http://www.troyhunt.com/2014/01/azure-will-save-you-from-unexpected_28.html. We also have PowerShell Cmdlets that can pull table data down for you (cerebrata.com). To be fair we are not the only players in this space and there are others who have similar services.
Finally, at Tech Ed they announced that the AZCopy tool will be updated later this year so that it can pull down entire tables, which is just automating the reading through tables and pulling them down. There is currently no way to "Snapshot" a table so all of the methods above will result in a copy as the data is copied over, it might have changed in the source table by the time the copy is completed.
I've recently put together a simple solution to backup table storage. It uses the AzCopy tool and the Storage Rest Api to pull down a list of all the tables and do a backup to JSON.
Hope it's useful!
param(
[parameter(Mandatory=$true)]
[string]$Account,
[parameter(Mandatory=$true)]
[string]$SASToken,
[parameter(Mandatory=$true)]
[string]$OutputDir
)
$ErrorActionPreference = "Stop"
##Example Usage
#.\Backup-TableStorage.ps1 -OutputDir "d:\tablebackup" -Account "examplestorageaccount" -SASToken "?sv=2015-04-05&ss=t&srt=sco&sp=rl&st=2016-04-08T07%3A44%3A00Z&se=2016-04-09T07%3A55%3A00Z&sig=CNotAREALSIGNITUREBUTYOURESWOUDLGOHERE3D"
if (-not (Test-Path "${env:ProgramFiles(x86)}\Microsoft SDKs\Azure\AzCopy\AzCopy.exe"))
{
throw "Azcopy not installed - get it from here: https://azure.microsoft.com/en-gb/documentation/articles/storage-use-azcopy/"
}
Write-host ""
Write-Host "Starting backup for account" -ForegroundColor Yellow
Write-host "--------------------------" -ForegroundColor Yellow
Write-Host " -Account: $Account"
Write-Host " -Token: $SASToken"
$response = Invoke-WebRequest -Uri "https://$Account.table.core.windows.net/Tables/$SASToken"
[xml]$tables = $response.Content
$tableNames = $tables.feed.entry.content.properties.TableName
Write-host ""
Write-host "Found Tables to backup" -ForegroundColor Yellow
Write-host "--------------------------" -ForegroundColor Yellow
foreach ($tableName in $tableNames)
{
Write-Host " -Table: $tableName"
}
foreach ($tableName in $tableNames)
{
$url = "https://$Account.table.core.windows.net/$tableName"
Write-host ""
Write-Host "Backing up Table: $url"-ForegroundColor Yellow
Write-host "--------------------------" -ForegroundColor Yellow
Write-host ""
& "${env:ProgramFiles(x86)}\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" /Source:$url /Dest:$OutputDir\$account\ /SourceSAS:$SASToken /Z:"$env:temp\$([guid]::NewGuid()).azcopyJournal"
Write-host ""
Write-host "Backup completed" -ForegroundColor Green
Write-host ""
Write-host ""
}
For more details on usage have a look here:
https://gripdev.wordpress.com/2016/04/08/backup-azure-table-storage-quick-powershell-script/
You can backup any Azure Table Storage table (not blobs though) with free software like Slazure Light. The following C# code backup all your Azure Tables to json files:
Download NuGet packages first:
Install-Package Azure.Storage.Slazure.Light
Create a console application in Visual Studio and add the following code:
using System;
using System.Linq;
using Microsoft.WindowsAzure.Storage.Table;
using Newtonsoft.Json;
using SysSurge.Slazure.AzureTableStorage;
namespace BackupAzureTableStore
{
class Program
{
/// <summary>
/// Usage: BackupAzureTableStore.exe "UseDevelopmentStorage=true"
/// </summary>
/// <param name="args"></param>
static void Main(string[] args)
{
var storage = new DynStorage(args.Length == 0 ? "UseDevelopmentStorage=true" : args[0]);
foreach (var cloudTable in storage.Tables)
{
var tableName = cloudTable.Name;
var fileName = $"{tableName}.json";
using (var file = new System.IO.StreamWriter(fileName))
{
var dynTable = new DynTable(storage.StorageAccount, tableName);
TableContinuationToken token = null; // Continuation token required if > 1,000 rows per table
do
{
var queryResult =
dynTable.TableClient.GetTableReference(tableName)
.ExecuteQuerySegmented(new TableQuery(), token);
file.WriteLine("{{{0} : [", JsonConvert.SerializeObject(tableName));
var cntr = 0;
foreach (var entity in queryResult.Results)
{
var dynEntity = dynTable.Entity(entity.PartitionKey, entity.RowKey);
dynEntity.LoadAll().ToList(); // Force pre-downloading of all properties
file.WriteLine("{0}{1}", cntr++ > 0 ? "," : string.Empty,
JsonConvert.SerializeObject(dynEntity));
}
file.WriteLine("]}");
token = queryResult.ContinuationToken;
} while (token != null);
}
}
Console.WriteLine("Done. Press a key...");
Console.ReadKey();
}
}
}

Resources