I use 'witadmin listfields' command for whole collection, but wondering if I could scale fields/states just to a single project?
The reason behind this: sometimes I migrate TFS project to AzureDevOps existing project. And collecting data about fields takes a lot of manual work. Wondering about the automation of this process...
Many thanks!
You can check out the rest api to get the fields/states of a project. See below:
Work Item Types Field - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/fields?api-version=4.1
Work Item Type States - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/states?api-version=4.1-preview.1
For below example, call above rest apis in powershell scripts:
[string]$userName = 'domain\username'
[string]$userPassword = 'password'
# Convert to SecureString
[securestring]$secStringPassword = ConvertTo-SecureString $userPassword -AsPlainText -Force
[pscredential]$credOject = New-Object System.Management.Automation.PSCredential ($userName, $secStringPassword)
$uri = "http://{instance}/{collection}/{project}/_apis/wit/workitemtypes/Bug/fields?api-version=4.1"
$invRestMethParams = #{
Credential = $credOject
Uri = $uri
Method = 'Get'
ContentType = 'application/json'
}
Invoke-RestMethod #invRestMethParams
I have a multi-tenant web application and I am using a database per tenant approach. The web application will also use Power BI Embedded to show reports based on the data for that particular tenant and all reports for each tenant will have the same format but the data source will be different.
From what I've seen, there is not straightforward way to implement multi-tenancy in Power BI, such as passing the data source as parameter. I managed to find two ways how to make Power BI embedded multi-tenant. Either use row-level security which would mean that I need to have a single data warehouse for all the tenant's data, and this is not an option for me. The other option would be having a workspace per tenant.
For the second option I would have a template workspace from which a copy will be created for each new tenant. This tutorial here describes how to do it: https://powerbi.microsoft.com/fr-fr/blog/duplicate-workspaces-using-the-power-bi-rest-apis-a-step-by-step-tutorial/ .
Can the same thing be done through the Power BI C# SDK? I would also need to change the data source used per workspace. How can I do this for all reports in my workspace?
Finally, has someone discovered an easier way how to implement multi-tenancy with Power BI embedded or is this it?
It depends on your data source type (SQL Server, SSAS, CSV files, etc.) and data connectivity mode (import, direct query, etc.). If you can use parameters, then one of your options is to allow the newly cloned report to switch it's data source itself by using connection specific parameters. To do this, open Power Query Editor by clicking Edit Queries and in Manage Parameters define two new text parameters, lets name them ServerName and DatabaseName:
Set their current values to point to one of your data sources, e.g. SQLSERVER2016 and AdventureWorks2016. Then right click your query in the report and open Advanced Editor. Find the server name and database name in the M code:
and replace them with the parameters defined above, so the M code will look like this:
Now you can close and apply changes and your report should work as before. But now when you want to change the data source, do it using Edit Parameters:
and change the server and/or database name to point to the other data source, that you want to use for your report:
After changing parameter values, Power BI Desktop will ask you to apply the changes and reload the data from the new data source. To change the parameter values (i.e. the data source) of a report published in Power BI Service, go to dataset's settings and enter new server and/or database name (check the gateway settings too, if this is on-premise data source):
After changing the data source, refresh your dataset to get the data from the new data source. With Power BI Pro account you can do this 8 times per 24 hours, while if the dataset is in a dedicated capacity, this limit is raised to 48 times per 24 hours.
To do this programatically, use Update Parameters / Update Parameters In Group and Refresh Dataset / Refresh Dataset In Group REST API calls. For example, you can do this with PowerShell like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/UpdateParameters' -Method Post -Body '{
"updateDetails": [
{
"name": "ServerName",
"newValue": "SQLSERVER2019"
},
{
"name": "DatabaseName",
"newValue": "AdventureWorks2019"
}
]
}'
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/refreshes' -Method Post
Disconnect-PowerBIServiceAccount
If you can't use parameters, e.g. Live connection to SSAS, the connection string could be changed using Update Datasources In Group REST API call. In PowerShell this could be done like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/Default.UpdateDatasources' -Method Post -Body '{
"updateDetails": [
{
"datasourceSelector": {
"datasourceType": "AnalysisServices",
"connectionDetails": {
"server": "My-As-Server",
"database": "My-As-Database"
}
},
"connectionDetails": {
"server": "New-As-Server",
"database": "New-As-Database"
}
}
]
}'
Disconnect-PowerBIServiceAccount
Note, that you need to provide both old and new server and database names.
In C# you can do the same in a very similar way, even without Power BI Client:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var client = new HttpClient();
client.DefaultRequestHeaders.Add("Accept", "application/json");
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + accessToken);
var restUrlUpdateParameters = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/Default.UpdateParameters";
var postData = new { updateDetails = new[] { new { name = "ServerName", newValue = "NEWSERVER" }, new { name = "DatabaseName", newValue = "Another_AdventureWorks2016" } } };
var responseUpdate = client.PostAsync(restUrlUpdateParameters, new StringContent(JsonConvert.SerializeObject(postData), Encoding.UTF8, "application/json")).Result;
var restUrlRefreshDataset = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/refreshes";
var responseRefresh = client.PostAsync(restUrlRefreshDataset, null).Result;
Using the Power BI C# client can make you life easier, e.g. refreshing the report can be made this way:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var credentials = new TokenCredentials(accessToken, "Bearer");
using (var client = new PowerBIClient(new Uri("https://api.powerbi.com"), credentials))
{
client.Datasets.RefreshDatasetInGroup(group_id, dataset_id);
}
When calling the API, you need to provide an access token. To acquire it use ADAL or MSAL libraries, e.g. with code like this:
private static string resourceUri = "https://analysis.windows.net/powerbi/api";
private static string authorityUri = "https://login.windows.net/common/"; // It was https://login.windows.net/common/oauth2/authorize in prior versions
private static string clientId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"; // Register at https://dev.powerbi.com/apps
private static string groupId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static string reportId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static AuthenticationContext authContext = new AuthenticationContext(authorityUri, new TokenCache());
public string Authenticate()
{
AuthenticationResult authenticationResult = null;
// First check is there token in the cache
try
{
authenticationResult = authContext.AcquireTokenSilentAsync(resourceUri, clientId).Result;
}
catch (AggregateException ex)
{
AdalException ex2 = ex.InnerException as AdalException;
if ((ex2 == null) || (ex2 != null && ex2.ErrorCode != "failed_to_acquire_token_silently"))
{
MessageBox.Show(ex.Message);
return;
}
}
if (authenticationResult == null)
{
var uc = new UserPasswordCredential("user#example.com, "Strong password");
try
{
authenticationResult = authContext.AcquireTokenAsync(resourceUri, clientId, uc).Result;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message + ex.InnerException == null ? "" : Environment.NewLine + ex.InnerException.Message);
return;
}
}
if (authenticationResult == null)
MessageBox.Show("Call failed.");
else
{
return authenticationResult.AccessToken;
}
}
Trying to figure out how to read an XML file from public blob storage.
When I have the XML in my powershell script, I was able to reference it like so:
[xml]$xml = #"
<?xml version="1.0"?>
<PublicSubnets>
<Subnet>1.2.3.0/24</Subnet>
<Subnet>2.3.4.0/24</Subnet>
<Subnet>192.168.1.0/24</Subnet>
<Subnet>10.0.0.0/8</Subnet>
</PublicSubnets>
"#
foreach ($subnet in $xml.PublicSubnets.Subnet)
{
$subnet
}
This would print the four subnets.
I'm trying to get that XML into a separate XML document, that's stored in public blob storage.
Example: https://teststorage.blob.core.windows.net/automation/Subnet.xml
So basically looking for the same functionality I had with the XML included in the script, but looking for the XML to be hosted externally (I removed the first and last line of the XML - "[xml]$xml = #"" and ""#"
Generate a SAS token for the blob and grab the URI and SAS token and put them into the variables $XmlBlobUrl (The Uri of the XML Blob) and $SASToken (Query string of the SAS token)
$XmlBlobUrl = "https://BlobStorageAccount.blob.core.windows.net/subnets.xml"
$SASToken = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
$Url = $XmlBlobUrl + $SASToken
$Subnets = #()
[xml]$xml = Invoke-WebRequest $Url | Select-Object -Expand Content
foreach ($subnet in $xml.PublicSubnets.Subnet) {
$Subnets += $Subnet
}
$Subnets
I'm enabling HTTPS on my IIS server where I have SharePoint Services 3.0 installed and I'd like to programatically update the default alternate access mappings for a single web application and my central administration instance (both on the same machine). Here's the code I have so far (Powershell), it adds a mapping for HTTPS but I get and error when trying to remove the original one.
Here's my code:
[void][system.reflection.assembly]::LoadWithPartialName("Microsoft.Sharepoint")
$SPWebServiceCollection = new-object Microsoft.SharePoint.Administration.SPWebServiceCollection ([Microsoft.SharePoint.Administration.SPFarm]::Local)
foreach ($SPWebService in $SPWebServiceCollection) {
foreach ($webApplication in $SPWebService.WebApplications) {
Write-Host ('Updating {0}' -f $webApplication.Name)
foreach ($alternateUrl in $webApplication.AlternateUrls) {
$incomingUrl = [System.URI] $alternateUrl.IncomingUrl
$newURL = 'https://{0}{1}' -f $incomingUrl.Authority, $incomingUrl.PathAndQuery
$newAltURL = New-Object Microsoft.SharePoint.Administration.SPAlternateUrl ($newURL, $alternateUrl.UrlZone)
$webApplication.AlternateUrls.Add($newAltURL)
$webApplication.AlternateUrls.Update($true)
$webApplication.AlternateUrls.Remove($alternateUrl) #Throws Exception
$webApplication.AlternateUrls.Update($true)
}
}
}
Here is the error I get when I try to remove the original:
Exception calling "Remove" with "1" argument(s): "An object in the SharePoint administrative framework, "SPAlternateUrlCollection Name=SharePoint - 1000 Parent=SPFarm Name=SharePoint_Config_8ddd3701-a332-4e79-98e4-fa11c1b6c17c", could not be deleted because other objects depend on it. Update all of these dependants to point to null or different objects and retry this operation. The dependant objects are as follows:
SPWebApplication Name=SharePoint - 1000 Parent=SPWebService
However, i'm not sure how to do what the exception suggests.
Ah... it looks like you are trying to remove the URL the Webservice is using...
It turns out there's another method for the exiting default entry that I overlooked:
$webApplication.AlternateUrls.SetResponseUrl($newAltURL)
[void][system.reflection.assembly]::LoadWithPartialName("Microsoft.Sharepoint")
$SPWebServiceCollection = new-object Microsoft.SharePoint.Administration.SPWebServiceCollection ([Microsoft.SharePoint.Administration.SPFarm]::Local)
foreach ($SPWebService in $SPWebServiceCollection) {
foreach ($webApplication in $SPWebService.WebApplications) {
Write-Host ('Updating {0}' -f $webApplication.Name)
foreach ($alternateUrl in $webApplication.AlternateUrls) {
$incomingUrl = [System.URI] $alternateUrl.IncomingUrl
$newURL = 'https://{0}{1}' -f $incomingUrl.Authority, $incomingUrl.PathAndQuery
$newAltURL = New-Object Microsoft.SharePoint.Administration.SPAlternateUrl ($newURL, $alternateUrl.UrlZone)
$webApplication.AlternateUrls.SetResponseUrl($newAltURL)
$webApplication.AlternateUrls.Update($true)
}
}
}
How do you figure out the current size of the sharepoint web application? Better yet, the size of a site collection or a subsite.
I am planning to move a site collection from one farm to another. I need to plan the storage capacity first.
All content for SharePoint is stored in Content Database (unless you are using some sort of 3rd party external BLOB provider).
A site collection (aka top level site) is stored in a single content database but each content database can have multiple site collections.
You can work out the site of the content databases using SQL Management Studio, stored procedures (though beware that these can include overhead like logfiles or allocated but unused space)
You can use the open source SPUsedSpaceInfo utility
You can use free tools like BLOBulator.
Programatically you can loop through the folders and subwebs of an SPWeb and add up the size of all the contents
These are going to give slightly different results -e.g. one is looking at the size of documents stored, the other is the size of the content database storing those documents. None of these is going to include the files in C:\Inetpub\wwwroot\wss\VirtualDirectories\80 or C:\Program Files\Common Files\Microsoft Shared\web server extensions\12 but these are nearly always insignificant compared to the size of the documents stored in SharePoint.
You can see the size (in bytes) opening Sharepoint 2010 Management Shell (run it with as Administrator) and execute:
> Start-SPAssignment -Global
> (Get-SPSiteAdministration -Identity http://YourSharePointURL/urlToYourSite/).DiskUsed
Also, you would like to know each subsite size. To do that, run the following script under the Sharepoint Management Shell:
function GetWebSizes ($StartWeb)
{
$web = Get-SPWeb $StartWeb
[long]$total = 0
$total += GetWebSize -Web $web
$total += GetSubWebSizes -Web $web
$totalInMb = ($total/1024)/1024
$totalInMb = "{0:N2}" -f $totalInMb
$totalInGb = (($total/1024)/1024)/1024
$totalInGb = "{0:N2}" -f $totalInGb
write-host "Total size of all sites below" $StartWeb "is" $total "Bytes,"
write-host "which is" $totalInMb "MB or" $totalInGb "GB"
$web.Dispose()
}
function GetWebSize ($Web)
{
[long]$subtotal = 0
foreach ($folder in $Web.Folders)
{
$subtotal += GetFolderSize -Folder $folder
}
write-host "Site" $Web.Title "is" $subtotal "KB"
return $subtotal
}
function GetSubWebSizes ($Web)
{
[long]$subtotal = 0
foreach ($subweb in $Web.GetSubwebsForCurrentUser())
{
[long]$webtotal = 0
foreach ($folder in $subweb.Folders)
{
$webtotal += GetFolderSize -Folder $folder
}
write-host "Site" $subweb.Title "is" $webtotal "Bytes"
$subtotal += $webtotal
$subtotal += GetSubWebSizes -Web $subweb
}
return $subtotal
}
function GetFolderSize ($Folder)
{
[long]$folderSize = 0
foreach ($file in $Folder.Files)
{
$folderSize += $file.Length;
}
foreach ($fd in $Folder.SubFolders)
{
$folderSize += GetFolderSize -Folder $fd
}
return $folderSize
}
Then:
GetWebSizes -StartWeb <startURL>
I hope this will help you... :)
Source: http://get-spscripts.com/2010/08/check-size-of-sharepoint-2010-sites.html