Multi-tenancy in Power BI Embedded - azure

I have a multi-tenant web application and I am using a database per tenant approach. The web application will also use Power BI Embedded to show reports based on the data for that particular tenant and all reports for each tenant will have the same format but the data source will be different.
From what I've seen, there is not straightforward way to implement multi-tenancy in Power BI, such as passing the data source as parameter. I managed to find two ways how to make Power BI embedded multi-tenant. Either use row-level security which would mean that I need to have a single data warehouse for all the tenant's data, and this is not an option for me. The other option would be having a workspace per tenant.
For the second option I would have a template workspace from which a copy will be created for each new tenant. This tutorial here describes how to do it: https://powerbi.microsoft.com/fr-fr/blog/duplicate-workspaces-using-the-power-bi-rest-apis-a-step-by-step-tutorial/ .
Can the same thing be done through the Power BI C# SDK? I would also need to change the data source used per workspace. How can I do this for all reports in my workspace?
Finally, has someone discovered an easier way how to implement multi-tenancy with Power BI embedded or is this it?

It depends on your data source type (SQL Server, SSAS, CSV files, etc.) and data connectivity mode (import, direct query, etc.). If you can use parameters, then one of your options is to allow the newly cloned report to switch it's data source itself by using connection specific parameters. To do this, open Power Query Editor by clicking Edit Queries and in Manage Parameters define two new text parameters, lets name them ServerName and DatabaseName:
Set their current values to point to one of your data sources, e.g. SQLSERVER2016 and AdventureWorks2016. Then right click your query in the report and open Advanced Editor. Find the server name and database name in the M code:
and replace them with the parameters defined above, so the M code will look like this:
Now you can close and apply changes and your report should work as before. But now when you want to change the data source, do it using Edit Parameters:
and change the server and/or database name to point to the other data source, that you want to use for your report:
After changing parameter values, Power BI Desktop will ask you to apply the changes and reload the data from the new data source. To change the parameter values (i.e. the data source) of a report published in Power BI Service, go to dataset's settings and enter new server and/or database name (check the gateway settings too, if this is on-premise data source):
After changing the data source, refresh your dataset to get the data from the new data source. With Power BI Pro account you can do this 8 times per 24 hours, while if the dataset is in a dedicated capacity, this limit is raised to 48 times per 24 hours.
To do this programatically, use Update Parameters / Update Parameters In Group and Refresh Dataset / Refresh Dataset In Group REST API calls. For example, you can do this with PowerShell like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/UpdateParameters' -Method Post -Body '{
"updateDetails": [
{
"name": "ServerName",
"newValue": "SQLSERVER2019"
},
{
"name": "DatabaseName",
"newValue": "AdventureWorks2019"
}
]
}'
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/refreshes' -Method Post
Disconnect-PowerBIServiceAccount
If you can't use parameters, e.g. Live connection to SSAS, the connection string could be changed using Update Datasources In Group REST API call. In PowerShell this could be done like this:
Import-Module MicrosoftPowerBIMgmt
Import-Module MicrosoftPowerBIMgmt.Profile
$password = "xxxxx" | ConvertTo-SecureString -asPlainText -Force
$username = "xxxxx#yyyyy.com"
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
Connect-PowerBIServiceAccount -Credential $credential
Invoke-PowerBIRestMethod -Url 'groups/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/datasets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/Default.UpdateDatasources' -Method Post -Body '{
"updateDetails": [
{
"datasourceSelector": {
"datasourceType": "AnalysisServices",
"connectionDetails": {
"server": "My-As-Server",
"database": "My-As-Database"
}
},
"connectionDetails": {
"server": "New-As-Server",
"database": "New-As-Database"
}
}
]
}'
Disconnect-PowerBIServiceAccount
Note, that you need to provide both old and new server and database names.
In C# you can do the same in a very similar way, even without Power BI Client:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var client = new HttpClient();
client.DefaultRequestHeaders.Add("Accept", "application/json");
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + accessToken);
var restUrlUpdateParameters = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/Default.UpdateParameters";
var postData = new { updateDetails = new[] { new { name = "ServerName", newValue = "NEWSERVER" }, new { name = "DatabaseName", newValue = "Another_AdventureWorks2016" } } };
var responseUpdate = client.PostAsync(restUrlUpdateParameters, new StringContent(JsonConvert.SerializeObject(postData), Encoding.UTF8, "application/json")).Result;
var restUrlRefreshDataset = $"https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/refreshes";
var responseRefresh = client.PostAsync(restUrlRefreshDataset, null).Result;
Using the Power BI C# client can make you life easier, e.g. refreshing the report can be made this way:
var group_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var dataset_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
var credentials = new TokenCredentials(accessToken, "Bearer");
using (var client = new PowerBIClient(new Uri("https://api.powerbi.com"), credentials))
{
client.Datasets.RefreshDatasetInGroup(group_id, dataset_id);
}
When calling the API, you need to provide an access token. To acquire it use ADAL or MSAL libraries, e.g. with code like this:
private static string resourceUri = "https://analysis.windows.net/powerbi/api";
private static string authorityUri = "https://login.windows.net/common/"; // It was https://login.windows.net/common/oauth2/authorize in prior versions
private static string clientId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"; // Register at https://dev.powerbi.com/apps
private static string groupId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static string reportId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
private static AuthenticationContext authContext = new AuthenticationContext(authorityUri, new TokenCache());
public string Authenticate()
{
AuthenticationResult authenticationResult = null;
// First check is there token in the cache
try
{
authenticationResult = authContext.AcquireTokenSilentAsync(resourceUri, clientId).Result;
}
catch (AggregateException ex)
{
AdalException ex2 = ex.InnerException as AdalException;
if ((ex2 == null) || (ex2 != null && ex2.ErrorCode != "failed_to_acquire_token_silently"))
{
MessageBox.Show(ex.Message);
return;
}
}
if (authenticationResult == null)
{
var uc = new UserPasswordCredential("user#example.com, "Strong password");
try
{
authenticationResult = authContext.AcquireTokenAsync(resourceUri, clientId, uc).Result;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message + ex.InnerException == null ? "" : Environment.NewLine + ex.InnerException.Message);
return;
}
}
if (authenticationResult == null)
MessageBox.Show("Call failed.");
else
{
return authenticationResult.AccessToken;
}
}

Related

Can someone share datasets in PowerBI across different service principle profiles?

We would like to expose some datasets that are common for all customers and could be imported into their powerbi reports across different workspaces.
We have a design where we use one service principle profile per customer. This in provides securoty around isolating the data of each customer. Is there any way (using service principle profiles) to be able to support the sharing of some common data across workspaces?
Is there any way (using service principal profiles) to be able to support the sharing of some common data across workspaces?
Yes, you can create a security group in Azure AD and add all the service principals that are modelled to your customers in that group and then provide that security group access to your workspace by adding the group as with required permissions on the workspace where the shared dataset exists. If you want to give access to individual service principals, you can do that too.
Your customers can then use the SP’s client ID, Tenant ID and Secret to access the dataset via their app in the browser or even by calling the data via PowerShell. You can directly share your dataset with the security group containing your Service principals.
Created a security group in Azure AD and added my service principal to the group: -
Power BI settings:-
Go to app.powerbi.com Log in to your Power BI workspace > click on Settings > Admin Portal > Tenant Settings > Developer Settings > Allow service principals to use Power BI API’s Enable > and then provide access to your security group
There are 2 ways you can share your data with the Service principal
1. Sharing directly
By directly sharing the workspace dataset with the Security group where the service principals exist.
2. Access permissions on workspace
Provide Access to the workspace so the sp’s can access data set directly via their app or call the workspace via Power shell.
Go to your workspace > Select your data and click on … dots > Manage permissions > Grant people access > Select your Power BI Embed group
You can change the read write permissions later >
Calling Dataset via Application -
Now, We can call this Dataset from our application by adding the Authentication to Service principal and adding the SP’s Client ID, Tenant ID, Client Secret etc. You can refer this document :-
https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-organization-app
Authentication Code :
This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices (IServiceCollection services) {
services
.AddMicrosoftIdentityWebAppAuthentication(Configuration)
.EnableTokenAcquisitionToCallDownstreamApi(PowerBiServiceApi.RequiredScopes)
.AddInMemoryTokenCaches();
services.AddScoped (typeof (PowerBiServiceApi));
var mvcBuilder = services.AddControllersWithViews (options => {
var policy = new AuthorizationPolicyBuilder()
.RequireAuthenticatedUser()
.Build();
options.Filters.Add (new AuthorizeFilter (policy));
});
mvcBuilder.AddMicrosoftIdentityUI();
services.AddRazorPages();
}
app.settings:
{
"AzureAd": {
"Instance": "https://login.microsoftonline.com/",
"Domain": "xxxx.onmicrosoft.com",
"TenantId": "xxxxxxxxxxxxx",
"ClientId": "xxxxxxxxxxxxx",
"ClientSecret": "xxxxxxxx",
"CallbackPath": "/signin-oidc",
"SignedOutCallbackPath": "/signout-callback-oidc"
},
"PowerBi": {
"ServiceRootUrl": "https://api.powerbi.com"
},
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*"
}
Controller.cs
private PowerBiServiceApi powerBiServiceApi;
public HomeController (PowerBiServiceApi powerBiServiceApi) {
this.powerBiServiceApi = powerBiServiceApi;
}
[AllowAnonymous]
public IActionResult Index() {
return View();
}
public async Task<IActionResult> Embed() {
Guid workspaceId = new Guid("11111111-1111-1111-1111-111111111111");
Guid reportId = new Guid("22222222-2222-2222-2222-222222222222");
var viewModel = await powerBiServiceApi.GetReport(workspaceId, reportId);
return View(viewModel);
}
[AllowAnonymous]
[ResponseCache (Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult Error() {
return View (new ErrorViewModel { RequestId = Activity.Current?.Id ?? HttpContext.TraceIdentifier });
}
}
Embed your powerbi data set with JS
$(function(){
// 1 - Get DOM object for div that is report container
let reportContainer = document.getElementById("embed-container");
// 2 - Get report embedding data from view model
let reportId = window.viewModel.reportId;
let embedUrl = window.viewModel.embedUrl;
let token = window.viewModel.token
// 3 - Embed report using the Power BI JavaScript API.
let models = window['powerbi-client'].models;
let config = {
type: 'report',
id: reportId,
embedUrl: embedUrl,
accessToken: token,
permissions: models.Permissions.All,
tokenType: models.TokenType.Aad,
viewMode: models.ViewMode.View,
settings: {
panes: {
filters: { expanded: false, visible: true },
pageNavigation: { visible: false }
}
}
};
// Embed the report and display it within the div container.
let report = powerbi.embed(reportContainer, config);
Add such codes depending on the framework of your customer’s App and run the app to access the Power BI data.
Accessing PowerBI workspace with Powershell -
Refer the document here :- https://learn.microsoft.com/en-us/powershell/module/microsoftpowerbimgmt.profile/connect-powerbiserviceaccount?view=powerbi-ps
Powershell commands :-
Install PowerBI Powershell module -
Install-Module -Name MicrosoftPowerBIMgmt
Connect to PowerBI SP -
Connect-PowerBIServiceAccount -ServicePrincipal -Credential (Get-Credential) -Tenant 83331f4e-7f45-4ce4-99ed-af9038592395
In the User name enter the App Id of the SP and in Password add the secret that was created for the SP during app registration.
connected to PowerBI successfully :-
Get the workspace -
Get-PowerBIWorkspace
Reference :-
https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal
https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-organization-app

[Azure Deployments]: Can we filter Azure Resource Group Deployments by name using c#?

I am trying to fetch the Azure Resource Group Deployments by using filter where name starting with "Deploy" but can't find any documentation on the $filter.
I tried to do something like below:
try
{
var credentials = new ClientSecretCredential(tenantId, clientId, clientSecret);
var resourceClient = new ResourcesManagementClient(subscriptionId, credentials);
var deployments = resourceClient.Deployments;
AsyncPageable<DeploymentExtended> rgDeployments = deployments.ListByResourceGroupAsync("myRG", "name eq 'Deploy-20210412184314'");
await foreach (DeploymentExtended deploymentProperties in rgDeployments)
{
Trace.WriteLine(deploymentProperties.Name);
}
}
catch(Exception e)
{
Trace.WriteLine(e.Message);
}
But it gives error saying -
{"error":{"code":"InvalidProvisioningStateFilter","message":"Invalid $filter 'name eq 'Deploy-20210412184314'' specified in the query string."}}
So can we only use filter like provisioningState eq '{state}' not for name?
I am using Azure Resources Management client library for .NET.
Please refer this documentation.
I am afraid you could not use the filter with name eq 'Deploy-20210412184314', in this case, if you already knew the name of the deployment and want to get it at the resource group scope, no need to use ListByResourceGroupAsync, just use this method DeploymentsOperations.GetAsync(String, String, CancellationToken), pass the resourceGroupName and deploymentName, you can simply get it.

Is there a nicer way to export Azure DevOps project's Templates (fields and states)?

I use 'witadmin listfields' command for whole collection, but wondering if I could scale fields/states just to a single project?
The reason behind this: sometimes I migrate TFS project to AzureDevOps existing project. And collecting data about fields takes a lot of manual work. Wondering about the automation of this process...
Many thanks!
You can check out the rest api to get the fields/states of a project. See below:
Work Item Types Field - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/fields?api-version=4.1
Work Item Type States - List
GET https://{instance}/{collection}/{project}/_apis/wit/workitemtypes/{type}/states?api-version=4.1-preview.1
For below example, call above rest apis in powershell scripts:
[string]$userName = 'domain\username'
[string]$userPassword = 'password'
# Convert to SecureString
[securestring]$secStringPassword = ConvertTo-SecureString $userPassword -AsPlainText -Force
[pscredential]$credOject = New-Object System.Management.Automation.PSCredential ($userName, $secStringPassword)
$uri = "http://{instance}/{collection}/{project}/_apis/wit/workitemtypes/Bug/fields?api-version=4.1"
$invRestMethParams = #{
Credential = $credOject
Uri = $uri
Method = 'Get'
ContentType = 'application/json'
}
Invoke-RestMethod #invRestMethParams

Kusto data ingestion from an Azure Function App ends with a 403

I try to ingest data from azure function app into a ADX database. I followed the instruction found in the the article here.
The difference is, I'd like to insert data into the table. I struggle with a 403 error "Principal 'aadapp=;' is not authorized to access table"
What I did:
I have created a AAD App with the following API permissions:
AAD App configured permission
I configured the database via Kusto Explorer:
.add database myDB ingestors ('aadapp=;')
'theAADAppname'
.add table PressureRecords ingestors ('aadapp=;') 'theAADAppname'
.add table TemperatureRecords ingestors ('aadapp=;') 'theAADAppname'
My code:
var kcsbDM = new KustoConnectionStringBuilder($"https://ingest-{serviceNameAndRegion}.kusto.windows.net:443/").WithAadApplicationKeyAuthentication(
applicationClientId: "<my AD app Id>",
applicationKey: "<my App Secret from Certificates & secrets>",
authority: "<my tenant Id>");
using (var ingestClient = KustoIngestFactory.CreateQueuedIngestClient(kcsbDM))
{
var ingestProps = new KustoQueuedIngestionProperties(databaseName, tableName);
ingestProps.ReportLevel = IngestionReportLevel.FailuresAndSuccesses;
ingestProps.ReportMethod = IngestionReportMethod.Queue;
ingestProps.JSONMappingReference = mappingName;
ingestProps.Format = DataSourceFormat.json;
using (var memStream = new MemoryStream())
using (var writer = new StreamWriter(memStream))
{
var messageString = JsonConvert.SerializeObject(myObject); // maps to the table / mapping
writer.WriteLine(messageString);
writer.Flush();
memStream.Seek(0, SeekOrigin.Begin);
// Post ingestion message
ingestClient.IngestFromStream(memStream, ingestProps, leaveOpen: true);
}
The issue is that the mapping you are using in this ingestion command does not match the existing table schema (it has additional columns). In these cases Azure Data Explorer (Kusto) attempts to add the additional columns it finds in the mappings. Since the permission that the app has is 'ingestor', it cannot modify the table structure and thus the ingestion fails.
In your specific case, your table has a column that is written in a specific casing and in the ingestion mapping the same column has a different casing (for one character) so it is treated as a new column.
We will look into providing a better error message in this case.
Update: the issue is fixed in the system and now it works as expected.
Avnera thanks for your hint, potential it is an issue because of the Real vs double translation. In one of my first try I used double in the table and that worked. That is not longer possible, looks the supported data types changed.
My current configuration:
.create table PressureRecords ( Timestamp:datetime, DeviceId:guid, Pressure:real )
.create-or-alter table PressureRecords ingestion json mapping "PressureRecords"
'['
'{"column":"TimeStamp","path":"$.DateTime","datatype":"datetime","transform":null},'
'{"column":"DeviceId","path":"$.DeviceId","datatype":"guid","transform":null},'
'{"column":"Pressure","path":"$.Pressure","datatype":"real","transform":null}'
']'
public class PressureRecord
{
[JsonProperty(PropertyName = "Pressure")]
public double Pressure { get; set; }
[JsonProperty(PropertyName = "DateTime")]
public DateTime DateTime { get; set; } = DateTime.Now;
[JsonProperty(PropertyName = "DeviceId")]
[Key]
public Guid DeviceId { get; set; }
}

Project Server Online CSOM - GeneralSecurityAccessDenied while reading TimePhase Assignments

This is my first SO question so please let me know if this question is not very clear or if I am missing anything.
FYI SO prevented me from attaching links, so sorry for all the bad formatting.
Overview
I'm trying to read (and write) the "Actual work" for a resource in Project Server Online by using the CSOM library available by Microsoft. Reading and writing the assignments and Actual work is working perfectly, as long as I am reading the assignments for the currently authenticated user. If I attempt to read this for another resource, I receive a GeneralSecurityAccessDenied error.
I've done this in the past using Impersonation, which is supposed to be called transparently in the background if the user has the StatusBrokerPermission, but it doesn't seem to be working for me. Impersonation has been removed in 2013+, so that's no longer an option.
Problem summary
The CSOM is supposed to transparently enable statusing extensions to allow status updates to be made for resources other than the currently authenticated user (as long as the user has the status broker permission). This works fine for adding new assignments, but does not work when trying to update actual TimePhased hours via the TimePhased assignments. The assignments cannot be queried, and thus, we cannot call SubmitAllStatusUpdates to submit the hours.
Research
Usage scenarios for the CSOM: https:// msdn.microsoft.com/en-us/library/office/jj163082(v=office.15).aspx#pj15_WhatTheCSOM_UsageScenarios
Impersonation Deprecated: https:// msdn.microsoft.com/en-us/library/office/ee767690(v=office.15).aspx#pj15_WhatsNew_Deprecated)
Picture: Supposed to read on behalf of another user...
People with the same problem # 1: https:// social.technet.microsoft.com/Forums/projectserver/en-US/dccdb543-18a1-4a0e-a948-5d861305516e/how-to-get-resource-assignments-summary-view-data-project-server-online-2013?forum=projectonline)
People with the same problem # 2: http:// uzzai.com/ZB43wp95/ps2013-app-how-to-read-and-update-timephased-data-with-jsom-javascript-csom.html
People with the same problem # 4: https:// social.technet.microsoft.com/Forums/Sharepoint/en-US/be27d497-e959-44b6-97cb-8f19fe0278fe/csom-how-to-set-timephase-data-on-an-assignment?forum=project2010custprog
Other things I've tried
Using the CSOM with the MsOnlineClaimsHelper to retrieve the FedAuth cookies for a user (and assigning them using the CookieContainer).
Using the REST/OData API.
a) https:// URL.sharepoint.com/sites/pwa/_api/ProjectServer/EnterpriseResources('c39ba8f1-00fe-e311-8894-00155da45f0e')/Assignments/GetTimePhaseByUrl(start='2014-12-09',end='2014-12-09')/assignments
Enabling the "StatusBrokerPermission" for the user
Unchecking the “Only allow task updates via Tasks and Timesheets.” Option within the server settings screen (Task settings and display).
Creating a SharePoint-hosted app and using JSOM code equivalent to the CSOM code above.
a) The code we wrote was JavaScript being executed from within SharePoint app, so we did not need to provide authentication. The user who was logged in had the StatusBrokerPermission.
Using a Provider-hosted SharePoint app and using the CSOM code above. We tried using all authentication methods for CSOM above, with an additional test:
a) using Fiddler to view the FedAuth cookies being set by the SharePoint app authentication, and overriding the WebRequest to manually insert the FedAuth/rtFA cookies: webRequestEventArgs.WebRequestExecutor.WebRequest.CookieContainer = getStaticCookieContainer();
Using timesheets to submit time phased data.
a) We can only create a timesheet for the currently-authenticated user, and cannot populate timesheet lines with projects / assignments not available to him (or a GeneralItemDoesNotExist error is thrown).
Manually issuing a “SubmitAllStatusUpdates” CSOM request using fiddler, as a different user.
a) The purpose of this test was to determine if we can write time phased data, even if we can’t read it.
Making sure the project was checked out to the current user.
Using administrative delegation for a resource.
Setting all available options within project permissions.
Using the Project Web UI to enter the TimePhased data for other resources.
Using SharePoint permission mode instead of Project Permission Mode.
The code
See failing code screenshot here
using System;
using System.Security;
using Microsoft.ProjectServer.Client;
using Microsoft.SharePoint.Client;
namespace ProjectOnlineActuals
{
static class Program
{
const string projectSite = "https://URL.sharepoint.com/sites/pwa/";
private const string edward = "c39ba8f1-00fe-e311-8894-00155da45f0e";
private const string admin = "8b1bcfa4-1b7f-e411-af75-00155da4630b";
static void Main(string[] args)
{
TestActuals();
}
private static void TestActuals()
{
Console.WriteLine("Attempting test # 1 (login: admin, resource: admin)");
TestActuals("admin#URL.onmicrosoft.com", "123", admin);
Console.WriteLine("Attempting test # 2 (login: admin, resource: edward)");
TestActuals("adminy#hmssoftware.onmicrosoft.com", "123", edward);
Console.ReadLine();
}
private static void TestActuals(string username, string password, string resourceID)
{
try
{
using (ProjectContext context = new ProjectContext(projectSite))
{
DateTime startDate = DateTime.Now.Date;
DateTime endDate = DateTime.Now.Date;
Login(context, username, password);
context.Load(context.Web); // Query for Web
context.ExecuteQuery(); // Execute
Guid gResourceId = new Guid(resourceID);
EnterpriseResource enterpriseResource = context.EnterpriseResources.GetByGuid(gResourceId);
context.Load(enterpriseResource, p => p.Name, p => p.Assignments, p => p.Email);
Console.Write("Loading resource...");
context.ExecuteQuery();
Console.WriteLine("done! {0}".FormatWith(enterpriseResource.Name));
Console.Write("Adding new resource assignment to collection...");
enterpriseResource.Assignments.Add(new StatusAssignmentCreationInformation
{
Comment = "testing comment - 2016-02-17",
ProjectId = new Guid("27bf182c-2339-e411-8e76-78e3b5af0525"),
Task = new StatusTaskCreationInformation
{
Start = DateTime.Now,
Finish = DateTime.Now.AddDays(2),
Name = "testing - 2016-02-17",
}
});
Console.WriteLine("done!");
Console.Write("Trying to save new resource assignment...");
enterpriseResource.Assignments.Update();
context.ExecuteQuery();
Console.WriteLine("done!");
Console.Write("Loading TimePhase...");
TimePhase timePhase = enterpriseResource.Assignments.GetTimePhase(startDate.Date, endDate.Date);
context.ExecuteQuery();
Console.WriteLine("done!");
Console.Write("Loading TimePhase assignments...");
context.Load(timePhase.Assignments);
context.ExecuteQuery();
Console.WriteLine("done! Found {0} assignments.".FormatWith(timePhase.Assignments.Count));
Console.WriteLine("Updating TimePhase assignments...");
foreach (var assignment in timePhase.Assignments)
{
Console.WriteLine("Updating assignment: {0}. ActualWork: {1}".FormatWith(assignment.Name, assignment.ActualWork));
assignment.ActualWork = "9h";
assignment.RegularWork = "3h";
assignment.RemainingWork = "0h";
}
timePhase.Assignments.SubmitAllStatusUpdates("Status update comment test 2016-02-17");
context.ExecuteQuery();
Console.WriteLine("done!");
Console.WriteLine("Success (retrieved & updated {0} time phase assignments)!".FormatWith(timePhase.Assignments.Count));
}
}
catch (Exception ex)
{
if (ex.ToString().Contains("GeneralSecurityAccessDenied"))
Console.WriteLine("ERROR! - GeneralSecurityAccessDenied");
else
throw;
}
finally
{
Console.WriteLine();
Console.WriteLine();
}
}
private static void Login(ProjectContext projContext, string username, string password)
{
var securePassword = new SecureString();
foreach (char c in password)
securePassword.AppendChar(c);
projContext.Credentials = new SharePointOnlineCredentials(username, securePassword);
}
static string FormatWith(this string str, params object[] args)
{
return String.Format(str, args);
}
}
}
Can anyone help??

Resources