How Modifying Azure Analysis services roles using a logic app? - azure

With Azure Data Factory I have built a pipeline to orchestrate the processing of my Azure Analysis Services model trough a dedicated Logic App as explicated in this article, and it works properly.
Now, always using Azure Data Factory (through Logic App), I wish I could also update the list of the user in a specific roles.
In the article mentioned above, to process the Azure Analysis Services models, the Logic App calls a specific API that has the following format:
https:// <rollout>.asazure.windows.net/servers/<serverName>/models/<resource>/refreshes
but this API doesn't seem to work for update the model's roles.
Is there anyone who knows the correct method to be able to update model roles using a specific Logic App?
Thanks for any suggestions

If you don't necessarily need to use the logic app for this, I think it might be possible using Azure automation and the powershell cmdlets for managing azure analysis services:
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-refresh-azure-automation
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-powershell
https://learn.microsoft.com/en-us/powershell/module/sqlserver/Add-RoleMember?view=sqlserver-ps

One alternative approach might be to have fixed AD groups as members of the tabular model roles and add / remove members from those AD groups. Therefore the tabular model roles would not need to be refreshed, it would simply be a matter of adding or removing members from the AD groups as part of your governance process.
A second approach would be to use dynamic row-level security. Adding records to a Azure SQL DB table is perfectly possible with Logic Apps and could be used to drive security, depending on your requirements. You can then refresh your security dimension with the Logic App. See here for more details:
https://learn.microsoft.com/en-us/power-bi/desktop-tutorial-row-level-security-onprem-ssas-tabular
To answer your question however, the Azure Analysis Services REST API is useful but is not that fully featured, ie it does not contain all possible operations for tabular models or the service. One other missing example I found was backups, ie although it is possible to trigger a pause or resume of the service, it is not possible to trigger a backup of a tabular model via the REST API. I do not believe it is possible to alter role members or at least, the operation is not listed in the REST API, although happy to be corrected if I am wrong. To be more specific, Roles is not mentioned in the list of available objects which can be passed in to the Objects array using the POST / Refreshes eg here. table and partition are the only ones I'm aware of.
There are also no examples on the MS github site:
https://github.com/microsoft/Analysis-Services
Finally, consider calling TMSL via Powershell in an Azure Function, which you can call from Azure Data Factory.
HTH

Related

How to handle a Custom Connector that uses header authentication in each API call?

I have an Azure logic app that uses a Custom Connector that I've made from importing a Postman Collection. The RESTful API that my connector calls require 3 authentication headers in each request: UserName, Secret, ApiIntegrationCode. Because it takes three specifically named parameters, I don't believe that any of the authentication presets will work for me.
I know that I can protect the inputs and outputs of various connectors. I have been entertaining the idea of storing the sensitive information in a SQL table that I query in each run and storing the values in variables that I pass to each of my custom connector's API calls.
Would this be a viable way of protecting sensitive data from being seen by people that may have access to my logic app? What is the most secure way I can pass these headers in each call?
There are not too many options within a (consumption) Logic App in this regard.
Your options with Logic Apps
A first step into the right direction is to put your sensitive information into an Azure Key Vault and use the corresponding connector in your Logic App to retrieve the data from there. This is easier to implement and more secure than querying a SQL table for this purpose.
The second thing you can do is to activate secure inputs for the connectors that make the API calls. This makes sure, that the sensitive information passed to these connectors is obfuscated in the run history of your logic App and in connected services like Azure Log Analytics.
The problem with this approach is, that anyone who has more than just read permissions to your Logic App can just go ahead and deactivate the secure inputs setting or create a step that dumps the content of your Key Vault. You can use RBAC to control access to your Logic App but that means of course administrative overhead.
Alternative: API Management Service
If you want by all means to allow other developers to change the Logic App without exposing API secrets to them, you might consider using some sort of middle tier to communicate with the API. Azure API Management Service (APIM) is one of the options here.
You would manage your sensitive information in a Key Vault and inject them via "Named Values" into your APIM instance. You can then add your API as a backend in APIM and expose it towards your Logic App.
The advantage here is that you can secure access to your API with APIM subscription keys that you can cycle frequently. You can also restrict the access to the original API to only those calls, that need to be available to the Logic App.
If APIM is something for you depends on your use case, as it comes at a price. Even the developer plan costs about $50/month: https://azure.microsoft.com/en-us/pricing/details/api-management/
Alternative: Azure Function
You can use a simple Azure Function that serves as a middle tier between your Logic App and your API. This function can be configured to pull the sensitive data from a Key Vault and can also be secured via a function access key, that you can renew on a regular basis.
This is a dirt cheap option, if you are running the functions on a consumption plan: https://azure.microsoft.com/en-us/pricing/details/functions/

Azure Split/Merge Service, is it still relevant?

I have managed to get the C# and db setup using ListMappings. However, when I try to deploy the split/merge tool to Azure cloud classic the service it states 'The requested VM tier is currently not available in East US for this subscription. Please try another tier or deploy to a different location.' We tried a few other regions with the same result. Do you know if there is a workaround or updated version? Is the split / merge service even still relevant? Has anyone got this service to run on Azure lately?
https://learn.microsoft.com/en-us/azure/azure-sql/database/elastic-scale-overview-split-and-merge
The answer to the question on whether it is still relevant, in my opinion is ...no. Split\merge is no longer relevant with the maturation of elastic pools. Elastic pools with one data base per tenant seem the sustainable way to implement multi tenancy with legacy code. The initial plan was to add keys to each of our tables to have multiple tenants per database. Elastic pools give us the same flexibility without having to make breaking changes our existing code.
Late post here, but we are implementing ElasticScale for a client to split ~50 clients into a database-per-tenant model. I don't think the SplitMerge tool will be used over the long term, just for the initial data migration from one db to many shards, but it has been handy for that purpose. We are using the ElasticScale SDK to allow a single API to route queries to the appropriate shard(s) based on sharding key. Happy to compare notes with you if you are still working on this.

How do I get the list of providers from Dynamics 365 FO via an .NET core API

So far, using Azure service bus, I have managed to get data from purchase orders, payment logs, invoices, etc.
But my API was waiting that one of these events to happen. But now we need to do an active search and show all providers registered in dynamics.
So far the ways we found to get some information from dynamics were: MS Flow, Azure Logic Apps and the Azure service bus. But it all depends on an event within Dynamics 369 FO so that we can receive some information.
Instead of using the tools in Azure (which are fantastic in some scenarios), have you considered interacting directly with the OData endpoints directly exposed by the D365FO application? You can create custom data entities using MorphX/x++ and expose them as an RESTful OData endpoints (JSON format returned). There are also hundreds of out of the box data entities that are exposed without needing custom ones created.
Learn more here: https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/odata
and here: https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/build-consuming-data-entities
and code samples here: https://github.com/microsoft/Dynamics-AX-Integration

Azure Machine Learning (AML) Webservice REST API with Multiple endpoints

I've been working on developing an API to serve a machine learning model using Azure Machine Learning (AML) webservice deployments on a Kubernetes target as outlined here: https://learn.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where#prepare-to-deploy
My particular use case requires more than simple scoring of data through the model. The API needs to have multiple endpoints to perform different actions related to the model. For example, an endpoint to upload data, an endpoint to delete data, an endpoint to list existing data, an endpoint to score previously uploaded data, an endpoint to change preprocessing parameters, etc...
I can build all of this logic, but I am struggling with the fact that AML web services only provides one endpoint (The service URI ending in "/score"). Is there a way to add more endpoints to an AML service? For example, I would like to have a way for users to be able to POST, GET, DELETE, PUT "/data", GET "/predictions", and POST, GET, DELETE, PUT "/parameters", etc..
Is there a way to do this in AML or is this not the right tool for what I am trying to accomplish? Is there a better solution within Azure that is more suited for my needs?
Thank you!
Azure ML allows controlled rollout/traffic splitting, but doesn't directly support your API design.
I might need to know more about your use case to make a recommendation. Are you looking at implementing incremental learning? What is the motivation for separate endpoints?
-Andon
Your proposal seems like a stateful web server which is more than a REST API service. For example, you need to keep a piece of logic to maintain "ids" of data: if there are two POST /data calls with different data, and the DELETE /data need to operate on the proper one. This is much more than a single performance optimized machine learning service.
I would recommend you creating a separate server with all these logic pieces and only reach Azure Machine Learning service whenever you need it. You could also build a cache in your service to only call Azure ML service when a new data coming in or the local cache expired. It will save you additional money from Azure :-)

Azure Mobile Services with existing database + table controller

I have created Azure Mobile Service using Azure portal and selected a existing database.
Now I have few questions:
Whenever I'll run the application, database gets cleared as ClearDatabaseSchemaIfModelChanges has configured for Database Initializer in WebApiConfig. Can I use database first approach for Entity Framework?
My existing database has default schema is dbo but it will create a new schema for all tables same as service name. How to overcome this scenario?
Finally, TableController is tightly bind to a table. But output requires in my case is bit complex as I require data from multiple table using join each other. And I also wants to add some logic before giving return the result.
And a tightly bound table also requires some common columns like CreatedAt, UpdatedAt, Id, Deleted as it inherit from EntityData class. But In my existing tables, columns are not there.
Help me out !!
You can do the following:
1) Create the custom API for your operations (that are not so tightly connected with the HTTP verbs and the table)
2) Manipulate the DB manually - it is just a SQL Azure database that can be managed by SQL Servre Management Studio, Visual Studio or queries.
So, shortly, if you need the flexibility of doing what you need with the database, you can do it, but Mobile Services is a service, so there are some limitations.
Hope that helps. If not, please elaborate the issue.
first, you should be using Azure Mobile Apps instead of Azure Mobile Services.
You can follow what Alexandr has mentioned, or you can stick with using Mobile App by extending your existing tables to add the required columns (especially if you want to do sync).
The TableControllers are just plain WebAPI controllers, you can certainly extend it.
Have a look at this link for a more guided step on how reuse an existing database.

Resources