Retrain the classification model automatically based on updated data set - azure

We have created an experiment in Azure ML Studio to predict some scheduling activities based on the system data and user data. System data consists of the CPU time, Heap Usage and other system parameters while user data has active sessions of the user and some user-specific data.
Our experiment is working fine and returning the results quite similar to what we are expecting, but we are struggling with the following:-
1) Our experiment is not considering the updated data for training its models.
2) Every time we are required to upload the data and retrain the models manually.
I wonder if it is really possible to feed in live data to the azure experiments using some web-services or by using Azure DB. We are trying to update the data in CSV file that we have created in Azure storage. That probably would solve our 1st query.
Now, this updated data should be considered to train the model periodically automatically.
It would be great if someone could help us out with it?
Note: We are using our model using the web services created with the help of Azure studio.

Step 1 : Create 2 web services with Azure ML Studio ( One for the training model and one for the predictive model)
Step 2: Create endpoint through the web service with the link Manage Endpoint on Azure ML Studio for each web service
Step 3: Create 2 new connections on Azure Data Factory / Find Azure ML (on compute tab) and copy the Endpoint key and API Key that you will find under the Consume tab in the endpoint configuration (the one that you created on step 2) Endpoint Key = Batch Requests Key and API Key = Primary Key
Set Disable Update Resource for the training model endpoint
Set Enable Update Resource for the predictive model endpoint ( Update Resource End Point = Patch key )
Step 4 : Create a pipeline with 2 activities ( ML Batch Execution and ML Update Resource)
Set the AML Linked service for the ML batch Execution with the connection that has disable Update Resource
Set the AML Linked service for the ML Update Resource with the connection that has Enable Update Resource
Step 5 : Set the Web Service Inputs and Outputs

You need to use Azure Data Factory to retrain the ML model.
You need to create a pipeline with the ML Batch Execution and ML Update Resource activities and to call your ML model you need to configure the endpoint on the webservice.
Here is some links to help you :
https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-machine-learning
https://learn.microsoft.com/en-us/azure/data-factory/update-machine-learning-models

Related

How to list all the model monitoring job on vertex ai given the display name?

I have enabled the model monitoring job for my endpoint on the first pipeline run. For the first run, the first version of the model is created for which monitoring is enabled.
Now when I rerun the pipeline, a new version of the model will be created, in that case how to enable the model monitoring job for the new model version through python SDK?
Or will it be enabled automatically given that both models are deployed to the same endpoint?

How Modifying Azure Analysis services roles using a logic app?

With Azure Data Factory I have built a pipeline to orchestrate the processing of my Azure Analysis Services model trough a dedicated Logic App as explicated in this article, and it works properly.
Now, always using Azure Data Factory (through Logic App), I wish I could also update the list of the user in a specific roles.
In the article mentioned above, to process the Azure Analysis Services models, the Logic App calls a specific API that has the following format:
https:// <rollout>.asazure.windows.net/servers/<serverName>/models/<resource>/refreshes
but this API doesn't seem to work for update the model's roles.
Is there anyone who knows the correct method to be able to update model roles using a specific Logic App?
Thanks for any suggestions
If you don't necessarily need to use the logic app for this, I think it might be possible using Azure automation and the powershell cmdlets for managing azure analysis services:
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-refresh-azure-automation
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-powershell
https://learn.microsoft.com/en-us/powershell/module/sqlserver/Add-RoleMember?view=sqlserver-ps
One alternative approach might be to have fixed AD groups as members of the tabular model roles and add / remove members from those AD groups. Therefore the tabular model roles would not need to be refreshed, it would simply be a matter of adding or removing members from the AD groups as part of your governance process.
A second approach would be to use dynamic row-level security. Adding records to a Azure SQL DB table is perfectly possible with Logic Apps and could be used to drive security, depending on your requirements. You can then refresh your security dimension with the Logic App. See here for more details:
https://learn.microsoft.com/en-us/power-bi/desktop-tutorial-row-level-security-onprem-ssas-tabular
To answer your question however, the Azure Analysis Services REST API is useful but is not that fully featured, ie it does not contain all possible operations for tabular models or the service. One other missing example I found was backups, ie although it is possible to trigger a pause or resume of the service, it is not possible to trigger a backup of a tabular model via the REST API. I do not believe it is possible to alter role members or at least, the operation is not listed in the REST API, although happy to be corrected if I am wrong. To be more specific, Roles is not mentioned in the list of available objects which can be passed in to the Objects array using the POST / Refreshes eg here. table and partition are the only ones I'm aware of.
There are also no examples on the MS github site:
https://github.com/microsoft/Analysis-Services
Finally, consider calling TMSL via Powershell in an Azure Function, which you can call from Azure Data Factory.
HTH

Trigger next step in data factory after Tabular model refresh is complete

I have setup an Azure automation account and Web hook to process my analysis services database. I am calling that using Web activity (POST method) in Azure Data Factory to trigger the cube refresh. Web activity method is working fine but it returns back without waiting for refresh to complete.
Now, I want to execute further steps only after the Cube processing is complete. Is there a way to detect when the cube is refreshed and then start the next step of activities in data factory?
After lot of research, I was able to achieve this by using Microsoft recommended REST API's to process my Analysis Services Database and Get the Refresh status back.
Here are some helpful links below:
REST API: https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-async-refresh
GitHub References:
This link contains the documentation specifying the inputs and steps to Process Cube and Wait till the refresh is completed - https://github.com/furmangg/automating-azure-analysis-services/blob/master/README.md#processazureas
Code Repository: https://github.com/furmangg/automating-azure-analysis-services/tree/master/ADFv2
Unlike other solutions which leverage external services like Azure Logic Apps or custom ADF .NET activities running in Azure Batch, this approach uses only built-in activities which depend on no external services other than Azure Analysis Services. So, I changed my solution to NOT use Azure Automation account or Webhook to process the cube.
Feel free to get in touch if you need further details.
Hope this helps!!

The way to pass input for azure machine experiment from app ( for example console app )

I'm trying to do some kind of web job application that can run for period time and make prediction on azure machine learning studio. After that i want get the result of this experiment and do something with that in my console application. What is the best way to do this in azure with machine learning or maybe some similiar stuff to prediction data from data series ?
You can try using Azure Data Factory to create a Machine Learning pipeline or use Azure ML Studio's Predictive Web Services.
With Azure Data Factory
Follow this link for details. Azure Data Factory implementations would seem difficult at first but they do work great with Azure ML experiments.
Azure Data Factory can run your ML Experiment on a schedule or one-off at a specified time (I guess you can set only for UTC Timezone right now) and monitor it through a dashboard (which is pretty cool).
As an example you can look # ML Batch Execution. I used this in one of our implementations (we do have latency issues, but trying to solve that).
If you directly want to use the experiment in your console (assuming it is a web application), use create a Predictive Web service out of your ML Experiment, details here
I couldn't exactly understand your use case so I posted two alternatives that should help you. Hope this might lead you to a better solution/approach.

Automating Azure Machine Learning

Is there a way of automating the calls to the Azure Machine Learning Service (AML)?
I’ve created the web service from AML. Now I have to do the calls the automated way. I’m trying to build a system, that connects to a Raspberry Pi for sensor data and gets a prediction from the ML service to be saved with the data itself.
Is there something in Azure to automate this or should I do it within the application?
I'm assuming you've created the webservice from the experiment and asking about the consumption of the webservice. You can consume the webservice from anything that can do an API call to the endpoint. I don't know the exact architecture of your solution but take a look at this as it might suit your scenario.
Stream analytics on Azure has a new feature called Functions(just a heads-up, its still in preview) that can automate the usage of deployed ML services from your account.Since you are trying to gather info from IoT devices, you might use Event Hubs or IoT Hubs to get the data and process it using Stream Analytics and during the process you can use the Webservice as Function in SA to achieve on-the-go ML results.
Usage is relatively simple if you are familiar with Stream Analytics or SQL queries in general.This link shows the step by step implementation and the usage is below;
WITH subquery AS (
SELECT text, "webservicealias"(text) as result from input
)
Select text, result.[Score]
Into output
From subquery
Hope this helps!
Mert
you can also automatically schedule this using powershell command and any task scheduler
Powershell for Azure ML - https://github.com/hning86/azuremlps and its usage is described here - https://github.com/hning86/azuremlps#invoke-amlwebservicerrsendpoint
Task Scheduler for powershell - http://www.metalogix.com/help/Content%20Matrix%20Console/SharePoint%20Edition/002_HowTo/004_SharePointActions/012_SchedulingPowerShell.htm

Resources