Automating Azure Machine Learning - azure

Is there a way of automating the calls to the Azure Machine Learning Service (AML)?
I’ve created the web service from AML. Now I have to do the calls the automated way. I’m trying to build a system, that connects to a Raspberry Pi for sensor data and gets a prediction from the ML service to be saved with the data itself.
Is there something in Azure to automate this or should I do it within the application?

I'm assuming you've created the webservice from the experiment and asking about the consumption of the webservice. You can consume the webservice from anything that can do an API call to the endpoint. I don't know the exact architecture of your solution but take a look at this as it might suit your scenario.
Stream analytics on Azure has a new feature called Functions(just a heads-up, its still in preview) that can automate the usage of deployed ML services from your account.Since you are trying to gather info from IoT devices, you might use Event Hubs or IoT Hubs to get the data and process it using Stream Analytics and during the process you can use the Webservice as Function in SA to achieve on-the-go ML results.
Usage is relatively simple if you are familiar with Stream Analytics or SQL queries in general.This link shows the step by step implementation and the usage is below;
WITH subquery AS (
SELECT text, "webservicealias"(text) as result from input
)
Select text, result.[Score]
Into output
From subquery
Hope this helps!
Mert

you can also automatically schedule this using powershell command and any task scheduler
Powershell for Azure ML - https://github.com/hning86/azuremlps and its usage is described here - https://github.com/hning86/azuremlps#invoke-amlwebservicerrsendpoint
Task Scheduler for powershell - http://www.metalogix.com/help/Content%20Matrix%20Console/SharePoint%20Edition/002_HowTo/004_SharePointActions/012_SchedulingPowerShell.htm

Related

Is it possible to Monitor Azure Integration Runtime?

I am running few Data Pipelines in Azure Data Factory and its using Azure Integration Runtime for the compute.
I am trying to Monitor the CPU/Memory Usage Pipelines Consume and Utilise Azure IR.
I have checked in the Azure Monitor but the CPU / Memory Metrics are for Self Hosted Integration Runtime I think.
Also, with the Diagnostic Setting Enabled, I tried to verify the details in the Logs too but these details are not available.
Can anyone help to know more options?
If you are referring to the Azure AutoResolveIntegrationRuntime, then no there is not, and this is why (from https://www.cathrinewilhelmsen.net/integration-runtimes-azure-data-factory/)
Microsoft has massive elastic pools across the various locations/regions they offer Azure, and at runtime ADF determines what pool/hardware it will use to perform the Pipeline activities. So there is really no way (and no need) to monitor the Azure Autoresolve IR. But if you are interested in monitoring Self-Hosted IR's then there are many ways to do it.
One simple and straight forward way to do it is by creating Azure Dashboards in the Metrics portion of Azure Monitor. As you can see from the screenshot below it provides good visual representation of usage/resources over time.
As you can see I'm visualizing the integration Runtime itself (CPU/Memory) as well as the Azure VM that is hosting the Integration Runtime. On top of this you can go into the Metrics dashboard to set up alerts if certain conditions are met (eg AVG CPU % usage is over 75% for the last 15 minutes). These alerts can send you a text message, or email... and even do things as complicated as triggering a LogicApp or WebHook for automated scaling up/out, advanced notifications, etc.
This in my opinion is the best way to monitor but another option could be to call the Azure Data Factory REST API to get monitor data for the Integration Runtimes
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/integrationRuntimes/{integrationRuntimeName}/monitoringData?api-version=2018-06-01
But this method would require you to incrementally pull in data, store it, parse it, and then visualize it or act upon it when that is already very well built in for you. Sometimes it's fun to recreate wheels though.
Yes It is possible to Monitor Azure Integration Runtime.
"Pipeline Runs" in Monitoring has the option to check the CPU Utilization specific to pipeline, Integration Runtime and more specific filters. You can find here, how its done.

Azure Data Factory(ADF) vs Azure Functions: How to choose?

Currently we are using Blob trigger Azure Functions to move json data into Cosmos DB. We are planning to replace Azure Functions with Azure Data Factory(ADF) pipeline.
I am new to Azure Data Factory(ADF), so not sure, Could Azure Data Factory(ADF) pipeline be better option or not?
Though my answer is a bit late, I would like to add that I would not recommend replacing your current setup with ADF. Reasons:
It is too expensive. ADF costs way more than azure functions.
Custom Logic: ADF is not built to perform cleansing logics or any custom code. Its primary goal is for data integration from external systems using its vast connector pool
Latency: ADF has much higer latency due to the large overhead of its job frameweork
Based on you requirements, Azure Data Factory is your perfect option. You could follow this tutorial to configure Cosmos DB Output and Azure Blob Storage Input.
Advantage over azure function is being that you don't need to write any custom code unless there is a data cleaning involved and azure data factory is the recommended option, even if you want azure function for other purposes you can add it within the pipeline.
Fundamental use of Azure Data Factory is data ingestion. Azure Functions are Server-less (Function as a Service) and its best usage is for short lived instances. Azure Functions which are executed for multiple seconds are far more expensive. Azure Functions are good for Event Driven micro services. For Data ingestion , Azure Data Factory is a better option as its running cost for huge data will be lesser than azure functions. Also you can integrate Spark processing pipelines in ADF for more advanced data ingestion pipelines.
Moreover , it depends upon your situation . Azure functions are server less light weight processes meant for quick access in response to an event instead of volumetric responses which are meant for batch processes.
So, if your requirement is to quickly respond to an event with little information stay with Azure functions or if you have a need for batch process switch to ADF.
Cost
I get images from here.
Let's calculate the cost:
if your file is large:
43:51hour=43.867(h)
4(DIU)*43.867(h)*0.25($/DIU-H)=43.867$
43.867/7.514GB= 5.838 ($/GB)
if your file is small(2.497MB), take about 45 seconds:
4(DIU)*1/60(h)*0.25($/DIU-H)=0.0167$
2.497MB/1024MB=0.00244013671 GB
0.0167/0.00244013671= 6.844 ($/GB)
scale
The Max instances Azure function can run is 200.
ADF can run 3,000 Concurrent External activities. And In my test, only 1500 copy activities were running parallel. (This test wasted a lot of money.)

The right service to listen to iot-hub and send queries to Azure SQL

I'd like to build a small solution on Azure for practice. I'd be sending data using IOT-HUB from some devices and what I need is some way to interpret this data and do appropriate query to Azure SQL.
Basically I would need a way to have my program running all the time being able to:
listen to events from iot-hub
interpret event information and save/get data to/from database
send a message to some device using iot-hub
Which service would be good for that? Am I able to use Entity Framework?
In ideal solution I'd create a C# program to do what I need and have it running in Azure, waiting for events from iot-hub, having access to my database - is it even possible?
I'm very new (rather completely new) to cloud solutions, so I'd be really grateful for any advices. Currently I feel completely lost in all these Azure services.
There is quite some documentation on Azure IoT and how one can possible architect an IoT solution.
The IoT documentation is the obvious first step to get an overview of what Azure offers. There are some nice 'Getting Started' walkthroughs also
Take a look at this IoT reference architecture. Quite helpful to get an overview.
There are tons of links and interesting examples for Azure IoT. Just google around.

The way to pass input for azure machine experiment from app ( for example console app )

I'm trying to do some kind of web job application that can run for period time and make prediction on azure machine learning studio. After that i want get the result of this experiment and do something with that in my console application. What is the best way to do this in azure with machine learning or maybe some similiar stuff to prediction data from data series ?
You can try using Azure Data Factory to create a Machine Learning pipeline or use Azure ML Studio's Predictive Web Services.
With Azure Data Factory
Follow this link for details. Azure Data Factory implementations would seem difficult at first but they do work great with Azure ML experiments.
Azure Data Factory can run your ML Experiment on a schedule or one-off at a specified time (I guess you can set only for UTC Timezone right now) and monitor it through a dashboard (which is pretty cool).
As an example you can look # ML Batch Execution. I used this in one of our implementations (we do have latency issues, but trying to solve that).
If you directly want to use the experiment in your console (assuming it is a web application), use create a Predictive Web service out of your ML Experiment, details here
I couldn't exactly understand your use case so I posted two alternatives that should help you. Hope this might lead you to a better solution/approach.

Pulling data from Stream Analytics to Azure Machine Learning

Working on a IoT telemetry project that receives humidity and weather pollution data from different sites on the field. I will then apply Machine Learning on the collected data. I'm using Event Hubs and Stream Analytics. Is there a way of pulling the data to Azure Machine Learning without the hassle of writing an application to get it from Stream Analytics and push to AML web service?
Stream Analytics has a functionality called the “Functions”. You can call any web service you’ve published using AML from within Stream Analytics and apply it within your Stream Analytics query. Check this link for a tutorial.
Example workflow in your case would be like the following;
Telemetry arrives and reaches Stream Analytics
Streaming Analytics (SA) calls the Machine Learning function to apply it on the data
SA redirects it to the output accordingly, here you can use the PowerBI to create a predictions dashboards.
Another way would be using R, and here’s a good tutorial showing that https://blogs.technet.microsoft.com/machinelearning/2015/12/10/azure-ml-now-available-as-a-function-in-azure-stream-analytics/ .
It is more work of course but can give you more control as you control the code.
Yes,
This is actually quite easy as it is well supported by ASA.
You can call custom AzureML function from your ASA query when you create this function from the portal.
See the following tutorial on how to achieve something like this.

Resources