Difference between Azure ML and Azure ML experimentation - azure

I am new to Azure ML. I am having some doubts .Could anyone please clarify my doubts listed below.
What is the difference between Azure ML service Azure ML experimentation service.
What is the difference between Azure ML workbench and Azure ML Studio.
I want to use azure ML Experimentation service for building few models and creating web API's. Is it possible to do the same with ML studio.
And also ML Experimentation service requires me to have a docker for windows installed for creating web services.
Can i create web services without using docker?

I'll do my best to answer these questions and feel free to ask more questions. :)
What is the difference between Azure ML service Azure ML experimentation service?
Essentially, Azure ML Service (I may reference this as Azure ML Studio) uses a drag and drop interface to build out your workflow and test models. Azure ML experimentation is a new offering from the Azure Portal to host them directly in Azure and offer a better way to manage your models. Experimentation will use Azure ML Workbench to build out your models.
What is the difference between Azure ML workbench and Azure ML Studio?
The biggest difference is ML Studio has the drag and drop interface to build the workflow and models, whereas Workbench lets you use Python to programmatically build out your models. Workbench also includes a really nice and powerful way to clean your data from the app. In Studio you have some good modules to clean data, but I don't think it's as powerful as what you can do in Workbench.
EDIT: The Workbench application is deprecated and has been replaced by/upgraded to ML Services. The core functionality is unchanged, though.
I want to use azure ML Experimentation service for building few models and creating web API's. Is it possible to do the same with ML studio?
I would actually say it's much easier to do this in ML Studio. The drag and drop interface is very intuitive and it is only a couple of clicks to create a web API to call your model. I feel, as it is currently at the time of this writing, is more complex to deploy your model and it involves using the Azure CLI.
And also ML Experimentation service requires me to have a docker for windows installed for creating web services. Can I create web services without using docker?
Here I'm not too familiar with the Docker parts of Workbench, but I believe you can create and deploy without using Docker. It will require an Azure Model Management resource, though, I believe.
I hope this helps and, again, feel free to ask more questions.

The AML Experimentation is one of our many new ML offerings, including data preparation, experimentation, model management, and operationalization. Workbench is a PREVIEW product that provides a GUI for some of these services. But it is just a installer/wrapper for the CLI that is needed to run. The services are Spark and Python based. Other Python frameworks will work, and you can get a little hacky to call Java/Scala from Python. Not really sure what you mean by an "Azure ML Service", perhaps you are referring to the operationalization service I mentioned above. This will quickly let you create new Python based APIs using Docker containers, and will connect with the model management account to keep track of the linage between your models and your services. All services here are still in preview and may breaking change before GA release.
Azure ML Studio is an older product that is perhaps simpler for some(myself an engineer not a data scientist). It offers a drag and drop experience, but is limited in it's data size to about 10G. This product is GA.
It is, but you need smaller data sizes, and the job flow is not spark based. I use this to do rapid PoC's. Also you will less control over the scalability of your scoring (batch or real time), because it is PaaS, compared to the newer service which is more IaaS. I would recommend looking at the new service instead of studio for most use cases.
The web services are completely based on Docker. Needing docker for experimentation is more about running things locally, which I myself rarely do. But, for the real time service, everything you package is placed into a docker container so it can be deployed to an ACS cluster.

Related

Download a trained ML Model from Azure ML studio to deploy on a standalone computer

I have setup a ML model in Azure ML studio and I am able to use the ML Studio's Web API to obtain predictions.
The key challenge with keeping the model hosted within Azure ML Studio is client computer's internet dependency and latency associated with each prediction.
I wanted to understand if there was any way to download a model created in Azure ML studio and consume it locally(say using a local .Net application).
This question was asked few years back, some people believed it was possible but without much details.
When a model has been trained, I do see the option of 'Save as Trained Model' as shown in the snapshot but I am not sure how this could eventually be downloaded to a local computer and consumed locally
Looking forward to hearing potential solutions.
I think you are using the ML Studio classic version.
The Classic ML Studio doesn't support CODE SDKs.
You could migrate the newer Azure ML.
Migrate datasets to Azure Machine Learning.
Use the designer to rebuild experiments
. - Use the designer to redeploy web services.
Once you migrate, the model
The model gets outputted to blob
You can download the model from here.Not sure whether this will help you directly .Alternatively, you can use the trained model of the workspace by following the article here. The newer version supports the SDK. You will have to train and register the model in the ML Designer. Post which you can consume them from the local machine.

Azure Machine Learning (preview) to Customer Insights

I am trying to integrate MS Dynamics Customer Insights (CI) with the model I have built within the new Azure Machine Learning (designer). Currently, I see there is only an integration between CI and Azure Machine Learning studio (classic).
I have deployed my model behind a web service (REST) within new Azure Machine Learning however it is not getting picked up in CI. However, I am able to score/generate predictions from the API using a Python script.
Please recommend a way to integrate these two MS services or suggest an architecture where CI can pick up the results.
Here are some hands-on labs which walk through the complete lifecycle for integrating Dynamics 365 Customer Insights with Azure data services. One scenario seen in Lab 5 is integrating the results of an Azure Machine Learning model back into Customer Insights. That walkthrough should give you one good way of accomplishing that task.
I totally agree that there's a real opportunity for a more seamless integration between Azure ML and D365 (including CI). For me, I'm thinking about how cool would it be to have an ML Dataset be automatically created with D365 data that could be used in the ML designer. After which the winning model could be registered, then made available inside D365 as a field that would score things in real time.
If you an "idea" on the forum below. My team and I will upvote it!
https://experience.dynamics.com/ideas/

Is Servic Fabric Appropriate for Simple Background Jobs?

We have a bunch of Windows Services. We need to be able to continue to deploy our code as Windows service on premise, but would like to deploy to Azure where appropriate. The goal is to manage less infrastructure. I'm not keen on deploying dozens of bits an pieces as Azure Functions, but not entirely opposed to it either. Azure Batch / Webjobs are also another option. However, the long term goal is to move all of our services over to an orchestration server like Service Fabric so that all the services can be deployed and orchestrated from the one place. This is mainly a deployment consideration.
We will break the existing C# code in to .NET Core class libraries and reference them from either Service Fabric hosted in Azure, on-premise Service Fabric, or on-premise Windows Service. Is Service Fabric an appropriate choice? Or, is there a strong reason to run background jobs as Azure Batch / Functions / Webjobs?
This is Microsoft's diagram from here:
The answer to the question is that we don't really need full fledged orchestration right now, but it will become more important moving in to future. I have to balance being able to deploy all our code in one hit with the ease of ad hoc deployment that Azure Functions offer.
(Stateful) Services can be an excellent way to run background jobs. They offer the RunAsync entry point, in which you can run your job, check (and store) progress. SF really shines when multiple services collaborate on tasks, offering SF Remoting as a communication channel, with built-in retry support.
You can choose to containerize your software, which would free you from platform lock-in, but prevent you from using some platform features.
By automating delivery of services (CI/CD), you can deploy to any platform you choose. This is not something that is specific to SF.

What is the most covered way of scripting Azure resources?

I know of ARM, the REST SDK and the CLI Powershell cmdlets.
What I want to know is; which of these has the most extensive support for scripting resources without having to touch the (indeed very slow) Azure Portal?
And I would also really like to know which one of these Microsoft usually ship first with regards to preview features?
Each service in Azure is exposed using a REST API. Most of those APIs are publically supported. Some aren't.
It depends on the team that builds the elements that make up Azure and often their primary customer base. The Windows IaaS and AAD teams have been mostly PowerShell first. Machine learning and AI seem to favor azure-cli, which is built in Python, a very commonly used language in big data scenarios. The Azure Devops team has recently moved from the Visual Studio to the Azure brand (formerly Visual Studio Team Services, Visual Studio Online, Team Foundation Service preview). Their tools are mostly Node and Powershell based. Not everything in Azure is a "Resource", per se. So not all things are created or updated using Azure Resource Manager Templates (ARM).
So, unfortunately, there is no golden hammer when it comes to automating Azure.
Azure REST Api is, obviously, the best way to go, but its the least convenient (there probably is a better word for this). I really like arm templates, they (basically) allow you to define REST api calls you want to do and allow to do some looping\parametrizing\etc. As arm templates are just a proxy for the rest api, they usually work really well.

Internet of Things using MS Azure

I am starting my journey of IoT development with MS Azure. I would like some insight on the Azure cloud. I am a total newbie on cloud development. Can someone tell me some good books/links on Azure that will help me understand how I can use Azure for IoT and start development on the same.?
Thanks a lot for your inputs.
This totally depends on the architecture of your application. You can use SAAS components for rapid prototyping, parts or all of your application architecture. This will give you a better insight into selecting the appropriate stack of tools for your application.
If you want to deploy your own software stack, you would provision Azure Virtual Machines. Azure provides an SDK to interact with the cloud infrastructure.
Docker is a really good option to use for application deployment these days. Google provides better support for Docker containers using its Kubernetes framework.
Simple APIs or website can be developed on azure using Azure webapps. I am currently developing a node application using azure websites. The actual container where the site runs is a windows NT machine with IIS. If you want your SAAS server container's to be linux based then you might look at AWS/Google or Redhat Openshift.
I have used OpenShift SAAS, and found it quite easy to get onboard with.
I advise you to have a look at Build and Ignite events, this week. There might be more announcements there. You can definitely have a look at the following white paper: http://download.microsoft.com/download/E/1/F/E1FFDADF-C0FF-4E72-A834-B173A079F393/Microsoft_Internet_of_Things_White_Paper.pdf
The most important services for IoT in Azure are (until today):
Azure Event Hubs: a massive ingestion service that can take in millions of telemetry events per second.
Azure Stream Analytics: Real time complex event processing, combining multiple incoming streams of data and detection patterns in it
PowerBI: this will allow users to build and explore interactive reports and graphs
Azure Machine Learning: Leverage prediction & machine learning models
For storage, you have DocumentDB, Azure and blob storage, among other
HDInsight will help you in working with the data (big data) and make jobs with it.
Azure Web Apps and API apps will allow you to present and expose the data to you users and custom reports
Good luck

Resources