How to execute multiple rows in web service Azure Machine Learning Studio - azure

I create a model in Azure ML studio.
I deployed the web service.
Now, I know how to check one record at a time, but how can I load a csv file and made the algorithm go through all records ?
If I click on Batch Execution - it will ask me to create an account for Azure storage.
Is any way to execute multiple records from csv file without creating any other accounts?

Yes, there is a way and it is simple. What you need is an excel add-in. You need not create any other account.
You can either read Excel Add-in for Azure Machine Learning web services doc or you can watch Azure ML Excel Add-in video.
If you search for videos on excel add in for azure ml, you get other useful videos too.
I hope this is the solution you are looking for.

Related

Azure Function App to pull files from other project/repo

Ive got a straightforward scenario but i can't figure out how to do it.
I have a SQL Database Project which has a bunch of simple SQL files and some of these drop/create summary tables.
I am trying to use a Azure Function App on a Timer to pick up certain SQL files from this project and run them ( every night ). I am however confused how to get these SQL files into my function to run?
I've tried referencing the SQL project and picking up the files from bin/debug but once function app is deployed i can see how it will be able to build the SQL Project?
Any ideas appreciated
If you choose Azure Functions for achieving this, then you can utilize the REST API calls as the logic to do operations on the database either fetching or updating the data.
This approach is good for short time period.
Refer to the C# Corner Article for the information and code on using the Rest APIs of database calls in Azure Functions.
I believe the long-term reliable solution for this requirement; Azure DevOps project is recommended.
Create Azure DevOps project for SQL Project.
Schedule it with an internal time which deploys the respective results at on-time whenever check-in happens on that project also.
Refer to this MS Doc for more information on using the above solution.

What is the best way to automate creating SQL .bak or .bacpac backup files and saving them to an Azure cloud storage container?

Currently I am tasked with researching a solution to easily copying data from one environment to another (QA to DEV for example) as well as having the flexibility of going to different times to compare our data. It is an easy task to do locally with SSMS and I am looking for the best ways to do it using Azure and it's tools.
These are the options that I found so far:
Backup Service and Backup Vault (The MS solution that I am not asking for. They don't generate .bak files)
Azure Function to execute generate and transfer SQL (flexible but the code needs to be maintained + manage authentication)
Powershell process with Azure Automate (Flexible too but needs to be maintained)
Datafactory/SSIS (Still learning and researching)
Anyone got any tools/methods that are worth looking into before I dive deeper with a solution?
For Azure SQL database, SQL Data Sync is one of feature for the data sync between Azure SQL and SQL server(on-premise). Some limits are that Azure SQL database must be hub and each must have a primary key. That may not suit you.
Per my experience, Data Factory is the best one for you. You can copy the data between different environment, in Sink settings, we can using upsert(insert or update) operation to sync the data.
If you only want to schedule the backup automatically for the SQL, the third-part tool also could feed your request: SQL Backup and FTP.
Since you have searched a lot and found almost all the options in Azure, all the ways can achieve that. You need to know your real request, data sync or auto backup create the .bacpac file to storage. That's not a good question to help you find the best way. The way you like, the way is the best.
I went with writing an Azure Automate powershell script. including cmdlts like New-AzureRmSqlDatabaseExport and passing in the parameters was ticky but it finally did the job.

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Azure Unzip automation

I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf

Copy files from on-prem to azure

I'm new to Azure eco system. I'm doing some research on copying data from on-prem to azure. I found following options:
AzCopy
Azure Data Factory (Copy Data Tool)
Data Management Gateway
Ours is a Microsoft shop; so, I'm looking for tools that gel with MS platform. Also, down the line, we want to automate the entire thing as much as we can. So, I think, Azure Storage Explorer is out of the question. Is there a preference among the above 3. Or, are there any better tools?
I think you are mixing stuff, Copy Data Tool is just an Azure Data Factory Wizard to make some sample data moving between resources. Azure Data Factory uses the data management gateway to get on premises resources such as files and databases.
What you want to do can be made with Azure Data Factory. I recommend using version 2 (even in its preview version) because its Authoring is easier to understand if you are new to the tool. You can graphically configure linked services, datasets and pipelines from there.
I hope this helped, if you need further help just ask away!
If you're already familiar with SSIS, there's also the option to use SSIS in ADF that enables on-prem data access via VNet.

Resources