Azure Function, Blob Trigger : Copy data file from blob to a server - azure

My requirement is to create a blob trigger azure function which will be triggered when a specific file (say, trigger.txt) is copied into a container. Once it is triggered, the powershell function should copy this trigger.txt on to a Azure windows VM in the same same subscription and resource group.
I can see that the function is triggered if an example trigger.txt file is present.
What do I need to do to copy this blob in the container to the azure VM? I see that azcopy does not work.

You could consider one of the following approaches
Using Azure File Storage
Mount Azure File Storage onto Windows VM
Azure Function would create a new file in file storage using content from input blob. Since there is no binding support for file strorage, you will have to use the File Storage SDK directly.
You could also consider using Logic Apps which has connectors for both blob storage and file storage. But do note that there are file size restrictions that you may run into depending on your use case.
And finally, you might want to consider using blob events to reduce on polling costs for both approaches.
Use PowerShell Functions to remote into the VM as suggested by #silent in the comments and run azcopy to download the file
There is an official doc which showcases how you can run commands using remoting that you can refer to. The doc takes the context of remoting to an on-premises machine via a hybrid connection which you can ignore for your use case.
Also, if your VM doesn't have a public endpoint, you will have to use Premium Functions which support VNET Integration.

Related

Copying new Azure blobs to different container

We have 5 vendors that are SFTPing files to Blob Storage. When the files come in, I need to copy them to another container and create a folder in that container named with the date to put the files in. From the second container, I need to copy the files to a file share on an Azure server. What is the best way to go about this?
I'm very new to Azure and unsure what the best way is to accomplish what I am being asked to do. Any help would be greatly appreciated.
I'd recommend using Azure Synapse for this task. It will let you move data to and from different storage securely and with little-to-no code.
Specifically, I'd put a blob storage trigger on the SFTP blob container so that the Synapse Pipeline to move data automatically runs when your vendors drop their files.
Note that when you look for documentation on how to do things in Synapse, most of the time the Azure Data Factory documentation will also be applicable, since most of Data Factory's functionality is now in Synapse.
The ADF and Synapse YouTube channels are excellent resources, as well as the Microsoft Learn courses on Data Engineering.
I need to copy them to another container and create a folder in that container named with the date to put the files in.
You can use Azcopy to copy a files to another container by using SAS token.
command:
azcopy copy 'https://<storage account>.blob.core.windows.net/test/files?SAS' 'https://<storage account >.blob.core.windows.net/mycontainer/12-01-2023?SAS' --recursive
Console:
Portal:
I need to copy the files to a file share on an Azure server
You can also copy the files from container to file share by using Azcopy.
Command:
azcopy copy 'https://<storage account>.blob.core.windows.net/test?SAS' 'https://<storage account >.file.core.windows.net/fileshare/12-01-2023?SAS' --recursive
Console:
Portal:
You can get the SAS token through portal:
Go to portal -> your storage account -> shared access signature -> check the resource types -> click generate SAS and Connection-string.
Portal:
Probably azcopy is a good way to move all or part of the blobs from one container to another one. But I would suggest to automate it with Azure Functions. I think it can be atomated triggering an Azure Function every time a blob or set of blobs (Azure could process a batch of blobs) are updoladed to the source container.
Note on Azure Functions, depends on the quantity of blobs to be moved and the time that it could take, durable functions should be better solution to skip timeout exception. Durable function returns inmediate response but are running in "background".
Consider this article to have a better approach to this solution:
https://build5nines.com/azure-functions-copy-blob-between-azure-storage-accounts-in-c/

copy file to Azure File Share (Azure Storage)

in addition to my question Code sync from Azure Scale Set VM To Azure Storage is there any way to copy files from one of the particular Scale Set VM to Azure File share(Azure Storage) through ADO Pipelines ? since its scale Set server i cant push every time from one VM. Eg: In pool there will be 100 VMSS servers ,when i try to push code through pipeline it should pick up one server from pool and from that need to push code !!! does it possible ?
No, we do not have this kind of build-in task. We have a Azure File Copy task which use it copy application files and other artifacts to Microsoft Azure storage blobs or virtual machines (VMs).
When the target is Azure VMs, the files are first copied to an
automatically generated Azure blob container and then downloaded into
the VMs. The container is deleted after the files have been
successfully copied to the VMs.
As a workaround, please use a custom script to copy files to Azure File Storage. In other words, if you are able to do the same thing locally. You should also be able to achieve it through Azure DevOps pipeline.
You may have to built your own extension or use scripts in pipeline. Take a look at this similar 3rd-party task-- AzureDevOpsReleaseToFileStorage

Is there a way to know if a file has been uploaded to a network drive in Azure

I have a network location where every hour a csv file gets dumped. I need to copy that file to an azure blob. How do I know that a file has been uploaded to that network drive. Is there something like a file watcher in azure which monitors this network location? Also, is it possible to copy a file from network location to an azure blob through code?
I'm using .net core APIs deployed to an Azure App Service.
Please suggest a possible solution.
You can use Azure Event Grid but as of today Event Grid does not support Azure File Share.
As your fileshare in on-prem the only way I see is that you can write a custom publisher which can run on-prem and uses Azure Event Grid to send the event to Azure Event Grid and a subscriber which can be Azure Function does the work you want it to do.
https://learn.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal
But it will only be an Event and not the file itself which has been added\changed and to do that you will have to then upload the file itself into Azure for processing as well. As the above way requires you to do two things I would recommend run a custom code on-prem which runs CRON job like and looks for the new or edited file and then uploads to Azure BLOB Storage and then execute Azure Function to do your processing task.
Since the files are on-prem you can use powershell to monitor a folder for new files. Then fire an event to upload the file to an Azure blob.
There is a video showing how to do this here: https://www.youtube.com/watch?v=Usih7UywZYA
The changes you need to make are:
replace the action with an upload to azure https://argonsys.com/microsoft-cloud/library/how-to-upload-files-to-azure-blob-storage-using-powershell-and-azcopy/
Run powershell in the context of a user that can upload files

Copy files from Azure Blob storage to Azure File Storage with Azure functions

I have files in Azure Blob Storage that I want to be able to share with users through an FTP server running on an Azure VM.
As I understand it you can't mount Blob Storage on a VM but you can mount an Azure File Share using 'net use'.
The files on the Blob Storage will be uploaded incrementally so ideally I would like to copy them to Azure files when they are uploaded and Azure Function seems like the ideal way since they are easy to set up and handle the trigger on the Blob Storage for me.
How would I go about copying a file from Blob Storage to an Azure File Share using an Azure function?
You can setup a Trigger Binding on the Azure Function to be triggered by Blobs in the Azure Blob Storage Container. Then you'll have to download the file stream of the blob and upload it to the Azure Storage File Share.
Azure Functions do not include support for an Output Binding directly to an Azure Storage File Share. So, you'll need to either use the Azure Storage SDK from in code, or look into mounting the File Share to the Azure Functions runtime environment so you can write file updates to it from within the Azure Function.
An alternative solutions would be to use Azure Logic Apps to implement this without writing any code. This article might help for integrating with an Azure Storage File Share -> Connect to on-premises file systems form logic apps with the File System connector

Run command line EXE on Azure, supplying a path to an Azure Blob

I have an Azure web app that stores documents in Azure blob "container X".
Now, we want to run a "job" to generate specialized reports for these documents.
This includes running an EXE file that takes a document path as argument, letting it generate a report on the file system, and uploading this to Azure blob "container Y".
Like: generate-report.exe document.doc generates report.txt.
How can this be done? Do we need to download the blob to the web app, or is it possible to somehow refer to a blob as we refer to a physical disk file?
You cannot refer to a blob as a local file object, since blob storage does not implement any type of file I/O abstraction layer. Yes, you can use File Storage service, which implements SMB on top of blob storage, but this is different than working with individual blobs.
If your app is built to deal just with file objects, you'd need to copy it from blob storage to local disk first, then upload the results from local disk to blob storage.
If you're just building your app, you can directly access blob content via the REST API (or one of the various language-specific SDK's that wrap the API).
Reading file from the blob can be done in form of stream that can later be used to create the text file inside the web app also.
You can also create web jobs under web app to accomplish this task in backend.
https://azure.microsoft.com/en-in/documentation/articles/storage-dotnet-how-to-use-blobs/

Resources