Is there a way to know if a file has been uploaded to a network drive in Azure - azure

I have a network location where every hour a csv file gets dumped. I need to copy that file to an azure blob. How do I know that a file has been uploaded to that network drive. Is there something like a file watcher in azure which monitors this network location? Also, is it possible to copy a file from network location to an azure blob through code?
I'm using .net core APIs deployed to an Azure App Service.
Please suggest a possible solution.

You can use Azure Event Grid but as of today Event Grid does not support Azure File Share.
As your fileshare in on-prem the only way I see is that you can write a custom publisher which can run on-prem and uses Azure Event Grid to send the event to Azure Event Grid and a subscriber which can be Azure Function does the work you want it to do.
https://learn.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal
But it will only be an Event and not the file itself which has been added\changed and to do that you will have to then upload the file itself into Azure for processing as well. As the above way requires you to do two things I would recommend run a custom code on-prem which runs CRON job like and looks for the new or edited file and then uploads to Azure BLOB Storage and then execute Azure Function to do your processing task.

Since the files are on-prem you can use powershell to monitor a folder for new files. Then fire an event to upload the file to an Azure blob.
There is a video showing how to do this here: https://www.youtube.com/watch?v=Usih7UywZYA
The changes you need to make are:
replace the action with an upload to azure https://argonsys.com/microsoft-cloud/library/how-to-upload-files-to-azure-blob-storage-using-powershell-and-azcopy/
Run powershell in the context of a user that can upload files

Related

How to get notified when a file is uploaded in ftp location using Azure and start copy job?

I need to start a copy job whenever a file(desired) is uploaded in an FTP location. so for notifying that the file is available in that location is there any way( like if the file is available then run this copy job) other than logic apps using ADF? can anyone please post some suggestions?
Azure DataFactory event based triggers will not support FTP. So we cannot do directly from Azure DataFactory.
As you said, you might need to relay on other process such as logic apps or Azure functions or Azure Automation, etc. to check for file landing on FTP and kick off ADF pipeline execution.

Azure Function, Blob Trigger : Copy data file from blob to a server

My requirement is to create a blob trigger azure function which will be triggered when a specific file (say, trigger.txt) is copied into a container. Once it is triggered, the powershell function should copy this trigger.txt on to a Azure windows VM in the same same subscription and resource group.
I can see that the function is triggered if an example trigger.txt file is present.
What do I need to do to copy this blob in the container to the azure VM? I see that azcopy does not work.
You could consider one of the following approaches
Using Azure File Storage
Mount Azure File Storage onto Windows VM
Azure Function would create a new file in file storage using content from input blob. Since there is no binding support for file strorage, you will have to use the File Storage SDK directly.
You could also consider using Logic Apps which has connectors for both blob storage and file storage. But do note that there are file size restrictions that you may run into depending on your use case.
And finally, you might want to consider using blob events to reduce on polling costs for both approaches.
Use PowerShell Functions to remote into the VM as suggested by #silent in the comments and run azcopy to download the file
There is an official doc which showcases how you can run commands using remoting that you can refer to. The doc takes the context of remoting to an on-premises machine via a hybrid connection which you can ignore for your use case.
Also, if your VM doesn't have a public endpoint, you will have to use Premium Functions which support VNET Integration.

Azure use logic app to load files from file storage to blob storage

I need to use logic app to load some csv files in a files storage in Azure to a blob storage. what trigger to use in logic app to access the files storage in Azure? I have tried e.g. file systems but that seems works for windows file share. What i want to do is to check if there is a new file in the file storage then load it to the blob. I know there are other ways to achieve this but I am assigned the task of looking into the feasibility of doing this using logic app.
For now, since file storage connector now has no trigger like when a file is added or modified so you could not achieve your function. So maybe you could go to feedback and ask for Logic App help .
And now, you could only copy specified file to blob with Get file content using path and Create blob. Or you choose use Azure Function with timer trigger to move new file to blob.
If you still have other questions, please let me know.

Is there any trigger for Azure File Share in azure functions or azure logic app?

I created file share in the azure storage account after that I mount the file share with my windows pc. Next I uploaded the files into file share drive for example (Z://), but whenever I uploaded files into OnPremise file share drive, then I want to trigger either logic app or azure function automatically and give the file/image to computer vision api and store the response into azure SQL database.
For that I followed the below documentations as
Extract Text From Images Using Computer Vision API And Azure Function
Computer Vision
But those are related to azure blob storage, but not an azure file share.
Actually, there is no trigger for Azure File Share.
Here is a similar post, refer to it. This is the feedback, you could vote it.

Can Azure Data Factory write to FTP

I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.

Resources