Is there any way to know when new file gets in azure file storage - azure

I am trying to download and perform some action whenever there is new file in file storage.
but there is absolutely no way to do it.
I tried mapping the drive on to a vm and use Inotifywait tool (failed to get notifications).
I tried logic app and sftp to file share was not able to connect it.
I checking all over internet but found nothing
can anyone suggest any alternative ?
Thanks

I am trying to download and perform some action whenever there is new file in file storage.
As far as I know, it is not supported Azure storage file trigger now. I also find the related azure feedback.
can anyone suggest any alternative?
Based on my experience, if azure blob storage is possible, you could use blob trigger to replace it. Or you could use WebJob time trigger to check whether there is any update on the storage file share.

Related

Is there a way to know if a file has been uploaded to a network drive in Azure

I have a network location where every hour a csv file gets dumped. I need to copy that file to an azure blob. How do I know that a file has been uploaded to that network drive. Is there something like a file watcher in azure which monitors this network location? Also, is it possible to copy a file from network location to an azure blob through code?
I'm using .net core APIs deployed to an Azure App Service.
Please suggest a possible solution.
You can use Azure Event Grid but as of today Event Grid does not support Azure File Share.
As your fileshare in on-prem the only way I see is that you can write a custom publisher which can run on-prem and uses Azure Event Grid to send the event to Azure Event Grid and a subscriber which can be Azure Function does the work you want it to do.
https://learn.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal
But it will only be an Event and not the file itself which has been added\changed and to do that you will have to then upload the file itself into Azure for processing as well. As the above way requires you to do two things I would recommend run a custom code on-prem which runs CRON job like and looks for the new or edited file and then uploads to Azure BLOB Storage and then execute Azure Function to do your processing task.
Since the files are on-prem you can use powershell to monitor a folder for new files. Then fire an event to upload the file to an Azure blob.
There is a video showing how to do this here: https://www.youtube.com/watch?v=Usih7UywZYA
The changes you need to make are:
replace the action with an upload to azure https://argonsys.com/microsoft-cloud/library/how-to-upload-files-to-azure-blob-storage-using-powershell-and-azcopy/
Run powershell in the context of a user that can upload files

MarkLogic - Can we configure scheduled backup on Azure Blob

We want to configure schedule backup for database.
We have set storage account and access key for Azure Blob in security-> Credentials for Azure.
In backup directory, when enter azure://containName
This container name is exist in given storage account.
In response it says
The directory azure://backup/ does not exist on host ml01. Would you like to try and create it?
Can anybody please help me to configure?
It sounds like you want to create a work job which backup the data of your MarkLogic Database to Azure Blob Storage and trigger by a time schedule. Right? I do not completely understand what you said, so here just my suggestion below.
I'm not familar with MarkLogic, but I think you can write a script for NodeJS or a Java program to do the backup work, after I read the tag info for marklogic and I see it supports the client API for Node and Java.
As I known, there are three ways normally to deploy it on Azure if you are ready to backup in programming.
You can deploy it as a webjob with a cron expression to trigger the backup work, please refer to the offical document Run Background tasks with WebJobs in Azure App Service.
You can deploy it as a Web API on Azure by using the service like WebApp, and use Azure Scheduler to trigger it.
You can deploy it as an Azure Function and trigger it with timer trigger, please refer to the offical document Create a function in Azure that is triggered by a timer.
Ofcourse, there are other services can help to realize your needs. I don't know what the best for you is. If you have any concern, please feel free to let me know.
I was running into similar issue and was able to resolve it.
Create something side your container within a folder. so your structure should look like this
azure://containername/folder
I just was able to resolve my issue by doing that.

Azure use logic app to load files from file storage to blob storage

I need to use logic app to load some csv files in a files storage in Azure to a blob storage. what trigger to use in logic app to access the files storage in Azure? I have tried e.g. file systems but that seems works for windows file share. What i want to do is to check if there is a new file in the file storage then load it to the blob. I know there are other ways to achieve this but I am assigned the task of looking into the feasibility of doing this using logic app.
For now, since file storage connector now has no trigger like when a file is added or modified so you could not achieve your function. So maybe you could go to feedback and ask for Logic App help .
And now, you could only copy specified file to blob with Get file content using path and Create blob. Or you choose use Azure Function with timer trigger to move new file to blob.
If you still have other questions, please let me know.

Is there a way to move files in azure file storage using only the rest APIs or python APIs

I was looking through the azure file service apis for python and for rest but I didn't see a method to just move files. I would like to move files on azure file storage without mounting a network drive is there a way to do this?
As I know, your requirement couldn't be implemented directly. However you could use the combination of copy and delete API.
These are copy and delete API of REST.And this part is about copy and delete API of python.And an Azure Storage File is an SMB-compatible share, so you should be able to make files copies with normal file I/O operations.You could refer to this answer.
If this couldn't meet your requirement or still have questions, please let me know.

Can Azure Data Factory write to FTP

I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.

Resources