How to create a simple dashboard from Azure Storage and Fileshare - azure

I have an azure storage account.
Inside the container, with a client-specific folder structure, every morning, some files get pushed.
I have a function app which processes and converts these files and calls some external service to work upon on these processed files.
I have got a file-share as well, which is basically mounted on a vm.
The external service, after processing the files (#3), generates the resultant success/failure files inside this file-share(#4).
Now the ask is:
Create a simple dashboard which will monitor the storage account(and in effect the container and the file-shares),it should capture & show basic information's, and should look like below table structure(with 3 simple variations of data):
FileName|ReceivedDateTime|NumberOfRecords
Original_file.csv20221011 5:21 AM|10
Original_file_Success.csv20221011 5:31 AM|9
Original_file_Failure.csv20221011 5:32 AM|1
In here the first record is captured from the Container and the second and third - both are generated in the file-share.
Also, whenever a new failure file is generated, i.e., Original file_Failure, it should send email with a predefines template adding the file name to a predefined recipient list.
Any guidance on the azure service to use?
I have seen Azure Monitor,workbook and other stuffs, but I feel that would be an overkill for such simple requirement.
Thanks in advance.

Related

How to Trigger an email in the Event that a Logic App Trigger is skipped?

I have the below scenario:
I have a Logic App, which gets triggered once in every day(24hours).
It basically looks at a SFTP location, if there is file dropped in there, pulls it and pushes it into a BLOB storage and then deletes it from the source(SFTP).
I need to trigger an email in the events of:
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
If it failed to Upload to the BLOB Storage.
Is it possible to enable Email Trigger in the above scenarios?(1&2)
Any guidance will be appreciated as I am new in the IAC space.
Thanks in advance.
Firstly, you can List the files in FTP and pass the name to get the content of the files using List files in folder and Get file content actions of SFTP connector.
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
For this, in the next step you can use a condition action to check if the file has been uploaded for that day by comparing the last modified time with the current date. If yes, then create a file in the blob storage with the file contents from Get file content step. Below is the flow of my logic app.
If it failed to Upload to the BLOB Storage.
For this you can create another condition action and check if the file is been created or not by using actions('Create_blob_(V2)')['outputs']['statusCode']. Below is the complete code of my Logic app

Azure az copy command complete event grid notification

Is there a way for the Azure event grid to trigger when an AZ copy command completes?
We have clients which use az copy to transfer hundreds of files and sub folders into our Azure storage. The number of files is variable. And the az copy command is of a single root folder on their local containing those files and sub folders.
We want to raise an event grid notification when the az copy is complete and successful.
An alternative would be to have a second az copy command in a batch file that transfers a single flag file once the initial command is fully executed successfully. We would then monitor for this single file as the flag to proceed with further processing.
Perhaps if az copy cannot raise the event, then it can add a verification file signaling the end of the transfer?
You can have event grid notifications on an individual blob (or directory, when using ADLS). azcopy is essentially creating individual blobs, so you'd get individual notifications. Azure storage doesn't provide a transactional batch of uploads, so you can't get a single notifiction.
If you wanted a single notification, you'd have to manage this yourself. You mentioned a "flag" file, but you can also create custom topics, use an Azure Function, service bus message, etc. How you ultimately implement this is up to you (and your clients that are uploading content), but tl;dr no, you can't get a single completion event for a batch of uploads.
I have tried to reproduce in my environment get notification successfully
For event grid notification the endpoint which will receive notification in function app or logic apps...I created endpoint in function app
In your function app -> Function -> create ->select azure event grid trigger ->create
once created, in storage account create subscription as below
please make changes as below once you select endpoint by default it shows function in right side as below and confirm selection and create subscription
once azcopy is complete and files uploaded in container you will get a notification like below

Azure Storage account having container consisting of hierarchy folders :- Create automated email if file is not uploaded to selected folders

Pretty new to Azure world ,however I have tried to googling this but haven't found a good way to go about this . Let me describe the problem
I have a storage account in Azure . In the container we are storing various data files . The files are stored in tree hierarchy of folders ( Parent-> year -> month -> day).Each day new files get uploaded to the specific day folder . If the file for that specific day is not uploaded I would like to drop email notification
Please let me know if you guys have an idea of how I can get this done
I have managed to do this
Basically use logic_app to monitor the storage account when blob is added to the storage account trigger an email
Is there are better way to do this ? I would like the logic to be if for specific folders in the container if there is no file by lets say 7pm everyday then drop email
Try this:
This is some explanation of this simple logic :
You can create a schedule to trigger this logic every day at 7 PM.
Use List Blob to check if there is any file in the folder let's
say /2021/03/17 so I use the expression to combine a path for
every day checking: concat('/',utcNow('yyyy/MM/dd'))
In Azure storage, if there is no file uploaded to some path, this path not exists, List Blob will fail.
Send an email to someone if List Blob fails, set run after here so
that this action will run only after List Blob fails:
I have tested on my side and everything works as excepted :

Copying files in fileshare with Azure Data Factory configuration problem

I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.
In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:
mystorageaccount.file.core.windows.net\\mystoragefilesharename
When trying to test the connection, I get the following error:
[{"code":9059,"message":"File path 'E:\\approot\\mscissstorage.file.core.windows.net\\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]
Should I move the data to another storage type like a blob or I am not entering the correct host url?
You'll need to specify the host in json file like this "\\myserver\share" if you create pipeline with JSON directly or you use set the host url like this "\myserver\share" if you're using UI to setup pipeline.
Here is more info:
https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions
I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:\xxx, D:\xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.
Based on the link posted by Nicolas Zhang: https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):
In the host path, the correct one should be: \\mystorageaccount.file.core.windows.net\mystoragefilesharename\myfolderpath
The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.

Creating a folder using Azure Storage Rest API without creating a default blob file

I want to create following folder structure on Azure:
mycontainer
-images
--2007
---img001.jpg
---img002.jpg
Now, one way is to use PUT Blob request and upload img001.jpg specifying the whole path as
PUT "mycontainer/images/2007/img001.jpg"
But, I want to first create the folders images and 2007 and then in a different request upload the blob img001.jpg.
Right now when I tried to doing this using PUT BLOB request:
StringToSign:
PUT
x-ms-blob-type:BlockBlob
x-ms-date:Tue, 07 Feb 2017 23:35:12 GMT
x-ms-version:2016-05-31
/account/mycontainer/images/
HTTP URL
sun.net.www.protocol.http.HttpURLConnection:http://account.blob.core.windows.net/mycontainer/images/
It is creating a folder but its not empty. By, default its creating an
empty blob file without name.
Now, a lot of people say we can't create a empty folder. But, then how come, we can make it using the azure portal as the browser must be sending some type of rest request to create the folder.
I think it has to do something with Content-Type i.e. x-ms-blob-content-type, which should be specified in order to tell azure that its a folder not a blob.
But, I am confused.
I want to first create the folders images and 2007 and then in a different request upload the blob img001.jpg
I agree with Brendan Green, currently, Azure blob storage just enable us to create virtual directory structure by naming blobs with path information in their names.
I think it has to do something with Content-Type i.e. x-ms-blob-content-type, which should be specified in order to tell azure that its a folder not a blob. But, I am confused.
You could check the description of Request Headers that could be set for Put Blob operation and you will find it does not support creating an empty folder by specifying some request headers.
Besides, as Gaurav Mantri said, if you really want to create an empty folder structure without content, you could try to use Azure File storage and it also enables us to use REST API to access Azure File storage. And the Create Directory operation cloud be used to create a new directory under the specified share or parent directory.
PUT https://myaccount.file.core.windows.net/myshare/myparentdirectorypath/mydirectory?restype=directory
This is not possible - the folder structure is virtual only.
See Get started with Azure Blob storage using .NET. You can only create a container, and everything else held in that container is a blob.
Excerpt:
As shown above, you can name blobs with path information in their
names. This creates a virtual directory structure that you can
organize and traverse as you would a traditional file system. Note
that the directory structure is virtual only - the only resources
available in Blob storage are containers and blobs.

Resources