Access blob storage through azure logic apps - azure

I have Azure blob storage, which contains some CSV files.
My task:
1. Create a logic app for the blob storage.
2. Retrieve the data from the blob storage.
3. Convert the retrieve file to JSON.
3. Upload that data to an online portal through API.
I've tried retrieving the data by "Get blob content" action, but not sure where to see the result. I have just created the logic app, but I'm stuck on what to do next.

For your question about
I've tried retrieving the data by "Get blob content" action, but not sure where to see the result.
When you run the logic app, you can see the content of your csv file in the OUTPUTS, shown as below:
If you want to use an api to upload the data, you just need to do the operation like below screenshot(but it also depends the type of your api's request body) :
Since you mentioned your azure storage blob contains some csv files, so maybe you need to loop the files. You can use "List blobs" action and "For each" action to loop them and then get blob content.
Hope it helps, if you have any further problem, please let me know.

Related

Uploading string value into azure storage location as a file content

I want to upload a string value as file content into azure storage location but without writing any code and using azure components only. Is there any option available?
There are a lot of ways to do this:
Write a string into .txt file and upload the file to a storage container on Azure Portal.
Generate a long lifetime SAS token on Azure Portal:
and use blob rest API to upload it to a blob, you can do it directly in postman:
Result:
Use Azure Logic App to do this. Let me know if you have any more questions.
Let me know if you have any more questions.

Is there a way to only pass the appended blob content to blob trigger Azure Function?

I am trying to make a blob trigger azure function for log files, but the problem is it will pass through the entire blob content when any blob is created or updated.
So I am wondering is there a way to only get the appended blob content?
module.exports = async function main(context, myBlob) {
// I am using javascript, myblob contains the entire content of single blob.
// For logs in append blob, it results in duplicated logs, which is not ideal.
};
So I am wondering is there a way to only get the appended blob content?
No.
Unless you
maintain the byte-index/position per log file where you read last some place (e.g. using a file/DB/any-persistant-storage) or use Durable Function
on change notification, you find the last byte-index/position and read starting from that location using appropriate SDK/API. Here is the REST API (for ADLS Gen2, find the right one if you're using Gen1 or Blob) and some description on how to read a byte range out of a file in blobs.

Copy files from Azure blob to Sharepoint folder using Microsoft Flow

I want to copy files from Azure blob storage to SharePoint Folder using Microsoft Flow. I have tried several times and the flow always fail when its running.
I have attached the flow that I'm currently trying to execute:
Can someone help me with this?
For your problem, please refer to the logic I post below (I have upload a testcsv.csv file to the blob storage):
After the trigger "When a blob is added or modified", we need to use "Get blob content" action to get the content of the csv file. Then add "Create file" action of SharePoint and put the file content which we got from blob to the "File Content" box.
By the way, as you mentioned it is a csv file, so in my blob storage container I only have one file by default. If there are more than one file in your blob storage, you can use "List blobs" action and use "For each" to loop it and then create each of the file in SharePoint.
I tried that however failed as it would not handle if you have folder structure in the blob and you'd like to mirror that structure in SP and copy individual files to folders.

Can you output an Azure Logic App Variable to file and store on blob storage?

I have searched google and MSDN and it's not clear if you can write a variable to blob storage? Searching the available steps/actions does not yield anything obvious either.
I have constructed an array variable of file names from an SFTP in per the following documentation, but I can't figure out if this can be stored or saved in any capacity.
Right now it seems these variables are essentially internal to the logic app and can't be made external or is there a way to export them?
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-create-variables-store-values
If you just simply want to save the variable's value in a blob then you can do so with the Azure Blob Storage - Create Blob action:

Verifying CloudBlob.UploadFromStream compleated with no errors?

I want to save files users upload to my site into my Azure Blob and I am using the CloudBlob.UploadFromStream method to do so but I want to make sure the file completed saving to the blob with no problems before doing some more work. I am currently just uploading the blob then checking to see if a reference to the new blob exists using GetBlockBlobReference inside an if statement. Is there a better way of verifying the upload completed fine?
If there's any problem while uploading the blob, CloudBlob.UploadFromStream method would throw an error so that would be the first place to check if the upload went fine.
I don't think creating a reference for a blob using GetBlockBlobReference would do you any good as it just creates an instance of CloudBlockBlob. It doesn't check if the blob exists in the storage or not. If you want to check if the blob exists in the storage, you could either fetch blob attributes using CloudBlockBlob.FetchAttributes method or creating an instance of CloudBlob using CloudBlobContainer.GetBlobReferenceFromServer or CloudBlobClient.GetBlobReferenceFromServer. All of the three methods above will fetch information about the blob from storage and would throw appropriate errors if something is not right (e.g. Not Found error if the blob does not exist).

Resources