I've created an azure logic app that triggers (3 hours interval) when a new file is created in the OneDrive folder the action is to copy the file from the specific folder to the Azure storage account container. The app works perfectly while testing and sometimes it works fine.
The OneDrive folder contains large files (meeting recordings) that may be up to 1.5GB, not sure the size matters. Please advise, is there any other way to copy this file apart from the logic app to the storage account? or how to fix this issue.
I have reproduced in my environment, and I have similar issue as below:
As One Drive connector in Standard Plan allows only 500 Mb of data that can be transferrable. Currently your requirement of 1.5Gb file transfer is not possible using Logic apps.
Related
I have a project ASP.Net MVC classics which need to be migrated to Azure and host in AppServices. Currently this project save files in the root folder and file size could be 2GB.
Now the question is should I leave the current logic to store the file in wwwroot folder as:\wwwroot\Files\myfile.txt"; or should I store it in the blob?
I am looking for the best practice and do not want to change the current logic? Can someone give me the idea?
Thanks
Storing files in Azure Blob Storage:
According to Documentation it says,
Azure Blob Storage enables the creation of data lakes for analytics purposes and provides storage for the development of powerful cloud-native and mobile apps. Reduce costs by using tiered storage for long-term data and scalability for high-performance computing and machine learning workloads.
According to Documentation it says,
SAS enables you to securely upload and download files from Azure Blob Storage without having to share the connection string.
While uploading you can split large data in small amounts with which we can decrease uploading time and after uploading then it again combines into a single blob.
Storing file in wwwroot folder:
According to Documentation it says
Static resource files are stored in web root, The default directory is {content root}/wwwroot folder.
According to Documentation we can clearly say that capacity depends on price tier.
Web apps performance can be effected by uploading large files.
In Azure linux web apps, if the size is around 2 Gb then it might lead to timeoutException.
I need to use logic app to load some csv files in a files storage in Azure to a blob storage. what trigger to use in logic app to access the files storage in Azure?
The files are quite large up to 1 GB and I'd like to be able to send them to an ftp server or to a restful endpoint for upload (using example PUT verb).
Is logic apps able to do this or would it be better to use Azure functions? Any resources or help pointing me in right direction would be useful.
For your question about which trigger you can use in logic app, it depends on your requirements. If you want the logic app be triggered periodically, you can add a "Recurrence" schedule. If you want to trigger it manually, you can add a request trigger, then you can trigger the logic app by calling the request url.
For your concern about if logic app can do this, I'm a little confused about what you want to do by logic app, you want to load csv files from azure file storage to blob storage in logic app? Or load csv files from blob storage to ftp? Both of them can be implemented by logic app if your files don't exceed its limits.
The "Azure File Storage" connector has general limits below:
The "Azure Blob Storage" connector also has some general limits, shown as below:
Ftp connector's limits are shown as below:
According to the two screenshots above, if your 1 GB files are lots of small files(the number of list blobs can't exceed 5000), your requirements can be implemented in logic app.
If you want to load files from azure file storage to blob storage(your files don't exceed the limits above), you can refer to the logic app below:
If you want to load files from azure blob storage to ftp(your files don't exceed the limits above), you can refer to the logic app below:
By the way, I think it is necessary to mention the price of logic app. It is billed by number of actions' execution, we can know more information about logic app price by this link. So if you have too many files and it will lead to too many action executions in you logic app, you need to notice the cost between logic app and azure function. Maybe function will be cheaper than logic app.
I am trying to upload a file to Azure Data lake using Azure Data lake Upload File action of Logic Apps. It is working fine for small files about 20 MB. But files with 28 MB or greater are failing with Status code 413- request entity too large.
I have enabled Chunking also in the Upload File Action.Is there any solution for this?
Thanks for the response George.
I have got a workaround. My scenario involves getting file from SharePoint online and uploading to Azure Data Lake. In earlier setup which had the above issue, I was using SharePoint trigger -When a file is created or modified in a folder, which returns file content, to get the file from SharePoint and Datalake Upload File action to upload it to Azure Data Lake. This setup was failing for files larger than 27MB (request entity too large-413) in File Upload Action even when chunking was enabled at File Upload action.
After some troubleshooting, I got a workaround which involves using another SharePoint trigger-When a file is created or modified in a folder(properties-only). It returns metadeta instead of file content. After getting metadeta I used Get File Content SharePoint Action to get the file content to upload to Azure Data lake which worked fine.
Logic app has the limits for message, for the Logic Apps message size limit, see Logic Apps limits and configuration.
However actions that support chunking can access the message content in these outputs. So you just need to set the Allow chunking on.
I test with a 40MB blob file and it succeeds. Further more information you could refer to this doc:Handle large messages with chunking in Azure Logic Apps. Hope this could help you.
I need to use logic app to load some csv files in a files storage in Azure to a blob storage. what trigger to use in logic app to access the files storage in Azure? I have tried e.g. file systems but that seems works for windows file share. What i want to do is to check if there is a new file in the file storage then load it to the blob. I know there are other ways to achieve this but I am assigned the task of looking into the feasibility of doing this using logic app.
For now, since file storage connector now has no trigger like when a file is added or modified so you could not achieve your function. So maybe you could go to feedback and ask for Logic App help .
And now, you could only copy specified file to blob with Get file content using path and Create blob. Or you choose use Azure Function with timer trigger to move new file to blob.
If you still have other questions, please let me know.
I'm currenlty using Azure Blob to store files, and upload/download from ASP.Net Application hosted outside of Azure. (I do not have Web Role and Worker Role.)
Is it possible to zip multiple files into one zip file within Azure Blob before downloading?
Thanks in advance!
THe only way to achieve this would be to do it by using a WIndows Azure Compute Role in the cloud. You obviously wouldn't want to do it on your on-prem servers as you'd round-trip the files twice.
One approach you might consider would be to build a download 'client' in Silverlight. This could handle the communications to blob stgorage and pull down the blobs (maybe in parallel) and then create the zip client side for saving.
But the short answer is this is not possible using WIndows Azure storage alone.