Azure Logic App Get File Content greater than 300MB - azure

I want to create an logic app which list all my files on my azure file storage and then copy them to an SFTP server. I have setup the following flow
1. List files in file storage
2. Get meta data of file
3. Get content of file
4. Create file on SFTP
With files smaller than 300MB everything works fine, but when I want to copy an file > 300MB I get
"The file contains 540.782 megabytes which exceeds the maximum 300 megabytes."
So is there an workaround or another solution for my issue?

This limit could caused it's still a preview feature, and then I test with blob it doesn't have this limit.
So I suppose you could call a function to upload the file yo blob after get the file name then use blob connector to get the blob content and upload to ftp server.
The below is my function code, In my test I don't pass the file name from request, so you could change it to get the name from request.
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequestMessage req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("storage connection string");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("windows");
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
String filename = "Downloads.zip";
CloudFile file = rootDir.GetFileReference(filename);
string fileSas = file.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
// Only read permissions are required for the source file.
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
Uri fileSasUri = new Uri(file.StorageUri.PrimaryUri.ToString() + fileSas);
var myClient = storageAccount.CreateCloudBlobClient();
var container = myClient.GetContainerReference("test");
var blockBlob= container.GetBlockBlobReference("test.zip");
await blockBlob.StartCopyAsync(fileSasUri);
return (ActionResult)new OkObjectResult($"success");
}
The below is my test flow. Note the blob path, you need to modify it with this /test/#{body('Get_file_metadata')?['DisplayName']}.
And this is the result pic, from the size you could I test with a file more than 300MB.

Related

Write file to blob storage and save the SaS URL using C#

I am trying to create an Azure Function that create files in blob storage and then save a pre-signed blob file url that is generated dynamically in an azure table so that we can return blob file url to the client program to open.
I am able to create the files in blob storage and save the urls. Right now, the code makes the file urls public, I am not sure how can I make the current code generate SaS url instead of public url and save it to the azure table.
I didn't see any example that shows the usage of CloudBlobClient and SaS. Appreciate any help.
[FunctionName("CreateFiles")]
public static async void Run([QueueTrigger("JobQueue", Connection = "")]string myQueueItem,
[Table("SubJobTable", Connection = "AzureWebJobsStorage")] CloudTable subJobTable,
ILogger log)
{
Job job = JsonConvert.DeserializeObject<Job>(myQueueItem);
var storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
string containerName = $"{job.Name.ToLowerInvariant()}{Guid.NewGuid().ToString()}";
CloudBlobContainer cloudBlobContainer =
cloudBlobClient.GetContainerReference(containerName);
cloudBlobContainer.CreateIfNotExists();
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
cloudBlobContainer.SetPermissions(permissions);
string localPath = "./data/";
string localFileName = $"{job.Id}.json";
string localFilePath = Path.Combine(localPath, localFileName);
File.WriteAllText(localFilePath, myQueueItem);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(localFileName);
log.LogInformation("Uploading to Blob storage as blob:\n\t {0}\n", cloudBlockBlob.Uri.AbsoluteUri);
cloudBlockBlob.UploadFromFile(localFilePath);
// update the table with file uri
DynamicTableEntity entity = new DynamicTableEntity(job.Id, job.PracticeId);
entity.Properties.Add("FileUri", new EntityProperty(cloudBlockBlob.Uri.AbsoluteUri));
entity.Properties.Add("Status", new EntityProperty("Complete"));
TableOperation mergeOperation = TableOperation.InsertOrMerge(entity);
subJobTable.Execute(mergeOperation);
}
It looks like that your code is making use of the older version of the SDK (Microsoft.Azure.Storage.Blob). If that's the case, then you would need to use GetSharedAccessSignature method in CloudBlob to generate a shared access signature token.
Your code would be something like:
...
cloudBlockBlob.UploadFromFile(localFilePath);
var sasToken = cloudBlockBlob. GetSharedAccessSignature(sas-token-parameters);
var sasUrl = "${cloudBlockBlob.Uri.AbsoluteUri}?${sasToken}";//Add question mark only if sas token does not have it.
...

Do you know what could be causing my azure function app to throw a 503 error after running 30 seconds?

` [FunctionName("FileShareDirRead02")]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, "get","post", Route = null)] HttpRequest req,
ILogger log)
{
//Get the contents of the POST and store them into local variables
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
//The following variable values are being passed in to the function through HTTP POST, or via parameters specified in the data factory pipeline
string storageAccount = data.storageAccount; //Name of storage account containing the fileshare you plan to parse and remove files
string fileshare = data.fileShare; //Name of the fileshare within the storage account
string folderPath = data.folderPath; // with no leading slash, this will be considered the ROOT of the fileshare. Parsing only goes down from here, never up.
string keyVaultName = data.keyvaultName; //Name of the key valut where the storage SAS token is stored
int daysoldbeforeDeletion = data.daysoldbeforeDeletion; //Number of days old a file must be before it is deleted from the fileshare
string nameofSASToken = data.nameofsasToken; //Name of SAS token created through PowerShell prior to execution of this function
string storageAccountSAS = storageAccount + "-" + nameofSASToken; //Format of the storage account SAS name
string kvUri = "https://" + keyVaultName + ".vault.azure.net/"; //URI to the key vault
var client = new SecretClient(new Uri(kvUri), new DefaultAzureCredential()); //Instantiate an instance of a SecretClient using the key vault URI
var storageKey1 = await client.GetSecretAsync(storageAccountSAS); //Obtain SAS Token from key vault
string key = storageKey1.Value.Value; //Assign key the SIG value which is part of the SAS Token
key = key.Substring(1); //Trim the leading question mark from the key value since it is not a part of the key
string connectionString = "FileEndpoint=https://" + storageAccount + ".file.core.windows.net/;SharedAccessSignature=" + key; //Define the connection string to be used when creating a Share Client
ShareClient share = new ShareClient(connectionString,fileshare); // Instantiate a ShareClient which will be used to manipulate the file share
var folders = new List<Tuple<string, string>>(); //reference a new list 2-tuple named folders which will include our directories from our share in our Azure Storage Account
ShareDirectoryClient directory = share.GetDirectoryClient(folderPath); // Get a reference to the directory supplied in the POST
Queue<ShareDirectoryClient> remaining = new Queue<ShareDirectoryClient>(); // Track the remaining directories to walk, starting from the folder path provided in the POST
remaining.Enqueue(directory);
while (remaining.Count > 0) // Keep scanning until all folders and files have been evaluated
{
ShareDirectoryClient dir = remaining.Dequeue(); // Get all of the next directory's files and subdirectories
if (dir.GetFilesAndDirectoriesAsync() != null) //Make sure the folder path exists in the fileshare
{
//return new OkObjectResult("{\"childItems\":" + JsonConvert.SerializeObject(remaining.Count) + "}"); //Returns a list of all files which were removed from the fileshare
await foreach (ShareFileItem item in dir.GetFilesAndDirectoriesAsync()) //For each directory and file
{
if (!(item.IsDirectory)) //Make sure the item is not a directory before executing the below code
{
ShareFileClient fileClient = new ShareFileClient(connectionString, fileshare, dir.Path + "/" + item.Name); //Create the File Service Client
if (fileClient.Exists())
{
ShareFileProperties properties = await fileClient.GetPropertiesAsync(); //Get the properties of the current file
DateTime convertedtime = properties.LastModified.DateTime; //Get the last modified date and time of the current file
DateTime date = DateTime.UtcNow; //Get today's date and time
TimeSpan timeSpan = date.Subtract(convertedtime); //Subtract last modified date/time from today's date/time
int dayssincelastmodified = timeSpan.Days; //Assign the number of days between the two dates
if (dayssincelastmodified > daysoldbeforeDeletion)
{
folders.Add(new Tuple<string, string>(item.Name, fileClient.Path)); // Add the directory names and filenames to our list 2-tuple
fileClient.Delete(); //Delete the file from the share
}
}
}
if (item.IsDirectory) // Keep walking down directories
{
remaining.Enqueue(dir.GetSubdirectoryClient(item.Name));
}
}
}
}
return new OkObjectResult("{\"childItems\":" + JsonConvert.SerializeObject(folders) + "}"); //Returns a list of all files which were removed from the fileshare
}
}
}`I have written a function app using MS Visual Studio C# and published it to an azure function app. The app is very simple. It reads a directory and all subdirectories of a file share looking for files that have not been modified in the last 90 days. If so, the files are deleted. This function works fine when reading a small set of directories and files. But when I run it on a directory with say a 1000 or more files, the app crashes with a 503 error saying the service is not available and to check back later. I am using an App Service Plan, Standard. I thought maybe it was timing out but this type of plan is not supposed to prevent an app from running, no matter how long it runs. To be sure, I put a line in my host.json file "functionTimeout": "01:00:00" to make sure that was not the problem. I cannot find a single log entry that explains what is happening. Any ideas on how to debug this issue?
This problem is often caused by application-level issues, such as:
requests taking a long time
application using high memory/CPU
application crashing due to an exception.
Seems like your function is taking more time to return an HTTP response. As mentioned in the documentation, 230 seconds is the maximum amount of time that an HTTP triggered function can take to respond to a request. Please refer to this.
Also, It is also possible to prevent the autorun of the Azure function by specifying the proper value for the parameter functionTimeout of the host.json file and where you set the functionTimeout to 1 hour.
To debug this issue, refer this azure-app-service-troubleshoot-http-502-http-503 MSFT documentation and follow the troubleshooting steps provided.
For longer processing times, use Azure Durable Functions async pattern. Refer this MS Doc.

Azure CloudFile UploadFromFile "The specified resource name contains invalid characters"

I am trying to upload a file, I get an exception "The specified resource name contains invalid characters".
The path I am using is #"C:\Test\Test.txt". When I change to relative addressing (i.e., #".\Test.txt") and have the file in the exe folder, it will work.
What I need to know is relative addressing the only option to upload a file to Azure File Storage from a .NET client? Is there a way to reference a file with a full path and upload to File Storage?
Update: Based on the comments and answer below, I realized my mistake: I was supplying the incoming file path to the GetFileReference method, where this should be the name of the new file in Azure, hence it contained the ':' which was invalid. Comments are right, I should have provided code, may have been diagnosed easier.
public static async Task WriteFileToStorage(string filePath)
{
CloudFileShare fileShare = GetCloudFileShare();
CloudFileDirectory fileDirectory = fileShare.GetRootDirectoryReference();
CloudFile cloudFile = fileDirectory.GetFileReference(filePath);
await cloudFile.UploadFromFileAsync(filePath);
}
.Net client does support the full path when upload to azure file storage.
You'd better provide the complete code you're using including the file name / path in local and azure file storage.
Here is the code I test with, and it works.(and I'm using the package WindowsAzure.Storage, version 9.3.3 ):
static void Main(string[] args)
{
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("account_name", "account_key"), true);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference("test");
CloudFileDirectory rootDir = fileShare.GetRootDirectoryReference();
CloudFile myfile = rootDir.GetFileReference("mytest.txt");
//the full file path on local
myfile.UploadFromFile(#"C:\Test\Test.txt");
}

Why can't I download an Azure Blob using an asp.net core application published to Azure server

I am trying to download a Blob from an Azure storage account container. When I run the application locally, I get the correct "Download" folder C:\Users\xxxx\Downloads. When I publish the application to Azure and try to download the file, I get an error. I have tried various "Knownfolders", and some return empty strings, others return the folders on the Azure server. I am able to upload files fine, list the files in a container, but am struggling with downloading a file.
string conn =
configuration.GetValue<string>"AppSettings:AzureContainerConn");
CloudStorageAccount storageAcct = CloudStorageAccount.Parse(conn);
CloudBlobClient blobClient = storageAcct.CreateCloudBlobClient();
CloudBlobContainer container =
blobClient.GetContainerReference(containerName);
Uri uriObj = new Uri(uri);
string filename = Path.GetFileName(uriObj.LocalPath);
// get block blob reference
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
Stream blobStream = await blockBlob.OpenReadAsync();
string _filepath = _knownfolder.Path + "\\projectfiles\\";
Directory.CreateDirectory(_filepath);
_filepath = _filepath + filename;
Stream _file = new MemoryStream();
try
{
_file = File.Open(_filepath, FileMode.Create, FileAccess.Write);
await blobStream.CopyToAsync(_file);
}
finally
{
_file.Dispose();
}
The expected end result is the file ends up in the folder within the users "Downloads" folder.
Since you're talking about publishing to Azure, the code is probably from a web application, right? And the code for the web application runs on the server. Which means the code is trying to download the blob to the server running the web application.
To present a downloadlink to the user to enable them to download the file, use the FileStreamResult which
Represents an ActionResult that when executed will write a file from a stream to the response.
A (pseudo code) example:
[HttpGet]
public FileStreamResult GetFile()
{
var stream = new MemoryStream();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
blockBlob.DownloadToStream(stream);
blockBlob.Seek(0, SeekOrigin.Begin);
return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
{
FileDownloadName = "someFile.txt"
};
}

How to copy an Azure File to an Azure Blob?

I've got some files sitting in an Azure File storage.
I'm trying to programatically archive them to Azure Blobs and I'm not sure how to do this efficiently.
I keep seeing code samples about copying from one blob container to another blob container .. but not from a File to a Blob.
Is it possible to do without downloading the entire File content locally and then uploading this content? Maybe use Uri's or something?
More Info:
The File and Blob containers are in the same Storage account.
The storage account is RA-GRS
Here was some sample code I was thinking of doing .. but it just doesn't feel right :( (pseudo code also .. with validation and checks omitted).
var file = await ShareRootDirectory.GetFileReference(fileName);
using (var stream = new MemoryStream())
{
await file.DownloadToStreamAsync(stream);
// Custom method that basically does:
// 1. GetBlockBlobReference
// 2. UploadFromStreamAsync
await cloudBlob.AddItemAsync("some-container", stream);
}
How to copy an Azure File to an Azure Blob?
We also can use CloudBlockBlob.StartCopy(CloudFile). CloudFile type is also can be accepted by the CloudBlockBlob.StartCopy function.
How to copy CloudFile to blob please refer to document. The following demo code is snippet from the document.
// Parse the connection string for the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
Microsoft.Azure.CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Create a new file share, if it does not already exist.
CloudFileShare share = fileClient.GetShareReference("sample-share");
share.CreateIfNotExists();
// Create a new file in the root directory.
CloudFile sourceFile = share.GetRootDirectoryReference().GetFileReference("sample-file.txt");
sourceFile.UploadText("A sample file in the root directory.");
// Get a reference to the blob to which the file will be copied.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("sample-container");
container.CreateIfNotExists();
CloudBlockBlob destBlob = container.GetBlockBlobReference("sample-blob.txt");
// Create a SAS for the file that's valid for 24 hours.
// Note that when you are copying a file to a blob, or a blob to a file, you must use a SAS
// to authenticate access to the source object, even if you are copying within the same
// storage account.
string fileSas = sourceFile.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
// Only read permissions are required for the source file.
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
// Construct the URI to the source file, including the SAS token.
Uri fileSasUri = new Uri(sourceFile.StorageUri.PrimaryUri.ToString() + fileSas);
// Copy the file to the blob.
destBlob.StartCopy(fileSasUri);
Note:
If you are copying a blob to a file, or a file to a blob, you must use a shared access signature (SAS) to authenticate the source object, even if you are copying within the same storage account.
Use the Transfer Manager:
https://msdn.microsoft.com/en-us/library/azure/microsoft.windowsazure.storage.datamovement.transfermanager_methods.aspx
There are methods to copy from CloudFile to CloudBlob.
Add the "Microsoft.Azure.Storage.DataMovement" nuget package
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.File;
using Microsoft.WindowsAzure.Storage.DataMovement;
private string _storageConnectionString = "your_connection_string_here";
public async Task CopyFileToBlob(string blobContainer, string blobPath, string fileShare, string fileName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(_connectionString);
CloudFileShare cloudFileShare = storageAccount.CreateCloudFileClient().GetShareReference(fileShare);
CloudFile source = cloudFileShare.GetRootDirectoryReference().GetFileReference(fileName);
CloudBlobContainer blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference(blobContainer);
CloudBlockBlob target = blobContainer.GetBlockBlobReference(blobPath);
await TransferManager.CopyAsync(source, target, true);
}

Resources