I have a logic app that uses azure function as a http trigger and gets a return string.
When the azure function is to receive a Base64 string, create a file with the information and uploads to the assigned storage account, I keep getting status code 500 internal server error from the Azure function every time I run it. After many trial and error I deduced the problem occurs from when the file is to be created from the Base64 string and when the blob container client is created.
So Help me Please.
UPDATE: As per some of your suggestions, I implemented application insights ran it a few times and got this error occuring twice:
Azure.RequestFailedException
Message: Exception while executing function: BlobAdd The specifed resource name contains invalid characters
Status: 400 (The specifed resource name contains invalid characters.)
ErrorCode: InvalidResourceName
FailedMethod: Azure.Storage.Blobs.BlobRestClient+Container.CreateAsync_CreateResponse.
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
return await Base64(requestBody);
}
public static async Task<IActionResult> Base64(string Base64Body)
{
string HoldDBase = "";
string TestBlobData = "null";
if (Base64Body.Length > 10)
{
HoldDBase = Base64Body;
}
else
{
TestBlobData = "The Base64Body did not Pass";
return (ActionResult)new OkObjectResult
(
new
{
TestBlobData
}
);
}
//Connection string of the Storage Account Azure
string ConnectionString = "xxxxxxxxx";
// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(ConnectionString);
//Create a unique name of the container
string ContainerName = "Base64_Blob" + Guid.NewGuid().ToString();
//create the container and return a container client Object
BlobContainerClient ContainerClient = await blobServiceClient.CreateBlobContainerAsync(ContainerName); //Problem Here
//create a local file in the Storage
string localPath = "D:/Reliance/OlaForm/uploadsO";
string fileName = "quickstart" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
//convert string to bytes
byte[] BaseBytes = Convert.FromBase64String(HoldDBase);
//create file in local data
await File.WriteAllBytesAsync(localFilePath,BaseBytes); //Problem Here
//get reference to a blob
BlobClient blobclient = ContainerClient.GetBlobClient(fileName);
// Open the file and upload its data
FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobclient.UploadAsync(uploadFileStream);
// blobclient.Upload(uploadFileStream);
uploadFileStream.Close();
//blob id from blobclient and return it
TestBlobData = blobclient.ToString();
TestBlobData = HoldDBase;
return (ActionResult)new OkObjectResult
(
new {
TestBlobData
}
);
}
You are trying to write to the "D" disk. Since the Azure function does not have direct access to disks, you get a 500 error when trying to write to a disk that does not exist.
To write to a file from an Azure function you can use the Azure Storage SDK.
Related
solution/code how to upload large files more than 100 mb of size to blob storage in azure hosted app service (webapi) using .Net Core, But from local machine it's working not from azure app service.
Error showing file is too large to upload
Tried like one Example below -
[RequestFormLimits(MultipartBodyLengthLimit = 6104857600)]
[RequestSizeLimit(6104857600)]
public async Task<IActionResult> Upload(IFormfile filePosted)
{
string fileName = Path.GetFileName(filePosted.FileName);
string localFilePath = Path.Combine(fileName);
var fileStream = new FileStream(localFilePath, FileMode.Create);
MemoryStream ms = new MemoryStream();
filePosted.CopyTo(ms);
ms.WriteTo(fileStream);
BlobServiceClient blobServiceClient = new BlobServiceClient("ConnectionString");
var containerClient = new blobServiceClient.GetBlobContainerClient("Container");
BlobUploadOptions options = new BlobUploadOptions
{
TransferOptions = new StorageTransferOptions
{
MaximumConcurrency = 8,
MaximumTransferSize = 220 * 1024 * 1024
}
}
Blobsclient bc = containerClient.GetBlobClient("Name");
await bc.UploadAsync(fileStream, options);
ms.Dispose();
return Ok()
}
I tried in my environment and got below results:
To upload large files from local storage to Azure blob storage or file storage, you can use Azure data movement library,It provides high-performance for uploading, downloading larger files.
Code:
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.DataMovement;
class program
{
public static void Main(string[] args)
{
string storageConnectionString = "<Connection string>";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("test");
blobContainer.CreateIfNotExists();
string sourceBlob = #"C:\Users\download\sample.docx";
CloudBlockBlob destPath = blobContainer.GetBlockBlobReference("sample.docx");
TransferManager.Configurations.ParallelOperations = 64;
// Setup the transfer context and track the download progress
SingleTransferContext context = new SingleTransferContext
{
ProgressHandler = new Progress<TransferStatus>(progress =>
{
Console.WriteLine("Bytes Upload: {0}", progress.BytesTransferred);
})
};
// download thee blob
var task = TransferManager.UploadAsync(
sourceBlob, destPath, null, context, CancellationToken.None);
task.Wait();
}
}
Console:
Portal:
After I executed the above code and got successfully uploaded large file to azure blob storage.
i'm trying to download file from azure blob storage, but it returns only part of file. What i'm doing wrong ? File in storage is not corrupted
public async Task<byte[]> GetFile(string fileName)
{
var blobClient = BlobContainerClient.GetBlobClient(fileName);
var downloadInfo = await blobClient.DownloadAsync();
byte[] b = new byte[downloadInfo.Value.ContentLength];
await downloadInfo.Value.Content.ReadAsync(b, 0, (int)downloadInfo.Value.ContentLength);
return b;
}
I'm using Azure.Storage.Blobs 12.4.2 package. I tried this code and it works for me
public async Task<byte[]> GetFile(string fileName)
{
var blobClient = BlobContainerClient.GetBlobClient(fileName);
using (var memorystream = new MemoryStream())
{
await blobClient.DownloadToAsync(memorystream);
return memorystream.ToArray();
}
}
I am not able to full understand your code as the current BlobClient as of v11.1.1 does not expose any download methods. As #Guarav Mantri-AIS mentioned the readAsync can behave in that manner.
Consider an alternative using the DownloadToByteArrayAsync() which is part of the API. I have include the code required to connect but of course this is just for the purpose of demonstrating a full example.
Your method would be condensed as follows:
public async Task<byte[]> GetFile(string containerName, string fileName)
{
//i am getting the container here, not sure where or how you are doing this
var container = GetContainer("//your connection string", containerName);
//Get the blob first
ICloudBlob blob = container.GetBlockBlobReference(fileName);
//and now download it straight to a byte array
return await blobClient.DownloadAsync();
}
public CloudBlobContainer GetContainer(string connectionString, string containerName)
{
//1. connect to the account
var account = CloudStorageAccount.Parse(connectionString);
//2. create a client
var blobClient = _account.CreateCloudBlobClient();
//3. i am getting the container here, not sure where or how you are doing this
return = _blobClient.GetContainerReference(containerName);
}
I'm trying to extract MD5 and length (size) of the upload blob using Azure function using Http Trigger, Below the code im experimenting, but I always get null and -1. Please someone confirm the code is correct or any other option is available
public static async Task<IActionResult> Run(HttpRequest req,string inputBlob, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
log.LogInformation($"name,{inputBlob}");
log.LogInformation("Blob content: " + inputBlob.Properties.Length); //This is printing content of blob
CloudBlockBlob blob;
var credentials = new StorageCredentials("xxx", "xxxx");
var client = new CloudBlobClient(new Uri("https://xxx.blob.core.windows.net"), credentials);
var container = client.GetContainerReference("parent");
blob = container.GetBlockBlobReference("file.csv");
log.LogInformation("Blob details: " + blob.Properties.Length); //This is printing -1, if i provide ContentMD5 its showing null. Bascially its not able to read the blob
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
You are missing FetchAttributesAsync() (or FetchAttributes()) method before you try to retrieve any properties of the blob.
//your other code
blob = container.GetBlockBlobReference("file.csv");
blob.FetchAttributesAsync()(); //or FetchAttributes()
//then you can try to get any property here.
I am uploading documents to my an Azure Blob Storage, which works perfect, but I want to be able to link and ID to this specifically uploaded document.
Below is my code for uploading the file:
[HttpPost]
public ActionResult Upload(HttpPostedFileBase file)
{
try
{
var path = Path.Combine(Server.MapPath("~/App_Data/Uploads"), file.FileName);
string searchServiceName = ConfigurationManager.AppSettings["SearchServiceName"];
string blobStorageKey = ConfigurationManager.AppSettings["BlobStorageKey"];
string blobStorageName = ConfigurationManager.AppSettings["BlobStorageName"];
string blobStorageURL = ConfigurationManager.AppSettings["BlobStorageURL"];
file.SaveAs(path);
var credentials = new StorageCredentials(searchServiceName, blobStorageKey);
var client = new CloudBlobClient(new Uri(blobStorageURL), credentials);
// Retrieve a reference to a container. (You need to create one using the mangement portal, or call container.CreateIfNotExists())
var container = client.GetContainerReference(blobStorageName);
// Retrieve reference to a blob named "myfile.gif".
var blockBlob = container.GetBlockBlobReference(file.FileName);
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(path))
{
blockBlob.UploadFromStream(fileStream);
}
System.IO.File.Delete(path);
return new JsonResult
{
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
Data = "Success"
};
}
catch (Exception ex)
{
throw;
}
}
I have added the ClientID field to the Index(It is at the bottom), but have no idea how I am able to add this to this index. This is still al nerw to me and just need a little guidance if someone can help :
Thanks in advance.
I' m using Azure Block Blob Storage to keep my files. Here is my code to upload file.
I m calling the method twice as below for the same file in the same request;
The first call of the method saves the file as expected but the second call saves file as length of 0 so i cant display the image and no error occurs.
[HttpPost]
public ActionResult Index(HttpPostedFileBase file)
{
UploadFile(file);
UploadFile(file);
return View();
}
public static string UploadFile(HttpPostedFileBase file){
var credentials = new StorageCredentials("accountName", "key");
var storageAccount = new CloudStorageAccount(credentials, true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("images");
container.CreateIfNotExists();
var containerPermissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
container.SetPermissions(containerPermissions);
var blockBlob = container.GetBlockBlobReference(Guid.NewGuid().ToString());
blockBlob.Properties.ContentType = file.ContentType;
var azureFileUrl = string.Format("{0}", blockBlob.Uri.AbsoluteUri);
try
{
blockBlob.UploadFromStream(file.InputStream);
}
catch (StorageException ex)
{
throw;
}
return azureFileUrl ;
}
I just find the below solution which is strange like mine, but it does not help.
Strange Sudden Error "The number of bytes to be written is greater than the specified ContentLength"
Any idea?
Thanks
You need to reset the position of the stream back to the beginning. Put this line at the top of your UploadFile method.
file.InputStream.Seek(0, SeekOrigin.Begin);