Azure Function Blob Storage connection: The format of value '*' is invalid - azure

I am writing a v2 Azure Function in which I will access Azure Blob Storage. Because I was having trouble, I reduced it down to this minimal example.
namespace Test
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "test")] HttpRequest req,
ILogger log)
{
var azureStorage = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
var blobClient = azureStorage.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("migrated-load-sets-localhost");
var blobReference = container.GetBlockBlobReference("11016093-2f6e-4631-97c1-04f8acfb2370");
var memoryStream = new MemoryStream();
var accessCondition = AccessCondition.GenerateIfExistsCondition();
var blobRequestOptions = new BlobRequestOptions();
await blobReference.DownloadToStreamAsync(memoryStream, accessCondition, blobRequestOptions, null);
var text = System.Text.Encoding.UTF8.GetString(memoryStream.ToArray());
return new OkObjectResult(text);
}
}
}
When I run and hit this, I get the error
System.Private.CoreLib: Exception while executing function: Function1. Microsoft.WindowsAzure.Storage: The format of value '*' is invalid. System.Net.Http: The format of value '*' is invalid.
If I change
var accessCondition = AccessCondition.GenerateIfExistsCondition();
to be
var accessCondition = AccessCondition.GenerateEmptyCondition();
it works.
I have observed in debugging that accessCondition.IfMatchETag equals "*", so it seems like that might be the culprit.
Am I doing something wrong when I use AccessCondition.GenerateIfExistsCondition(), or is there a bug in the library?

In case if you need to check if the blob is present before downloading the file , all you need is
if(blobReference.ExistsAsync())
{
//Download
}

Related

HTTP Listener in a HTTP Trigger Azure Function

I have a HTTP Listener console app that works on my local machine. When I try to use it inside a HTTP Trigger Azure Function. I always get the 418 error code.
In my console app:
HttpListener listener = new HttpListener();
try
{
listener.Prefixes.Add("http://localhost:11000/");
listener.Start();
} catch (Exception e)
{ // }
do {
var ctx = listener.GetContext();
var res = ctx.Response;
var req = ctx.Request;
var reqUrl = req.Url;
var readStream = new StreamReader(req.InputStream);
var content = readStream.ReadToEnd();
Console.WriteLine(content);
// business logic
readStream.Close();
res.StatusCode = (int)HttpStatusCode.OK;
res.ContentType = "text/plain";
res.OutputStream.Write(new byte[] { }, 0, 0);
res.Close();
if (stopListener) { listener.Stop(); }
} while (listener.IsListening);
Now HTTP Trigger Function uses the HttpRequest class and that seems to give me the 418 error code. I replaced it with HttpListener() but when I add the prefix of the Azure Function string connection (on the CLI), the stream never goes through and its as if its not capturing it? Or what connection should I use? I feel like self-referencing it is the reason its not working.
Azure Function:
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpListener listener,
ILogger log,
IBinder binder)
{//same as above}
Is this the right approach to getting data from an external app? So far this has been the way I can see it working via the HTTP Listener.
Any suggestions are welcomed.
Is this the right approach to getting data from an external app?
The right way to access Data from an external source and any other source. You can create an API and use this API to access data from external sources.
For create azure function click hereby Microsoft documents.
Below sample code for access web API in azure function.
var _httpclient = new HttpClient();
var _response = await _httpclient .GetAsync(rul);
var result_= await _response .Content.ReadAsStringAsync();
its use is just like using API in C# code.
Azure Function Code:-
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using System.Net.Http;
namespace _73093902_FunctionApp10
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
var _httpclient = new HttpClient();
var _response = await _httpclient.GetAsync("https://localhost:7101/WeatherForecast");
var result_ = await _response.Content.ReadAsStringAsync();
return new OkObjectResult(result_);
}
}
}
Debug Output:-

Testing a Multipart file upload Azure Function

So I have written a simple Azure Function (AF) that accepts (via Http Post method) an IFormCollection, loops through the file collection, pushes each file into an Azure Blob storage container and returns the url to each file.
The function itself works perfectly when I do a single file or multiple file post through Postman using the 'multipart/form-data' header. However when I try to post a file through an xUnit test, I get the following error:
System.IO.InvalidDataException : Multipart body length limit 16384 exceeded.
I have searched high and low for a solution, tried different things, namely;
Replicating the request object to be as close as possible to Postmans request.
Playing around with the 'boundary' in the header.
Setting 'RequestFormLimits' on the function.
None of these have helped so far.
The details are the project are as follows:
Azure Function v3: targeting .netcoreapp3.1
Startup.cs
public class Startup : FunctionsStartup
{
public IConfiguration Configuration { get; private set; }
public override void Configure(IFunctionsHostBuilder builder)
{
var x = builder;
InitializeConfiguration(builder);
builder.Services.AddSingleton(Configuration.Get<UploadImagesAppSettings>());
builder.Services.AddLogging();
builder.Services.AddSingleton<IBlobService,BlobService>();
}
private void InitializeConfiguration(IFunctionsHostBuilder builder)
{
var executionContextOptions = builder
.Services
.BuildServiceProvider()
.GetService<IOptions<ExecutionContextOptions>>()
.Value;
Configuration = new ConfigurationBuilder()
.SetBasePath(executionContextOptions.AppDirectory)
.AddJsonFile("appsettings.json")
.AddJsonFile("appsettings.Development.json", optional: true)
.AddEnvironmentVariables()
.Build();
}
}
UploadImages.cs
public class UploadImages
{
private readonly IBlobService BlobService;
public UploadImages(IBlobService blobService)
{
BlobService = blobService;
}
[FunctionName("UploadImages")]
[RequestFormLimits(ValueLengthLimit = int.MaxValue,
MultipartBodyLengthLimit = 60000000, ValueCountLimit = 10)]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "images")] HttpRequest req)
{
List<Uri> returnUris = new List<Uri>();
if (req.ContentLength == 0)
{
string badResponseMessage = $"Request has no content";
return new BadRequestObjectResult(badResponseMessage);
}
if (req.ContentType.Contains("multipart/form-data") && req.Form.Files.Count > 0)
{
foreach (var file in req.Form.Files)
{
if (!file.IsValidImage())
{
string badResponseMessage = $"{file.FileName} is not a valid/accepted Image file";
return new BadRequestObjectResult(badResponseMessage);
}
var uri = await BlobService.CreateBlobAsync(file);
if (uri == null)
{
return new ObjectResult($"Could not blob the file {file.FileName}.");
}
returnUris.Add(uri);
}
}
if (!returnUris.Any())
{
return new NoContentResult();
}
return new OkObjectResult(returnUris);
}
}
Exception Thrown:
The below exception is thrown at the second if statement above, when it tries to process req.Form.Files.Count > 0, i.e.
if (req.ContentType.Contains("multipart/form-data") && req.Form.Files.Count > 0) {}
Message:
System.IO.InvalidDataException : Multipart body length limit 16384 exceeded.
Stack Trace:
MultipartReaderStream.UpdatePosition(Int32 read)
MultipartReaderStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
StreamHelperExtensions.DrainAsync(Stream stream, ArrayPool`1 bytePool, Nullable`1 limit, CancellationToken cancellationToken)
MultipartReader.ReadNextSectionAsync(CancellationToken cancellationToken)
FormFeature.InnerReadFormAsync(CancellationToken cancellationToken)
FormFeature.ReadForm()
DefaultHttpRequest.get_Form()
UploadImages.Run(HttpRequest req) line 42
UploadImagesTests.HttpTrigger_ShouldReturnListOfUploadedUris(String fileNames)
xUnit Test Project: targeting .netcoreapp3.1
Over to the xUnit Test project, basically I am trying to write an integration test. The project references the AF project and has the following classes:
TestHost.cs
public class TestHost
{
public TestHost()
{
var startup = new TestStartup();
var host = new HostBuilder()
.ConfigureWebJobs(startup.Configure)
.ConfigureServices(ReplaceTestOverrides)
.Build();
ServiceProvider = host.Services;
}
public IServiceProvider ServiceProvider { get; }
private void ReplaceTestOverrides(IServiceCollection services)
{
// services.Replace(new ServiceDescriptor(typeof(ServiceToReplace), testImplementation));
}
private class TestStartup : Startup
{
public override void Configure(IFunctionsHostBuilder builder)
{
SetExecutionContextOptions(builder);
base.Configure(builder);
}
private static void SetExecutionContextOptions(IFunctionsHostBuilder builder)
{
builder.Services.Configure<ExecutionContextOptions>(o => o.AppDirectory = Directory.GetCurrentDirectory());
}
}
}
TestCollection.cs
[CollectionDefinition(Name)]
public class TestCollection : ICollectionFixture<TestHost>
{
public const string Name = nameof(TestCollection);
}
HttpRequestFactory.cs: To create Http Post Request
public static class HttpRequestFactory
{
public static DefaultHttpRequest Create(string method, string contentType, Stream body)
{
var request = new DefaultHttpRequest(new DefaultHttpContext());
var contentTypeWithBoundary = new MediaTypeHeaderValue(contentType)
{
Boundary = $"----------------------------{DateTime.Now.Ticks.ToString("x")}"
};
var boundary = MultipartRequestHelper.GetBoundary(
contentTypeWithBoundary, (int)body.Length);
request.Method = method;
request.Headers.Add("Cache-Control", "no-cache");
request.Headers.Add("Content-Type", contentType);
request.ContentType = $"{contentType}; boundary={boundary}";
request.ContentLength = body.Length;
request.Body = body;
return request;
}
private static string GetBoundary(MediaTypeHeaderValue contentType, int lengthLimit)
{
var boundary = HeaderUtilities.RemoveQuotes(contentType.Boundary);
if (string.IsNullOrWhiteSpace(boundary.Value))
{
throw new InvalidDataException("Missing content-type boundary.");
}
if (boundary.Length > lengthLimit)
{
throw new InvalidDataException(
$"Multipart boundary length limit {lengthLimit} exceeded.");
}
return boundary.Value;
}
}
The MultipartRequestHelper.cs class is available here
And Finally the Test class:
[Collection(TestCollection.Name)]
public class UploadImagesTests
{
readonly UploadImages UploadImagesFunction;
public UploadImagesTests(TestHost testHost)
{
UploadImagesFunction = new UploadImages(testHost.ServiceProvider.GetRequiredService<IBlobService>());
}
[Theory]
[InlineData("testfile2.jpg")]
public async void HttpTrigger_ShouldReturnListOfUploadedUris(string fileNames)
{
var formFile = GetFormFile(fileNames);
var fileStream = formFile.OpenReadStream();
var request = HttpRequestFactory.Create("POST", "multipart/form-data", fileStream);
var response = (OkObjectResult)await UploadImagesFunction.Run(request);
//fileStream.Close();
Assert.True(response.StatusCode == StatusCodes.Status200OK);
}
private static IFormFile GetFormFile(string fileName)
{
string fileExtension = fileName.Substring(fileName.IndexOf('.') + 1);
string fileNameandPath = GetFilePathWithName(fileName);
IFormFile formFile;
var stream = File.OpenRead(fileNameandPath);
switch (fileExtension)
{
case "jpg":
formFile = new FormFile(stream, 0, stream.Length,
fileName.Substring(0, fileName.IndexOf('.')),
fileName)
{
Headers = new HeaderDictionary(),
ContentType = "image/jpeg"
};
break;
case "png":
formFile = new FormFile(stream, 0, stream.Length,
fileName.Substring(0, fileName.IndexOf('.')),
fileName)
{
Headers = new HeaderDictionary(),
ContentType = "image/png"
};
break;
case "pdf":
formFile = new FormFile(stream, 0, stream.Length,
fileName.Substring(0, fileName.IndexOf('.')),
fileName)
{
Headers = new HeaderDictionary(),
ContentType = "application/pdf"
};
break;
default:
formFile = null;
break;
}
return formFile;
}
private static string GetFilePathWithName(string filename)
{
var outputFolder = Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location);
return $"{outputFolder.Substring(0, outputFolder.IndexOf("bin"))}testfiles\\{filename}";
}
}
The test seems to be hitting the function and req.ContentLength does have a value. Considering this, could it have something to do with the way the File Streams are being managed? Perhaps not the right way?
Any inputs on this would be greatly appreciated.
Thanks
UPDATE 1
As per this post, I have also tried setting the ValueLengthLimit and MultipartBodyLengthLimit in the Startup of the Azure Function and/or the Test Project as opposed to attributes on the Azure Function. The exception then changed to:
"The inner stream position has changed unexpectedly"
Following this, I then set the fileStream position in the test project to SeekOrigin.Begin. I started getting the same error:
"Multipart body length limit 16384 exceeded."
It took me a 50km bike ride and a good nights sleep but I finally figured this one out :-).
The Azure function (AF) accepts an HttpRequest object as a parameter with the name of 'req' i.e.
public async Task Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "images")] HttpRequest req)
The hierarchy of the files object in the HttpRequest object (along with the parameter names) is as follows:
HttpRequest -> req
FormCollection -> Form
FormFileCollection -> Files
This is what the AF accepts and one would access the files collection by using req.Form.Files
In my test case, instead of posting a FormCollection object, I was trying to post a Stream of a file to the Azure Function.
var formFile = GetFormFile(fileNames);
var fileStream = formFile.OpenReadStream();
var request = HttpRequestFactory.Create("POST", "multipart/form-data", fileStream);
As a result of this, req.Form had a Stream value that it could not interpret and the req.Form.Files was raising an exception.
In order to rectify this, I had to do the following:
Revert all changes made as part of UPDATE 1. This means that I removed the 'RequestFormLimits' settings from the Startup file and left them as attributes on the functions Run method.
Instantiate a FormFileCollection object and add the IFormFile to it
Instantiate a FormCollection object using this FormFileCollection as a parameter.
Add the FormCollection to the request object.
To achieve the above, I had to make the following changes in code.
Change 'Create' method in the HttpRequestFactory
public static DefaultHttpRequest Create(string method, string contentType, FormCollection formCollection)
{
var request = new DefaultHttpRequest(new DefaultHttpContext());
var boundary = $"----------------------------{DateTime.Now.Ticks.ToString("x")}";
request.Method = method;
request.Headers.Add("Cache-Control", "no-cache");
request.Headers.Add("Content-Type", contentType);
request.ContentType = $"{contentType}; boundary={boundary}";
request.Form = formCollection;
return request;
}
Add a private static GetFormFiles() method
I wrote an additional GetFormFiles() method that calls the existing GetFormFile() method, instantiate a FormFileCollection object and add the IFormFile to it. This method in turn returns a FormFileCollection.
private static FormFileCollection GetFormFiles(string fileNames)
{
var formFileCollection = new FormFileCollection();
foreach (var file in fileNames.Split(','))
{
formFileCollection.Add(GetFormFile(file));
}
return formFileCollection;
}
Change the Testmethod
The test method calls the GetFormFiles() to get a FormFileCollection then
instantiates a FormCollection object using this FormFileCollection as a parameter and then passes the FormCollection object as a parameter to the HttpRequest object instead of passing a Stream.
[Theory]
[InlineData("testfile2.jpg")]
public async void HttpTrigger_ShouldReturnListOfUploadedUris(string fileNames)
{
var formFiles = GetFormFiles(fileNames);
var formCollection = new FormCollection(null, formFiles);
var request = HttpRequestFactory.Create("POST", "multipart/form-data", formCollection);
var response = (OkObjectResult) await UploadImagesFunction.Run(request);
Assert.True(response.StatusCode == StatusCodes.Status200OK);
}
So in the end the issue was not really with the 'RequestFormLimits' but rather with the type of data I was submitting in the POST message.
I hope this answer provides a different perspective to someone that comes across the same error message.
Cheers.

reading content of blob from azure function

I'm trying to read the content of a blob inside an azure function.
Here's the code:
Note:
If I comment out the using block and return the blob i.e.
return new OkObjectResult(blob);
I get back the blob object.
However, if I use the using block, I get 500.
Any idea why I can't get the content?
string storageConnectionString = "myConnectionString";
CloudStorageAccount storageAccount;
CloudStorageAccount.TryParse(storageConnectionString, out storageAccount);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = cloudBlobClient.GetContainerReference("drawcontainer");
var blob = drawingsContainer.GetBlockBlobReference("notes.txt");
using (StreamReader reader = new StreamReader(blob.OpenRead()))
{
content = reader.ReadToEnd();
}
return new OkObjectResult(content);
HTTP 500 indicates that the code has error. The most probable reason for error is the variable 'content'. Define the variable 'content' outside the using block as the scope of the content variable defined inside it is limited to the block only. Declare it outside the using block, something like below:
try
{
string content = string.Empty;
using (StreamReader reader = new StreamReader(blob.OpenRead()))
{
content = reader.ReadToEnd();
}
}
catch (Exception ex)
{
// Log exception to get the details.
}
Always make use of try catch to get more details about errors in the code.
The OpenRead method didn't exist so I used the async one and it solved it.
I got to this solution after creating an azure function in VS and publishing it and it works.
Here's the code I used:
public static class Function1
{
[FunctionName("Function1")]
public static async Task<ActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=avitest19a1c;AccountKey=<AccessKey>";
CloudStorageAccount storageAccount = null;
CloudStorageAccount.TryParse(storageConnectionString, out storageAccount);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer drawingsContainer = cloudBlobClient.GetContainerReference("drawcontainer");
var blob = drawingsContainer.GetBlockBlobReference("notes.txt");
string content = string.Empty;
**var contentStream = await blob.OpenReadAsync();**
using (StreamReader reader = new StreamReader(contentStream))
{
content = reader.ReadToEnd();
}
return new OkObjectResult(content);
}
}

Azure Function Cosmos DB Output Binding - Custom JsonSerializerSettings

I have an Azure Function with a CosmosDB output binding, like this:
public static class ComponentDesignHttpTrigger
{
[FunctionName("ComponentDesignInserter-Http-From-ComponentDesign")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "fromComponentDesign")] HttpRequest request,
[CosmosDB(
databaseName: StorageFramework.CosmosDb.DatabaseId,
collectionName: Storage.ComponentDesignCollectionId,
ConnectionStringSetting = "CosmosDBConnection")] out ComponentDesign componentDesignToInsert,
ILogger log)
{
var requestBody = new StreamReader(request.Body).ReadToEnd();
componentDesignToInsert = JsonConvert.DeserializeObject<ComponentDesign>(requestBody);
return new OkObjectResult(componentDesignToInsert);
}
}
In this function componentDesignToInsert is automatically serialized and put into CosmosDB after the function finishes executing. But the default serialization does not put things in camelCase. For this Json.NET lets you provide custom serializer settings, like this:
var settings = new JsonSerializerSettings
{
ContractResolver = new CamelCasePropertyNamesContractResolver()
};
var json = JsonConvert.SerializeObject(yourObject, settings);
but I'm unsure how I can integrate this with my output binding. How can I accomplish this?
Output binding does not expose the serializer settings at this moment.
One thing you can do though, is leverage your own custom DocumentClient for the operation.
One important thing though is that the DocumentClient instance needs to be static (more details on https://github.com/Azure/azure-functions-host/wiki/Managing-Connections).
private static Lazy<DocumentClient> lazyClient = new Lazy<DocumentClient>(InitializeDocumentClient);
private static DocumentClient documentClient => lazyClient.Value;
private static DocumentClient InitializeDocumentClient()
{
// Perform any initialization here
var uri = new Uri("example");
var authKey = "authKey";
var settings = new JsonSerializerSettings
{
ContractResolver = new CamelCasePropertyNamesContractResolver()
};
return new DocumentClient(uri, authKey, settings);
}
[FunctionName("ComponentDesignInserter-Http-From-ComponentDesign")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "fromComponentDesign")] HttpRequest request,
ILogger log)
{
var requestBody = new StreamReader(request.Body).ReadToEnd();
var componentDesignToInsert = JsonConvert.DeserializeObject<ComponentDesign>(requestBody);
var collectionUri = UriFactory.GetDocumentCollectionUri(StorageFramework.CosmosDb.DatabaseId, Storage.ComponentDesignCollectionId);
await documentClient.UpsertDocumentAsync(collectionUri, componentDesignToInsert);
return new OkObjectResult(componentDesignToInsert);
}
Another option is to decorate the class with JsonProperty if that suits your scenario.

How to upload a file to a storage location through URL using Azure function app

I want to upload the upload a file to a storage location through URL using Azure function app from Azure blob storage. I'm able to pull the file from Azure blob. But not able to upload the file through url.
Below I have attached the code which i have written. Could anyone help me on this?
#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"
#r "System.IO"
using System;
using System.IO;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Auth;
using System.Xml;
using System.Collections.Generic;
using Newtonsoft.Json;
using System.Net;
public static void Run(string input, TraceWriter log)
{
log.Info($"C# manual trigger function processed\n");
const string StorageAccountName = "";
const string StorageAccountKey = "";
var storageAccount = new CloudStorageAccount(new StorageCredentials(StorageAccountName, StorageAccountKey), true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("hannahtest");
var Destcontainer = blobClient.GetContainerReference("hannahtestoutput");
var blobs = container.ListBlobs();
log.Info($"Creating Client and Connecting");
foreach (IListBlobItem item in container.ListBlobs(null, false))
{
if (item is CloudBlockBlob blockBlob)
{
using (StreamReader reader = new StreamReader(blockBlob.OpenRead())
{
//old content string will read the blockblob (xml)till end
string oldContent1 = reader.ReadToEnd();
log.Info(oldContent1);
var content = new FormUrlEncodedContent(oldContent1);
var response = await client.PostAsync("http://www.example.com/recepticle.aspx", content);
var responseString = await response.Content.ReadAsStringAsync();
log.Info($"Success");
}
}
}
}
Have a look at Blob Output Binding - that's how the blobs are intended to be uploaded from Azure Functions, without messing with Azure Storage SDK.
Azure function to upload multiple image file to blob storage.
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public static class ImageUploadFunction
{
[FunctionName("ImageUploadFunction")]
public static async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, "post")]HttpRequestMessage req, ILogger log)
{
var provider = new MultipartMemoryStreamProvider();
await req.Content.ReadAsMultipartAsync(provider);
var files = provider.Contents;
List<string> uploadsurls = new List<string>();
foreach (var file in files) {
var fileInfo = file.Headers.ContentDisposition;
Guid guid = Guid.NewGuid();
string oldFileName = fileInfo.FileName;
string newFileName = guid.ToString();
var fileExtension = oldFileName.Split('.').Last().Replace("\"", "").Trim();
var fileData = await file.ReadAsByteArrayAsync();
try {
//Upload file to azure blob storage method
var upload = await UploadFileToStorage(fileData, newFileName + "." + fileExtension);
uploadsurls.Add(upload);
}
catch (Exception ex) {
log.LogError(ex.Message);
return new BadRequestObjectResult("Somthing went wrong.");
}
}
return uploadsurls != null
? (ActionResult)new OkObjectResult(uploadsurls)
: new BadRequestObjectResult("Somthing went wrong.");
}
private static async Task<string> UploadFileToStorage(byte[] fileStream, string fileName)
{
// Create storagecredentials object by reading the values from the configuration (appsettings.json)
StorageCredentials storageCredentials = new StorageCredentials("<AccountName>", "<KeyValue>");
// Create cloudstorage account by passing the storagecredentials
CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Get reference to the blob container by passing the name by reading the value from the configuration (appsettings.json)
CloudBlobContainer container = blobClient.GetContainerReference("digital-material-library-images");
// Get the reference to the block blob from the container
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
// Upload the file
await blockBlob.UploadFromByteArrayAsync(fileStream,0, fileStream.Length);
blockBlob.Properties.ContentType = "image/jpg";
await blockBlob.SetPropertiesAsync();
return blockBlob.Uri.ToString();
//return await Task.FromResult(true);
}
}

Resources