Blazor server side upload file to Azure Blob Storage - azure

I am using a file upload component that posts the file to an API controller and this works ok but i need to get the progres of the upload.
[HttpPost("upload/single")]
public async Task SingleAsync(IFormFile file)
{
try
{
// Azure connection string and container name passed as an argument to get the Blob reference of the container.
var container = new BlobContainerClient(azureConnectionString, "upload-container");
// Method to create our container if it doesn’t exist.
var createResponse = await container.CreateIfNotExistsAsync();
// If container successfully created, then set public access type to Blob.
if (createResponse != null && createResponse.GetRawResponse().Status == 201)
await container.SetAccessPolicyAsync(Azure.Storage.Blobs.Models.PublicAccessType.Blob);
// Method to create a new Blob client.
var blob = container.GetBlobClient(file.FileName);
// If a blob with the same name exists, then we delete the Blob and its snapshots.
await blob.DeleteIfExistsAsync(Azure.Storage.Blobs.Models.DeleteSnapshotsOption.IncludeSnapshots);
// Create a file stream and use the UploadSync method to upload the Blob.
uploadFileSize = file.Length;
var progressHandler = new Progress<long>();
progressHandler.ProgressChanged += UploadProgressChanged;
using (var fileStream = file.OpenReadStream())
{
await blob.UploadAsync(fileStream, new BlobHttpHeaders { ContentType = file.ContentType },progressHandler:progressHandler);
}
Response.StatusCode = 400;
}
catch (Exception e)
{
Response.Clear();
Response.StatusCode = 204;
Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload";
Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message;
}
}
private double GetProgressPercentage(double totalSize, double currentSize)
{
return (currentSize / totalSize) * 100;
}
private void UploadProgressChanged(object sender, long bytesUploaded)
{
uploadPercentage = GetProgressPercentage(uploadFileSize, bytesUploaded);
}
I am posting this file and it does upload but the file upload progress event is inaccurate it says the file upload is complete after a few seconds when in reality the file takes ~90 secs on my connection to appear in the Azure Blob Storage container.
So in the code above i have the progress handler which works (I can put a break point on it and see it increasing) but how do I return this value to the UI?
I found one solution that used Microsoft.AspNetCore.SignalR but I can't manage to integrate this into my own code and I'm not even sure if I'm on the right track.
using BlazorReportProgress.Server.Hubs;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.SignalR;
using System.Threading;
namespace BlazorReportProgress.Server.Controllers
{
[ApiController]
[Route("api/[controller]")]
public class SlowProcessController : ControllerBase
{
private readonly ILogger<SlowProcessController> _logger;
private readonly IHubContext<ProgressHub> _hubController;
public SlowProcessController(
ILogger<SlowProcessController> logger,
IHubContext<ProgressHub> hubContext)
{
_logger = logger;
_hubController = hubContext;
}
[HttpGet("{ClientID}")]
public IEnumerable<int> Get(string ClientID)
{
List<int> retVal = new();
_logger.LogInformation("Incoming call from ClientID : {ClientID}", ClientID);
_hubController.Clients.Client(ClientID).SendAsync("ProgressReport", "Starting...");
Thread.Sleep(1000);
for (int loop = 0; loop < 10; loop++)
{
_hubController.Clients.Client(ClientID).SendAsync("ProgressReport", loop.ToString());
retVal.Add(loop);
Thread.Sleep(500);
}
_hubController.Clients.Client(ClientID).SendAsync("ProgressReport", "Done!");
return retVal;
}
}
}
I read the Steve Sandersen blog but this says not to use the code as its been superceded by inbuilt blazor functionality.
My application is only for a few users and so i'm not too worried about backend APIs etc, If the upload component used a service not a controller I could more easily get the progress, but the compoents all seem to post to controllers.
Can anyone please enlighten me as to the best way to solve this?

Related

Azure Cognitive Service\Computer Visio\OCR - Can I use it into into WebSite C#

I'm trying to use Azure Ocr into my website c#.
I added the package Microsoft.Azure.CognitiveServices.Vision.ComputerVision and I wrote code, with key and endpoint of my subscription.
static string subscriptionKey = "mykey";
static string endpoint = "https://myocr.cognitiveservices.azure.com/";
private const string ANALYZE_URL_IMAGE = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-sample-data-files/master/ComputerVision/Images/printed_text.jpg";
protected void Page_Load(object sender, EventArgs e)
{
// Create a client
ComputerVisionClient client = Authenticate(endpoint, subscriptionKey);
// Analyze an image to get features and other properties.
AnalyzeImageUrl(client, ANALYZE_URL_IMAGE).Wait();
}
public static ComputerVisionClient Authenticate(string endpoint, string key)
{
ComputerVisionClient client =
new ComputerVisionClient(new ApiKeyServiceClientCredentials(key))
{ Endpoint = endpoint };
return client;
}
public static async Task AnalyzeImageUrl(ComputerVisionClient client, string imageUrl)
{
// Read text from URL
var textHeaders = await client.ReadAsync(imageUrl);
...
}
It seems all ok, but at line
var textHeaders = await client.ReadAsync(urlFile);
website crashes.
I don't understand why. No error, it's just stopped.
So I ask: azure ocr can to be use only with console app?
EDIT
The code is ok for ConsoleApp and WebApp but not working for my asp.net WEBSITE.
Could be a problem with async?
We can use OCR with web app also,I have taken the .net core 3.1 webapp in Visual Studio and installed the dependency of Microsoft.Azure.CognitiveServices.Vision.ComputerVision by selecting the check mark of include prerelease as shown in the below image:
After creating computer vision resource in Azure Portal, copied the key and endpoint from there and used inside the c# code.
using System;
using System.Collections.Generic;
using Microsoft.Azure.CognitiveServices.Vision.ComputerVision;
using Microsoft.Azure.CognitiveServices.Vision.ComputerVision.Models;
using System.Threading.Tasks;
using System.IO;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Threading;
using System.Linq;
namespace ComputerVisionQuickstart
{
class Program
{
// Add your Computer Vision subscription key and endpoint
static string subscriptionKey = "c1****b********";
static string endpoint = ".abc.cognitiveservices.azure.com/";
private const string READ_TEXT_URL_IMAGE = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-sample-data-files/master/ComputerVision/Images/printed_text.jpg";
static void Main(string[] args)
{
Console.WriteLine("Azure Cognitive Services Computer Vision - .NET quickstart example");
Console.WriteLine();
ComputerVisionClient client = Authenticate(endpoint, subscriptionKey);
// Extract text (OCR) from a URL image using the Read API
ReadFileUrl(client, READ_TEXT_URL_IMAGE).Wait();
}
public static ComputerVisionClient Authenticate(string endpoint, string key)
{
ComputerVisionClient client =
new ComputerVisionClient(new ApiKeyServiceClientCredentials(key))
{ Endpoint = endpoint };
return client;
}
public static async Task ReadFileUrl(ComputerVisionClient client, string urlFile)
{
Console.WriteLine("----------------------------------------------------------");
Console.WriteLine("READ FILE FROM URL");
Console.WriteLine();
// Read text from URL
var textHeaders = await client.ReadAsync(urlFile);
// After the request, get the operation location (operation ID)
string operationLocation = textHeaders.OperationLocation;
Thread.Sleep(2000);
// Retrieve the URI where the extracted text will be stored from the Operation-Location header.
// We only need the ID and not the full URL
const int numberOfCharsInOperationId = 36;
string operationId = operationLocation.Substring(operationLocation.Length - numberOfCharsInOperationId);
// Extract the text
ReadOperationResult results;
Console.WriteLine($"Extracting text from URL file {Path.GetFileName(urlFile)}...");
Console.WriteLine();
do
{
results = await client.GetReadResultAsync(Guid.Parse(operationId));
}
while ((results.Status == OperationStatusCodes.Running ||
results.Status == OperationStatusCodes.NotStarted));
// Display the found text.
Console.WriteLine();
var textUrlFileResults = results.AnalyzeResult.ReadResults;
foreach (ReadResult page in textUrlFileResults)
{
foreach (Line line in page.Lines)
{
Console.WriteLine(line.Text);
}
}
Console.WriteLine();
}
}
}
The above code is taken from the Microsoft Document.
I can be able to read the text inside the image successfully as shown in the below screenshot:

IFormFileCollection: Cannot access a Disposed Object

I am trying to upload a document into Azure Blob but it throws an error as below while reading the file stream.
Error,
Cannot access a disposed object.
Object name: 'FileBufferingReadStream'.
The error comes when the code OpenReadStream() called,
public async Task UploadAsync(IFormFileCollection files, string directoryName)
{
var blobContainer = await _azureBlobConnectionFactory.GetBlobContainer();
CloudBlobDirectory directory = blobContainer.GetDirectoryReference(directoryName);
for (int i = 0; i < files.Count; i++)
{
CloudBlockBlob blockblob = directory.GetBlockBlobReference(files[i].FileName);
using (var stream = files[i].OpenReadStream())
{
await blockblob.UploadFromStreamAsync(stream);
}
}
}
I am calling the UploadAsync method from my actual service classs like,
public async Task<bool> UploadToBlob(DocumentModel model,string directorypath)
{
try
{
//string directory = directorypath + model.EmailId + "/" + model.Files[0].FileName;
await _blobService.UploadAsync(model.Files, "Documents/dummy.pdf");
return true;
}
catch(Exception e)
{
throw e;
}
}
Where I did go wrong?
The return method of Action method can be a possible reason for this issue
The error occurs if the return type is not Task<T>.
You can modify your Action Method as follows:
[HttpPost]
public async Task<int> ActionMethod(IFormFile img)
{
// same
return resultValue;
}
References :
When trying to upload file, FileBufferingReadStream.ThrowIfDisposed() error occur
Cannot access a disposed object. Object name: FileBufferingReadStream
IFormFile copy to memorystream ObjectDisposedException

Can I set the access tier when I upload a blob? If yes, then how to do that?

I did not find any way to set the access tier of a blob when I upload it, I know I can set a blob's access tier after I uploaded it, but I just want to know if I can upload the blob and set it's access tier in only one step. And if there is any golang API to do that?
I googled it but I got nothing helpful till now.
Here is what I did now, I mean upload it and then set it's access tier.
// Here's how to upload a blob.
blobURL := containerURL.NewBlockBlobURL(fileName)
ctx := context.Background()
_, err = azblob.UploadBufferToBlockBlob(ctx, data, blobURL, azblob.UploadToBlockBlobOptions{})
handleErrors(err)
//set tier
_, err = blobURL.SetTier(ctx, azblob.AccessTierCool, azblob.LeaseAccessConditions{})
handleErrors(err)
But I want to upload a blob and set it's tier in one step, not two steps as I do now.
The short answer is No. According to the offical REST API reference, the blob operation you want is that to do via two REST APIs Put Blob and Set Blob Tier. Actually, all SDK APIs for different languages are implemented by wrapping the related REST APIs.
Except for Page Blob, you can set the header x-ms-access-tier in your operation request to do your want, as below.
For Block Blob, the operations in two steps are necessary, and can not be merged.
It is now possible using the new x-ms-access-tier header:
x-ms-access-tier
REST API with auth
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Net.Mime;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
namespace WhateverYourNameSpaceIs
{
class Program
{
private const string StorageKey = #"PutYourStorageKeyHere";
private const string StorageAccount = "PutYourStorageAccountHere";
private const string ContainerName = "PutYourContainerNameHere";
private const string Method = "PUT";
private const string ContentType = MediaTypeNames.Image.Jpeg;
private static readonly string BlobStorageTier = StorageTier.Cool;
private static readonly List<Tuple<string, string>> HttpContentHeaders = new List<Tuple<string, string>>()
{
new Tuple<string, string>("x-ms-access-tier", BlobStorageTier),
new Tuple<string, string>("x-ms-blob-type", "BlockBlob"),
new Tuple<string, string>("x-ms-date", DateTime.UtcNow.ToString("R")),
new Tuple<string, string>("x-ms-version", "2018-11-09"),
new Tuple<string, string>("Content-Type", ContentType),
};
static async Task Main()
{
await UploadBlobToAzure("DestinationFileNameWithoutPath", "LocalFileNameWithPath");
}
static async Task<int> UploadBlobToAzure(string blobName, string fileName)
{
int returnValue = (int)AzureCopyStatus.Unknown;
try
{
using var client = new HttpClient();
using var content = new ByteArrayContent(File.ReadAllBytes(fileName));
HttpContentHeaders.ForEach(x => content.Headers.Add(x.Item1, x.Item2));
var stringToSign = $"{Method}\n\n\n{content.Headers.ContentLength.Value}\n\n{ContentType}\n\n\n\n\n\n\n";
foreach (var httpContentHeader in HttpContentHeaders.Where(x => x.Item1 != "Content-Type").OrderBy(x => x.Item1))
stringToSign += $"{httpContentHeader.Item1.ToLower()}:{httpContentHeader.Item2}\n";
stringToSign += $"/{StorageAccount}/{ContainerName}/{blobName}";
HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(StorageKey));
string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("SharedKey", $"{StorageAccount}:{signature}");
var httpResponse = await client.PutAsync($"https://{StorageAccount}.blob.core.windows.net/{ContainerName}/{blobName}", content);
returnValue = (int)httpResponse.StatusCode;
}
catch (IOException ioException)
{
Console.WriteLine(ioException.ToString());
returnValue = (int)AzureCopyStatus.FileNotFound;
}
catch (Exception exception)
{
Console.WriteLine(exception.ToString());
returnValue = (int)AzureCopyStatus.Error;
}
return returnValue;
}
internal enum AzureCopyStatus
{
Unknown = -1,
Error = 0,
FileNotFound = 2
}
internal static class StorageTier
{
internal static string Cool = "Cool";
internal static string Hot = "Hot";
}
}
}

Any Example of WebJob using EventHub?

I've tried to come up with something from the example in the WebJobsSDK gitHub
var eventHubConfig = new EventHubConfiguration();
string eventHubName = "MyHubName";
eventHubConfig.AddSender(eventHubName,"Endpoint=sb://test.servicebus.windows.net/;SharedAccessKeyName=SendRule;SharedAccessKey=xxxxxxxx");
eventHubConfig.AddReceiver(eventHubName, "Endpoint=sb://test.servicebus.windows.net/;SharedAccessKeyName=ReceiveRule;SharedAccessKey=yyyyyyy");
config.UseEventHub(eventHubConfig);
JobHost host = new JobHost(config);
But I'm afraid that's not far enough for someone of my limited "skillset"!
I can find no instance of JobHostConfiguration that has a UseEventHub property (using the v1.2.0-alpha-10291 version of the Microsoft.AzureWebJobs package), so I can't pass the EventHubConfiguration to the JobHost.
I've used EventHub before, not within the WebJob context. I don't see if the EventHostProcessor is still required if using the WebJob triggering...or does the WebJob trigger essentially act as the EventHostProcessor?
Anyway, if anyone has a more complete example for a simpleton like me that would be really sweet! Thanks
From the documentation here, you should have all the information you need.
What you are missing is a reference of the Microsoft.Azure.WebJobs.ServiceBus.1.2.0-alpha-10291 nuget package.
The UseEventHub is an extension method that is declared in this package.
Otherwise your configuration seems ok.
Here is an example on how to receive or send messages from/to an EventHub:
public class BasicTest
{
public class Payload
{
public int Counter { get; set; }
}
public static void SendEvents([EventHub("MyHubName")] out Payload x)
{
x = new Payload { Counter = 100 };
}
public static void Trigger(
[EventHubTrigger("MyHubName")] Payload x,
[EventHub("MyHubName")] out Payload y)
{
x.Counter++;
y = x;
}
}
EventProcessorHost is still required, as the WebJob just provides the hosting environment for running it. As far as I know, EventProcessorHost is not integrated so deeply into WebJob, so its triggering mechanism cannot be used for processing EventHub messages. I use WebJob for running EventProcessorHost continuously:
public static void Main()
{
RunAsync().Wait();
}
private static async Task RunAsync()
{
try
{
using (var shutdownWatcher = new WebJobsShutdownWatcher())
{
await Console.Out.WriteLineAsync("Initializing...");
var eventProcessorHostName = "eventProcessorHostName";
var eventHubName = ConfigurationManager.AppSettings["eventHubName"];
var consumerGroupName = ConfigurationManager.AppSettings["eventHubConsumerGroupName"];
var eventHubConnectionString = ConfigurationManager.ConnectionStrings["EventHub"].ConnectionString;
var storageConnectionString = ConfigurationManager.ConnectionStrings["EventHubStorage"].ConnectionString;
var eventProcessorHost = new EventProcessorHost(eventProcessorHostName, eventHubName, consumerGroupName, eventHubConnectionString, storageConnectionString);
await Console.Out.WriteLineAsync("Registering event processors...");
var processorOptions = new EventProcessorOptions();
processorOptions.ExceptionReceived += ProcessorOptions_ExceptionReceived;
await eventProcessorHost.RegisterEventProcessorAsync<CustomEventProcessor>(processorOptions);
await Console.Out.WriteLineAsync("Processing...");
await Task.Delay(Timeout.Infinite, shutdownWatcher.Token);
await Console.Out.WriteLineAsync("Unregistering event processors...");
await eventProcessorHost.UnregisterEventProcessorAsync();
await Console.Out.WriteLineAsync("Finished.");
}
catch (Exception ex)
{
await HandleErrorAsync(ex);
}
}
}
private static async void ProcessorOptions_ExceptionReceived(object sender, ExceptionReceivedEventArgs e)
{
await HandleErrorAsync(e.Exception);
}
private static async Task HandleErrorAsync(Exception ex)
{
await Console.Error.WriteLineAsync($"Critical error occured: {ex.Message}{ex.StackTrace}");
}

UWP apps accessing files from random location on system

in UWP there are files and permissions restrictions, so we can only acces files directly from few folders or we can use filepicker to access from anywhere on system.
how can I use the files picked from filepicker and use them anytime again when the app runs ? tried to use them again by path but it gives permission error. I know about the "futureacceslist" but its limit is 1000 and also it will make the app slow if I am not wrong? .
Is there a better way to do this ? or can we store storage files link somehow in local sqlite database?
If you need to access lots of files, asking the user to select the parent folder and then storing that is probably a better solution (unless you want to store 1,000 individually-picked files from different locations). You can store StorageFolders in the access list as well.
I'm not sure why you think it will make your app slow, but the only real way to know if this will affect your performance is to try it and measure against your goals.
Considering this method..
public async static Task<byte[]> ToByteArray(this StorageFile file)
{
byte[] fileBytes = null;
using (IRandomAccessStreamWithContentType stream = await file.OpenReadAsync())
{
fileBytes = new byte[stream.Size];
using (DataReader reader = new DataReader(stream))
{
await reader.LoadAsync((uint)stream.Size);
reader.ReadBytes(fileBytes);
}
}
return fileBytes;
}
This class..
public class AppFile
{
public string FileName { get; set; }
public byte[] ByteArray { get; set; }
}
And this variable
List<AppFile> _appFiles = new List<AppFile>();
Just..
var fileOpenPicker = new FileOpenPicker();
IReadOnlyList<StorageFile> files = await fileOpenPicker.PickMultipleFilesAsync();
foreach (var file in files)
{
var byteArray = await file.ToByteArray();
_appFiles.Add(new AppFile { FileName = file.DisplayName, ByteArray = byteArray });
}
UPDATE
using Newtonsoft.Json;
using System.Linq;
using Windows.Security.Credentials;
using Windows.Storage;
namespace Your.Namespace
{
public class StateService
{
public void SaveState<T>(string key, T value)
{
var localSettings = ApplicationData.Current.LocalSettings;
localSettings.Values[key] = JsonConvert.SerializeObject(value);
}
public T LoadState<T>(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
return JsonConvert.DeserializeObject<T>(((string) localSettings.Values[key]));
return default(T);
}
public void RemoveState(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
localSettings.Values.Remove((key));
}
public void Clear()
{
ApplicationData.Current.LocalSettings.Values.Clear();
}
}
}
A bit late, but, yes the future access list will slow down your app in that it returns storagfile, storagefolder, or storeageitem objects. These run via the runtime broker which hits a huge performance barrier at about 400 objects regardless of the host capability

Resources