Azure: Blob Trigger throwing Error in web Jobs - azure

I am trying to create a webjob with blog trigger and my motive to create to automate some process whenever new blog uploaded in container.
i have write few line of code to test weather my webjob is working or not,but it's not working and throwing below error:
Microsoft.WindowsAzure.Storage.StorageException was unhandled
HResult=-2146233088
Message=The remote server returned an error: (400) Bad Request.
Can you help me to short out the issue:
I am attaching snapshot whatever i write to achieve this.
Please see snap shot

I am not able to repro the issue. Look at the below code sample which works fine for me.
static void Main()
{
CreateDemoData();
// The connection string is read from App.config
JobHost host = new JobHost();
host.RunAndBlock();
}
private static void CreateDemoData()
{
string connectionString = AmbientConnectionStringProvider.Instance.GetConnectionString(ConnectionStringNames.Storage);
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("input");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("BlobOperations.txt");
blob.UploadText("Hell!");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference("persons");
queue.CreateIfNotExists();
Person person = new Person()
{
Name = "Mohit",
Age = 30
};
queue.AddMessage(new CloudQueueMessage(JsonConvert.SerializeObject(person)));
}
Reference to Git Hub Project: Microsoft Azure WebJobs SDK Samples

Related

.NET Core: Reading Azure Storage Blob into Memory Stream throws NotSupportedException in HttpBaseStream

I want to download a storage blob from Azure and stream it to a client via an .NET Web-App. The blob was uploaded correctly and is visible in my Azure storage account.
Surprisingly, the following throws an exception within HttpBaseStream:
[...]
var blobClient = _containerClient.GetBlobClient(Path.Combine(fileName));
var stream = await blobClient.OpenReadAsync();
return stream;
-> When i step further and return a File (return File(stream, MediaTypeNames.Application.Octet);), the download works as intended.
I tried to push the stream into an MemoryStream, which also fails with the same exception:
[...]
var blobClient = _containerClient.GetBlobClient(Path.Combine(fileName));
var stream = new MemoryStream();
await blobClient.DownloadToAsync(stream);
return stream
->When i step further, returning the file results in a timeout.
How can i fix that? Why do i get this exception - i followed the official quickstart guide from Microsoft.
the following throws an exception within HttpBaseStream
It looks like the HTTP result type is attempting to set the Content-Length header and is reading Length to do so. That would be the natural thing to do. However, it would also be natural to handle the NotSupportedException and just not set Content-Length at all.
If the NotSupportedException only shows up when running in the debugger, then just ignore it.
If the exception is actually thrown to your code (i.e., causing the request to fail), then you'll need to follow the rest of this answer.
First, create a minimal reproducible example and report a bug to the .NET team.
To work around this issue in the meantime, I recommend writing a stream wrapper that returns an already-determined length, which you can get from the Azure blob attributes. E.g.:
public sealed class KnownLengthStreamWrapper : Stream
{
private readonly Stream _stream;
public KnownLengthStreamWrapper(Stream stream, long length)
{
_stream = stream;
Length = length;
}
public override long Length { get; private set; }
... // override all other Stream members and forward to _stream.
}
That should be sufficient to get your app working.
I tried to push the stream into an MemoryStream
This didn't work because you'd need to "rewind" the MemoryStream at some point, e.g.:
var stream = new MemoryStream();
await blobClient.DownloadToAsync(stream);
stream.Position = 0;
return stream;
Check this sample of all the blob options which i have already posted on git working as expected. Reference
public void DownloadBlob(string path)
{
storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference("images");
CloudBlockBlob blockBlob = container.GetBlockBlobReference(Path.GetFileName(path));
using (MemoryStream ms = new MemoryStream())
{
blockBlob.DownloadToStream(ms);
HttpContext.Current.Response.ContentType = blockBlob.Properties.ContentType.ToString();
HttpContext.Current.Response.AddHeader("Content-Disposition", "Attachment; filename=" + Path.GetFileName(path).ToString());
HttpContext.Current.Response.AddHeader("Content-Length", blockBlob.Properties.Length.ToString());
HttpContext.Current.Response.BinaryWrite(ms.ToArray());
HttpContext.Current.Response.Flush();
HttpContext.Current.Response.Close();
}
}

Azure Blob Storage Uploads Fine, but Blob Doesn't Exist

Here is my code:
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using Microsoft.WindowsAzure;
using System.Net.Http;
namespace Test
{
class Program
{
static void Main(string[] args)
{
//get the storage account from the connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=[account name];AccountKey=[account key];EndpointSuffix=core.windows.net");
//instantiate the client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//set the container
CloudBlobContainer container = blobClient.GetContainerReference("images");
//get the blob reference
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.jpg");
//get image from stream and upload
using (var client = new HttpClient())
{
using (var stream = client.GetStreamAsync(some_url).GetAwaiter().GetResult())
{
if (stream != null)
{
blockBlob.UploadFromStreamAsync(stream);
}
}
client.Dispose();
}
}
}
}
The storage account instantiation works fine.
The container referencing works fine (it actually exists).
The block blob referencing works, as well, with no errors.
The stream has the image I am getting from the URL referenced.
Finally, the upload returns no errors.
Except, there is no image when I navigate to the Blob URI.
I get the following error:
The specified blob does not exist. RequestId:7df0aadc-0001-007c-6b90-f95158000000 Time:2017-07-10T15:21:25.2984015Z
I have also uploaded an image via the Azure Portal and that exists and can be navigated to through a browser.
Am I missing something?
Update below line in your code as you're calling async method.
blockBlob.UploadFromStreamAsync(stream).GetAwaiter().GetResult();
This should resolve your problem.

An unhandled exception of type 'System.StackOverflowException' occurred in Microsoft.WindowsAzure.Storage.dll

I am presently working with windows 10 universal application, here i am working with azure storage.
I am getting above error when i downloading file file from windows azure storage.
Here is my download code:
private async Task<int> DownloadFromAzureStorage()
{
try
{
// create Azure Storage
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=<myaccname>;AccountKey=<mykey>");
// create a blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// create a container
CloudBlobContainer container = blobClient.GetContainerReference("sample");
await container.CreateIfNotExistsAsync();
// create a block blob
CloudBlockBlob blockBlob = container.GetBlockBlobReference("abc.jpg");
FileSavePicker openPicker = new FileSavePicker();
openPicker.SuggestedStartLocation = PickerLocationId.PicturesLibrary;
openPicker.FileTypeChoices.Add("File", new List<string>() { ".jpg" });
openPicker.SuggestedFileName = "New Documents";
var imgFile = await openPicker.PickSaveFileAsync();
await blockBlob.DownloadToFileAsync(imgFile); **//Error occuring in this line**
return 1;
}
catch
{
// return error
return 0;
}
}
File Uploading to my azure storage is success, but when i download to my uploaded my it showing the error is "An unhandled exception of type 'System.StackOverflowException' occurred in Microsoft.WindowsAzure.Storage.dll" in await blockBlob.DownloadToFileAsync(imgFile); Line
please help me to resolve this issue..
This is a known issue with the DownloadToFileAsync() method in the Win10 Universal Storage Client (unfortunately). The bug is fixed in the current preview release (7.0.2-preview), and will be fixed in an upcoming non-preview release. To fix your issue for now, please change your DownloadToFile call to the following:
await blockBlob.DownloadToFileAsync(imgFile, null, null, null, CancellationToken.None);

Azure - Creating a queue keep returns "(400) bad request"

I am trying to simply create a new Storage Queue with Azure but it keeps crash without explanation, creating tables worked just fine, this is the relevant code:
private CloudTable userTable;
private CloudTable petTable;
private CloudQueue healingQueue;
public override bool OnStart()
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("connectionString"));
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
userTable = tableClient.GetTableReference("users");
userTable.CreateIfNotExists();
petTable = tableClient.GetTableReference("pets");
petTable.CreateIfNotExists();
// This is where I create the queue: //
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
healingQueue = queueClient.GetQueueReference("healQueue");
healingQueue.CreateIfNotExists(); // This line makes the code crash.
}
Code crashes in line healingQueue.CreateIfNotExists(); with the explanation of (400) bad request
Tables are created just fine so I assume there is no problem with the storage account, any ideas what I can do?
The problem is in the following line of code:
healingQueue = queueClient.GetQueueReference("healQueue");
Essentially the reason you're getting this error is because you are choosing an invalid name for a queue. Try using healqueue (all lowercase).
Please see this link for Queue naming rules: https://msdn.microsoft.com/en-us/library/azure/dd179349.aspx.

Getting an error when uploading a file to Azure Storage

I'm converting a website from a standard ASP.NET website over to use Azure. The website had previously taken an Excel file uploaded by an administrative user and saved it on the file system. As part of the migration, I'm saving this file to Azure Storage. It works fine when running against my local storage through the Azure SDK. (I'm using version 1.3 since I didn't want to upgrade during the development process.)
When I point the code to run against Azure Storage itself, though, the process usually fails. The error I get is:
System.IO.IOException occurred
Message=Unable to read data from the transport connection: The connection was closed.
Source=Microsoft.WindowsAzure.StorageClient
StackTrace:
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
at Framework.Common.AzureBlobInteraction.UploadToBlob(Stream stream, String BlobContainerName, String fileName, String contentType) in C:\Development\RateSolution2010\Framework.Common\AzureBlobInteraction.cs:line 95
InnerException:
The code is as follows:
public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
string contentType)
{
// Setup the connection to Windows Azure Storage
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());
DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
DiagnosticMonitor.Start(storageAccount, dmc);
CloudBlobClient BlobClient = null;
CloudBlobContainer BlobContainer = null;
BlobClient = storageAccount.CreateCloudBlobClient();
// For large file copies you need to set up a custom timeout period
// and using parallel settings appears to spread the copy across multiple threads
// if you have big bandwidth you can increase the thread number below
// because Azure accepts blobs broken into blocks in any order of arrival.
BlobClient.Timeout = new System.TimeSpan(1, 0, 0);
Role serviceRole = RoleEnvironment.Roles.Where(s => s.Value.Name == "OnlineRates.Web").First().Value;
BlobClient.ParallelOperationThreadCount = serviceRole.Instances.Count;
// Get and create the container
BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
BlobContainer.CreateIfNotExist();
//delete prior version if one exists
BlobRequestOptions options = new BlobRequestOptions();
options.DeleteSnapshotsOption = DeleteSnapshotsOption.None;
CloudBlob blobToDelete = BlobContainer.GetBlobReference(fileName);
Trace.WriteLine("Blob " + fileName + " deleted to be replaced by newer version.");
blobToDelete.DeleteIfExists(options);
//set stream to starting position
stream.Position = 0;
long totalBytes = 0;
//Open the stream and read it back.
using (stream)
{
// Create the Blob and upload the file
CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
try
{
BlobClient.ResponseReceived += new EventHandler<ResponseReceivedEventArgs>((obj, responseReceivedEventArgs)
=>
{
if (responseReceivedEventArgs.RequestUri.ToString().Contains("comp=block&blockid"))
{
totalBytes += Int64.Parse(responseReceivedEventArgs.RequestHeaders["Content-Length"]);
}
});
blob.UploadFromStream(stream);
// Set the metadata into the blob
blob.Metadata["FileName"] = fileName;
blob.SetMetadata();
// Set the properties
blob.Properties.ContentType = contentType;
blob.SetProperties();
}
catch (Exception exc)
{
Logging.ExceptionLogger.LogEx(exc);
}
}
}
I've tried a number of different alterations to the code: deleting a blob before replacing it (although the problem exists on new blobs as well), setting container permissions, not setting permissions, etc.
Your code looks like it should work, but it has lots of extra functionality that is not strictly required. I would cut it down to an absolute minimum and go from there. It's really only a gut feeling, but I think it might be the using statement giving you grief. This enture function could be written (presuming the container already exists) as:
public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
string contentType)
{
// Setup the connection to Windows Azure Storage
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());
CloudBlobClient BlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
stream.Position = 0;
blob.UploadFromStream(stream);
}
Notes on the stuff that I've removed:
You should set up diagnostics just once when you're app starts, not every time a method is called. Usually in the RoleEntryPoint.OnStart()
I'm not sure why you're trying to set ParallelOperationThreadCount higher if you have more instances. Those two things seem unrelated.
It's not good form to check for the existence of a container/table every time you save something to it. It's more usual to do that check once when your app starts or to have a process external to the website to make sure all the required containers/tables/queues exist. Of course if you're trying to dynamically create containers this is not true.
The problem turned out to be firewall settings on my laptop. It's my personal laptop originally set up at home and so the firewall rules weren't set up for a corporate environment resulting in slow performance on uploads and downloads.

Resources