Azure Queue duplicate items - azure

Does anyone see a problem with this code? I have a single Azure Queue in operation that I write to after each transaction on a Web Application. The queue is only called once per transaction, but items sometimes appear more than once - 10 duplicates in some cases. This is sporadic and happens rarely, but it is annoying nonetheless.
public async ActionResult DoWork()
{
//Carry out some processing
string jsonMessage = new JavaScriptSerializer().Serialize(MyDataObject);
await storageAccess.AddItemToQueue(new CloudQueueMessage(jsonMessage));
}
public async Task AddItemToQueue(CloudQueueMessage message)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
var queue = queueClient.GetQueueReference("outbox");
await queue.CreateIfNotExistsAsync();
await queue.AddMessageAsync(message);
}
If there is a timeout or a warning, could the AddMessageAsync method be trying to send it again until it succeeds?
Inspecting the queue, the items might look like this, where I have 3 items, all containing the same MyObject1 inserted well after the transaction closed.
1. ksasdsad-asdsad-asdsads-s222 12:05 MyObject1
2. sadadsd-weqwe-sdddsd-e21323 12:10 MyObject1
3. 32323-asdsads-2123213-sddwe 12:30 MyObject1

Related

Batch insert to Table Storage via Azure function

I have a following azure storage queue trigger azure function which is binded to azure table for the output.
[FunctionName("TestFunction")]
public static async Task<IActionResult> Run(
[QueueTrigger("myqueue", Connection = "connection")]string myQueueItem,
[Table("TableXyzObject"), StorageAccount("connection")] IAsyncCollector<TableXyzObject> tableXyzObjectRecords)
{
var tableAbcObject = new TableXyzObject();
try
{
tableAbcObject.PartitionKey = DateTime.UtcNow.ToString("MMddyyyy");
tableAbcObject.RowKey = Guid.NewGuid();
tableAbcObject.RandomString = myQueueItem;
await tableXyzObjectRecords.AddAsync(tableAbcObject);
}
catch (Exception ex)
{
}
return new OkObjectResult(tableAbcObject);
}
public class TableXyzObject : TableEntity
{
public string RandomString { get; set; }
}
}
}
I am looking for a way to read 15 messages from poisonqueue which is different than myqueue (queue trigger on above azure function) and batch insert it in to dynamic table (tableXyz, tableAbc etc) based on few conditions in the queue message. Since we have different poison queues, we want to pick up messages from multiple poison queues (name of the poison queue will be provided in the myqueue message). This is done to avoid to spinning up new azure function every time we have a new poison queue.
Following is the approach I have in my mind,
--> I might have to get 15 queue messages using queueClient (create new one) method - ReceiveMessages(15) of Azure.Storage.Queue package
--> And do a batch insert using TableBatchOperation class (cannot use output binding)
Is there any better approch than this?
Unfortunately, storage queues don't have a great solution for this. If you want it to be dynamic then the idea of implementing your own clients and table outputs is probably your best option. The one thing I would suggest changing is using a timer trigger instead of a queue trigger. If you are putting a message on your trigger queue every time you add something to the poison queue it would work as is, but if not a timer trigger ensures that poisoned messages are handled in a timely fashion.
Original Answer (incorrectly relating to Service Bus queues)
Bryan is correct that creating a new queue client inside your function isn't the best way to go about this. Fortunately, the Service Bus extension does allow batching. Unfortunately the docs haven't quite caught up yet.
Just make your trigger receive an array:
[QueueTrigger("myqueue", Connection = "connection")]string myQueueItem[]
You can set your max batch size in the host.json:
"extensions": {
"serviceBus": {
"batchOptions": {
"maxMessageCount": 15
}
}
}

Azure Storage Queue performance

We are migrating a transaction-processing service which was processing messages from MSMQ and storing transacitons in a SQLServer Database to use the Azure Storage Queue (to store the id's of the messages and placing the actual messages in the Azure Storage Blob).
We should at least be able to process 200.000 messages per hour, but at the moment we barely reach 50.000 messages per hour.
Our application requests batches of 250 messages from the Queue (which now takes about 2 seconds to get the id's from the azure queue and about 5 seconds to get the actual data from the azure blob storage) and we're storing this data in one time into the database using a stored procedure accepting a datatable.
Our service also resides in Azure on a virtual machine, and we use the nuget-libraries Azure.Storage.Queues and Azure.Storage.Blobs suggested by Microsoft to access the Azure Storage queue and blob storage.
Does anyone have suggestions how to improve the speed of reading messages from the Azure Queue and then retrieving the data from the Azure Blob?
var managedIdentity = new ManagedIdentityCredential();
UriBuilder fullUri = new UriBuilder()
{
Scheme = "https",
Host = string.Format("{0}.queue.core.windows.net",appSettings.StorageAccount),
Path = string.Format("{0}", appSettings.QueueName),
};
queue = new QueueClient(fullUri.Uri, managedIdentity);
queue.CreateIfNotExists();
...
var result = await queue.ReceiveMessagesAsync(1);
...
UriBuilder fullUri = new UriBuilder()
{
Scheme = "https",
Host = string.Format("{0}.blob.core.windows.net", storageAccount),
Path = string.Format("{0}", containerName),
};
_blobContainerClient = new BlobContainerClient(fullUri.Uri, managedIdentity);
_blobContainerClient.CreateIfNotExists();
...
public async Task<BlobMessage> GetBlobByNameAsync(string blobName)
{
Ensure.That(blobName).IsNotNullOrEmpty();
var blobClient = _blobContainerClient.GetBlobClient(blobName);
if (!blobClient.Exists())
{
_log.Error($"Blob {blobName} not found.");
throw new InfrastructureException($"Blob {blobName} not found.");
}
BlobDownloadInfo download = await blobClient.DownloadAsync();
return new BlobMessage
{
BlobName = blobClient.Name,
BaseStream = download.Content,
Content = await GetBlobContentAsync(download)
};
}
Thanks,
Vincent.
Based on the code you posted, I can suggest two improvements:
Receive 32 messages at a time instead of 1: Currently you're getting just one message at a time (var result = await queue.ReceiveMessagesAsync(1);). You can receive a maximum of 32 messages from the top of the queue. Just change the code to var result = await queue.ReceiveMessagesAsync(32); to get 32 messages. This will save you 31 trips to storage service and that should lead to some performance improvements.
Don't try to create blob container every time: Currently you're trying to create a blob container every time you process a message (_blobContainerClient.CreateIfNotExists();). It is really unnecessary. With fetching 32 messages, you're reducing this method call by 31 times however you can just move this code to your application startup so that you only call it once during your application lifecycle.

Azure Servicebus, MassTransit and DLQ's. Moving from DLQ to original queue

It really annoys me that we're unable to move messages from a Dead Letter Queue over to the Original Queue for processing when using Azure Servicebus. So, I figured out that I will try to implement this feature myself. We are using Masstransit to publish events. The queuename in ASB will be an events full assembly name.
I've created an REST Endpoint in my application to move messages from the DLQ to the original queue for reprocessing. This is where I'm stuck at the moment.
To get all messages in a DLQ, the user gives me the queuename, and I will format it to contain the DeadLetterQueue. Like this:
myproject.events.usercreatedevent -> myproject.events.usercreatedevent/$DeadLetterQueue
I get all the messages from this queue by using classes from the Nuget package Microsoft.Azure.Servicebus
public async Task RequeueMessagesAsync(string queueName)
{
var msg = new MessageReceiver(BuildConnectionString(), queueName);
var messages = await msg.PeekAsync(50);
foreach (var message in messages)
{
var content = Encoding.UTF8.GetString(message.Body);
var jsonObject = JsonConvert.DeserializeObject<JObject>(content);
var destinationAddress = jsonObject["destinationAddress"].ToString();
var messageContent = jsonObject["message"].ToString();
var messageType = destinationAddress.Split("/").Last();
await _bus.SendAsync(jsonObject, messageType);
}
}
The when calling _bus.SendAsync(object, address) the message ends up in a _skipped queue. I think the reason for this is that the messageHeaders is set to JObject, and not the actual message type. I cannot use reflection to recreated the event either, as we have a lot of microservices and source code of the event it not necessarily available. The code behind the _bus.SendAsync(object, address) looks like this:
public async Task SendAsync(object message, string queueName, CancellationToken cancellationToken = default)
{
ISendEndpoint sender = await GetSenderAsync(queueName);
sender.ConnectSendObserver(new ErrorQueueConfiguration(_addressProvider.GetAddress("error")));
await sender.Send(message, cancellationToken);
}
Can I trick Masstransit to forward this "unknown" type to my Consumer by changing the MessageHeaders somehow? Have anyone successfully moved messages from a DLQ to its original queue?

Azure queue trigger duplicate value

I have Azure Queue in operation that I write to after each execution on a azure time trigger.The time trigger added message to the queue. But in some times duplicate value added in azure queue. The queue trigger functionality is to send notifications to user in a particular time. But some time user get 3 notification.
In time tigger execute in 15 min interval. When the time trigger execute in 11.45Am, the user get notifications total 3.
At time 11.45Am, 11.50am, 11.55am etc. But actually we need to send only one notification in 11.45 am. I cheked the time trigger log, it execute only in 15 time interval.
public void AddToQueue(string queueData, string name, string connectionString)
{
//"DefaultEndpointsProtocol=https;AccountName=myAccount;AccountKey=c3RyaW5nIGxlbmd0aCB2YWxpZA=="
var storageAccount = CloudStorageAccount.Parse(connectionString);
var queueClient = storageAccount.CreateCloudQueueClient();
var queue = queueClient.GetQueueReference(name);
queue.CreateIfNotExistsAsync();
var message = new CloudQueueMessage(queueData);
queue.AddMessageAsync(message);
}
Is any dulicate added in queue or any issue in my code?
Note: total I have 3 queue trigger functions and one time trigger function.

ApproximateMessageCount always null after calling FetchAttributesAsync in a Universal windows App

I am making a small App that should list the number of items in my Azure queues.
When I use FetchAttributesAsync and ApproximateMessageCount in a Console App, I get the expected result in ApproximateMessageCount after a call to FetchAttributesAsync (or FetchAttributes).
When I use the same in a Universal Windows app, ApproximateMessageCount remains stuck at null after a call to FetchAttributesAsync (FetchAttributes is not available there).
Console code:
CloudStorageAccount _account;
if (CloudStorageAccount.TryParse(_connectionstring, out _account))
{
var queueClient = _account.CreateCloudQueueClient();
Console.WriteLine(" {0}", _account.QueueEndpoint);
Console.WriteLine(" ----------------------------------------------");
var queues = (await queueClient.ListQueuesSegmentedAsync(null)).Results;
foreach (CloudQueue q in queues)
{
await q.FetchAttributesAsync();
Console.WriteLine($" {q.Name,-40} {q.ApproximateMessageCount,5}");
}
}
Universal App code:
IEnumerable<CloudQueue> queues;
CloudStorageAccount _account;
CloudQueueClient queueClient;
CloudStorageAccount.TryParse(connectionstring, out _account);
queueClient = _account.CreateCloudQueueClient();
queues = (await queueClient.ListQueuesSegmentedAsync(null)).Results;
foreach (CloudQueue q in queues)
{
await q.FetchAttributesAsync();
var count = q.ApproximateMessageCount;
// count is always null here!!!
}
I have tried all kinds of alternatives, like Wait()'s and such on the awaitables. Whatever I try, the ApproximateMessageCount stays a null with dertermination :-(.
Am I missing something?
I think you have discovered a bug in the storage client library. I looked up the code on Github and essentially instead of reading the value for Approximate Message Count header, the code is reading the value for Lease Status header.
In QueueHttpResponseParsers.cs class:
public static string GetApproximateMessageCount(HttpResponseMessage response)
{
return response.Headers.GetHeaderSingleValueOrDefault(Constants.HeaderConstants.LeaseStatus);
}
This method should have been:
public static string GetApproximateMessageCount(HttpResponseMessage response)
{
return response.Headers.GetHeaderSingleValueOrDefault(Constants.HeaderConstants.ApproximateMessagesCount);
}
I have submitted a bug for this: https://github.com/Azure/azure-storage-net/issues/155.

Resources