Why do I see a FunctionIndexingException when creating a QueueTrigger WebJob Function? - azure

I created a function like this
public static Task HandleStorageQueueMessageAsync(
[QueueTrigger("%QueueName%", Connection = "%ConnectionStringName%")] string body,
TextWriter logger)
{
if (logger == null)
{
throw new ArgumentNullException(nameof(logger));
}
logger.WriteLine(body);
return Task.CompletedTask;
}
The queue name and the connection string name come from my configuration that has an INameResolver to get the values. The connection string itself I put from my secret store into the app config at app start. If the connection string is a normal storage connection string granting all permissions for the whole account, the method works like expected.
However, in my scenario I am getting an SAS from a partner team that only offers read access to a single queue. I created a storage connection string from that which looks similar like
QueueEndpoint=https://accountname.queue.core.windows.net;SharedAccessSignature=st=2017-09-24T07%3A29%3A00Z&se=2019-09-25T07%3A29%3A00Z&sp=r&sv=2018-03-28&sig=token
(I tried successfully to connect using this connection string in Microsoft Azure Storage Explorer)
The queue name used in the QueueTrigger attribute is also gathered from the SAS
However, now I am getting the following exceptions
$exception {"Error indexing method 'Functions.HandleStorageQueueMessageAsync'"} Microsoft.Azure.WebJobs.Host.Indexers.FunctionIndexingException
InnerException {"No blob endpoint configured."} System.Exception {System.InvalidOperationException}
If you look at the connection string, you can see the exception is right. I did not configure the blob endpoint. However I also don't have access to it and neither do I want to use it. I'm using the storage account only for this QueueTrigger.
I am using Microsoft.Azure.WebJobs v2.2.0. Other dependencies prevent me from upgrading to a v3.x
What is the recommended way for consuming messages from a storage queue when only a SAS URI with read access to a single queue is available? If I am already on the right path, what do I need to do in order to get rid of the exception?

As you have seen, v2 WebJobs SDK requires access to blob endpoint as well. I am afraid it's by design, using connection string without full access like SAS is an improvement tracked but not realized yet.
Here are the permissions required by v2 SDK. It needs to get Blob Service properties(Blob,Service,Read) and Queue Metadata and process messages(Queue,Container&Object,Read&Process).
Queue Trigger is to get messages and delete them after processing, so SAS requires Process permission. It means the SAS string you got is not authorized correctly even if SDK doesn't require blob access.
You could ask partner team to generate SAS Connection String on Azure portal with minimum permissions above. If they can't provide blob access, v3 SDK seems an option to try.
But there are some problems 1. Other dependencies prevent updating as you mentioned 2. v3 SDK is based on .NET Core which means code changes can't be avoided. 3. v3 SDK document and samples are still under construction right now.

I was having a load of issues getting a SAS token to work for a QueueTrigger.
Not having blob included was my problem. Thanks Jerry!
Slightly newer screenshot (I need add also):

Related

Azure Data Explorer oneclick Ingest from blob container (UI)

I'm trying to configure and use the Azure Data Explorer OneClick Ingest from blob container (continous ingest).
Whatever I try the URL is never accepted, I always end up with this error:
Invalid URL. Either the URL leads to a blob instead of a container, or the permissions are incorrect. If you just grant permission, please wait couple of minutes and try again.
The URL I'm using follow that pattern:
https://mystorageaccount.blob.core.windows.net/mycontainer?sp=rl&st=2022-04-26T22:01:42Z&se=2032-04-27T06:01:42Z&spr=https&sv=2020-08-04&sr=c&sig=Z4Mlh7s5%2Fm1890kdfzlkYLSIHHDdGJmTSyYXVYsHdn01o%3D
I'm probably missing something, either in the URL syntax ou SAS generation.
Has anyone successfully used it? Any idea what could be wrong?
Thanks
I finally found out what was the issue.
Probably due to the security in place on my Storage account I had to create in Azure Data Explorer Networking panel, a Managed private enpoint, pointing to my storage resource (and then approve that endpoint in the storage account Networking)
https://learn.microsoft.com/en-us/azure/data-explorer/security-network-managed-private-endpoint-create

Azure Storage Queue message to Azure Blob Storage

I have access to a Azure Storage Queue using a connection string which was provided to me (not my created queue). The messages are sent once every minute. I want to take all the messages and place them in Azure Blob Storage.
My issue is that I haven't been succesful in getting the message from the attached Storage Queue. What is the "easiest" way of doing this data storage?
I've tried accessing the external queue using Logic Apps and then tried to place it in my own queue before moving it to Blob Storage, however without luck.
If you want to access and external storage in the logic app, you will need the name of the storage account and the Key.
You have to choose the trigger for an azure queues and then click in the "Manually enter connection information".
And in the next step you will be able to choose the queue you want to listen for.
I recomend you to use and azure function, something like in this article:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-output?tabs=csharp
Firts you can try only reading the messages, and then add the output that create your blob:
[FunctionName("GetMessagesFromQueue")]
public IActionResult GetMessagesFromQueue(
[QueueTrigger("%ExternalStorage.QueueName%", Connection = "ExternalStorage.StorageConnection")ModelMessage modelmessage,
[Blob("%YourStorage.ContainerName%/{id}", FileAccess.Write, Connection = "YourStorage.StorageConnection")] Stream myBlob)
{
//put the modelmessage into the stream
}
You can bind to a lot of types not only Stream. In the link you have all the information.
I hope I've helped

BlobTrigger in Functions Runtime preview 2 with local storage account

I have the Functions Runtime preview 2 installed .
I was able to create and run functions with timer trigger. But the Blob trigger doesn't , seem to 'trigger'.
I am using local azure explorer ( local development blob container) as my trigger source.
Is this a known issue?
i notice that the mouseover in 'integrate' section shows endpoint protocol as https. i specified http endpoint when creating it( since thats what the local storage emulator supports). but runtime seems to be picking up https on its own.
I was able to create and run functions with timer trigger. But the Blob trigger doesn't , seem to 'trigger'.
It seems that the issue is your storage account connection string. You only write the Blob endpoint in your connection string. If you don't want to use 'UseDevelopmentStorage=true', You need to write the complete connection string instead:
DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;
You could also check the function logs on your side. Maybe there is an error.
i notice that the mouseover in 'integrate' section shows endpoint protocol as https
I have tested on my side, the system is based on endpoint url 'http'. Even if the DefaultEndpointsProtocol=https, we still could use the storage connection string successfully. If we setDefaultEndpointsProtocol=http and endpoint url to 'https', the connection string would not work.

Do I need an Azure Storage Account to run a WebJob?

So I'm fairly new to working with Azure and there are some things I can't quite wrap my head around. One of them being the Azure Storage Account.
My web jobs keeps stopping with the following error "Unhandled Exception: System.InvalidOperationException: The account credentials for '[account_name]' are incorrect." Understanding the error however is not the problem, at least that's what I think. The problem lies in understanding why I need an Azure Storage Account to overcome it.
Please read on as I try to take you through the steps taken thus far. Hopefuly the real question will become more clear to you.
In my efforts to deploy a WebJob on Azure we have created the following resources so far:
App Service Plan
App Service
SQL server
SQL database
I'm using the following code snippet to prevent my web job from exiting:
JobHostConfiguration config = new JobHostConfiguration();
config.DashboardConnectionString = null;
new JobHost(config).RunAndBlock();
To my understanding from other sources the Dashboard connection string is optional but the AzureWebJobsStorage connection string is required.
I tried setting the required connection string in portal using the configuration found here.
DefaultEndpointsProtocol=[http|https];AccountName=myAccountName;AccountKey=myAccountKey
Looking further I found this answer that clearly states where I would get the values needed, namely an/my missing Azure Storage Account.
So now for the actualy question: Why do I need an Azure Storage Account when I seemingly have all the resources I need place for the WebJob to run? What does it do? Is it a billing thing, cause I thought we had that defined in the App Service Plan. I've tried reading up on Azure Storage Accounts over here but I need a bit more help understanding how it relates to everything.
From the docs:
An Azure storage account provides resources for storing queue and blob data in the cloud.
It's also used by the WebJobs SDK to store logging data for the dashboard.
Refer to the getting started guide and documentation for further information
The answer to your question is "No", it is not mandatory to use Azure Storage when you are trying to setup and run a Azure web job.
If you are using JobHost or JobHostConfiguration then there is indeed a dependency for Storage accounts.
Sample code snippet is give below.
class Program
{
static void Main()
{
Functions.ExecuteTask();
}
}
public class Functions
{
[NoAutomaticTrigger]
public static void ExecuteTask()
{
// Execute your task here
}
}
The answer is no, you don't. You can have a WebJob run without being tied to an Azure Storage Account. Like Murray mentioned, your WebJob dashboard does use a storage account to log data but that's completely independent.

The type initializer for 'Lucene.Net.Store.FSDirectory' threw an exception' error

I'm using the Azure Library for Lucene.net (ALL) to create and search indexes on my test Azure account.
Have setup blob storage and am able to access it using the Azure Portal and the Azure Storage Explorer.
Having issues with writing indexes to the blob storage as well as reading Lucene.net indexes that I created locally and manually moved up.
Seem to be going backwards because initially I was able to see that the ALL was able to create the write.lock, but first things first...
Now I'm getting an 'The type initializer for 'Lucene.Net.Store.FSDirectory' threw an exception' error.
I'm using the blob storage connection string as follows: DefaultEndpointsProtocol=http; AccountName=; AccountKey=;
Code failing on now is:
var account = CloudStorageAccount.Parse(BlobStorageConnectionString);
AzureDirectory azureDirectory = new AzureDirectory(account, indexByName);
Note: yes I should probably be reading from the config file; just trying to get it to work correctly and the account var seems valid.
Thoughts on what I should look into?

Resources