Table Attribute binding not working when using IQueryable<T> - azure

I have an Azure app function that i would like to read from an azure storage Table. I am new to azure functions and table storage. But can not get the samples to work properly.
the functions are created in visual Studio 2017 and published to azure.
The app function have 2 functions, 1 for posting and one for getting.
The post functions work as expected.
The getter function fails with below error:
internal error 500: "'TableName' can't be invoked from Azure WebJobs SDK. Is it missing Azure WebJobs SDK attributes?"
Get function:
[FunctionName("FunctionName")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")]HttpRequestMessage req, [Table("TableName", Connection = "Default")]IQueryable<Person> inTable, TraceWriter log)
{
return req.CreateResponse(HttpStatusCode.OK);
}
The only difference between the post and get methods is their signature.
the post sets the tableattribute for the ICollector interface.
[Table("TableName", Connection = "Default")]ICollector<Person> outTable
And the get sets the table attribute for the IQueryable Interface
[Table("TableName", Connection = "Default")]IQueryable<Person> inTable
Any input is appreciated.
This is almost the default sample provided for working with table storage.

When using IQueryable, the T must implement ITableEntity (or derive from TableEntity) from the Storage SDK. ICollector does not have this constraint.
Be sure to use the same version of the storage SDK that the your project already has from its Microsot.NET.SDK.Functions reference (and not pull in a new, different storage SDK that loads side-by-side).
There are some more details here, https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table, although those examples are *.CSX (where the bindings are in a separate function.json file) and
not *.cs (where the bindings are inline attributes). But the C# types and rules are the same.

Related

Azure Function built in VS vs built in portal performance for adding in Storage Queue

I noticed that Adding messages to Storage queue was quite slow (average of 1.5s) from an HTTP Azure Fucntion compared to other functions created on Azure Portal.
To test this, I have created a very basic azure function that takes the HTTP request, adds a uniqe ID to it and addds it as a message to Storage Queue. I created 2 similar Function Apps. On one I created this easy azure function on the Azure Portal, and on the 2nd one I uploaded through a new VS project. The performance of the VS project Function was an average of 900ms running time over 500 requests. The performance of the function created on the Azure Portal had an average of 200ms running time over 500 requests. Why is there so much performance difference?
Function below:
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Queue("data"), StorageAccount("AzureWebJobsStorage")] ICollector<string> queue,
ILogger log)
{
string uniqId = Guid.NewGuid().ToString("N");
string queryString = $"{req.QueryString.ToString().Trim('?')}&ui={uniqId}";
queue.Add(queryString);
return new OkObjectResult("OK");
}
There is a Code Type difference between Portal Azure Functions and VS IDE Azure Functions Project is C# Script (.csx) Type for Portal Function App Extension and C# Class Library (.cs) that shows the reason for performance difference.
As stated in this MS Doc of Azure Functions, it may make a small difference for portal functions as they have to get compiled and then run and Deploying from IDE are precompiled functions. Comparatively, pre-compiled functions will have upper hand.

Azure Functions - StorageBlob Trigger method signature using CloudBlockBlob not Stream

Creating Azure Functions targeting .Net Standard 2.0 using Visual Studio 2017.
Using the Add New Azure Function wizard, a blob trigger method is successfully created with the following method signature.
public static void Run([BlobTrigger("attachments-collection/{name}")] Stream myBlob, string name, ILogger log)
This method compiles and works fine.
However, we want to be able access the metadata connected to the CloudBlockBlob being saved to storage, which as far as I know is not possible using a stream. Other answers on this site such as (Azure Function Blob Trigger CloudBlockBlob binding) suggest you can bind to a CloudBlockBlob instead of a Stream and access the metadata that way. But the suggested solution does not compile in latest version of Azure Functions.
Microsoft's online documentation (https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#trigger---usage) also seems to confirm that it is possible to bind the trigger to a CloudBlockBlob rather than a Stream, but gives no example of the syntax.
Could someone please clarify the exact syntax required to enable Azure Function Blob storage trigger to bind to a CloudBlockBlob instead of the standard Stream?
Thanks
Thanks to Jerry Liu's s insights, this problem has been solved.
Method:
Use the default storage package for Azure Storage that is installed when you create a new Function App
Microsoft.Azure.WebJobs.Extensions.Storage (3.0.1)
This installs dependency
WindowsAzure.Storage (9.3.1)
Then both of the following method signatures will run correctly
public static async Task Run([BlobTrigger("samples-workitems/{name}")]Stream myBlob, string name, ILogger log)
and
public static async Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob myBlob, string name, ILogger log)
Actually CloudBlockBlob does work, we don't need FileAccess.ReadWrite as it's BlobTrigger instead of Blob input or output.
public static Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob blob, string name, ILogger log)
Update for Can't bind BlobTrigger to CloudBlockBlob
There's an issue tracking here, Function SDK has some problem integrating with WindowsAzure.Storge >=v9.3.2. So just remove any WindowsAzure.Storage package reference, Function SDK references v9.3.1 internally by default.

Why do I see a FunctionIndexingException when creating a QueueTrigger WebJob Function?

I created a function like this
public static Task HandleStorageQueueMessageAsync(
[QueueTrigger("%QueueName%", Connection = "%ConnectionStringName%")] string body,
TextWriter logger)
{
if (logger == null)
{
throw new ArgumentNullException(nameof(logger));
}
logger.WriteLine(body);
return Task.CompletedTask;
}
The queue name and the connection string name come from my configuration that has an INameResolver to get the values. The connection string itself I put from my secret store into the app config at app start. If the connection string is a normal storage connection string granting all permissions for the whole account, the method works like expected.
However, in my scenario I am getting an SAS from a partner team that only offers read access to a single queue. I created a storage connection string from that which looks similar like
QueueEndpoint=https://accountname.queue.core.windows.net;SharedAccessSignature=st=2017-09-24T07%3A29%3A00Z&se=2019-09-25T07%3A29%3A00Z&sp=r&sv=2018-03-28&sig=token
(I tried successfully to connect using this connection string in Microsoft Azure Storage Explorer)
The queue name used in the QueueTrigger attribute is also gathered from the SAS
However, now I am getting the following exceptions
$exception {"Error indexing method 'Functions.HandleStorageQueueMessageAsync'"} Microsoft.Azure.WebJobs.Host.Indexers.FunctionIndexingException
InnerException {"No blob endpoint configured."} System.Exception {System.InvalidOperationException}
If you look at the connection string, you can see the exception is right. I did not configure the blob endpoint. However I also don't have access to it and neither do I want to use it. I'm using the storage account only for this QueueTrigger.
I am using Microsoft.Azure.WebJobs v2.2.0. Other dependencies prevent me from upgrading to a v3.x
What is the recommended way for consuming messages from a storage queue when only a SAS URI with read access to a single queue is available? If I am already on the right path, what do I need to do in order to get rid of the exception?
As you have seen, v2 WebJobs SDK requires access to blob endpoint as well. I am afraid it's by design, using connection string without full access like SAS is an improvement tracked but not realized yet.
Here are the permissions required by v2 SDK. It needs to get Blob Service properties(Blob,Service,Read) and Queue Metadata and process messages(Queue,Container&Object,Read&Process).
Queue Trigger is to get messages and delete them after processing, so SAS requires Process permission. It means the SAS string you got is not authorized correctly even if SDK doesn't require blob access.
You could ask partner team to generate SAS Connection String on Azure portal with minimum permissions above. If they can't provide blob access, v3 SDK seems an option to try.
But there are some problems 1. Other dependencies prevent updating as you mentioned 2. v3 SDK is based on .NET Core which means code changes can't be avoided. 3. v3 SDK document and samples are still under construction right now.
I was having a load of issues getting a SAS token to work for a QueueTrigger.
Not having blob included was my problem. Thanks Jerry!
Slightly newer screenshot (I need add also):

Reference file from Azure function 2.0 with .net core

I am trying to create an Azure function that reads from a .mmdb file GeoLite2 Country DB
I have added the geolite2 file next to my function. But I cannot find a way programmatically to reference the file path such that it remains the same on my local machine as well as deployed/published.
string GeoLocationDbPath = "D:<path_to_project>\Functions\GeoLocation-Country.mmdb"
var reader = new DatabaseReader($"{GeoLocationDbPath}");
I came across this article How to add assembly references to an Azure Function App
I was hoping there was a better way to reference a file both locally and deployed.
Any ideas?
Other links I've looked at:
How to add and reference a external file in Azure Function
How to add a reference to an Azure Function C# project?
Retrieving information about the currently running function
Azure functions – Read file and use SendGrid to send an email
You can get the path to folder by injecting ExecutionContext to your function:
public static HttpResponseMessage Run(HttpRequestMessage req, ExecutionContext context)
{
var funcPath = context.FunctionDirectory; // e.g. d:\home\site\wwwroot\HttpTrigger1
var appPath = context.FunctionAppDirectory; // e.g. d:\home\site\wwwroot
// ...
}
I've come to the conclusion that referencing files local to the Azure function was not a good approach. I read the Azure functions best practices and Azure functions are meant to be stateless. Also, folder structure changes when you deploy.
If I were to continue I would upload the .mmdb file to a Azure blob container and use the CloudStorageAccount to access the file.

Azure Function EventHubTrigger Attribute not finding event hub name

I am having an issue where my event hub name is not found when I publish my function to a function app (It works fine locally, if I just run it in VS2017). I am recieving the following error on the published function in the azure portal when I open the function.
This is the attribute on my Run method.
public static void Run([EventHubTrigger("%eventHubName%", Connection = "eventHubConnection")]string data, TraceWriter log)
Now if I don't include the %'s wrapped around the eventHubName, when I run it locally it will say that it can't find the eventhub (Using the eventHubName string literally instead of looking into the local.settings.json like the connection string), but it will work when it is published. I am wanting to avoid putting the actual name in the attribute as different environments will have unique event hub names.
Azure Functions will use the local.settings.json file when you are developing locally. When your Function App is running on Azure, it will read the values from the Application Settings.
Using the %zzz% is the correct way to read settings, so this makes me question if you have a setting called eventHubName in Application Setting when you deploy to Azure.
https://learn.microsoft.com/en-us/azure/app-service/web-sites-configure

Resources