A follow-question to my previous post about Azure Functions. I need to update a document in DocumentDB using the imperative binder (Binder). I don't really understand the documentation and I can't find any examples (I more or less find one kind of example which is the TextWriter one). The documentation says I can bind to "out T" by I find no examples of this.
Say that the document looks like this before running the function:
{
child: {
value: 0
}
}
And the functions looks like this:
var document = await binder.BindAsync<dynamic>(new DocumentDBAttribute("myDB", "myCollection")
{
ConnectionStringSetting = "my_DOCUMENTDB",
Id = deviceId
});
log.Info($"C# Event Hub trigger function processed a message: document: { document }");
document.value = 100;
document.child.value = 200;
log.Info($"Updated document: { document }");
According to the second logging row, the document isn't properly updated. The child is not updated (which existed when read from the store) and value is added. Either way, nothing is persisted. I've tried adding an Output in the function.json, but the compiler complains about it and the documentation states that you shouldn't have any.
What am I missing?
Mathew's sample (using DocumentClient) works, but I wanted to clarify the other way you can do it with Binder and an output binding.
You're bumping into two issues:
The Document implementation of dynamic appears to return a new object instance every time you request a child object. This isn't related to Functions, but explains why document.child.value = 200 doesn't work. You are updating one instance of child that is not actually attached to the document. I'll try to double-check this with DocumentDb folks, but that is confusing. One way around this is to request a JObject instead of a dynamic. My code below does that.
As #mathewc pointed out, Binder does not auto-update the document. We'll track that in the issue he filed. Instead, you can use an output binding with IAsyncCollector<dynamic> to update the document. Behind-the-scenes we'll call InsertOrReplaceDocumentAsync, which will update the document.
Here's a full sample that worked for me:
Code:
#r "Microsoft.Azure.WebJobs.Extensions.DocumentDB"
#r "Newtonsoft.Json"
using System;
using Newtonsoft.Json.Linq;
public static async Task Run(string input, Binder binder, IAsyncCollector<dynamic> collector, TraceWriter log)
{
string deviceId = "0a3aa1ff-fc76-4bc9-9fe5-32871d5f451b";
dynamic document = await binder.BindAsync<JObject>(new DocumentDBAttribute("ItemDb", "ItemCollection")
{
ConnectionStringSetting = "brettsamfunc_DOCUMENTDB",
Id = deviceId
});
log.Info($"C# Event Hub trigger function processed a message: document: { document }");
document.value = 100;
document.child.value = 200;
await collector.AddAsync(document);
log.Info($"Updated document: { document }");
}
binding:
{
"type": "documentDB",
"name": "collector",
"connection": "brettsamfunc_DOCUMENTDB",
"direction": "out",
"databaseName": "ItemDb",
"collectionName": "ItemCollection",
"createIfNotExists": false
}
Yes, I do believe there is an issue here, and I've logged a bug here in our repo to track it.
To work around this until we fix it, you can bind to and use the DocumentClient directly to perform the update, e.g.:
public static async Task Run(
string input, Binder binder, DocumentClient client, TraceWriter log)
{
var docId = "c31d48aa-d74b-46a3-8ba6-0d4c6f288559";
var document = await binder.BindAsync<JObject>(
new DocumentDBAttribute("ItemDb", "ItemCollection")
{
ConnectionStringSetting = "<mydb>",
Id = docId
});
log.Info("Item before: " + document.ToString());
document["text"] = "Modified!";
var docUri = UriFactory.CreateDocumentUri("ItemDb", "ItemCollection", docId);
await client.ReplaceDocumentAsync(docUri, document);
}
However, once you're using DocumentClient directly like this, it might turn out that you can just use it directly for all of your operations, your call. For example:
public static async Task Run(
string input, DocumentClient client, TraceWriter log)
{
var docId = "c31d48aa-d74b-46a3-8ba6-0d4c6f288559";
var docUri = UriFactory.CreateDocumentUri("ItemDb", "ItemCollection", docId);
var response = await client.ReadDocumentAsync(docUri);
dynamic document = response.Resource;
log.Info("Value: " + dynamic.text);
document.text = "Modified!";
await client.ReplaceDocumentAsync(docUri, document);
}
Related
I am currently making an API that will be hosted via Azure Functions. I'm running .net core 3.1. The way I have the project routed right now is defining the accepted methods as a parameter for the HttpTrigger then I have an if statement for determining how the endpoint was called. I am attempting to use the OpenAPI package to create API definitions, but when I assign Methods to the function, the Swagger document only picks up the first Method listed (PUT). I am unsure of the intended structure / usage of endpoints that have multiple possible request methods.
See code below. (OpenAPI tags are placeholder descriptions)
namespace Dyn.Sync.Func.PractifiSync
{
public class Prospect
{
[FunctionName("Prospect")]
[OpenApiOperation(operationId: "Run", tags: new[] { "name" })]
[OpenApiSecurity("function_key", SecuritySchemeType.ApiKey, Name = "code", In = OpenApiSecurityLocationType.Query)]
[OpenApiParameter(name: "name", In = ParameterLocation.Query, Required = true, Type = typeof(string), Description = "The **Name** parameter")]
[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "text/plain", bodyType: typeof(string), Description = "The OK response")]
public async Task<IActionResult> Create([HttpTrigger(AuthorizationLevel.Anonymous, "post", "put", Route = null)] HttpRequest req, ILogger log)
{
string primarySecretsContainerName = "Main";
DynUser user = await DynAuthManager.CreateDynUserAsync(req);
DynProspect prospect = JsonSerializer.Deserialize<DynProspect>(req.Body);
PFIConnection pfiConnector = PFIConnectionsCache.GetConnection(user, DynSecretsCache.GetSecretsContainer(primarySecretsContainerName));
try
{
if (!pfiConnector.IsConnected) { await pfiConnector.Connect(); }
if (req.Method == "POST") { return await pfiConnector.CreateProspect(prospect); }
if (req.Method == "PUT") { return await pfiConnector.UpdateProspect(prospect); }
else { return new ObjectResult("Invalid method.") { StatusCode = 400 }; }
}
catch (Exception ex)
{
DynError dynError = new DynError(ex);
log.LogError(ex, "Exception " + dynError.RequestID.ToString() + " occured.");
return (IActionResult)new ExceptionResult(ex, true);
}
}
}
}
My question is this: When the swagger document is created, it only lists whatever method I defined first (in other words, it ignores the "put" method). What is the intended way to structure an API when creating it in Azure functions? I tried creating a separate method in the same class for each HTTP method that it would accept, but then I couldn't even hit the endpoint when making requests. Does microsoft want us to create a new function class for each endpoint? So then instead of:
PUT http://myapi.azure.com/api/prospect
POST http://myapi.azure.com/api/prospect
it would be:
PUT http://myapi.azure.com/api/updateprospect
POST http://myapi.azure.com/api/prospect
I should note that this will eventually live under and Azure API Management instance, which makes me even more worried to implement it in a "one function per method" fashion as when loading azure functions the way I have done it, it correctly assigns the methods in APIM and I'd prefer not to have to manually configure it.
I have been searching for documentation on this specific issue with no luck. Anyone have any ideas how Microsoft intended this to be used?
Thanks.
I have an azure function which returns a service bus message. However, I want to conditionally return a service bus message, instead of being forced to return the message every time.
here is an example
[FunctionName("ServiceBusOutput")]
[return: ServiceBus("myqueue", Connection = "ServiceBusConnection")]
public static string ServiceBusOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# function processed: {input.Text}");
// check condition here, abort return completely
// Otherwise, return
return input.Text;
}
Said another way, I want to return a message on a service bus when certain conditions apply within the function code block. Is this possible?
One idea that does not work is to throw an exception. However, this just results in the message being placed into the DL queue. I want to completely abort the operation of returning the message on the service bus, and avoid DL.
Another idea that does not work is to simply execute
return;
But this results in compile-time error, which is sort of expected
"An object of a type convertible to 'MyReturnType1' is required"
I can think of a hack which I dont like, which is to return null, and handle the null later in the chain. But this is sort of dirty to me.
You could just bind ServiceBus as MessageSender type, then use the SendAsync() method to send the message.
The below is my test code, if the request name equals "george", it will send the name to the message queue.
public static class Function1
{
[FunctionName("Function1")]
public static async System.Threading.Tasks.Task RunAsync(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[ServiceBus("myqueue", Connection = "ServiceBusConnection")] MessageSender messagesQueue,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
if (name.Equals("george")) {
byte[] bytes = Encoding.ASCII.GetBytes(name);
Message m1 = new Message();
m1.Body = bytes;
await messagesQueue.SendAsync(m1);
}
}
}
Suppose this is what you want, hope this could help you, if you still have other problem please feel free to let me know.
Instead of using the output binding available in Azure Function, you can send a message to the queue from a custom queue client created inside the function.
Posting the message based on a condition is not possible with the bindings.
I have an azure function taking in messages from an Azure service bus queue and sending documents to cosmosDB. I'm using Azure functions 1.x:
public static class Function1
{
[FunctionName("Function1")]
public static void Run([ServiceBusTrigger("ServiceBusQueue", AccessRights.Manage, Connection = "ServiceBusQueueConnection")]BrokeredMessage current, [DocumentDB(
databaseName: "DBname",
collectionName: "Colname",
ConnectionStringSetting = "CosmosDBConnection")]out dynamic document, TraceWriter log)
{
document = current.GetBody<MyObject>();
log.Info($"C# ServiceBus queue triggered function processed the message and sent to cosmos");
}
}
This inserts to cosmos successfully, but when updating I get errors:
Microsoft.Azure.Documents.DocumentClintException: Entity with the specified id already exists in the system.
They key I'm trying to update on is the partition key of that collection.
I saw this question: Azure function C#: Create or replace document in cosmos db on HTTP request
But It seems like my usage is similar to the one in Matias Quarantas answer. Also he mentioned that using an out parameter causes an upsert on cosmos.
How can I create this "upsert" function, while still using azure function 1.x?
The binding does indeed do an Upsert operation.
I created this sample Function that takes an Http payload (JSON) and stores it in Cosmos DB as-is:
[FunctionName("Function1")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[DocumentDB("MyDb", "MyCollection", ConnectionStringSetting = "MyCosmosConnectionString")] out dynamic document,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
dynamic data = req.Content.ReadAsAsync<object>().GetAwaiter().GetResult();
document = data;
return req.CreateResponse(HttpStatusCode.OK);
}
If I send a JSON payload to the Http endpoint, the output binding works as expected:
When I check the Data Explorer, I see:
If I send a second payload, this time, adding a property (same id):
The Data Explorer shows the document was updated, with the same Function code:
Can you add the full Exception/Error trace? Is your Service Bus Message including an "id"? Is your collection partitioned?
If your collection is partitioned and you are changing the value of the Partition key property, then the binding won't update the existing document, it will create a new one because the Upsert operation won't find an existing document (based on the id/partition key). But it won't throw an exception.
I would like to be able to add custom properties to a queue/topic message as I place it in a queue from and Azure Function. The custom properties are for filtering the messages into different topics. I must be missing something because this working example doesn't seem to have anywhere to put custom properties.
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
TraceWriter log,
ICollector<Contact> outputSbMsg)
{
var contactList = await req.Content.ReadAsAsync<ContactList>();
foreach(var contact in contactList.Contacts)
{
if (contact.ContactId == -1)
{
continue;
}
contact.State = contactList.State;
outputSbMsg.Add(contact);
}
}
I'm coding the function through the Azure Portal. The contact list comes into the function through in the body of an http request. The functions parses out each contact, adds modifies some properties and submits each contact to the queue topic. Additionally I pull other data from the request headers and the contact list and I would like to use that data in the queue topic to filter the requests into different subscriptions.
Edit:
As per #Sean Feldman's suggestion below, the data is added to a BrokeredMessage before adding the BrokeredMessage to the output collection. The key part is to serialize the contact object before adding it to the BrokeredMessage.
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
TraceWriter log,
ICollector<BrokeredMessage> outputSbMsg)
{
var contactList = await req.Content.ReadAsAsync<ContactList>();
foreach(var contact in contactList.Contacts)
{
if (contact.ContactId == -1)
{
continue;
}
string jsonData = JsonConvert.SerializeObject(contact);
BrokeredMessage message = new BrokeredMessage(jsonData);
message.Properties.Add("State", contactList.State);
outputSbMsg.Add(message);
}
}
Thank you
To be able to set custom/user properties, the output collector should be of a native Azure Service Bus message type, BrokeredMessage.
In your case, you'll have to change ICollector<Contact> to ICollector<BrokeredMessage>.
I'm seeing a lot of exceptions in the collectionSelfLink when making DocumentDb call -- see image below.
I'm able to connect to DocumentDb and read data but these exceptions concern me -- especially in something that's pretty straight forward like a collectionSelfLink.
Any idea what may be causing them and how to fix them?
Here's the function that's using the selfLink
public async Task<IEnumerable<T>> ReadQuery<T>(string dbName, string collectionId, SqlQuerySpec query)
{
// Prepare collection self link
// IMPORTANT: This is where I'm seeing those exceptions when I inspect the collectionLink. Otherwise, I'm NOT getting any errors.
var collectionLink = UriFactory.CreateDocumentCollectionUri(dbName, collectionId);
var result = _client.CreateDocumentQuery<T>(collectionLink, query, null);
_client.CreateDocumentQuery<T>(collectionLink);
return await result.QueryAsync();
}
And here's the QueryAsync() extension method
public async static Task<IEnumerable<T>> QueryAsync<T>(this IQueryable<T> query)
{
var docQuery = query.AsDocumentQuery();
var batches = new List<IEnumerable<T>>();
do
{
var batch = await docQuery.ExecuteNextAsync<T>();
batches.Add(batch);
}
while (docQuery.HasMoreResults);
var docs = batches.SelectMany(b => b);
return docs;
}
So SelfLink is an internal property that is set by DocumentDB. It cannot be set by the user and will only be populated on resources that have been returned from a call to the server.
The UriFactory code that you are using is construction a link that can be used to execute operations, but it is not a SelfLink.
If you are looking at a SelfLink property on a newly initialized DocumentCollection() object the SelfLink will be null as it has not been persisted on the server yet. This would explain all those errors in debug watch.