I have an azure function taking in messages from an Azure service bus queue and sending documents to cosmosDB. I'm using Azure functions 1.x:
public static class Function1
{
[FunctionName("Function1")]
public static void Run([ServiceBusTrigger("ServiceBusQueue", AccessRights.Manage, Connection = "ServiceBusQueueConnection")]BrokeredMessage current, [DocumentDB(
databaseName: "DBname",
collectionName: "Colname",
ConnectionStringSetting = "CosmosDBConnection")]out dynamic document, TraceWriter log)
{
document = current.GetBody<MyObject>();
log.Info($"C# ServiceBus queue triggered function processed the message and sent to cosmos");
}
}
This inserts to cosmos successfully, but when updating I get errors:
Microsoft.Azure.Documents.DocumentClintException: Entity with the specified id already exists in the system.
They key I'm trying to update on is the partition key of that collection.
I saw this question: Azure function C#: Create or replace document in cosmos db on HTTP request
But It seems like my usage is similar to the one in Matias Quarantas answer. Also he mentioned that using an out parameter causes an upsert on cosmos.
How can I create this "upsert" function, while still using azure function 1.x?
The binding does indeed do an Upsert operation.
I created this sample Function that takes an Http payload (JSON) and stores it in Cosmos DB as-is:
[FunctionName("Function1")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[DocumentDB("MyDb", "MyCollection", ConnectionStringSetting = "MyCosmosConnectionString")] out dynamic document,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
dynamic data = req.Content.ReadAsAsync<object>().GetAwaiter().GetResult();
document = data;
return req.CreateResponse(HttpStatusCode.OK);
}
If I send a JSON payload to the Http endpoint, the output binding works as expected:
When I check the Data Explorer, I see:
If I send a second payload, this time, adding a property (same id):
The Data Explorer shows the document was updated, with the same Function code:
Can you add the full Exception/Error trace? Is your Service Bus Message including an "id"? Is your collection partitioned?
If your collection is partitioned and you are changing the value of the Partition key property, then the binding won't update the existing document, it will create a new one because the Upsert operation won't find an existing document (based on the id/partition key). But it won't throw an exception.
Related
I want to schedule splunk report to an azure web-hook and persist it into Cosmos DB.(after from processing ) This tutorial gave me some insight on how to process and persist data into cosmos db via the azure functions ( in java ). To solve the next part of the puzzle I"m reaching out for some advise on how to go about:
How to setup and host a webhook on Azure ?
Should I set a HttpTrigger , inside the EventHubOutput function and deploy it into the function app.? Or should I use the Webhook from Azure Event Grid ?(not clear on how to do this ). I'm NOT looking to stream any heavy volumes of data and want to keep the consumption cost low. So , which route should I take here?. Any pointers to tutorials will be of help here.
How do I handle a webhook data processing on #EventHubOutput ( referring the java example in the tutorial) ?. What is the setup and configuration I need to do here ? Any working examples will be of help .
I ended up using just the #HttpTrigger and binding the output using #CosmosDBOutput to persist the data. Something like this , would like to know if there are any better approaches.
public class Function {
#FunctionName("PostData")
public HttpResponseMessage run(
#HttpTrigger(
name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
#CosmosDBOutput( name = "databaseOutput", databaseName = "SplunkDataSource",
collectionName = "loginData",
connectionStringSetting = "CosmosDBConnectionString")
OutputBinding<String> document,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
// Parse the payload
String data = request.getBody().get();
if (data == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body(
"Please pass a name on the query string or in the request body").build();
} else {
// Write the data to the Cosmos document.
document.setValue(data);
context.getLogger().info("Persisting payload to db :" + data);
return request.createResponseBuilder(HttpStatus.OK).body(data).build();
}
}
So I have a basic setup for listening on events coming in to Event Hub, using an Event Hub Trigger function, defined as follows:
[FunctionName("myfunction")]
public async Run([EventHubTrigger(myeventhub, Connection = "EventHubConnectionAppSetting", ConsumerGroup = %myconsumergroup%)], EventData eventHubMessage, ILogger logger)
Where the values for the connection and consumergroup parameters are in my local settings.
Running my function and sending events via Postman (properly authenticated), the function is not triggered.
The only way the function is every triggered is if I remove the ConsumerGroup from the parameters, which will cause it to point to the $Default consumer group for the event hub:
[FunctionName("myfunction")]
public async Run([EventHubTrigger(myeventhub, Connection = "EventHubConnectionAppSetting"], EventData eventHubMessage, ILogger logger)
My goal is to keep my custom consumer group and have the function trigger for events coming into that consumer group.
I will mention that I'm testing this out locally, and using local storage:
AzureWebJobsStorage="UseDevelopmentStorage=true"
But obviously the event hub in question is an actual created resource on Azure, with the relevant consumer group existing under it as well.
You can directly use the consumer group name in the function, like below:
[FunctionName("myfunction")]
public async Run([EventHubTrigger(myeventhub, Connection = "EventHubConnectionAppSetting", ConsumerGroup = "myconsumergroup_name")], EventData eventHubMessage, ILogger logger)
Or if you want to use this mode %myconsumergroup%, you can refer to this steps as below:
In local.settings.json(remember right click the file -> select properties -> set "Copy to Output Directory" as "copy if newer"), define the key-values like below:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"EventHubConnectionAppSetting": "xxxx",
"EventHubConsumerGroup": "your_consumergroup_name"
}
}
then in the function method, define the function like below:
[FunctionName("Function1")]
public static async Task Run([EventHubTrigger("eventhub_name", Connection = "EventHubConnectionAppSetting",ConsumerGroup = "%EventHubConsumerGroup%")] EventData[] events, ILogger log)
Note: I use the latest packages for the function.
Can't seem to get my c# Azure Function to return data to my logic app correctly. Within the local Function testing I get the data correctly. When I call it from the Logic App I can see the content length returns as 0 when returned as a string. If I pass it as a json with number=foo, I can get the "number" key entry, but the value from foo is still blank.
Local run within the Function gives the results
Data returned to the logic app shows Content length is 0
output binding is default
return new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(foo, Encoding.UTF8, "application/json")
};
As I have tested, it works fine in my site.
Within the local Function testing I get the data correctly.
You mean you use azure portal to run the function and you could get the data.
I am not sure if you make some logical judgments on foo. If the foo value it null it would get the second picture you provided.
Here are my working steps and you could refer to:
1.Test with azure function on portal
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string foo="123456789";
return new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(foo, Encoding.UTF8, "application/json")
};
}
2.Logic app design: use queue trigger
3.Add message in queue.
I would like to be able to add custom properties to a queue/topic message as I place it in a queue from and Azure Function. The custom properties are for filtering the messages into different topics. I must be missing something because this working example doesn't seem to have anywhere to put custom properties.
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
TraceWriter log,
ICollector<Contact> outputSbMsg)
{
var contactList = await req.Content.ReadAsAsync<ContactList>();
foreach(var contact in contactList.Contacts)
{
if (contact.ContactId == -1)
{
continue;
}
contact.State = contactList.State;
outputSbMsg.Add(contact);
}
}
I'm coding the function through the Azure Portal. The contact list comes into the function through in the body of an http request. The functions parses out each contact, adds modifies some properties and submits each contact to the queue topic. Additionally I pull other data from the request headers and the contact list and I would like to use that data in the queue topic to filter the requests into different subscriptions.
Edit:
As per #Sean Feldman's suggestion below, the data is added to a BrokeredMessage before adding the BrokeredMessage to the output collection. The key part is to serialize the contact object before adding it to the BrokeredMessage.
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
TraceWriter log,
ICollector<BrokeredMessage> outputSbMsg)
{
var contactList = await req.Content.ReadAsAsync<ContactList>();
foreach(var contact in contactList.Contacts)
{
if (contact.ContactId == -1)
{
continue;
}
string jsonData = JsonConvert.SerializeObject(contact);
BrokeredMessage message = new BrokeredMessage(jsonData);
message.Properties.Add("State", contactList.State);
outputSbMsg.Add(message);
}
}
Thank you
To be able to set custom/user properties, the output collector should be of a native Azure Service Bus message type, BrokeredMessage.
In your case, you'll have to change ICollector<Contact> to ICollector<BrokeredMessage>.
How can we do CRUD with table storage in Azure functions:
I have insert working, but would like to know how to return entities and do updates and deletes too.
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, IAsyncCollector<User> outputTable)
{
log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");
var user = new User();
user.PartitionKey = "Users";
user.RowKey = DateTime.Now.Ticks.ToString();
user.UserId = "aaaa";
user.Country = "uk";
await outputTable.AddAsync(user);
....
You can bind your function to an instance of CloudTable class, and then you get all its API at your hands.
I think you should be able to just replace IAsyncCollector<User> with CloudTable in your function definition and adjust the usage (provided you have a valid output binding).
See "Output usage" under Azure Functions Storage table bindings.