Azure CosmosDB Query Explorer vs Data Explorer - azure

I'm running the same query against my CosmosDB instance (using SQL API):
SELECT c.partition, COUNT(1) AS total
FROM c
WHERE c.system = "SF"
GROUP BY c.partition
I'm a bit surprised that I'm getting expected results from Data Explorer while under Query Explorer tab I'm getting 400 Bad Request with below message:
{"code":400,"body":"{\"code\":\"BadRequest\",\"message\":\"Message: {\\"Errors\\":[\\"Cross partition query only supports 'VALUE ' for aggregates.\\"]}\r\nActivityId: d8523615-c2ff-47cf-8102-5256237c7024, Microsoft.Azure.Documents.Common/2.7.0\"}","activityId":"d8523615-c2ff-47cf-8102-5256237c7024"}
I know I can use the first one but the same exception occurs when I'm trying to run my query from Logic Apps:
So the question is pretty simple: is that query syntactically correct or not ?

For your requirements, I think we can have a try to use azure function in your logic app instead of the "Query documents V2" action.
Here is my function code:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using System.Collections.Generic;
namespace HuryCosmosFun
{
public static class Function1
{
[FunctionName("Function1")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[CosmosDB(
databaseName: "ToDoList",
collectionName: "Items",
SqlQuery = "SELECT c.partition, COUNT(1) AS total FROM c WHERE c.system = 'SF' GROUP BY c.partition",
ConnectionStringSetting = "CosmosDBConnection")]
IEnumerable<ResultsClass> results,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
foreach (ResultsClass result in results)
{
log.LogInformation(result.partition);
log.LogInformation(result.total.ToString());
}
return new OkResult();
}
}
}
namespace HuryCosmosFun
{
public class ResultsClass
{
public string partition { get; set; }
public int total { get; set; }
}
}
For more information about the code above, you can refer to this tutorial.
After publish to azure, we can create an azure function in logic app, it will help us do the sql operation.
By the way, since this tutorial mentioned azure cosmos db sdk has supported "group by".
I think we can also write code in azure function to connect and query cosmos db by the sdk in this document, and then create azure function in logic app as the first solution.
Hope it would be helpful to your requirements~

Related

Azure Function binding working locally but not in portal

I have this azure function v3:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.Azure.Cosmos.Table;
namespace FunctionApp4
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Table("Items")] CloudTable table,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
string responseMessage = string.IsNullOrEmpty(name)
? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
: $"Hello, {name}. This HTTP triggered function executed successfully.";
return new OkObjectResult(responseMessage);
}
}
}
Runs perfectly locally but when published to the portal I get:
Error indexing method 'Function1' Cannot bind parameter 'table' to type CloudTable. Make sure the parameter Type is supported by the binding. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).
Does anyone know what could be causing this?
These are the dependencies for the function:

Azure CosmosDB: function app- how to update a document

I am new to Azure. I was wondering if I could get some help with updating an existing record via https trigger.
Many solutions I find online are either creating a new record or updating complete document. I just want to update 2 properties in the document.
I tried [this][1] and the following code but it didn't work
[FunctionName("Function1")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[DocumentDB("MyDb", "MyCollection", ConnectionStringSetting = "MyCosmosConnectionString")] out dynamic document,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
dynamic data = req.Content.ReadAsAsync<object>().GetAwaiter().GetResult();
document = data;
return req.CreateResponse(HttpStatusCode.OK);
}
I want to pass primary key and 2 other values which the document can update based on the primary string. Can anyone help?
I just want to update 2 properties in the document.
Till 2020/10/20, this feature still not support. You can check the progress rate in this place:
https://feedback.azure.com/forums/263030-azure-cosmos-db/suggestions/6693091-be-able-to-do-partial-updates-on-document#{toggle_previous_statuses}
The work to support the feature start one year ago, and now is still not finished, the only thing we can do is wait.
On your side, you need to get the document, change the internal and then update.
A simple example:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Azure.Cosmos;
using System.Collections.Generic;
namespace FunctionApp21
{
public static class Function1
{
private static CosmosClient cosmosclient = new CosmosClient("AccountEndpoint=https://testbowman.documents.azure.com:443/;AccountKey=xxxxxx;");
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
CosmosContainer container = cosmosclient.GetContainer("testbowman", "testbowman");
ItemResponse<ToDoActivity> wakefieldFamilyResponse = await container.ReadItemAsync<ToDoActivity>("testbowman", new PartitionKey("testbowman"));
ToDoActivity itemBody = wakefieldFamilyResponse;
itemBody.status = "This is been changed.";
wakefieldFamilyResponse = await container.ReplaceItemAsync<ToDoActivity>(itemBody, itemBody.id, new PartitionKey(itemBody.testbowman));
return new OkObjectResult("");
}
}
public class ToDoActivity
{
public string id { get; set; }
public string status { get; set; }
public string testbowman { get; set; }
}
}
The offcial doc:
https://learn.microsoft.com/en-us/azure/cosmos-db/create-sql-api-dotnet-v4#replace-an-item

Adding Custom Dimension to Request Telemetry - Azure functions

I am creating a new Function app using v2.x and I am integrating Application Insights for request logging that is automatically being done as Azure Function is now integrated with App Insights (as mentioned in the documentation link). What I would need to do is log few custom fields in the custom dimensions in Application Insights Request Telemetry. Is it possible without using Custom Request logging (using TrackRequest method)
About adding custom properties, you could refer to this tutorial:Add properties: ITelemetryInitializer. The below is my test a HTTP trigger function.
public static class Function1
{
private static string key = "Your InstrumentationKey";
private static TelemetryClient telemetry = new TelemetryClient() { InstrumentationKey = key };
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (!telemetry.Context.Properties.ContainsKey("Function_appName"))
{
telemetry.Context.Properties.Add("Function_appName", "testfunc");
}
else
{
telemetry.Context.Properties["Function_appName"] = "testfunc";
}
telemetry.TrackEvent("eventtest");
telemetry.TrackTrace("tracetest");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
}
After running this function, go to the Application Insights Search could check the data Or go to Logs(Analytics).
Update:
You should use ITelemetry Initializer(which can add custom dimension to a specified telemetry like only for request) in function app, please follow the steps below:
1.In Visual studio, create a function app(In my test, I create a blob triggerd function), and install the following nuget packages:
Microsoft.ApplicationInsights, version 2.10.0
Microsoft.NET.Sdk.Functions, version 1.0.29
2.Then in the Function1.cs, write code like below:
using Microsoft.ApplicationInsights.Channel;
using Microsoft.ApplicationInsights.DataContracts;
using Microsoft.ApplicationInsights.Extensibility;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using System.IO;
[assembly: WebJobsStartup(typeof(FunctionApp21.MyStartup))]
namespace FunctionApp21
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
}
internal class MyTelemetryInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
//use telemetry is RequestTelemetry to make sure only add to request
if (telemetry != null && telemetry is RequestTelemetry && !telemetry.Context.GlobalProperties.ContainsKey("my_custom_dimen22"))
{
telemetry.Context.GlobalProperties.Add("my_custom_dimen22", "Hello, this is custom dimension for request!!!");
}
}
}
public class MyStartup : IWebJobsStartup
{
public void Configure(IWebJobsBuilder builder)
{
builder.Services.AddSingleton<ITelemetryInitializer, MyTelemetryInitializer>();
}
}
}
3.Publish it to azure, then nav to azure portal -> the published function app -> Monitor -> Add an application insights.
4.Run the function from azure. And wait for a few minutes -> nav to the application insights portal, check the telemetry data, and you can see the custom dimension is only added to request telemetry:
The other solutions don't quite answer the question, how to add custom properties to the request telemetry. There is a very simple solution, add the following within your function's code:
Activity.Current?.AddTag("my_prop", "my_value");
You'll need:
using System.Diagnostics;
This then can be dynamic per function invocation / request, rather a fixed global property.

verify or debug azure hybrid connection

I've got an Azure Function running with a Hybrid Connection to an on premise server. Everything works nicely. In the future we will have multiple hybrid connections in multiple locations, and its possible they might have the same server name and port combination.
Is there a way to verify (or debug) the service bus namespace or UserMetadata properties of the hybrid connection being used, before the SQL operation is executed?
Here is run.csx from the function app:
#r "System.Data"
using System.Net;
using System.Collections.Generic;
using System.Configuration;
using Newtonsoft.Json;
using System.Data.SqlClient;
using System.Text;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string connectionString = "Server=MyServerName;Initial Catalog=BillingDB;";
string queryString = "select top 2 * from Billing";
List<Billing> bills = new List<Billing>();
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
/***************
Here is where I would like to be able to verify/validate
the namespace or UserMetadata of the hybrid connection so that I'm sure
we're connecting to the desired database. "Server" in the connection string
is not enough in our case.
******************/
SqlCommand DBCmd = new SqlCommand(queryString, conn);
SqlDataReader myDataReader;
myDataReader = DBCmd.ExecuteReader();
while (myDataReader.Read())
{
bills.Add(new Billing
{
Student_ID = Convert.ToInt32(myDataReader[0]),
Transaction_Number = Convert.ToInt32(myDataReader[1]),
Log = myDataReader[2].ToString(),
Amount_Owed = Convert.ToDecimal(myDataReader[3])
}
);
}
myDataReader.Close();
conn.Close();
var json = JsonConvert.SerializeObject(bills);
log.Info("json: " + json);
return req.CreateResponse(HttpStatusCode.OK,json);
}
public class Billing {
public int Student_ID { get; set; }
public int Transaction_Number { get; set; }
public string Log { get; set; }
public decimal Amount_Owed { get; set; }
}
I was eventually able to solve this by making a GET Request to the Azure Resource Manager REST API (by clicking on Resource Manager in the Application Settings, it provides the url needed to make a callout to as well as the expected response body). In addition to this an Active Directory application needs to be created to acquire a token to access the resources.
https://management.azure.com/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.Web/sites/{functionname}/hybridConnectionRelays?api-version=2016-08-01
This returns a JSON object listing the properties of the Hybrid Connections which are 'Connected' to the individual application/function app.

Azure Functions and DocumentDB triggers

Is it possible to specify the DocumentDB is to fire triggers when writing to DocumentDB?
I have an Azure function that pulls JSON messages off a Service Bus Queue and puts them into DocumentDB like so:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return myQueueItem;
}
This inserts new documents into the database as they are added to the service bus queue, however I need DocumentDB to process these as they are added and add attachments. This cannot be done in the present setup and I would like to tell DocumentDB to fire a trigger.
I have tried something like this:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return "x-ms-documentdb-post-trigger-include: addDocument\n" + myQueueItem;
}
It doesn't work and gives me errors like this:
Exception while executing function:
Functions.ServiceBusQueueTriggerCSharp1. Microsoft.Azure.WebJobs.Host:
Error while handling parameter _return after function returned:.
Newtonsoft.Json: Unexpected character encountered while parsing value:
x. Path '', line 0, position 0.
I like this setup because I can saturate the queue with requests to add records and they just buffer until the database can deal with it, which deals with spikes in demand, but it allows data offload from the client machine as fast as the network can carry it and then the queue/database combination gets caught up when demand drops again.
You could refer to the following code sample to create document with the trigger enabled in Azure Functions.
using System;
using System.Threading.Tasks;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
public static void Run(string myQueueItem, TraceWriter log)
{
string EndpointUri = "https://{documentdb account name}.documents.azure.com:443/";
string PrimaryKey = "{PrimaryKey}";
DocumentClient client = new DocumentClient(new Uri(EndpointUri), PrimaryKey);
client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri("{databaseid}", "{collenctionid}"), new MyChunk { MyProperty = "hello" },
new RequestOptions
{
PreTriggerInclude = new List<string> { "YourTriggerName" },
}).Wait();
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}
public class MyChunk
{
public string MyProperty { get; set; }
}
Note: for using Microsoft.Azure.DocumentDB NuGet package in a C# function, please upload a project.json file to the function's folder in the function app's file system.
project.json
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Azure.DocumentDB": "1.13.1"
}
}
}
}
Besides, please make sure you have created triggers in your DocumentDB, for details about creating triggers, please refer to this article.

Resources