Locally testing Azure Functions with DocumentDB (client library) and receiving 'Invalid API version' for 1.13.2 to 1.17.0 - azure

I'm testing Azure Functions locally using VS2017 (Preview 7.1). The function writes to DocumentDB locally using the emulator (1.11.136.2) and everything works fine when using Microsoft.Azure.DocumentDB 1.13.1. As soon as I upgrade to any of the newer versions (1.13.2 to 1.17.0), I receive the following error:
Invalid API version. Ensure a valid x-ms-version header value is passed.
When calling the function from Postman I add a x-ms-version: 2017-02-22 header, but I suspect this is required only for the REST API.
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using System;
using System.Configuration;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
namespace AzureFunction
{
public static class Function1
{
private static readonly ConnectionPolicy connectionPolicy =
new ConnectionPolicy
{
UserAgentSuffix = " tilt",
ConnectionMode = ConnectionMode.Direct,
ConnectionProtocol = Protocol.Tcp,
EnableEndpointDiscovery = false,
};
[FunctionName("FunctionApp")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "func/app")]HttpRequestMessage req, TraceWriter log)
{
string setting = ConfigurationManager.AppSettings["DOCUMENTDB"];
string databaseName = "test1";
Tuple<Uri, string> conn = Connection(setting);
using (var client = new DocumentClient(conn.Item1, conn.Item2, connectionPolicy))
{
// error thrown at next line
await client.CreateDatabaseIfNotExistsAsync(new Database { Id = databaseName });
}
return req.CreateResponse(HttpStatusCode.OK);
}
static Tuple<Uri, string> Connection(string configSetting)
{
string[] setting = configSetting.Split(';');
string endpoint = setting[0].Split('=')[1];
string key = setting[1].Split('=')[1] + "==";
var t = new Tuple<Uri, string>(new Uri(endpoint), key);
return t;
}
}
}
I could continue using 1.13.1, but I would like to start using Graph DB which is not compatible with this version.
Why am I receiving this error for the client library, and why only from version 1.13.2?

It turns out the emulator was the incorrect version. For Microsoft.Azure.DocumentDB 1.14.0 and above, Azure DocumentDB Emulator 1.13.58.2 is required, which I installed from here https://chocolatey.org/packages/azure-documentdb-emulator.
Seems this site https://learn.microsoft.com/en-us/azure/cosmos-db/local-emulator points to the old version.

Related

"Access to the path 'C:\\home\\site\\wwwroot\\dataModel.csv' is denied." in a .NET Core app, deployed in Azure

I have an ASP.NET Core API app which runs some background processes via HangFire. One of the process includes
writing a csv file onto the wwwroot folder as following:
public async Task Work(PerformContext context)
{
var latestLikes = await this.likeRepository
.All()
.Select(l => new LatestLikesServiceModel
{
UserId = l.UserId,
BeatId = l.BeatId,
})
.ToListAsync();
var modelPath = this.webHostEnvironment.ContentRootPath + "\\dataModel.csv";
using (var writer = new StreamWriter(modelPath))
using (var csv = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
csv.WriteRecords(latestLikes);
}
}
In localhost it works perfectly, but when I deploy it in azure the HangFire log returns:
"System.UnauthorizedAccessException","ExceptionMessage":"Access to the path 'C:\home\site\wwwroot\dataModel.csv' is denied."
How can I resolve this issue?
After one week of researching I finally found out the solution
I should have mentioned that I use Azure DevOps for CI & CD.
By default the azure app service is deployed as a zip (which cuts the direct access to the file system) What I had to do was to change the Deployment method to a Web Deploy (Additional Deployment Options) and I finally have access to the file system.
For more descriptive information, please refer:
https://tomasherceg.com/blog/post/azure-app-service-cannot-create-directories-and-write-to-filesystem-when-deployed-using-azure-devops#comments
Below code works fine on my side:
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace FunctionApp97
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ExecutionContext ec,
ILogger log)
{
string str = ec.FunctionAppDirectory;
log.LogInformation(str);
using (var reader = new StreamReader(str+ #"\filename.txt"))
{
string line;
while ((line = reader.ReadLine()) != null)
{
log.LogInformation(line);
}
}
return new OkObjectResult(str);
}
}
}

Inference for custom vision model hosted on Azure Cognitive Services stopped working in December for me

I have a service that has been successfully performing inferences for 2 years, but API stopped working in December. I have created a simple App based on documentation from Microsoft to reproduce the problem. Please see code below.
Is anybody else experience this problem?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading.Tasks;
using Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction;
using static System.Net.Mime.MediaTypeNames;
namespace TestCustomVision
{
class Program
{
public static void Main()
{
string imageFilePath = <My Image>;
MakePredictionRequest(imageFilePath).Wait();
Console.WriteLine("\n\nHit ENTER to exit...");
Console.ReadLine();
}
public static async Task MakePredictionRequest(string imageFilePath)
{
var client = new HttpClient();
// Request headers - replace this example key with your valid Prediction-Key.
client.DefaultRequestHeaders.Add("Prediction-Key", <My key>);
// Prediction URL - replace this example URL with your valid Prediction URL.
string url = <Prediction URL>;
HttpResponseMessage response;
// Request body. Try this sample with a locally stored image.
byte[] byteData = GetImageAsByteArray(imageFilePath);
using (var content = new ByteArrayContent(byteData))
{
content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
response = await client.PostAsync(url, content);
Console.WriteLine(await response.Content.ReadAsStringAsync());
}
}
private static byte[] GetImageAsByteArray(string imageFilePath)
{
FileStream fileStream = new FileStream(imageFilePath, FileMode.Open, FileAccess.Read);
BinaryReader binaryReader = new BinaryReader(fileStream);
return binaryReader.ReadBytes((int)fileStream.Length);
}
}
}

Download File from Blob Storage .net core Azure Function C#

Note: This is a share.
Couple days ago I tried to use Azure Function to build an API manipulating "blob storage operations CRUD", I having investigated a solution to solve the download operation, since the majority internet solutions I found work it locally but while deploy my function the Web server needs the grant permission path to Create File and download locally which generated the error:"Access to path is denied".
Then I Solved download via HTTP response whit Azure function V2, C# .net core 2.1
This is the basic code it works me, I hope it helps you...
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Auth;
using System.IO;
using System.Net.Http.Headers;
using System.Net.Http;
using System.Net;
namespace BloApi
{
public static class BlobOperations
{
[FunctionName("DownloadBlob")]
public static async Task<HttpResponseMessage> DownloadBlob(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "DownloadBlob/{name}")] HttpRequest req, string name)
{
StorageCredentials storageCredentials = new StorageCredentials("Storage",
"CamEKgqVaylmQ.....ow2VHlyCww==");
CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
CloudBlobContainer container = storageAccount.CreateCloudBlobClient().GetContainerReference("MyBlobContainer");
var blobName = name;
CloudBlockBlob block = container.GetBlockBlobReference(blobName);
HttpResponseMessage message = new HttpResponseMessage(HttpStatusCode.OK);
Stream blobStream = await block.OpenReadAsync();
message.Content = new StreamContent(blobStream);
message.Content.Headers.ContentLength = block.Properties.Length;
message.StatusCode = HttpStatusCode.OK;
message.Content.Headers.ContentType = new MediaTypeHeaderValue(block.Properties.ContentType);
message.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = $"CopyOf_{block.Name}",
Size = block.Properties.Length
};
return message;
}
}
}

Why does my Azure V2 time function crashes with newtonsoft reference?

I have a simple timer based azure function that crashes with the following message
I have added the nuget package for newtonsoft.Json so I am not sure why this is a problem.
[1/11/2018 07:00:26] Executed 'PimDataFeeder' (Failed, Id=291e9147-7f57-4fd3-887d-a8001afc8230)
[1/11/2018 07:00:26] System.Private.CoreLib: Exception while executing function: PimDataFeeder. System.Private.CoreLib: Could not load file or assembly 'Newtonsoft.Json, Version=11.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. Could not find or load a specific file. (Exception from HRESULT: 0x80131621). System.Private.CoreLib: Could not load file or assembly 'Newtonsoft.Json, Version=11.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'.
---- EDIT ----
The simple function looks like this, basically it downloads a file from a remote destination and manipulates it in memory before writing it into a CosmosDB instance, or at least thats the idea once it starts working. Putting a break point in the loop tells me that the first loop iteration works and indeed the first line from the string is split properly and then follows the crash
using System;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Net.Http;
using System.Net.Http.Handlers;
using System.Net.Http.Headers;
using System.IO.Compression;
using System.IO;
using System.Text;
using System.Net;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Newtonsoft.Json.Linq;
using Newtonsoft.Json;
using System.ComponentModel.DataAnnotations;
namespace NWCloudPimDataFeeder
{
public static class PimDataFeeder
{
[FunctionName("PimDataFeeder")]
public static async System.Threading.Tasks.Task RunAsync([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log)
{
// The endpoint to your cosmosdb instance
var endpointUrl = "https://example.com";
// The key to you cosmosdb
var key = "XXX";
// The name of the database
var databaseName = "XXX";
// The name of the collection of json documents
var databaseCollection = "XXX";
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
HttpClientHandler handler = new HttpClientHandler()
{
AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
};
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("Authorization", "Bearer XXX");
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpResponseMessage file = await client.GetAsync("https://example.com");
var content = await file.Content.ReadAsByteArrayAsync();
MemoryStream originalFileStream = new MemoryStream(content);
using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
{
MemoryStream decompressedFileStream = new MemoryStream();
decompressionStream.CopyTo(decompressedFileStream);
byte[] fileResult = new byte[decompressedFileStream.Length];
decompressedFileStream.Position = 0;
decompressedFileStream.Read(fileResult, 0, fileResult.Length);
string result = System.Text.Encoding.UTF8.GetString(fileResult);
//log.Info(result);
foreach (var singleItem in result.Split(new string[] { "\r\n", "\r", "\n" }, StringSplitOptions.RemoveEmptyEntries))
{
log.Info("singleItem looks like: " +singleItem);
log.Info("In the loop");
var itemWrapper = new ItemWrapper { NWID = Guid.NewGuid(), Item = singleItem, DocumentType = "Item"};
// Create a cosmosdb client
using (var docClient = new DocumentClient(new Uri(endpointUrl), key))
{
// Save the document to cosmosdb
docClient.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri(databaseName, databaseCollection), itemWrapper)
.GetAwaiter().GetResult();
}
}
}
}
}
public class ItemWrapper
{
public Guid NWID { get; set; }
[Required]
[JsonProperty("item")]
public string Item { get; set; }
[Required]
[JsonProperty("documentType")]
public string DocumentType { get; set; }
}
}
Right now the Cli and Runtime is output as below when we debug function in VS. And the function project is created with Microsoft.NET.Sdk.Functions 1.0.23(>=1.0.14) references Newtonsoft.Json 11.0.2 by default.
Azure Functions Core Tools (2.1.748 Commit hash: 5db20665cf0c11bedaffc96d81c9baef7456acb3)
Function Runtime Version: 2.0.12134.0
I can only repro the problem with some old Function runtime, which still requires v10 Newtonsoft.Json. So check Function runtime version and make sure VS consumes the latest.
Download and set cli manually
Delete old Function CLI using by VS. Remove subfolders under %localappdata%\AzureFunctionsTools\Releases.
Delete template engine consumed by VS %userprofile%\.templateengine.
Go to CLI feed to download latest cli, right now it's feed 2.10.1 and CLI 2.1.748.
Go to %localappdata%\AzureFunctionsTools\Releases and create folder 2.10.1.
Decompress the zip and rename it to cli, drag it under 2.10.1.
Copy templates folder under cli to 2.10.1, and rename two files inside by removing version. e.g itemTemplates.2.0.0-10300.nupkg to itemTemplates.nupkg.
Create a manifest.json under 2.10.1 as below and change username.
{
"CliEntrypointPath": "C:\\Users\\UserName\\AppData\\Local\\AzureFunctionsTools\\Releases\\2.10.1\\cli\\func.exe",
"FunctionsExtensionVersion": "~2",
"MinimumRuntimeVersion": "2.1",
"ReleaseName": "2.10.1",
"RequiredRuntime": ".NET Core",
"SdkPackageVersion": "1.0.23",
"TemplatesDirectory": "C:\\Users\\UserName\\AppData\\Local\\AzureFunctionsTools\\Releases\\2.10.1\\templates"
}
The folder structure should be like this
After restarting VS, everything should work as expected.
These errors are often times caused by not having the latest version of the Functions SDK/Tools installed. The reference to Newtonsoft 11.0.0 suggests that this might be the be the case (latest version is 11.0.2). You can navigate to Tools -> Extensions and Updates -> Updates (bottom left of window) in VS to check for updates to Azure Functions tooling. Once you have the latest update, trying re-creating the function to see if it works.

A namespace cannot directly contain members in the Azure Function App

TARGET: Do the Azure Function tutorial on and copied code, but got several errors when executing locally on VS2017. I appreciate you help.
https://www.cyotek.com/blog/upload-data-to-blob-storage-with-azure-functions
ERROR 1 - related to Run:
CS0116 A namespace cannot directly contain members such as fields or methods UploadToBlobFunctionApp C:\AzureFunctions\UploadToBlobFunctionApp\UploadToBlobFunctionApp\UploadToBlobFunction.cs 15 Active
ERROR 2 - related to Task CreateBlob:
CS0116 A namespace cannot directly contain members such as fields or methods UploadToBlobFunctionApp
C:\AzureFunctions\UploadToBlobFunctionApp\UploadToBlobFunctionApp\UploadToBlobFunction.cs 45 Active
ERROR 3 - related to await CreateBlob:
CS0103 The name 'CreateBlob' does not exist in the current context UploadToBlobFunctionApp C:\AzureFunctions\UploadToBlobFunctionApp\UploadToBlobFunctionApp\UploadToBlobFunction.cs 36 Active
CODE Function.cs:
using System;
using System.Configuration;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
HttpStatusCode result;
string contentType;
result = HttpStatusCode.BadRequest;
contentType = req.Content.Headers?.ContentType?.MediaType;
if (contentType == "application/json")
{
string body;
body = await req.Content.ReadAsStringAsync();
if (!string.IsNullOrEmpty(body))
{
string name;
name = Guid.NewGuid().ToString("n");
await CreateBlob(name + ".json", body, log);
result = HttpStatusCode.OK;
}
}
return req.CreateResponse(result, string.Empty);
}
private async static Task CreateBlob(string name, string data,
TraceWriter log)
{
string accessKey;
string accountName;
string connectionString;
CloudStorageAccount storageAccount;
CloudBlobClient client;
CloudBlobContainer container;
CloudBlockBlob blob;
accessKey = "qwertyw4VhRajxlZn9C4hTMB8oSwE4klNUsvTy9VeTCIQ11111vFVVGExDwJ+JUboFv2B79j+W6foqLWE92w==";
accountName = "mystorage";
connectionString = "DefaultEndpointsProtocol=https;AccountName=" + accountName + ";AccountKey=" + accessKey + ";EndpointSuffix=core.windows.net";
storageAccount = CloudStorageAccount.Parse(connectionString);
client = storageAccount.CreateCloudBlobClient();
container = client.GetContainerReference("functionupload");
await container.CreateIfNotExistsAsync();
blob = container.GetBlockBlobReference(name);
blob.Properties.ContentType = "application/json";
using (Stream stream = new MemoryStream(Encoding.UTF8.GetBytes(data)))
{
await blob.UploadFromStreamAsync(stream);
}
}
The example that you are referencing is using scripted functions (csx file). They are mostly used while editing code directly in Azure portal.
I think you are trying to create a precompiled application with csproj and cs files. In this case, your code should be a valid C#, i.e. all methods should be inside classes.
Have a look at this example.
You can also use attributes to mark your functions and triggers instead of authoring function.json manually, see examples here.

Resources