We are building a system that some customers will run in Azure and some will run in Docker on their own hardware via docker-compose. We are basing our Microservices on Azure Functions.
I have written a docker-compose file to setup the various images (web site, Azure Functions and RabbitMQ)
The docker-compose looks like this (Simplified):
version: "3"
services:
abmicroservice:
build:
context: ./AbMicroservice
depends_on:
- rabbitmq
When the docker-compose starts up, I get this error when the Azure Function project is started:
abmicroservice_1 | No job functions found. Try making your
job classes and methods public. If you're using binding extensions
(e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called
the registration method for the extension(s) in your startup code
(e.g. builder.AddAzureStorage(), builder.AddServiceBus(),
builder.AddTimers(), etc.).
But when I run the same Azure Function using the func.exe tool or Visual Studio Debug, it runs fine.
I am guessing that the issue is my various host.json and the like and settings in docker-compose.yml.
My Function is just a hello-world test that runs great when Visual Studio 2019 runs it:
public static class TriggerFunction
{
[FunctionName("TriggerFunction")]
public static void Run(
[RabbitMQTrigger("hello")] string message,
ILogger log)
{
log.LogInformation($"************* Message received from RabbitMQ trigger: {message}");
}
}
Few thing that could could solve it:
Check the connection strings
Make sure the connection strings are given as env variables to your docker (look into function.json and the value of "connection" should be the name of env variable with connection string)
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "msg",
"type": "serviceBusTrigger",
"direction": "in",
"queueName": "nameOfTheQue",
"connection": "connectionVariable"
}
]
}
This means your docker should have a env variable "connectionVariable" with the connection string (to service bus in this example)
Related
I'm currently developing azure functions (new at it) but I'm getting the below error while trying to read from a topic/subscription. I have no idea what's causing this. Any help would be appreciated.
[20/12/2018 14:22:22] Loaded custom extension: ServiceBusExtensionConfig from 'referenced by: Method='Function.ContentCacheUpdate.ReadNotificationQueue.Run', Parameter='mySbMsg'.'
[20/12/2018 14:22:22] Generating 1 job function(s)
[20/12/2018 14:22:23] Found the following functions:
[20/12/2018 14:22:23] Function.ContentCacheUpdate.ReadNotificationQueue.Run
[20/12/2018 14:22:23]
[20/12/2018 14:22:23] Host initialized (1208ms)
Listening on http://localhost:7071/
Hit CTRL-C to exit...
[20/12/2018 14:22:23] Host started (1682ms)
[20/12/2018 14:22:23] Job host started
[20/12/2018 14:22:23] Host lock lease acquired by instance ID '000000000000000000000000EB6A5850'.
My function looks like this
private const string TopicName = "testtopic";
[FunctionName("Function2")]
public static void Run([ServiceBusTrigger(TopicName, "SubscriptionName", Connection = "MyBindingConnection")]string mySbMsg, ILogger log)
{
log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");
}
and my local.settings.json file is
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"TopicName": "testtopic",
"SubscriptionName": "testsubscription",
"MyBindingConnection": "Endpoint=sb://test-.xxxxxxxxxxxxxxxxxxxx="
}
}
Thanks
Looks like your Azure Functions Core Tools used by VS is outdated. To fix this,
First, go to VS menus>Tools>Extensions and Updates, find Azure Functions and Web Jobs Tools, update it if it's not the latest(15.10.2046.0 right now). Close all VS instances. Wait for the update to finish(if there is).
Then clean the old tools and templates and use VS to download new tools.
Remove %localappdata%\AzureFunctionsTools and %userprofile%\.templateengine folder.
Reopen VS to create a new Function project, wait at the creation dialog, See Making sure all templates are up to date....
After a while, we can see the tip changes as
Click Refresh to work with the latest template instantly.
After creating a new v2 ServiceBus Topic trigger, change your code as below. Connection looks for value in app settings(local.setting.json) by default, while for others properties, we need to wrap them with percent sign. Check details in doc.
[FunctionName("MyServiceBusTrigger")]
public static void Run([ServiceBusTrigger("%TopicName%", "%SubscriptionName%", Connection = "MyBindingConnection")]string mySbMsg, ILogger log)
{
log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"TopicName": "testtopic",
"SubscriptionName": "testsubscription",
"MyBindingConnection": "Endpoint=sb://test-.xxxxxxxxxxxxxxxxxxxx="
}
}
Even I did using this way,Can you cross check if there are messages in the subscription.
We have started to use the Queue binding in our Azure functions for longer-running tasks such as sending bulk e-mails and "clean-up" tasks for CosmosDB. We develop locally with the Functions emulator then commit into VSTS/Azure DevOps which then auto-deploys into our Function App.
It seems as though pretty quickly we're going to have multiple Functions (two local emulators and one cloud function) all listening to the same queue. We tried disabling locally and renaming locally, but these all seem like awkward workarounds that require too much manual work and have the possibility to push the wrong queue name forward into VSTS.
How do we configure the queue name in the function.json to read an environment variable? The connection setting in the binding takes the name of an environment variable, but the queue setting wants a string.
{
"disabled": false,
"bindings": [
{
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "emailer",
"connection": "STORAGE_CONNECTION_STRING"
}
]
}
Just wrap variable name with % and function can read its value from Application settings on portal and Values in local.settings.json locally.
"queueName": "%myqueue%"
connection property of triggers and bindings is a special case and automatically resolves values as app settings, without percent signs.
See Binding expressions - app settings.
I have a v.2 Service Bus Trigger function which, when I attempt to start, throws the following exception:
System.InvalidOperationException
HResult=0x80131509
Message=The host has not yet started.
Source=Microsoft.Azure.WebJobs.Host
StackTrace:
at Microsoft.Azure.WebJobs.JobHost.StopAsync() in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\JobHost.cs:line 121
at Microsoft.Azure.WebJobs.Hosting.JobHostService.StopAsync(CancellationToken cancellationToken) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Hosting\JobHostService.cs:line 32
at Microsoft.Extensions.Hosting.Internal.Host.<StopAsync>d__10.MoveNext()
I've searched around but cannot find anyone with a similar issue (and fix). I'm running VS 15.8.7 with all extensions and packages updated.
Here's what my function looks like:
[FunctionName("ServiceBusListenerFunction")]
public static void Run([ServiceBusTrigger("myTopic", "MySubscription", Connection = "MyConnection")]string mySbMsg, ILogger log)
{
log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");
}
And here's my local.settings.json:
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"MyConnection": "UseDevelopmentStorage=true",
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true"
},
"Host": {
"LocalHttpPort": 7077
}
}
I also tried doing the following in launchSettings.json, but it didn't help:
{
"profiles": {
"MyProject": {
"commandName": "Project",
"executablePath": "C:\\Users\\[myUserName\\AppData\\Roaming\\npm\\node_modules\\azure-functions-core-tools\\bin\\func.dll",
"commandLineArgs": "host start --port 7077"
}
}
}
I have Service Bus Explorer running and have created the above-named topic and subscription on it. The project in which the functions are located is built against .NET Standard 2.0.
Please let me know if you have any suggestions or need additional information.
EDIT: I grabbed the red exception text that appears briefly in the console window before it closes (which happens right before I get the above exception), and it reads:
Host initialized
A host error has occurred
System.Private.Uri: Value cannot be null
Parameter name: uriString
Stopping job host
Searching on this, I found this, but it doesn't seem as though I should have to change the attribute to get this working.
Thanks in advance for any help.
Problem is caused by this setting
"MyConnection": "UseDevelopmentStorage=true"
UseDevelopmentStorage=true represents Storage emulator connection string, for a Service Bus trigger, use Service Bus connection string(same one used in Service Bus Explorer or find it in Azure portal).
Some improvements:
In local.settings.json, LocalHttpPort somehow doesn't work in VS, you can remove it as commandLineArgs in launchSettings.json works as expected.
AzureWebJobsDashboard is not required now, so it can be deleted without special purpose.
In launchSettings.json, remove executablePath which is invalid as well. Usually we don't need this setting as VS use latest CLI by default.
One of the ways, I sorted the issue by removing the connection string from the [ServiceBusTrigger] and inserting it through local.settings.json.
in the function file.
[ServiceBusTrigger("Your-Topics-Name", "SubscriptionName",Connection = "MyServiceBus")]
Inside the local.settings.json.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsMyServiceBus": "your connection string",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
note: connection string name start with the "AzureWebJobs" so you can put the remaining as the name.
In my case, it was just to update the version of Microsoft.Azure.WebJobs.Extensions.ServiceBus from 4.7.x to 5.x.x, and that's it :-)
I had to install Azure Functions Core Tools. It includes a version of the same runtime that powers Azure Functions runtime that you can run on your local development computer. It also provides commands to create functions, connect to Azure, and deploy function projects.
In my case the problem was the Platform target, change it to Any CPU instead of x86
I solve the issue by updating all the packages. I had sold packages that were incompatible with a recent package I installed.
I try to deploy a precompiled Azure Function that use Blob Trigger.
After publishing the function, I have the following error in Kudu and my function is not executed:
2017-05-30T14:34:11.436 Starting Host (HostId=sfl-data-forecast-dev-funcs, Version=1.0.10945.0, ProcessId=17328, Debug=True, Attempt=0)
2017-05-30T14:34:11.436 Development settings applied
2017-05-30T14:34:11.436 No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. config.UseServiceBus(), config.UseTimers(), etc.).
2017-05-30T14:34:11.436 Job host started
2017-05-30T14:34:11.436 The following 1 functions are in error:
Import: The function type name 'Forecasts.Functions.ImportForecastsFunction' is invalid.
I do not understand why I have this error. The Azure function is in a web project that targeting framework 4.6.1. WebJob SDK, and Extensions nuget package were added. I have downgraded Newtonsoft.Json to version 9.01 but it didn't change anything.
I have the following function.json:
{
"scriptFile": "..\\bin\\SFL.Data.Forecasts.Functions.dll",
"entryPoint": "SFL.Data.Forecasts.Functions.ImportForecastsFunction.Run",
"bindings": [
{
"name": "file",
"type": "blobTrigger",
"direction": "in",
"path": "forecasts/{name}",
"connection": "HotStorageAccount.ConnectionString"
}
],
"disabled": false
}
Faced the same exception. Turned out that the runtime version was invalid. Erroneously defined as ~1, even though the function is referencing netcore2.1, not supported by runtime Version 1.
In particular, the invalid version was caused by an ARM-template based resource group deployment, defining the function app's parameter FUNCTIONS_EXTENSION_VERSION as ~1 instead of ~2.
I solved the problem by providing a namespace for the Azure Function file.
namespace MyProject.AppFunctions
This is my class:
namespace MyProject.AppFunctions
{
public static class SomeFunction
{
public static async Task<HttpResponseMessage> Run(...)
{
// CODE
}
}
}
This is my functions.json file:
{
"scriptFile": "..\\bin\\MyProject.AppFunctions.dll",
"entryPoint": "MyProject.AppFunctions.SomeFunction.Run",
...
}
FWIW, I just resolved the same issue. The problem was that in my debugger settings, it was pointing to an old version of the func.exe application. I switched my debugger settings to launch %AppData%\npm\func.cmd instead and everything worked fine.
(Solution) Function (FunctionName/CsvUpload) Error: The function type name 'Functions.CsvUpload' is invalid
I have solved this error by setting the value in Azure configuration from ~1 to ~2. Make Sure you have to use Visual studio 2019 and Microsoft.NET.Sdk.Functions 3.0.2 When you are working on dotnet core 3.0 something ! If you are using Visual Studio 2017 you must have to set your sdk version lower than 3.0.2 (Microsoft.NET.Sdk.Functions : 1.0.29 something) then only you can set FUNCTIONS_EXTENSION_VERSION = ~1
If you using Visual Studio 2019 then use FUNCTIONS_EXTENSION_VERSION = ~2
If you using Visual Studio 2017 then use FUNCTIONS_EXTENSION_VERSION = ~1
This also happens when you have multiple Azure Functions Core Tools versions installed at the same time.
I got this error when trying to debug a v4 Azure Function locally. Turns out I had v2 installed. Once I removed v2 things started working again.
I'd like to run unit / integration tests that utilise the Azure Storage Emulator rather than real storage from a Azure DevOps build.
The emulator is installed on the Hosted Build Controller as part of the Azure SDK in its usual place (C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\AzureStorageEmulator.exe).
However the emulator is in the uninitialised state on the Build Controller. When trying to run the command Init from the command line, I get the following error:
This operation requires an interactive window station
Is there an known workaround for this or plans to support the emulator in Azure DevOps builds?
Despite all the answers here to the contrary, I've been running the Azure Storage Emulator on a VS2017 hosted build agent for over a year.
The trick is to initialise SQL LocalDB first (the emulator uses it), and then start the emulator. You can do this with a command line task that runs:
sqllocaldb create MSSQLLocalDB
sqllocaldb start MSSQLLocalDB
sqllocaldb info MSSQLLocalDB
"C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\AzureStorageEmulator.exe" start
As already stated you can't run the Azure Storage Emulator. What you can run though is Azurite an open source alternative.
Please note: Azurite can emulate blobs, tables and queues. However I have only used the blob storage emulation in this way.
At the start of your build configuration add a nuget step that runs a custom nuget command install Azurite -version 2.2.2. Then add a command line step that runs start /b $(Build.SourcesDirectory)\Azurite.2.2.2\tools\blob.exe.
It runs on the same port as the Azure Storage Emulator so you can use the standard connection strings.
No, the Hosted Build Controller does not run in Interactive Mode, so the emulator won't work under the environment. See Q&A in Hosted build controller for XAML builds for details.
Q: Do you need to run your build service in interactive mode?
A: No. Then you can use the hosted build controller.
I recommend you setup on-premises build controller and run the build server in Interactive Mode. Refer to Setup Build Server and Setup Build Controller for details.
Seems like the answer is maybe from the Visual Studio Online side. There's a User Voice entry if anyone has similar issues.
Not really sure why the emulator doesn't have a non-interactive mode, personally I don't use it's UI 99% of the time. There's a general User Voice entry for making Azure Storage more unit testable.
If you want to do start the Azure Storage Emulator right in your integration test code in C#, you can put this into your test initialization (startup) code (the example is for xUnit):
[Collection("Database collection")]
public sealed class IntegrationTests
{
public IntegrationTests(DatabaseFixture fixture)
{
this.fixture = fixture;
}
[Fact]
public async Task TestMethod1()
{
// use fixture.Table to run tests on the Azure Storage
}
private readonly DatabaseFixture fixture;
}
public class DatabaseFixture : IDisposable
{
public DatabaseFixture()
{
StartProcess("SqlLocalDB.exe", "create MSSQLLocalDB");
StartProcess("SqlLocalDB.exe", "start MSSQLLocalDB");
StartProcess("SqlLocalDB.exe", "info MSSQLLocalDB");
StartProcess(EXE_PATH, "start");
var client = CloudStorageAccount.DevelopmentStorageAccount.CreateCloudTableClient();
Table = client.GetTableReference("tablename");
InitAsync().Wait();
}
public void Dispose()
{
Table.DeleteIfExistsAsync().Wait();
StartProcess(EXE_PATH, "stop");
}
private async Task InitAsync()
{
await Table.DeleteIfExistsAsync();
await Table.CreateAsync();
}
static void StartProcess(string path, string arguments, int waitTime = WAIT_FOR_EXIT) =>
Process.Start(path, arguments).WaitForExit(waitTime);
public CloudTable Table { get; }
private const string EXE_PATH =
"C:\\Program Files (x86)\\Microsoft SDKs\\Azure\\Storage Emulator\\AzureStorageEmulator.exe";
private const int WAIT_FOR_EXIT = 60_000;
}
[CollectionDefinition("Database collection")]
public class DatabaseCollection : ICollectionFixture<DatabaseFixture>
{
// This class has no code, and is never created. Its purpose is simply
// to be the place to apply [CollectionDefinition] and all the
// ICollectionFixture<> interfaces.
}