Expose webjobs functions to dashboard without azure storage - azure

In this question there's an example on how to use a webjob that can perform some background operations without interacting with azure table storage.
I tried to replicate the code in the answer but it's throwing the following error:
' 'Void ScheduleNotifications()' can't be invoked from Azure WebJobs SDK. Is it missing Azure WebJobs SDK attributes? '
In this link they have a similar error and in one of the answers it says that this was fixed in the 0.4.1-beta release. I'm running the 0.5.0-beta release and I'm experiencing the error.
Here's a copy of my code:
class Program
{
static void Main()
{
var config = new JobHostConfiguration(AzureStorageAccount.ConnectionString);
var host = new JobHost(config);
host.Call(typeof(Program).GetMethod("ScheduleNotifications"));
host.RunAndBlock();
}
[NoAutomaticTrigger]
public static void ScheduleNotifications()
{
//Do work
}
}
I want to know if I'm missing something or is this still a bug in the Webjobs SDK.
Update: Per Victor's answer, the Program class has to be public.
Working code:
public class Program
{
static void Main()
{
var config = new JobHostConfiguration(AzureStorageAccount.ConnectionString);
var host = new JobHost(config);
host.Call(typeof(Program).GetMethod("ScheduleNotifications"));
host.RunAndBlock();
}
[NoAutomaticTrigger]
public static void ScheduleNotifications()
{
//Do work
}
}

Unless you use a custom type locator, a function has to satisfy all conditions below:
it has to be public
it has to be static
it has to be non abstract
it has to be in a non abstract class
it has to be in a public class
Your function doesn't meet the last condition. If you make the class public it will work.
Also, if you use webjobs sdk 0.5.0-beta and you run a program with only the code in your example, you will see a message saying that no functions were found.

Came looking for an answer here, but didn't quite find it in the answer above, though everything he said is true. My problem was that I accidentally changed the inbound property names of a Azure web job so that they DIDN'T match the attributes of the object the function was supposed to catch. Duh!
For the concrete example:
my web job was listening for a queue message based on this class:
public class ProcessFileArgs
{
public ProcessFileArgs();
public string DealId { get; set; }
public ProcessFileType DmsFileType { get; set; }
public string Email { get; set; }
public string Filename { get; set; }
}
But my public static async class in the Functions.cs file contained this as a function definition, where the declared parameters didn't match the names within the queue message class for which it was waiting:
public static async Task LogAndLoadFile(
[QueueTrigger(Queues.SomeQueueName)] ProcessFileArgs processFileArgs,
string dealid,
string emailaddress,
string file,
[Blob("{fileFolder}/{Filename}", FileAccess.Read)] Stream input,
TextWriter log,
CancellationToken cancellationToke)
{
So if you run into this problem, check to make sure the parameter and attribute names match.

Related

Trying to follow .AddService & .UseService pattern

In my Minimal API, I use and integrate with Kofax TotalAgility WCF endpoints. I wanted to implement this integration properly, so I added a remote assembly and added the WCF contract in it along with the service interface and implementation:
Service Interface:
public interface IKofaxService
{
public Task<string> CreateJob(long letterId);
public Task ActionHandler(PortalActionRequest request);
}
Service implementation:
public class KofaxService : IKofaxService
{
private readonly ILogger<KofaxService> logger;
private readonly KofaxSetup config;
private readonly KtaJob.IJobService jobService;
private readonly KtaActivity.IActivityService activityService;
public KofaxService(ILogger<KofaxService> inLogger, KofaxSetup inConfig)
{
logger = inLogger;
// Here is the problem: THe constructor's parameter should be IOptions<Kofaxsetup> instead of just KofaxSetup and this below line will become:
// config = inConfig.Value;
config = inConfig;
//WCF Generated Stuff within this remote assembly
jobService = new KtaJob.JobServiceClient(GetBinding(), GetEndpointAddress(config.KtaUrlApiJob));
activityService = new KtaActivity.ActivityServiceClient(GetBinding(), GetEndpointAddress(config.KtaUrlApiActivity));
}
public async Task<string> CreateJob(long letterId)
{
...
}
public async Task ActionHandler(PortalActionRequest request)
{
...
}
}
In order to have a Servces.AddKofaxTotalAgility() like fluent API, I added the extension method like so (in the remote assembly):
Service extension method:
public static class ServiceCollectionExtensions
{
public static IServiceCollection AddKofaxTotalAgility(this IServiceCollection services)
{
services.AddScoped<IKofaxService, KofaxService>();
return services;
}
}
Also in the remote assembly, I have a class representing the setting object from appSetting's section:
Config class:
public class KofaxSetup
{
public string KtaUrlApiActivity { get; set; } = string.Empty;
public string KtaUrlApiJob { get; set; } = string.Empty;
public string SessionId { get; set; } = string.Empty;
public string ProcessId { get; set; } = string.Empty;
}
Back in the Minimal API project, I added a reference to the remote assembly and also have the settings in appSettings.json file:
appSettings.json:
{
...
"KofaxSetup": {
"KtaUrlApiActivity": "https://kofax.somewhere.com/TotalAgility/Services/SDK/ActivityService.svc",
"KtaUrlApiJob": "https://kofax.somewhere.com/TotalAgility/Services/SDK/JobService.svc",
"SessionId": "7DB87F70018D4770BF6114B1C9BA6041",
"ProcessId": "66EC6EED5D024E7AB0013D60F7A04A1A"
},
...
}
Lastly, modifications to Program.cs are as follows:
Minimal API Program.cs
var builder = WebApplication.CreateBuilder(args);
...
// Trigger KofaxSetting object from AppSetting's section
builder.Services.Configure<KofaxSetup>(builder.Configuration.GetSection(nameof(KofaxSetup)));
...
// Add the service to the DI
builder.Services.AddKofaxTotalAgility();
...
All of this just results in this exception at startup:
Exception # var app = builder.Build();
System.AggregateException: 'Some services are not able to be constructed (Error while validating the service descriptor 'ServiceType: DACRL.Integrations.Kofax.IKofaxService Lifetime: Scoped ImplementationType: DACRL.Integrations.Kofax.KofaxService': Unable to resolve service for type 'DACRL.Integrations.Kofax.Configs.KofaxSetup' while attempting to activate 'DACRL.Integrations.Kofax.KofaxService'.) (Error while validating the service descriptor 'ServiceType: DACRL.Application.Core.Services.ILetterService Lifetime: Transient ImplementationType: DACRL.Api.Services.LetterService': Unable to resolve service for type 'DACRL.Integrations.Kofax.Configs.KofaxSetup' while attempting to activate 'DACRL.Integrations.Kofax.KofaxService'.) (Error while validating the service descriptor 'ServiceType: DACRL.Application.Core.Services.ILetterService Lifetime: Transient ImplementationType: DACRL.Api.Services.LetterService': Unable to resolve service for type 'DACRL.Integrations.Kofax.Configs.KofaxSetup' while attempting to activate 'DACRL.Integrations.Kofax.KofaxService'.)'
1/2:
InvalidOperationException: Error while validating the service descriptor 'ServiceType: DACRL.Integrations.Kofax.IKofaxService Lifetime: Scoped ImplementationType: DACRL.Integrations.Kofax.KofaxService': Unable to resolve service for type 'DACRL.Integrations.Kofax.Configs.KofaxSetup' while attempting to activate 'DACRL.Integrations.Kofax.KofaxService'.
2/2:
InvalidOperationException: Unable to resolve service for type 'DACRL.Integrations.Kofax.Configs.KofaxSetup' while attempting to activate 'DACRL.Integrations.Kofax.KofaxService'.
Note that the ILetterService is working properly, and this is the service that internally attempts to receive the IKofaxService from DI in its parameter. I'm thinking the error has something to do with the object KofaxSetup
Is there a best practice that I'm missing here? Am I supposed to have a parameter-less constructor somewhere? Is the Logger<KofaxService> injection within the service's implementation not valid?
I actually sorted the issue out but didn't want to waste a well-written question.
The problem was fact, the KofaxSetup class. I was receiving it as its type directly in the Service's constructor. I had to use IOptions<KofaxSetup> instead to solve the issue.

There is no implicit reference conversion from table to ITableEntity in Azure Function

I am writing my first Azure Function and Azure table code. I am getting issue when I write Get query function. I have the following code that would try to get all the jobs from the table.
public static class GetJobStatus
{
[FunctionName("GetJobStatus")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)] HttpRequest req,
[Table("JobTable")] CloudTable jobTable,
ILogger log)
{
log.LogInformation("Get job status.");
string jobId = req.Query["jobid"];
TableQuery<JobTable> query = new TableQuery<JobTable>();
var segment = await jobTable.ExecuteQuerySegmentedAsync(query, null);
var data = segment.Select(JobExtension.ToJob);
return new OkObjectResult("");
}
}
But, I get compile time errors on these statements:
TableQuery<JobTable> query = new TableQuery<JobTable>();
var segment = await jobTable.ExecuteQuerySegmentedAsync(query, null);
I am trying to paste the actual error messages that appear on hover:
and, get the following on the ExecuteQuerySegmentedAsync method
My JobTable inherits from ITableEntity (Azure.Data.Tables):
public class JobTable : ITableEntity
{
public string Id { get; set; }
public DateTime CreatedTime { get; set; }
public JobRequest Request { get; set; }
//ITableEntity Members
public virtual string PartitionKey { get; set; } = "Job";
public virtual string RowKey { get => Id; set => Id = value; }
public DateTimeOffset? Timestamp { get; set; }
public ETag ETag { get; set; }
}
I have the following nuget packages installed:
I was trying to implement from this article, but it uses older nuget packages, and I was getting trouble.
Update #1:
As per the suggestions from Gaurav Mantri, to be consistent, I have removed Azure.Data.Tables and started using Microsoft.WindowsAzure.Storage.Table. That fixed the compile time errors. But now I get the following runtime error:
Microsoft.Azure.WebJobs.Host: Error indexing method 'GetJobStatus'. Microsoft.Azure.WebJobs.Extensions.Tables: Can't bind Table to type 'Microsoft.WindowsAzure.Storage.Table.CloudTable'.
Update #2:
I couldn't make it work, so I reverted all my code and references to use Microsoft.Azure.Cosmos.Table as described in the article I was referncing. Everything works as expected now. But, I still would like to see how I can use the newer libraries. For the original issue that was receiving, it was solved by Gaurav's suggestion so I will accept the answer for now.
I believe you are running into this issue is because you are using two different SDKs - Azure.Data.Tables and Microsoft.WindowsAzure.Storage.Table.
Your JobTable entity implements ITableEntity from Azure.Data.Tables and you are using that with your CloudTable from Microsoft.WindowsAzure.Storage.Table.
Can you try by removing Azure.Data.Tables package and just use Microsoft.WindowsAzure.Storage.Table?

Access SignalR Hub without Constructor Injection

With AspNetCore.SignalR (1.0.0 preview1-final) and AspNetCore.All (2.0.6), how can I invoke a method on a hub in server code that is not directly in a Controller and is in a class that cannot be made via Dependency Injection?
Most examples assume the server code is in a Controller and should 'ask' for the hub via an injectable parameter in a class that will created by DI.
I want to be able to call the hub's method from server code at any time, in code that is not injected. The old SignalR had a GlobalHost that enabled this approach. Basically, I need the hub to be a global singleton.
Now, everything seems to be dependent on using Dependency Injection, which is introducing a dependency that I don't want!
I've seen this request voiced in a number of places, but haven't found a working solution.
Edit
To be more clear, all I need is to be able to later access the hubs that I've registered in the Configure routine of the Startup class:
app.UseSignalR(routes =>
{
routes.MapHub<PublicHubCore>("/public");
routes.MapHub<AnalyzeHubCore>("/analyze");
routes.MapHub<ImportHubCore>("/import");
routes.MapHub<MainHubCore>("/main");
routes.MapHub<FrontDeskHubCore>("/frontdesk");
routes.MapHub<RollCallHubCore>("/rollcall");
// etc.
// etc.
});
If I register them like this:
services.AddSingleton<IPublicHub, PublicHubCore>();
it doesn't work, since I get back an uninitiated Hub.
No It's not possible. See "official" answer from david fowler https://github.com/aspnet/SignalR/issues/1831#issuecomment-378285819
How to inject your hubContext:
Best solution is to inject your hubcontext like IHubContext<TheHubWhichYouNeedThere> hubcontext
into the constructor.
See for more details:
Call SignalR Core Hub method from Controller
Thanks to those who helped with this. Here's what I've ended up on for now...
In my project, I can call something like this from anywhere:
Startup.GetService<IMyHubHelper>().SendOutAlert(2);
To make this work, I have these extra lines in Startup.cs to give me easy access to the dependency injection service provider (unrelated to SignalR):
public static IServiceProvider ServiceProvider { get; private set; }
public static T GetService<T>() { return ServiceProvider.GetRequiredService<T>(); }
public void Configure(IServiceProvider serviceProvider){
ServiceProvider = serviceProvider;
}
The normal SignalR setup calls for:
public void Configure(IApplicationBuilder app){
// merge with existing Configure routine
app.UseSignalR(routes =>
{
routes.MapHub<MyHub>("/myHub");
});
}
I don't want all my code to have to invoke the raw SignalR methods directly so I make a helper class for each. I register that helper in the DI container:
public void ConfigureServices(IServiceCollection services){
services.AddSingleton<IMyHubHelper, MyHubHelper>();
}
Here's how I made the MyHub set of classes:
using Microsoft.AspNetCore.SignalR;
using System.Threading.Tasks;
public class MyHub : Hub { }
public interface IMyHubHelper
{
void SendOutAlert(int alertNumber);
}
public class MyHubHelper : IMyHubHelper
{
public IHubContext<MyHub> HubContext { get; }
public MyHubHelper(IHubContext<MyHub> hubContext)
{
HubContext = hubContext;
}
public void SendOutAlert(int alertNumber)
{
// do anything you want to do here, this is just an example
var msg = Startup.GetService<IAlertGenerator>(alertNumber)
HubContext.Clients.All.SendAsync("serverAlert", alertNumber, msg);
}
}
This is a nice solution. In .NET Core 2.1 the service provider is disposed and you get cannot access disposed object. The fix is to create a scope:
public void Configure(IApplicationBuilder app, IHostingEnvironment env, IServiceProvider serviceProvider)
{
ServiceProvider = serviceProvider.CreateScope().ServiceProvider;

ETW events in Azure diagnostics (SDK 2.5) are logged with incorrect / missing schema

I upgraded to Azure SDK 2.5 and switched to semantic logging with EventSources.
Logging works locally with a custom EventListener.
When deployed, logs are written to a storage table, but only the EventId, Pid, Tid etc. are populated, the really interesting fields (Message, Task, Keyword, Opcode) are left blank.
The diagnostics infrastructure log is full of errors with regards to ETW, but I don't know what to make of them:
Failed to load backup EventSource manifest file C:\Resources\{13b7ec61-6424-d4d3-9972-a83e58d8d6bb}\directory\f71b19461fcf494d89d3717b3a13cadf. something.WorkerRole.DiagnosticStore\WAD0103\Configuration\EventSource_Manifest_fe06b63d-39aa-5419-0529-18c4dacf4f68_Ver_20.backup.xml;
EventSource events will be logged without a proper schema until provider sends the manifest packets
Load manifest file failed for C:\Resources\{13b7ec61-6424-d4d3-9972-a83e58d8d6bb}\directory\f71b19461fcf494d89d3717b3a13cadf.something. WorkerRole. DiagnosticStore\WAD0103\Configuration\EventSource_Manifest_fe06b63d-39aa-5419-0529-18c4dacf4f68_Ver_20.xml
Failed to manage manifest version for file C:\Resources\{13b7ec61-6424-d4d3-9972-a83e58d8d6bb}\directory\f71b19461fcf494d89d3717b3a13cadf. something. WorkerRole.DiagnosticStore\WAD0103\Configuration\EventSource_Manifest_fe06b63d-39aa-5419-0529-18c4dacf4f68_Pid_3436.xml
Failed to process EventSource manifest event GUID:fe06b63d-39aa-5419-0529-18c4dacf4f68, event id:0xFFFE
Change in the number of events lost since the last sample: EventsCaptured=2 EventsLogged=1 EventsLost=0
I do not use a manifest file and specify the EventSource via class / attribute name:
<EtwEventSourceProviderConfiguration scheduledTransferPeriod="PT3M" scheduledTransferLogLevelFilter="Information" provider="something.Core">
<DefaultEvents eventDestination="CoreEvents" />
</EtwEventSourceProviderConfiguration>
I must be missing something, but I do not know what.
The remaining diagnostic services all work (infrastructure logs, performance counter etc.).
The EventId that is being logged is the correct one, but all the important information of the log is missing, I suppose because of an incomplete configuration?
Edit: here is my EventSource code. I won't post the entire thing because it's quite large. I use another type that calls the EventSource methods and handles formatting of parameters (if the source is enabled in that level). Most method arguments are of type string, there are no objects or other complex types passed around (that handles the other type).
[EventSource(Name = "something.Core")]
public sealed class CoreEventSource : EventSource {
private static readonly CoreEventSource SoleInstance = new CoreEventSource();
static CoreEventSource() {}
private CoreEventSource() {}
public static CoreEventSource Instance {
get { return SoleInstance; }
}
public static EventKeywords AllKeywords = (EventKeywords)(-1);
public class Keywords {
public const EventKeywords None = (EventKeywords)(1 << 1);
public const EventKeywords Infrastructure = (EventKeywords)(1 << 2);
[...]
}
public class Tasks {
public const EventTask None = EventTask.None;
// generic operations
public const EventTask Create = (EventTask)11;
public const EventTask Update = (EventTask)12;
public const EventTask Delete = (EventTask)13;
public const EventTask Get = (EventTask)14;
public const EventTask Put = (EventTask)15;
public const EventTask Remove = (EventTask)16;
public const EventTask Process = (EventTask)17;
}
[Event(1, Message = "Initialization of {0} failed: {1}.", Level = EventLevel.Critical, Keywords = Keywords.Infrastructure)]
public void CriticalInitializationFailure(string component, string details, string exception) {
this.WriteEvent(1, component, details, exception);
}
[Event(2, Message = "[Role '{0}'] Startup: {1}", Level = EventLevel.Informational, Keywords = Keywords.Infrastructure)]
public void RoleStartup(string roleName, string message) {
this.WriteEvent(2, roleName, message);
}
[Event(3, Message = "[Role '{0}'] Stop failed: {1}.", Level = EventLevel.Error, Keywords = Keywords.Infrastructure)]
public void RoleStopFailed(string roleName, string details, string exception) {
this.WriteEvent(3, roleName, details, exception);
}
[Event(4, Message = "An unhandled exception occurred.", Level = EventLevel.Critical, Keywords = Keywords.Infrastructure)]
public void UnhandledException(string exception) {
this.WriteEvent(4, exception);
}
[Event(5, Message = "An unobserved exception occurred in a faulted task.", Level = EventLevel.Critical, Keywords = Keywords.Infrastructure)]
public void UnobservedTaskException(string exception) {
this.WriteEvent(5, exception);
}
[...]
}
Turns out there were quite a few problems with my EventSource. The first thing I'd recommend to anyone working with ETW is to use the Microsoft TraceEvent Library from NuGet, even if you use System.Diagnostics.Tracing, because it comes with a tool that will verify your EventSource code and notify you about problems.
I had to fix the following:
EventSource names must not contain a period .
Task/Opcode pairs must be unique within an EventSource
One must not declare a None field in a custom Keywords or Tasks enumeration
Hope this is of some use to anyone who encounters a similar problem.
Another thing that should be taken care of (which fixed our case)
- EventSources should only have a Name or a Guid, not both.
In our case, having both caused
- The EtwEventSourceProvider to not log anything
- The EtwEventManifestProvider to log the same way you outlined, with empty data points.

Spec fails when run by mspec.exe, but passes when run by TD.NET

I wrote about this topic in another question.
However, I've since refactored my code to get rid of configuration access, thus allowing the specs to pass. Or so I thought. They run fine from within Visual Studio using TestDriven.Net. However, when I run them during rake using the mspec.exe tool, they still fail with a serialization exception. So I've created a completely self-contained example that does basically nothing except setup fake security credentials on the thread. This test passes just fine in TD.Net, but blows up in mspec.exe. Does anybody have any suggestions?
Update: I've discovered a work-around. After researching the issue, it seems the cause is that the assembly containing my principal object is not in the same folder as the mspec.exe. When mspec creates a new AppDomain to run my specs, that new AppDomain has to load the assembly with the principal object in order to deserialize it. That assembly is not in the same folder as the mspec EXE, so it fails. If I copied my assembly into the same folder as mspec, it works fine.
What I still don't understand is why ReSharper and TD.Net can run the test just fine? Do they not use mspec.exe to actually run the tests?
using System;
using System.Security.Principal;
using System.Threading;
using Machine.Specifications;
namespace MSpecTest
{
[Subject(typeof(MyViewModel))]
public class When_security_credentials_are_faked
{
static MyViewModel SUT;
Establish context = SetupFakeSecurityCredentials;
Because of = () =>
SUT = new MyViewModel();
It should_be_initialized = () =>
SUT.Initialized.ShouldBeTrue();
static void SetupFakeSecurityCredentials()
{
Thread.CurrentPrincipal = CreatePrincipal(CreateIdentity());
}
static MyIdentity CreateIdentity()
{
return new MyIdentity(Environment.UserName, "None", true);
}
static MyPrincipal CreatePrincipal(MyIdentity identity)
{
return new MyPrincipal(identity);
}
}
public class MyViewModel
{
public MyViewModel()
{
Initialized = true;
}
public bool Initialized { get; set; }
}
[Serializable]
public class MyPrincipal : IPrincipal
{
private readonly MyIdentity _identity;
public MyPrincipal(MyIdentity identity)
{
_identity = identity;
}
public bool IsInRole(string role)
{
return true;
}
public IIdentity Identity
{
get { return _identity; }
}
}
[Serializable]
public class MyIdentity : IIdentity
{
private readonly string _name;
private readonly string _authenticationType;
private readonly bool _isAuthenticated;
public MyIdentity(string name, string authenticationType, bool isAuthenticated)
{
_name = name;
_isAuthenticated = isAuthenticated;
_authenticationType = authenticationType;
}
public string Name
{
get { return _name; }
}
public string AuthenticationType
{
get { return _authenticationType; }
}
public bool IsAuthenticated
{
get { return _isAuthenticated; }
}
}
}
Dan,
thank you for providing a reproduction.
First off, the console runner works differently than the TestDriven.NET and ReSharper runners. Basically, the console runner has to perform a lot more setup work in that it creates a new AppDomain (plus configuration) for every assembly that is run. This is required to load the .dll.config file for your spec assembly.
Per spec assembly, two AppDomains are created:
The first AppDomain (Console) is created
implicitly when mspec.exe is
executed,
a second AppDomain is created by mspec.exe for the assembly containing the specs (Spec).
Both AppDomains communicate with each other through .NET Remoting: For example, when a spec is executed in the Spec AppDomain, it notifies the Console AppDomain of that fact. When Console receives the notification it acts accordingly by writing the spec information to the console.
This communiciation between Spec and Console is realized transparently through .NET Remoting. One property of .NET Remoting is that some properties of the calling AppDomain (Spec) are automatically included when sending notifications to the target AppDomain (Console). Thread.CurrentPrincipal is such a property. You can read more about that here: http://sontek.vox.com/library/post/re-iprincipal-iidentity-ihttpmodule-serializable.html
The context you provide will run in the Spec AppDomain. You set Thread.CurrentPrincipal in the Because. After Because ran, a notification will be issued to the Console AppDomain. The notification will include your custom MyPrincipal that the receiving Console AppDomain tries to deserialize. It cannot do that since it doesn't know about your spec assembly (as it is not included in its private bin path).
This is why you had to put your spec assembly in the same folder as mspec.exe.
There are two possible workarounds:
Derive MyPrincipal and MyIdentity from MarshalByRefObject so that they can take part in cross-AppDomain communication through a proxy (instead of being serialized)
Set Thread.CurrentPrincipal transiently in the Because
(Text is required for formatting to work -- please ignore)
Because of = () =>
{
var previousPrincipal = Thread.CurrentPrincipal;
try
{
Thread.CurrentPrincipal = new MyPrincipal(...);
SUT = new MyViewModel();
}
finally
{
Thread.CurrentPrincipal = previousPrincipal;
}
}
ReSharper, for example, handles all the communication work for us. MSpec's ReSharper Runner can hook into the existing infrastructure (that, AFAIK, does not use .NET Remoting).

Resources