I have a MVC website that gets published to Azure where it uses an Azure SQL Database.
The time has come where we need to run a scheduled task to send SMS reminders. I was under the impression that Azure Web Jobs was a good fit for this but am having some issues getting it up and running.
I have added a console app to my website solution and referenced by EF data model from the console app (which I would like to publish as a web job).
The current console app looks as follows:
class Program
{
static void Main(string[] args)
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void ProcessNotifications()
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
}
Running the console app will then throw the exception:
Additional information: User account connection string is missing. This can be set via the 'AzureJobsData' connection string or via the constructor.
This suggests that the Web Job is specifically looking for a connection string that points at a storage account. However, I would like the web job to query an Azure database rather than work with storage.
Is this doable?
Thanks,
As Victor wrote, the sdk events are triggered by blobs, queues and tables.
A simple solution for your need: write a console application with a pollig approach. No sdk needed. Details at the beginning of http://blog.amitapple.com/post/73574681678/git-deploy-console-app/
while(true)
{
if ("isthereworktodo?")
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
Thread.Sleep(10000); // set an appropriate sleep interval
}
Unfortunately the WebJobs SDK does not support SQL databases as triggers. For triggers, it only supports Azure Storage (blobs, queues and tables).
You can access Azure SQL Databases from the web job as demonstrated in this answer.
To create a web job you don't have to use the webjobs sdk, you can use several types of executables (.exe, .cmd, .js, .php, .py, ...).
Try using this new visual studio add-on: http://visualstudiogallery.msdn.microsoft.com/f4824551-2660-4afa-aba1-1fcc1673c3d0, follow the steps there to link between your web application and your console application which will be deployed as a webjob when published to your Azure site.
Related
I want to know if is there a way to see all the requests made to my service fabric application while it is already published on azure servers, or at least capture all requests and save them in some place, I need informations like source and body.
Thanks in advance!
To see all the requests and responses, you first need to log them in somewhere. Here are the available approaches:
Streaming into VS Cloud Explorer
You could use ServiceEventSource to log the valuable information and
then you will be able to see it by attaching to your SF cluster via
CloudExplorer in VS. Here is you could find more info - Debug your
Service Fabric application by using Visual
Studio.
Windows Azure Diagnostics
WAD extension that you could install on your VM-s uploads logs to Azure Storage, and also has the option to send logs to Azure Application Insights or Event Hubs. Check out Event aggregation and collection using Windows Azure Diagnostics.
EventFlow
Using EventFlow allows you to have services send their logs directly to an analysis and visualization platform, and/or to storage. Other libraries (ILogger, Serilog, etc.) might be used for the same purpose, but EventFlow has the benefit of having been designed specifically for in-process log collection and to support Service Fabric services.
Event analysis and visualization with OMS
When OMS is configured, you will have access to a specific OMS workspace, from where data can be queried or visualized in dashboards.
After data is received by Log Analytics, OMS has several Management Solutions that are prepackaged solutions to monitor incoming data, customized to several scenarios. These include a Service Fabric Analytics solution and a Containers solution, which are the two most relevant ones to diagnostics and monitoring when using Service Fabric clusters. Find more info on Event analysis and visualization with OMS and Assess Service Fabric applications and micro-services with the Azure portal.
And there is a number of ways how you could capture source and body, or whatever you need. Below you could find some:
Apply ActionFilterAttribute on your controller class if you don't
have one, and log all the information you need within OnActionExecuted method
Add the middleware in Startup class -
public static void ConfigureApp(IAppBuilder appBuilder)
{
// Configure Web API for self-host.
HttpConfiguration config = new HttpConfiguration();
config.Routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{id}",
defaults: new { id = RouteParameter.Optional }
);
appBuilder.Use(async (IOwinContext context, Func<Task> next) =>
{
await next.Invoke();
// log anything you want here
ServiceEventSource.Current.Message($"Response status code = {context.Response.StatusCode}");
});
appBuilder.UseWebApi(config);
}
I have a web app built using ASP.NET 4.5/C# and is hosted in azure as a Web App. The site allows users to upload PDF files which are then stored in a azure blob container and can later be download via the website as needed. So far so good and everything was working fine.
We now have a new requirement which involves processing these files using the custom win32 executable and the website must know if the processing was successful or not. This exe has a setup file and must be installed on the target machine before it can be used.
I have been scratching my head over how to architect this feature. I have come across many articles which tell a Worker Role is needed, or a VM is needed. But all articles seem to a very abstract.
Given that the installer for the executable requires manual intervention, I am thinking a Azure VM is the way to go. But how will the web app communicate to this. How do I notify the web app with the result of the process?
You cannot install such software in an Azure Web App, since web apps are sandboxed. As such, you won't be able to run that setup exe.
For such processing, you'd need to run that part of your app in a Virtual Machine or web/worker role.
But how will the web app communicate to this. How do I notify the web app with the result of the process?
Azure Queue storage could meet your requirement. It can provide cloud messaging between application components. Your VM could write the process result to a queue and your web app could read the process result from the same queue.
To add a new message to a queue, you could reference following code.
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the queue client.
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
// Retrieve a reference to a queue.
CloudQueue queue = queueClient.GetQueueReference("myqueue");
// Create the queue if it doesn't already exist.
queue.CreateIfNotExists();
// Create a message and add it to the queue.
CloudQueueMessage message = new CloudQueueMessage("Hello, World");
queue.AddMessage(message);
In Your Web App, you could create a QueueTrigger WebJob, the job will be executed immediately if any new message has been added to the queue.
public static void ProcessQueueMessage([QueueTrigger("myqueue")] string processResult, TextWriter log)
{
//You can get the processResult and do anything needed here
}
I am using azure web jobs in a system to process xml data received in real-time via a web service that is queued for later processing. Some of the webjob functions are invoked at quite a high frequency (100s per minute). When I first trialed the system the logging seemed to perform well. However now after several weeks a large volume of log data has accumulated it seems to stop updating and displays "indexing in process" fairly constantly.
How do I 'purge' or clear-out the logs?
Can I and should I turn off logging selectively for the frequently updated jobs? How can this be achieved?
My web jobs are continuous and use the c# api. My question isn't the same as Azure webjobs output logs indexing taking very long although this answer is also relevant - I was specifically asking how to purge the logs and turn off logging selectively.
You would have configured the AzureWebJobsStorage application setting or connection string in your WebJob. The logs are stored in the blob storage of this storage account. You should be able to manually clear them up there.
Assuming that you are using the Azure WebJobs SDK, you can plug in a custom logger
JobHostConfiguration config = new JobHostConfiguration();
config.Tracing.Tracers.Add(new CustomTraceWriter(TraceLevel.Info));
JobHost host = new JobHost(config);
host.RunAndBlock();
CustomTraceWriter can wrap all writes within a check for an application setting
if(CloudConfigurationManager.GetSetting("EnableWebJobLogging"))
{
....
}
I have looked around the Azure portal and searched the net, but I have been unable to find an answer. Is there a way, perhaps via the api or powershell, to get metrics on webjobs? Such as average runtime per individual job? I would also like to find out the average queued time of a message that a webjob triggers from (though that is probably a storage metric not a webjob metric). Any pointers would be appreciated.
As Igorek said, I don't think it is possible either. There are many tools to monitor application. Two of them have Azure integration:
Application Insights
New relic
I have used Application Insights to send metric from a webjob. You can follow this tutorial to setup Application insights n your webjob:
Application Insights on Windows Desktop apps, services and worker roles
If you want to calculate the time to process a message from a queue, you can do something like that:
public async Task ProcessAsync([ServiceBusTrigger("queueName")] BrokeredMessage incommingMessage)
{
var stopwatch = Stopwatch.StartNew();
// Process your message
...
stopwatch.Stop();
// You should only instantiate the TelemetryClient once in your application.
var telemetryClient = new TelemetryClient() { InstrumentationKey = "MyInstrumentationKey"};
//Send your metric
telemetryClient.TrackMetric("ProcessQueueMessageElapsedTime", stopwatch.ElapsedMilliseconds);
}
Don't think this is possible without 3rd party services.. in fact, the only one I know that does this stuff specifically is CloudMonix, which I'm affiliated with.
I am developing an Azure application using queues, blob storage and SQL Azure. We anticipate that some clients will not be willing to have their data hosted in the cloud (for reasons of paranoia or legal limitations on the jurisdiction in which data can be stored) and will want to run the system on a server located within their own data centres, on a single server. Using SQL Server and building an alternative to blob storage should be easy, but Azure queues are a more complicated. I guess using the development fabric is undesirable because the MS documentation says it must run as administrator.
How should I go about this?
I would add a layer of abstraction over the AzureQueues.
Something like:
public interface IQueueService
{
// will create if not exists
IQueue GetQueue(string name);
IQueue GetQueueIfExists(string name);
}
public interface IQueue
{
string Name { get; set; }
void AddMessage(SimpleMessage message);
void DeleteMessage(SimpleMessage message);
SimpleMessage PeekMessage();
void Clear();
}
etc...
That should give you an idea. You can then provide two implementations, one that utilizes AzureQueues and another one that uses MS Queues (http://en.wikipedia.org/wiki/Microsoft_Message_Queuing)
You choose the implementation depending on whether you are running on Azure or not.
I have done something very similar in the past.
You don't need to run on the developer fabric to access azure resources. Blobs are very easy to access via the web, I'm fairly certain you can do it with tables and Queues as well as the "http://'accountname'.queue.core.windows.net/" URLs are publicly available.
For a neat solution you should look at Azure AppFabric service bus, it basically allows you to connect, or "project" an on premise app web service endpoints into the cloud, it's basically a relay service. (It sound like magic, but it's actually pretty simple). You can use the same Service Bus to give Azure Worker Role services public Url endpoints.
http://msdn.microsoft.com/en-us/library/ee732537.aspx
http://www.microsoft.com/windowsazure/appfabric/overview/default.aspx