I am reaching a conflict in my azure project,
I'm trying to seclude task :
on every 16th of the month of .xml file form individual web site.
In my project I have one web role and one worker role,
I trying to seclude on every 16th of the month that the web role will create a message(connect to the site and download the .xml file), insert it to the queue storage, the worker role gets the message process it and delete the message.
Suggestion anyone??
Have you taken a look at this post - http://blog.smarx.com/posts/building-a-task-scheduler-in-windows-azure? It discusses one approach for building a task scheduler, which seems pretty much like what you're looking to do.
In your worker role you can start a timer or use this: http://quartznet.sourceforge.net/
An Azure web role is a windows server, and as such has a task scheduler built in, just like any other windows server. You can use a startup task to add a scheduled task to the task scheduler on a web or worker role which will then be fired on the 16th of every month.
However, keep in mind that if you have multiple web roles / worker roles, each one will have this scheduled task set up. So you'll need to have some way of knowing if another role has taken care of the work yet or not, or the scheduled task will run once for each web role.
Related
I am writing an Azure hosted MVC website for a gym booking system. I need to be able to maintain membership expiry, suspensions as well as gym class attendence (i.e. logging to the database if a session has been missed). Each of these tasks requires a "c# service function" to be run that will go through the database, perform some checks and update records as and when required.
I need this to run pretty regularly to ensure that missed sessions are logged asap. Should I be developing this as an Azure WebJob and running it continuously? Should i be doing it in another manner? If I could get some suggestions on routes to take that would be massively appreciated.
Thanks
You have a few options: Web Jobs, Scheduler, and Worker Roles.
Web Jobs are a nice addon to an existing azure web app and have the benefit of no additional cost. Web Jobs use Scheduler under the covers if you choose to schedule the Web Job to run at an interval other than continuously. Here is a nice answer that describes the differences between the two.
Worker Roles would be the next logical step up from a Web Job. Worker Roles are dedicated Cloud Service VMs that can provide more dedicated power and offer greater scaling capabilities. Worker Roles can also do much more than just run jobs.
For the application you have described, if you are already running on Azure App Services (Web App) it sounds like a continuously running Web Job would be the correct choice.
Good day,
I would like to ask a conceptual question that has been tearing my mind for a while. There is no right or wrong answer here, probably, but I hope to get better understanding of my options.
Situation:
I have a Windows Azure Application. It consists of ONE Web Role with ONE instance and ONE worker role with TWO instances.
The main focus is on the worker role. It implements Quartz.NET scheduler to perform a task of getting information from a CRM system, making a table out of it and uploading to FTP server, say, every 8 hours. Very simple, nothing fancy.
The web role is used for manually triggering the job if someone needs it to run between 8 hour intervals. Simple user interface with pretty much one button.
However I would like to have a possibility to make it possible to change some configuration options of the worker role from the web role. For example credentials of destination FTP server and schedule of the job e.g. make it run hourly instead of 8 hours. The configuration DOES NOT need to persist if role goes offline. At the moment config is one in a static class.
This wouldn't seem a problem if I was running one worker role instance: I would send a message from web role via queue and change some static variables in the worker role, for example. But what confuses me is the message queues can be only picked up by one role instance, not both at the same time. So I will end up having the job run every 8 hours in one instance and every hour on another.
Is there any way to notify both instances that configuration needs to change?
There are several ways you could accomplish this. Let me offer two, aside from what #Gaurav suggested:
Windows Azure configuration settings. If you store your ftp server name and time interval as configuration settings, and then change the configuration, each role instance can capture and handle the RoleEnvironment.Changing event, and take action based on new settings.
Service bus pub/sub. You can have each role instance subscribe to a configuration change topic. Then, whenever you have a change, publish the change to the topic, and each instance, being subscribers, will receive the message and act accordingly.
One possibility would be to do a "PEEK" at message by your worker role instances instead of "GET" message. That way the message will remain visible to all the instances. Obviously the challenge there would be when to delete the message. Other alternative would be create a separate queue for each worker role instance (you can name the queues so that each worker role instance would actually GET the message from the queue intended for that instance e.g. if your worker role instances are say WorkerRole1_IN_0, WorkerRole1_IN_1, you could name your queues like workerrole1-in-0, worker-in-1 and so on). Just thinking out loud :)
Fairly new to Azure and the whole worker role concept, previously if I wanted some back end work done I would just create a windows forms application and have it as a scheduled task.
With my new site I have created a windows form application which I have running every hour which reads in XML feeds and does all the processing an inserts the information into sql azure.
There is also image links that I want to store in azure blob storage and possibly resize them , which I had trouble doing from my vb.net application.
My question is should I move all the processing from my windows form application to the worker role or should I set up a worker role to just process the image to blob storage?
How much compute time does a worker role use? I have seen examples of where there is a sleep timer but is it possible to run it every hour on the hour?
You can easily set up a timer to trigger every hour on the hour. Where you run your code this depends on your application architecture. If you have a web role, you can place this in your web role instead of a dedicated worker role unless you really need the extra processing power of a separate instance and are willing to pay for it. Also, the number of instances of each role (web/worker) will add complications to the solution.
A detailed outline of your architecture would provide a better frame of reference for the answer you seek.
Worker role is designed for doing all sort of background activities. so you can move all of processing logic from windows application to workerrole. It is simply a class library with an entry point class.
You can not schedule any work automatically in worker role. You will have to right a logic for your self. worker role is executing a piece of code in an infinite loop. You will be charged for the no. of hours you have used in worker role (compute), it does not matter your worker role is idle or processing something. plus if you are using blob and queue, you will also be charged for accessing them and for storing data in them.
Is there a way to use the windows scheduled task to kick off a url or a exe on a schedule?
Can I write a program as an exe then create a Azure VM then RDP into the Azure VM and hook it up to windows task scheduler?
Azure does have a scheduler now.
It allows invoking a Web Service over HTTP/s and post a message to a Windows Azure Storage Queue. It's very new but it can be free if you do not need the scheduler to be executed often. Otherwise it's a small monthly fee which come with scheduled task that can be up to every minute.
Things got much easier lately, please see this link https://azure.microsoft.com/en-us/services/scheduler/ to create a scheduled job in the new Azure Scheduler. There is a basic Free tier as well as some paid options but I think this is exactly what many of us were looking for. It is an excellent solution for triggering URLs with GET,POST,PUT,DELETE requests.
Just follow the simple instructions. Starting by selecting "Scheduler" from the Azure dashboard app menu:
Today the scheduler has been Azure Logic Apps:
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-overview
If you are looking for something like a cron job (which is a job, that is being run at specific time again and again), then check out Azure Functions:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-overview
Google Azure Storage Queues. They allow you to schedule jobs that will run at a later date. You can even specify when the job should run.
I guess starting a timer job from within the code requires Farm admin credentials. However, I need to start a timer job from a web part that will be used in any site. Now when I try to start the job it gives me an access denied error obviously because the app pool identity is not farm admin. Any ideas on how to resolve this issue?
Thanks,
Timer jobs run as the farm administrator and are not intended to be triggered directly by an end-user. Since some jobs may be resource intensive, only the farm admin can create new jobs or modify the schedule for existing jobs.
One solution is to use the SPWorkItem infrastructure to queue user tasks which are then processed by a custom timer job derived from SPWorkItemJobDefinition. Your webpart would call SPSite.AddWorkItem to add the work item. When your timer job runs, it will look for any work items with the matching WorkItemType GUID and invoke the ProcessWorkItem overload.
Your are right. To start a timer job the app pool user has to be farm administrator. Since starting a timer job requires you to update a SPJobDefinition object with an SPSchedule. The SPJobDefinition is a SPPersistedObject which is stored in the SharePoint Config Database. Only farm administrators can write into this db.
I don't see a way to get pass this issue.
Workaround:
Depending on your requirements you could write a master job that runs on regular basis. This job could query a SharePoint list and start another job defined by such a list item. Since the master job runs under the Farm Administrator account, the job would be able to start a new timer job.