WCF service, launch a task every X hours - multithreading

I have a WCF service which needs to perform some actions on database every 1 hour and also needs to generate a file with some information.
So which is better, do it through a timer or thread?
The problem of thread would be the constant iteration (with a little delay) on a loop checking if the time has elapsed and if so do the action.
Any ideas on how to achieve this scenario the most efficient possible?

Sounds like you need long running service.
WCF by it self not good solution.
You should look at Windows Services or WCF + WF hosted in app fabric
One of the reasongs, WCF does not support autostart, so you will have to start it every time after pool recycle(if you host in IIS, or any other hosting process)

Related

How to tell Azure not to remove particular server during scale down

I have a .NET app running on Azure App Service.
The auto-scale is setup and sometimes it goes up to 10 instances and then back to 3.
I have a background task (hangfire) that runs every hour on one of the instances (I don't know on which one, it is random).
Is there a way to tell Azure, during scale down, not to remove the server where the task is currently executing on?
You should never rely on such thing but design your background job processors to be able to shutdown gracefully.
This is why you should be using cancellation tokens in you jobs, and job should be able to pick up from where it left.
For hangfire there is custom implementation. In some other cases you can use .net CancellationToken

How to host long running process into Azure Cloud?

I have a C# console application which extracts 15GB FireBird database file on a server location to multiple files and loads the data from files to SQLServer database. The console application uses System.Threading.Tasks.Parallel class to perform parallel execution of the dataload from files to sqlserver database.
It is a weekly process and it takes 6 hours to complete.
What is best option to move this (console application) process to azure cloud - WebJob or WorkerRole or Any other cloud service ?
How to reduce the execution time (6 hrs) after moving to cloud ?
How to implement the suggested option ? Please provide pointers or code samples etc.
Your help in detail comments is very much appreciated.
Thanks
Bhanu.
let me give some thought on this question of yours
"What is best option to move this (console application) process to
azure cloud - WebJob or WorkerRole or Any other cloud service ?"
First you can achieve the task with both WebJob and WorkerRole, but i would suggest you to go with WebJob.
PROS about WebJob is:
Deployment time is quicker, you can turn your console app without any change into a continues running webjob within mintues (https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/)
Build in timer support, where WorkerRole you will need to handle on your own
Fault tolerant, when your WebJob fail, there is built-in resume logic
You might want to check out Azure Functions. You pay only for the processing time you use and there doesn't appear to be a maximum run time (unlike AWS Lambda).
They can be set up on a schedule or kicked off from other events.
If you are already doing work in parallel you could break out some of the parallel tasks into separate azure functions. Aside from that, how to speed things up would require specific knowledge of what you are trying to accomplish.
In the past when I've tried to speed up work like this, I would start by spitting out log messages during the processing that contain the current time or that calculate the duration (using the StopWatch class). Then find out which areas can be improved. The slowness may also be due to slowdown on the SQL Server side. More investigation would be needed on your part. But the first step is always capturing metrics.
Since Azure Functions can scale out horizontally, you might want to first break out the data from the files into smaller chunks and let the functions handle each chunk. Then spin up multiple parallel processing of those chunks. Be sure not to spin up more than your SQL Server can handle.

Stop Multiple WebAPI requests from Azure Scheduler

I have a web api which does a task and it currently takes couple of minutes based on the data. This can increases over time.
I have Azure scheduler job which calls this web api every 10 minutes. I want to avoid the case where the second call after 10 minutes overlaps with the first call because of the increase in time for execution. How can I put the smarts in the web api so that I detect and avoid the second call if the first call is running.
Can I use AutoResetEvent or lock statements? Or keeping a storage flag to indicate busy/free a better option?
Persistent state is best managed via storage. Can your long-running activity persist through a role reset (after all, a role may be reset at any time as long as availability constraints are met).
Ensure that you think through scenarios where your long running job terminates halfway through.
The Windows Azure Scheduler has a 30 seconds timeout. So we cannot have a long running task called by a scheduler. The overlap of 2 subsequent calls is out of question.
Also it seems like having a long running task from a WebAPI is a bad design, because the recycling of app pools. I ended up using Azure service bus. When a task is requested message is posted to the queue. That way the time occupied by the webapi is limited.

Long running WCF in shared hosting

I have a WCF (on top of IIS) which will be hosted on a shared hosting environment, so I don't have access to window services or permissions for installations.
This WCF would have a long running computation (it is a spatial interpolation), so my question is about which architecture to use in order to not affect performance, in particular I don't want to grab threads from the ASP.NET tread pool for such long task.
I know that a possible solution would be a window service for multi-threading computation and MSMQ for communicating between the WCf and the window service, but as I said I don't have the possibility to install a service.
Anybody could suggest a solution?
thanks in advance
You could simple use an asynchronous/one-way method on your WCF service and call this.
We use a similar method to upload data and kick off the import process. The client will then poll using another WCF method and when the initial process has finished, update the relevant data to indicate that and return it back to the client in the poll.

How to create a job in IIS capable of running an extended process

I have a web service app, I have 1 web service call that could take anything from 1 hour to 14 hours, depending on the data that needs to be processed and the time of the month.
Is there any way to create a job in IIS that could be capable of running this extended process. I also need job management and reporting to be able to see if jobs are running, so that new jobs aren't created on top of others.
I will be working with IIS6 primarily. And would like to use C# code.
Right now I am using a web service call, but I don't like the idea of having web services run for such a long time, and due to the nature of the web service, I can't split the functionality any more.
IIS jobs would be awesome if they are available. Any ideas?
If I were you, I would make a command line app that is kicked off by the web service. Running a commandline app is pretty straight forward, basically
Process p = new Process();
p.StartInfo.UseShellExecute = false;
p.StartInfo.FileName = "appname.exe";
p.Start();
There are a limited amount of worker processes per machine, they aren't really meant for long running jobs.
One possibility, with a bit of setup cost, is to have your processing run as a Windows service that listens to a message queue (MSMQ or similar), and have your web service simply post the request onto the message queue to be handled by the processing service.
Monitoring jobs is more difficult; your web service would need to have a way of querying your processing service to find out its state. This is an IPC (interprocess communication) problem, which has many different solutions with various tradeoffs that depend on your environment and circumstances.
That said, for simple cases, Matt's solution is probably sufficient.

Resources