Our website has an application pool. When new data is uploaded by csv it causes the application pool to crash. A call to the hosting company for an IIS reset gets us back again but only until the next upload. Any solutions?
Related
I have build a Windows Phone app. I send data to a database in Azure. I need this data to be processed and the results to be send back to the user, so the app remains 'light'.
Sorry if the question is a bit general, but what service should I use to process data in the cloud?
We are having a VM hosting our web application where its users upload big files to their profiles -mainly they are 3d models- via the Web API of the web app.
What I'm looking to do is to find a way to handle these long running uploading jobs/process efficiently to some place rather than the VM where I'm hosting the application.
I know there are some Azure methods but I want to make it correct. Which is the most efficient way to do that ? Worker Role, a web page running a scripts to upload it to storage or Web Jobs over a queue maybe ?
The function which uploads the file also is having some other processes like generating thumbnails, storing the data to sql and etc.
I have a situation which involves an MVC app, to which a potentially large number of up to 32MB chunks of data are uploaded. After each chunk is uploaded, it needs to be processed and a response sent before the client browser uploads the next chunk.
Ultimately the data and the results of its processing need to be stored on Azure storage. The data processing is CPU intensive. Given that transferring this amount of data takes an appreciable amount of time, I am looking to reduce the number of trips the data needs to do between machines, as well as move the work out of the web server threads.
Currently this is done by queuing up the jobs which are consumed by a single worker thread.
However, this process needs to be upgraded such that it runs an executable to the heavy work.
At the end of processing, the data is uploaded to Azure Blob storage. So, the data already needs to be transferred twice over the network (AFAIK) before the response is sent. Not ideal.
I am aware of the different queuing options in Azure, but am wary of making the situation worse rather than better. I don't want to overkill this problem, but do need to make the entire process run as quickly and efficiently as possible.
a) What kind of data transfer speeds can I expect between an Azure Web Role and Worker Role in a Cloud Service?
b) Is there any way to transfer the data directly to Azure storage and then process it there, without transferring it again?
c) Can / Will the worker role and web role actually run on the same machine?
d) Can I just run the .exe from inside the web app? How to get the path?
I would suggest a workflow similar to:
Client uploads data directly to Blob storage (in smaller chunks as per this guide)
When upload is finished client notifies your web service, and the web service posts a message on a Service Bus Queue (jobQueue). The message contains a unique session identifier and the Blob Url of the data uploaded. The web service then blocks and listens on another service bus queue (replyQueue) for the reply message with the specified sessionId.
A multi-threaded Worker role long polls the service bus jobQueue, for each message it receives they are processed, the processed data is then stored somewhere, and then a reply message is created and posted to the replyQueue with the sessionId set.
The web service will then receive the reply message (for the given sessionId) and can return a result to your client.
With an architecture similar to this you can scale vertically by using a bigger machine for your worker role, or horizontally by adding more instances of the worker role.
To make the process a bit more resilient you may want to return to the client instantly after the client has notified the web service of the uploaded data, and then the client can be signaled directly from the worker role using Signalr when the data has been processed.
My answers to the other parts of your question is:
a) I'm unsure what the guarantees of data transfer speeds are between the roles
b) Yes you can transfer data directly to Blob Storage, and then from the Blob to the Worker Role
c) You can run Worker Role style processing on the Web Role, call your Worker Role style code from the WebRole.OnRoleStart and WebRole.Run, then as you need to scale this code can be moved to its own dedicated Worker Role
I've got a web role on Azure that one of its jobs is to upload a picture, format it and then upload to a BLOB.
I do this with a temp directory on the web role - so there is a temp file there which I delete after it is uploaded to a BLOB.
Sometime the uploading is interrupted or the web role has some problems and the temp image file stays on the web role.
I want to create a worker role that once in X hours will clean that folder. It is possible that I will have 100 web roles (each one in its own isolated environment) and only 2 worker roles - So they have to somehow get to the web roles, one by one, and delete those files.
So my question is - Is this even possible?! if so, how?
Thanks!
Should you create a worker role it would run on a separate VM, not the same as your web role and that would defeat the whole idea since you can't get to another VM without a carefully crafted interface and such interface would definitely be an overkill for this task.
What you really want is just a separate thread (System.Threading.Thread) that you start from within the web role entry point and that continuously monitors the temporary folder for leftover files. That will be cheap and working.
I am working on TCP/IP in Windows Azure and am successfully able to develop a TCP client to send and web-role to receive the TCP data.
I want to display this received data in a .aspx page. How should I access the webrole data from .aspx page?
Regards,
Anil
Having read the long comment stream, I think I get the essence of this scenario and question. It appears that there's some tcp-listener code being launched from within webrole.cs, not within the asp.net app code.
Here's the thing: A Web Role is Windows Server 2008 with IIS, along with some Windows Azure code for handling bootstrap and shutdown tasks. The webrole.cs file you speak of is the bootstrap/shutdown code entry point, with methods such as OnStart(), Run(), OnStop(), and Stopping(). This code is run in a separate AppDomain than your web app.
If you're launching a ServiceHost (or some other port listener) from webrole.cs, that's fine, but you'd need to then store content somewhere temporarily after it's uploaded, then make it available for your web app later. You could choose durable storage such as SQL Azure or Azure Storage (blobs or tables), or volatile storage (e.g. a local disk). You could then use some type of communication scheme to notify your web app that it has new data display, possibly by placing a message on an Azure queue, or having your web app just query a table for data each time a user requests it.