I host my web application on IIS.
The pool that application resides in has 4 worker processes. (Web Garden)
To make a test, for each requst to aspx page, I write the Process Id of Executing Process into log.net file log.
When I open the file log, I see only the logs of first process.
I am sure other processes are running beacuse I can see them in task manager.
May other processes have access problems to log file since first process is writing into it?
How can I write to same log file from different processes of same application pool?
The way to fix this is to let each worker process write to a different log file.
To do this you will need to use dynamic file names. See http://geekswithblogs.net/rgupta/archive/2009/03/03/dynamic-log-filenames-with-log4net.aspx
for an example.
Related
I have a pretty data heavy Node application. Besides common things like file uploading, the app also spawns detached (long running) child processes.
For an example, consider a user uploads a file and the detached process triggers a native tool installed on the system to do some heavy processing. This can take anywhere between a second and several minutes - therefore the process is detached and the user is notified within the web site (when online) or via email.
I'm considering to use pm2 as monitoring tool. It seems great, though how would I monitor individual detached child processes with it? I've read most of the docs and checked the code examples - but I didn't find an example for my particular problem.
Concrete config examples will be welcome, since I'm new to pm2.
As of November 2020, there is an open issue for this feature:
https://github.com/Unitech/pm2/issues/1869
Sorry, but I am a complete noob to web applications, and I was just wondering what happens if my web application makes a call to an external binary executable which can take some time to process an input file, and multiple users try to call it at the same time, or when one user tries to call it while it is still running the previous process?
I think this has something to do with threading, but I'm not sure how that applies to external executables.....if someone could provide a resource for me to learn about how it works that would be great too!
When a process is launched it has it's isolated from other processes, and the same process can be launched several times, the only limitation being CPU and memory utilization. You have to use some concurrent access protection if writing to shared file, but if you do database access the DB engine takes care of concurrent access so it's not a problem.
I have a dashboard and I want a process to run when the user clicks on a button. That process might take a long time to complete.
My options so far:
using popen or something similar to execute the process
having a daemon monitor a directory. When this directory is changed (a file created) the daemon will do the job and then delete the file before idling again.
using cron, running every 5 seconds and also monitoring some directory.
Which one is more Linux-friendly? Is there any I have not considered?
This is what task queueing systems like Celery and Redis Queue are for.
Another option is to have a daemon (as in your 2nd option) that listen on some socket. Then, your WSGI application could just connect & send a command. There are many possibilities for how the communication over the socket would take place, choosing the right one depends a lot on the actual case.
This have the advantage that you can eventually have the two application (WSGI and the daemon) run on different computers or VMs at some point.
I have a large web app that runs on our two live servers. Part of our server side C# code calls a third party app to do a task for us.
That task works most of the time, but at a certain point it stops working until the AppPool is recycled.
This all happens in w3wp.exe, so I can see it running in process monitor like this (when it is not working),
Thread Create
Access the file PreviewGenerator.exe
Hive unloaded (this is the registry)
Thread Exit
And like this when it is working,
Thread Create
Access the file PreviewGenerator.exe
Process Start
Does heaps of stuff with PreviewGenerator.exe including reading / writing / registry, etc.
Process Exit
Hive unloaded
Thread Exit
How can I debug what is going on in my AppPool and why starting a separate process is not working some of the time?
I found the best thing to do was to create a separate app pool for my application in IIS and set an upper limit for the amount of RAM it could use. Also I found it useful to turn on the 'Generate Recycle Event Log Entry' items under the app pool settings.
You can then go to the system event log and filter out the items with a source of 'WAS' to understand what is going on in the app pools, when they are restarting and when they stop from being idle etc.
I think the main problem in our case is that the IIS box was running out of memory. Tuning the app pools and adding some extra RAM seems to have solved it.
Suppose there are two executables. One is mine and the other is some other application. Now if the other app is running, I want my app to run until the other one exits or is stopped.
Writing a separate service seems quite an overkill.
You can first obtain a Process object - say by Process.GetProcessesByName, or better - use the ProcessID of the process you wish to monitor, if you have it. You can then try obtaining a WaitHandle from it, as discussed e.g. here, then call WaitOne on it (or WaitAll, if you're monitoring several instances).
Write a windows service that will continuously monitor the other application executable. If the service finds it running it will start your executable if not running and make sure it keeps running throughout the life cycle of the other application. As soon as the other app terminates, your windows service will also terminate your exe.