Safe to Run LogParser Against Live Production IIS Log? - iis

Is it safe to run LogParser against our live production IIS log file?
Currently, I have been copying it over to another location and then running LogParser 2.2 against the log file.
Instead, I would really like to run it against the live data so that I can see changes to it immediately, however, I am a little concerned that it might cause issues.
Does anyone know if querying the live IIS logs would cause a problem?

It shouldn't cause any problems as I don't believe it locks the file. Why would it be a problem to copy the file though just to make sure? Even if you just copy it to a local folder, a batch file could make that easy, to copy the file and run it through logparser.
But it should be fine against live files.

It's definitely safe, since as Tom says Log Parser does not lock input files. Running in production on live log files was a key target scenario.

Related

executing a script uploaded through file upload in public folder

We have to fix some security vulnerability in our system, and one of the items is to: disable execution of uploaded scripts/exe's through file upload control.
We have excel upload facility. Lets say hypothetically hacker changes the .exe to .xls and uploads it (there are ways to block that, but ignore that for now). Also assume that
the upload folder is within pubilc directory from where the website is installed in IIS. OR
Someone can access that file by specifying a full path of file thru some api endpoint of which hacker is aware of
Now given that there is an exe or a script which is accessible to the hacker through above means, is it possible for hacker to run that script/exe in someway, so that it can cause harm to the server where the site is hosted?
I am not really security expert hence cant think ways how that can be possible? How a hacker can remotely run exe/script on server, given that they does not have any access to the server.
One of the things that you should definitely do is to remove IIS handlers permissions from running scripts, otherwise anybody can upload a ".asp" or a ".aspx" or any other script engine file and then execute it by requesting it. One simple way to test that is just create a "test.asp" file with "<%= Now() >" and if that returns you the date, then anybody can upload scripts and run them in your server.
The way to disable that in IIS 7+ would be to add a configuration file in a parent directory and edit the permission for handlers, for example assuming a child folder called "public" you can drop the following web.config to disable that:
<configuration>
<location path="public">
<system.webServer>
<handlers accessPolicy="Read" />
</system.webServer>
</location>
</configuration>
You can test then that it should no longer execute the file and instead block it. If you want to allow download of them, then you'll need to configure the static file handler (and request filtering) to handle everything instead, but make sure you do that for that folder only since you don't want people downloading your source code.
Running the script would require remote access to the server, either directly or by exploiting some bug in the website code (similar to SQL injection). The risk here is mostly in hosting malware, especially if you allow user uploads to be downloaded by other users. While getting malware onto a machine is not as simple as just renaming an executable to another file type (it still has to be run as an executable rather than an Excel spreadsheet, for instance, to be able to function), it is possible to embed malware in various types of files, such that the act of opening that file causes execution of the malware. In that sense, you really can't tell at a glance whether a file is malware or not. It could look like an Excel file even open up properly in Excel, but still wreck havoc. The only way to be safe is scan all user uploaded files with a good antimalware application.
As far as running something remotely goes, though, the access to the server required to run the script would provide a much better avenue for mischief that your upload form, anyways. So anyone who could manage that kind of access isn't going to be trying to exploit you through your upload form, and anyone who uploads something malicious without that access can't really do anything.

How persistent is data I put on my Azure WebApp via FTP?

I've been searching around and can't find any clear answers to this. I need a small amount of data - talking kilobytes, probably not ever reaching megabyte range - available as a file on my Azure instance, outside the web app itself, for a web job to work with. I won't get into why this is necessary, but it is (alternatives have been explored), and the question is now where to put those files. The obvious answer seems to be to connect to the FTP, create a directory, plop them there and work with them there.
I did a quick test and I'm able to create a "downloads" directory within the "data" directory, drop some files in it, and work with them there. It works great for this very small, simple need that I have.
How long will that data stay there? Is that directory purged at any point automatically by the servers? Is that directory part of any backups that are maintained? How "safe" is something I manually put outside the wwwroot folder?
It will never be purged. The only folder that can get purged is the %TEMP% folder. All other folders that you have write access to will be persisted forever.

rsync --delay-updates on Cygwin doesn't work?

What I'm trying to do:
I want to launch files to a .NET based website. Any time the dlls change, Windows recycles the web app. When I rsync files over the app can recycle several times because of the delay instead of the preferred single time. This brings the site out of commission for a longer period of time.
How I tried to solve it:
I attempted to remedy this by using the --delay-updates, which is supposed to stage all of the file changes in temporary files before changing them over. This appeared to be exactly what I wanted, however, giving the --delay-updates argument does not appear to behave as advertised. There is no discernable difference in the output (with -vv), and the end behavior is identical (the app recycles multiple times rather than once).
I don't want to run Cygwin on all of the production machines for stability reasons, otherwise I could rsync to a local staging directory, and then perform a local rsync, which would be fast enough to be "atomic".
I'm running Cygwin 1.7.17, with rsync 3.0.9.
I came across atomic-rsync (http://www.opensource.apple.com/source/rsync/rsync-40/rsync/support/atomic-rsync) which accomplishes this by rsyncing to a staging directory, renaming the existing directory, and then renaming the staging directory. Sadly this does not work in a Windows setting, because you cannot rename folders with running dll files in them (permission denied).
You are able to remove folders with running binaries, however this results in recycling the app every time, rather than just when there are updates to the dlls, which is worse.
Does anyone know how to either
Verify that --delay-updates is actually working
Accomplish my goal of updating all the files atomically (or rather, very very quickly)?
Thanks for the help.
This is pretty ancient, but I eventually discovered that --delay-updates was actually working as intended. The app only appeared to be recycling multiple times due to other factors.

couchdb log files missing, can't get any to be created :-(

I'm running couchdb 1.0.1 on ubuntu and everything is working OK - except that I've just seen that my log files are non existent. They seem to have been like this for nearly a year, but to be fair I haven't really been using the system as it is a test bed for a project I've just picked up again.
/var/log/couchdb contained 2 files. An old (many months!) couch.log.1 and a couch.log with size 0 - which is suspicious. I've deleted the old files and now tried restarting couch, but the log files stubbornly stay absent!
I've restarted couch using
/etc/init.d/couchdb restart
But no joy.
My local.ini file has this entry;
[log]
level = debug
file = /var/log/couchdb/couch.log
And /var/log/couchdb is owned by couchdb and is in group couchdb so I don't think it is a permission issue. There is plenty of disk space on the server too.
I've rebooted the server as well in frustration - no difference.
How do I persuade couchdb to start logging anything again? The reason it has become an issue is that I'm trying to PUT some standalone attachments, but only the small ones are working so I'm trying to look in my (non-existent) log files to see what the problem might be.
Any ideas?
There is a possibility that the log file configuration is being set by some other .ini file.
Issue a GET request to http://localhost:5984/_config/log to see what CouchDB has set.
I had stuff like this happen to me because I had installed CouchDB multiple times using different methods. (compiling from source, using apt, the install script that was put out by CouchOne at one point, etc.) It was hard to figure out exactly what local.ini was the real one!
OK so it looks as if my LIVE ini files were actually at /usr/local/etc/couchdb/local.ini and not /etc/couchdb/local.ini
And the real logs were in /usr/local as well.
Not quite sure why I had both sets, I guess I had installed couchdb a couple of times in the past and I was looking in the legacy files by mistake!
Hope this helps someone else ... I have been scratching my head for a couple of hours over it now!

Restarting IIS on file changed

AFAIK IIS restarts, whenever any of the web.config files is changed.
I've created my own configuration files (my.config, with slightly different hierarchy). Is there any possibility to have IIS automatically (automagically :)) restarted, whenever any of these are changed, too?
EDIT: I've considered filesystem watchers, but I'm not sure where to put them.
You mean to say that whenever you change my.config iis has to be restarted automatically.
Maybe you can write a batch file to perform your iisreset functinality alone if you dont want the user to manually restart IIS. But even if you give a batch file the user still needs to execute.
quick and ugly fix would be put config files in bin directory.
btw. I don't believe I am writing this ;)
these changes restarts web app:
* web.config
* machine.config
* global.asax
* Anything in the bin directory or it's sub-directories
copy/pasted from here Common reasons why your application pool may unexpectedly recycle
Use SomeAssemly.dll.config which will be put into ~/Bin, automatic be read on app (re)start and cause app restart on edit.
Note that App.config in project becomes $(OutputAssembly).config on build

Resources