How secure are .htaccess protected pages - security

Are there any known flaws with htaccess protected pages?
I know they are acceptable to brute force attacks as there is no limit to the amount of times someone can attempt to login. And if a user can uploaded and execute a file on the server, all bets are off...
Are there any other .htaccess flaws?

.htaccess is just a means of specifying Apache configuration directives on a per-directory basis. They allow numerous different kinds of password protection.
If you are talking about HTTP Basic Authentication then the username and password are sent in cleartext with every request and are subject to sniffing (assuming you aren't using SSL).
Aside from that, they are subject to the usual issues that any password based system suffers from.
Using HTTP Basic Authentication doesn't grant any additional ability for users to upload and execute files. If they can do that already, then they can still do that. If they couldn't, they can't.

The use of .htaccess is common and is fairly secure. However it makes you more susceptible to other attacks, such as remote file file disclosure vulnerabilities. For instance the follow code could be used to undermine .htaccess.
include("./path/to/languages/".$_GET['lang']);
An exploit would look like this:
http://127.0.0.1/LFI_Vuln.php?lang=../../../.htaccess
This will cause the contents of .htaccess to be displayed to the attacker.

Related

Is there any way to configure .htaccess to write to access.log file which is already handled by apache2.config

We have hosted our website with external agency, in the Linux environment.
now we have added cookies in our website code and want to track cookie in access.log. when we requested with our domain host provider they turn down the request to modify apache2.config file, instead they suggested to use .htaccess file to enable cookie in access.log. Right now we do not want to use any other method to log cookie other than .htaccess file.
we did not find any solutions to enable cookie in access.log using .htaccess file.
we need following questions to be answered.
1) Is it possible to use .htaccess file to enable cookie in access.log
2) If yes, steps to make it and it will be greatly appreciated if it is explained keeping it in mind that user is a layman.
As far as I know you cannot customize log files from .htaccess. And I think there is a valid reason of disabling this as it may impose security issues in a shared environment.
You would need to have the host enable mod_usertrack. Then they would need to allow you to override the configuration settings with .htaccess.
LogFormat "%{Apache}n %r %t" usertrack
CustomLog logs/clickstream.log usertrack
I track cookies, users, sessions, browsers, everything in a MySQL database. It's a lot easier to access the data with stats than log mining. (It does take up a bit of room though.)

How secure is .htaccess password protection?

Is password protecting a directory with .htaccess the best way to prevent its files from being seen by unauthorized users? Are there any alternatives to protecting a directory's content while still making it accessible to people that are authenticated to view it?
Also, couldn't someone try to bruteforce their way in, causing strain on the server?
Several things to notice:
Adding security in a .htaccess can always be done without the .htaccess, by using <Directory> instructions in the main configuration (or the virtualhost configuration). It will go faster (if you remove completly support for .htaccess with AllowOverride None) and you wont get the risk of someone altering your .htaccess.
There's several ways of adding security in .htaccess files, one of these ways is by using Basic HTTP Authentification with .htpasswd files. These .htpasswd files shouldn't be in the web directory root. One of the other possibility is using HTTP Digest Authentification, with the restriction that very old browsers won't support it (like IE6).
We usually encounter HTTP Basic Authentification. This is a very weak protection, simply because of the way it works. At the 1st request you're rejected, then your browser ask you for a password and login, and memorize this password login association for the webserver requested. Then for every request sent to this webserver until you close your browser the login and password will be added in the request header, unencrypted. There's simply a base64 encoding applied to the string 'Yourlogin:Yourpassword', to make it look like a pure ASCII7 strings and prevent encoding problems.
So anyone sniffing your request (wifi hotspot, man in the middle, local network, echo switch, etc) will know your password and login. Bad. The rule is ":
never ever use Basic HTTP
Authentification if the connection
isn't HTTPS (SSL).
If your webserver is completly in HTTPS no problem (see edit on the bottom), the clear text/password are encrypted by SSL.
For the brute force problem (and yes, some people can try to brute force the login/password, except if you tune a mod_security module to prevent that) the Security Consideration of the htpasswd page is quite clear:
When using the crypt() algorithm, note that only the first 8 characters of the password are used to form the password. If the supplied password is longer, the extra characters will be silently discarded
and:
On the Windows and MPE platforms, passwords encrypted with htpasswd are limited to no more than 255 characters in length. Longer passwords will be truncated to 255 characters.
So use SHA encoding hashing for passwords (even if it's not salted).
Another way to let authenticated user browse a directory content is to handle the directory listing and file upload within your application (PHP, Tomcat, etc) and not with the apache automatic listing. In term of security the automatic listing module (mod_autoindex) is something you shouldn't even have on your running apache.
Edit
Full HTTPS server is not required if you want to protect only some url with HTTP authentification. What you really need is that all these protected url should be in https, if non-protected url are in the http domain the authentification headers won't be used as this is a different domain (and the authentification headers are sent by domain). So you could add basic redirection rules in the http domain for these url, maybe something like that:
RedirectMatch 301 ^/secure/(.*)$ https://www.example.com/secure/$1

Implementing HTTP or HTTPS depending on page

I want to implement https on only a selection of my web-pages. I have purchased my SSL certificates etc and got them working. Despite this, due to speed demands i cannot afford to place them on every single page.
Instead i want my server to serve up http or https depending on the page being viewed. An example where this has been done is ‘99designs’
The problem in slightly more detail:
When my visitors first visit my site they only have access to non-sensitive information and therefore i want them to be presented with simple http.
Then once they login they are granted access to more sensitive information, e.g. profile information for which HTTPS is used to deliver.
Despite being logged in, if the user goes back to a non-sensitive page such as the homepage then i want it delivered using HTTP.
One common solution seems to be using the .htaccess file. The problem is that my site is relatively large meaning that to use this would require me to write a rule for every page (several hundred) to determine whether it should be server up using http or https.
And then there is the problem of defining user generated content pages.
Please help,
Many thanks,
David
You've not mentioned anything about the architecture you are using. Assuming that the SSL termination is on the webserver, then you should set up separate virtual hosts with completely seperate and non-overlapping document trees, and for preference, use a path schema which does not overlap (to avoid little accidents).

What are the pros and cons of a 100% HTTPS site?

First, let me admit that what I know about HTTPS is pretty rudimentary. I don't know much about session security, encryption, or how either of those things is supposed to be done.
What I do know is that web security is important; that horror stories of XSS, CSRF, and database injections pop up over and over again. I know that a preventative stance against such exploits is better than a reactive one.
But the motivation for this question comes from a different point of view. I work at a site that regularly accepts payment from users. Obviously, the payments are sent over a secure channel (HTTPS). I mainly work on the CSS, HTML, and JavaScript of the site. What I've been told is that it is necessary to duplicate CSS, JavaScript, and image files before they can be called over HTTPS. So assume I have the following files:
css/global.css
js/global.js
images/
logo.png
bg.png
The way I understand it, these files need to be duplicated before they can be "added" to the HTTPS. So a file can either be under security (HTTPS) or not.
If this is true, then this is a major hindrance. In even the smallest site, it would be a major pain to duplicate files and then have to maintain them every time you make a CSS or JS change. Obviously this could be alleviated by moving everything into the HTTPS.
So what I want to know is, what are the pros and cons of a site that is completely behind HTTPS? Does it cause noticeable overhead? Is it just foolish to place the entire site under encryption? Would users feel safer seeing the "secure" notifications in their browser during their entire visit? And last but not least, does it truly make for a more secure site? What can HTTPS not protect against?
You can serve the same content via HTTPS as you do via HTTP (just point it to the same document root).
Cons that may be major or minor, depending:
serving content over HTTPS is slower than serving it via HTTP.
certificates signed by well-known authorities can be expensive
if you don't have a certificate signed by a trusted authority (eg, you sign it yourself), visitors will get a warning
Those are pretty basic, but just a few things to note. Also, personally, I feel much better seeing that the entire site is HTTPS if it's anything related to financial stuff, obviously, but as far as general browsing, no, I don't care.
Noticeable overhead? Yes, but that matters less and less these days as clients and servers are much faster.
You don't need to make a copy of everything, but you do need to make those files accessible via HTTPS. Your HTTPS and HTTP services can use the same doc root.
Is it foolish to put the whole site under encryption? Typically no.
Would users feel safer? Probably.
Does it truly make for a more secure site? Only when dealing with the communication channel between the client and the server. Everything else is still up for grabs.
You've been misinformed. The css, js, and image files need not be duplicated assuming you've set up the http and https mapping to point to the same physical website on the server. The only important thing is that these files are referenced with https when the page you're looking at is also under https. This will prevent the dreaded security message that says that some objects on the page are not secured.
For every other page where you're running the site under http (unsecured) you can reference those same files in the same locations, but with an http address.
To answer your other question, there would indeed be a performance penalty to put the entire site under https. The server has to work hard to encrypt everything it sends over the wire. And then some not-so-old browsers won't cache https content to disk by default, which of course will result in an even heavier load on the server.
Because I like my sites to be as responsive as possible, I'm always selective about which sections of a site I choose to be SSL-encrypted. In most typical e-commerce sites, the only pages that need SSL encryption are the login, registration, and checkout pages.
The traditional reason for not having the entire site behind SSL is processing time. It does take more work for both the client and the server to use SSL. However, this overhead is fairly small compared to modern processors.
If you are running a very large site, you may need to scale slightly faster if you are encrypting everything.
You also need to buy a certificate, or use a self signed one which may not be trusted by your users.
You also need a dedicated IP address. If you are on a shared hosting system, you need to have an IP that you can dedicate to only having SSL on your site.
But if you can afford a certificate and private ip and don't mind needed a slightly faster server, using SSL on your entire site is a great idea.
With the number of attacks that SSL mitigates, I would say do it.
You do not need multiple copies of these files for them to work with HTTPs. You may need to have 2 copies of these files if the hosting setup has been configured in such that you have a separate https directory. So to answer your question - no duplicate files are not required for HTTPs but depending on the web hosting configuration - they may be.
In regards to the pros and cons of https vs http there are already a few posts addressing that.
HTTP vs HTTPS performance
HTTPS vs HTTP speed comparison
HTTPs only encrypts the data between the client computer and the server. It does not software holes or issues such as remote javascript includes. HTTPs doesn't make your application better - it only helps secure the data between the user and your app. You need to make sure your app has no security holes, practice filtering all data, SQL, and review security logs frequently.
However if you're only responsible for the frontend part of the site I wouldn't worry about it but would bring up concerns of security with the main developer for the backend.
One of the concerns is that https traffic could be blocked, for example on Apple computers if you set parental control on it blocks https traffic because it can't read the encrypted content, you can read here:
http://support.apple.com/kb/ht2900
https note: For websites that use SSL
encryption (the URL will usually begin
with https), the Internet content
filter is unable to examine the
encrypted content of the page. For
this reason, encrypted websites must
be explicitly allowed using the Always
Allow list. Encrypted websites that
are not on the Always Allow list will
be blocked by the automatic Internet
content filter.
An important "pro" for more https at your site is the following:
a user connecting thru an unencrypted WiFi, like at an airport, can give their password in https, but if the site then switches back to http after the password page, the session cookie becomes exposed and can be immediately used by an eavesdropper.
See this article http://steve.grc.com/2010/10/28/why-firesheeps-time-has-come/#comment-2666

Can I unprotect a single script via .htaccess using CodeIgniter?

I'm in a development environment and we're using basic .htaccess/.htpasswd authentication to keep lurkers out. But some of my AJAX calls are coming back with HTTP/401 authentication failed errors. Is it possible for me to allow access only to those specific URL's? I can't easily do it by popping a new .htaccess in a subfolder because CodeIgniter uses ReWrites.
It's not possible to allow access only to those specific URL's. Unfortunately, .htaccess and .htpasswd authentication operates on a directory level only. And you're exactly right about why just using a subdirectory won't work - b/c of CI rewrites, which happen AFTER Apache has transferred control to CodeIgniter's index.php front controller.
The easy option, if you're working on something that (1) is not likely to be hacked in the first place, and (2) can't reveal sensitive data even if it is, is to use security via obscurity. Don't have any links to your dev site, include a noindex directive for search engine crawlers, and go on your merry way. This also has the advantage that you can test versions of the site with your colleagues and friends by just telling them the URL to go to.
If you're more worried about security, then you're probably building an auth module for your website's users. In that case, for your dev environment, just call that auth module in the constructor for all of your controllers, and redirect to the login page if the user is not logged in.
Good luck!

Resources