Best way to password protect a site? .htacess - .htaccess

I created/edited a .htaccess file and I got my site password protected fine. Question though: Is there such thing as a URL key? Maybe I'm wording that incorrectly, but I would like to keep my site hidden, but be able to send out a specific URL that can view the site. What's the best way to accomplish this?
Thanks in advance.

If doing as Greg suggests and putting it in a folder isn't good enough for you, you could set the .htaccess to rewrite all url's to a php file (or whatever language you are using) and it checks some sort of database (or xml file or whatever format you want) for a key or parameter in the URL, and if its not there it can return a 404 in the header. That way unless they guess the url exactly it will return a 404 like there's nothing there.
And be sure to turn virtual directory listings off too.

Simply put your site in a directory that's using a "secret" name:
http://example.com/opensesame/
If you don't link to that, then it will be "hidden" unless you give out the URL.
Naturally, this doesn't protect against other people publishing your "secret" URL and linking to your site anyway.

Related

codeigniter controlled access to a url/folder

I am stuck at the situation where I want the url, which contains a folder having some files (html, swf etc.), to be accessible after I validate the user.
For example.
The url to access is:
A - http://mysite.com/files/version/1/file.swf
And this above url is accessible from the link,
B - http://mysite.com/view/1
I have implemented a way to hide the URL A from a normal user but if the user somehow is a semi-techie person then he can know the swf file location from firebug or other tools. So, to make the access-to-file secure what should I do?
If a user somehow knows the first url(A) and then enters it in browser, i have to check if the user is logged-in and if validation is done it lets the url A to be loaded.
Since, in CI, the controller names cannot be named same as the folders in the root directory, in this case i cannot have a controller called “files”. So, the only option left to make this secure access to url work is to use htaccess rule/cond. If this is the only option, then how can it be achieved by htaccess and if not, then what other options do i have.
Will the codeigniter's URI Routes work because when i tried like this:
$route[‘files/version/1/(:any)’] = “view/$1”;
and it doesnt work, maybe because there is no controller/function/param as files/versions/1 ...
looking for quick help. Thanks
There isn't a sure-fire way to do it without, for example, using .htpasswd.
One thing you could implement is sort of "Security by Obscurity". In that case you could redirect all requests to a file to the URL http://mysite.com/view/file-id and then instead of loading the requested file directly, you would load a .php template with the appropriate headers - be it an image, a flash file or anything else.
But it really depends on how the files are going to be managed, since every file will need an entry in the database and you would have to output different headers for different types of files. And if someone still manages to guess the path to the file, it will be directly accessible.

apache - smart way to protect and hide admin folder? robots.txt? .htaccess? Anything else or better?

How do I protect and hide /adminblah/ folder from robots and from users so the only the administrator will know it exists?
1) To prohibit it from robots and bots, we can use robots.txt file.
But, that file will contain Disallow: /adminblah/ then. As result, everybody (who wants to) will know the path to the administrator's folder because he can read robots.txt file.
For that purpose, we can put .htaccess file to the /adminblah/ to password protect that folder.
Is that smart? Any smarter solutions to limit access to /adminblah/index.php page?
This question concerns all the content - admin php files, admin pictures etc.
Mentionning the directory in robots.txt is not a solution since it's worse than doing nothing as you say it yourself.
.htaccess protection is a very good option alone ; add it at the root of /adminblah/ and even IF someone guesses it (including robots) they'll get nothing.
Using a robots.txt to hide a directory from search engines is nothing more than security through obscurity. You must have proper access controls on the content and a .htaccess file is perfect for this.
I like to do things like this
RedirectMatch permanent (?i)adminblah http://www.fbi.gov

Dynamically creating URLs for other websites

I'd like to know how websites have created URLs with other domains like these on trafficestimate.com.
I'm guessing it's some .htaccess stuff to redirect domain names to a dynamic page?
Thanks
Your URL has an GET Request. So when someone calls the page http://google.com/search with the parameters hl=en, safe=off etc., the page can process those parameters. So for instance safe=off means that you want to get back any search result. The q=site:... is your search string. In this case Google will look it up in its database and give you the results. So when you call this URL there is probably no .htaccess processing done. However you can process the URL and GET request with .htacces and i.e. redirect the user to another page.
Maybe you'll describe a bit further what exactly you trying to do/want to know. This makes explaining easier.
EDIT: After reading Gumbo's comment I looked at the Google result page. So maybe your question means the trafficestimate-URLs. They look like http://trafficestimate.com/example.org. This is really a good case for .htaccess. So using .htaccess they take the URL and redirect it to http://www.trafficestimate.com/websites/?domain=example.org. Here you have again a GET request and an application builds the page.
Some URL rewriting is probably involved. Otherwise they would have to create an existing file for every possible request.
Using Apache’s mod_rewrite in a .htaccess file is one option. But since the server identifies itself with “Microsoft-IIS/7.5”, they are probably rather using ISAPI_Rewrite, a mod_rewrite derivative for Microsoft’s IIS.

On password-protected site, how to whitelist certain referring domains?

I have a site that is password-protected using a .htaccess and .htpasswd file. I'd like for users to bypass the login prompt ONLY if they come from a certain domain. Can this be done by embedding the .htaccess credentials as parameters in the link somehow?
I do manage the domain I'd like to whitelist, so how can I pass GET parameters in the link that the .htaccess file will process?
You should rethink this as it is trivial to spoof the referring domain (or any information from the client).
You users can easily select to save their username / password if they wish to.
That would be highly insecure, the http referrer can be easily manipulated and your login bypassed.
If you own the other sites you can add some http header or GET var. If you don't, start thinking another solution for what you want to do.

.htaccess and sessions for security?

In my application users have their own "websites" which can be reached if they are signed in.
However, since these websites are just directories containing html and other documents everyone in the world can reach them if they know the address. I can't have that :) A user should be able to decide whether or not thw world might see their files or not.
Can I use .htaccess to activate a PHP-script every time a request is made to that directory?
I.e. if reqested-site is "/websites/{identifier}", run is-user-allowed-to-view.php?website={identifier}
The identifier is a numeric value which refers to both a physical folder and a post in the database... and the script would then return true or false.
Or is there perhaps another way of solving the same issue?
Cheers!
You can use mod_rewrite to rewrite requests with such a URL internally to your script:
RewriteEngine on
RewriteRule ^website/([0-9]+)$ is-user-allowed-to-view.php?website=$1
But this rule is only for the URL path /website/12345 and nothing else.
Or have every page as a PHP page and just put at the top a single line to redirect if the session / cookie is incorrect or not set. Obviously wouldn't work for non-PHP content such as images.
What you need is a proper front-end (written in whatever language). You need to have your web-server (Apache in your case it seems) pass the requests to the said front-end.
You cannot do what you are asking for with just .htaccess files.

Resources