How can I prevent scripts from running inside a directory? - security

I have a files directory for my image storage in my web root folder, i want to know how to secure that folder. i prevent people from uploading scripts to that folder, i check file extensions, if it is not an image then it will not save to that folder.
but faking extensions are done easily, what happens if someone manage to upload a script to my files directory and access that from the browser
so i need a way to prevent scripts from running inside that folder and only allow images to run.
i know htaccess can do that but i dont know how to set it up. my .htaccess file is like this:
AddHandler cgi-script .php .pl .py .jsp .asp .htm .shtml .sh .cgi
Options -ExecCGI
ForceType application/octet-stream
<FilesMatch "(?i)\.(gif|jpe?g|png)$">
ForceType none
</FilesMatch>
Options All -Indexes
but it is not working, i saved a php file in that folder then tried to accessed it from the browser and i can still access it. do you know how to make this work? or if you have more secure approach to this, please tell me.
thank you

I think that it isn't working because you have only added an extra handler, you haven't removed the other handlers.
It is easiest to put another .htaccess file in the folder you want to protect (rather than messing with the match directive) that contains:
# Fix PHP, you should do matching commands for JSP and ASP, & html
RemoveType application/x-httpd-php php
# .... add the other remove-handler statements here .... #
# Optionally make these equivalent to text files.
# UPDATE: Taken this out as you dont want people to see PHP files at all
#AddType text/html php
# To disable cgi and server side includes & indexes
# You need to check the setup of Apache, some of the file types
# listed should already be handled as CGI (.pl, .py, .sh)
Options -ExecCGI -Includes -Indexes
# Completely block access to PHP files
<FilesMatch "\.(php|phps|html|htm|jsp|asp)$">
Order allow,deny
Deny from all
</Files>
# Add in any additional types to block
That covers PHP and CGI, you should do matching commands for JSP and ASP
UPDATE: Added code to completely block access to PHP files - sorry, thought initially that you simply didn't want them executing. Also note that I've commented out the line that turns PHP files into text files.

Related

how to redirect a url that does not contain specific text using htaccess?

I have a site that has a some pdf files to download. i want to block direct file download.
example: if a user try https://skpselearning.enovic.in/uploads/document/viewer.html?file=lesson9.pdf it should download. but without viewer.html? it will should be block. ie https://skpselearning.enovic.in/uploads/document/lesson9.pdf this url will be block. how to do this?
(You've not stated how your viewer.html script is downloading/displaying the .pdf files? Simply blocking direct access to the .pdf files could also block access to your script, depending how it is implemented.)
So, all you are really asking (and all that can be answered) is how to block direct access to the .pdf files...
To block (403 Forbidden) HTTP access to all .pdf files on the site:
<Files "*.pdf">
Require all denied
</Files>
Or, to block only .pdf files in the /uploads/document subdirectory then you can use the following mod_rewrite directives at the top of the .htaccess file:
RewriteEngine On
RewriteRule ^uploads/document/[^/]+\.pdf$ - [F]

.htaccess parent folder is not completely overwritten

I have adapted the .htaccess on my WordPress site and made additions such as the activation of GZIP.
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
</IfModule>
Then I put a rule in the subfolder that should prohibit access to certain pages.
<FilesMatch "connection\.php|data\.php|protection\.php">
order allow,deny
deny from all
</FilesMatch>
Of course, I wanted the rules created in the root folder to also apply to the additional rules in the subfolders (of course only in the respective folders). Today I read a article that made me suspicious in which I read that a .htaccess file in a subfolder completely overwrites a .htaccess file from a parent folder and not adds the new specific points and only when a new point is added to a point from the higher-level folders which is suppose it gets overwritten. So I tried it out and in my opinion it is not true because, according to GZIP Tester, the files are also got zipped in the folder where I don't add this
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
</IfModule>
Here is a diagram from the page where I found this article.
It's German, but I think you will understand that. (Verzeichnis = Root folder, Unterverzeichnis = subfolder)
The question is what's right, did I make a mistake and have to re-list the rules every time I want to extend the root folder in each .htaccess file (in sub-folders) or was the text on the website just wrong?
From the official Apache docs (https://httpd.apache.org/docs/current/howto/htaccess.html#how):
The configuration directives found in a .htaccess file are applied to the directory in which the .htaccess file is found, and to all subdirectories thereof. However, it is important to also remember that there may have been .htaccess files in directories higher up. Directives are applied in the order that they are found. Therefore, a .htaccess file in a particular directory may override directives found in .htaccess files found higher up in the directory tree. And those, in turn, may have overridden directives found yet higher up, or in the main server configuration file itself.

Access files outside of webroot through alias and .htaccess rule

I need to make PDF files that are stored in a folder (with subfolders) outside of the web root publically accessible by a plain URL. An alias has been created in Apache that leads this folder so what I need now is a redirect rule in .htaccess to make this work.
I have this alias: https://www.examplesite.com/certificate
The URLs that will be used to access these PDFs are for example: https://www.examplesite.com/certificate/2018/LGOIGD9E9345034GJERGJER.PDF
https://www.examplesite.com/certificate/2017/GSDFJGLKJNL345L34LSNFLSD.PDF
How should I format my redirect rule in .htaccess to decide if the file is to be downloaded or viewed in the browser?
Sorry about the noise, I found the answer by myself:
<FilesMatch "\.pdf$">
ForceType applicaton/octet-stream
Header set Content-Disposition attachment
</FilesMatch>

Exclude directory from mod_autoindex.c Options -Indexes

I have the following in my .htaccess:
# "-Indexes" will have Apache block users from browsing folders without a
# default document Usually you should leave this activated, because you
# shouldn't allow everybody to surf through every folder on your server (which
# includes rather private places like CMS system folders).
<IfModule mod_autoindex.c>
Options -Indexes
</IfModule>
However, I have a directory that does not have an index file (and I prefer to keep it this way), which I need to enable access. How do I exclude this directory from the above code?
Create an .htaccess file in that directory, and in it put the following
<IfModule mod_autoindex.c>
Options +Indexes
</IfModule>
Despite the following comment (I got this code from someone else):
# "-Indexes" will have Apache block users from browsing folders without a
# default document
...this code does not block users from accessing files in a directory that does not have an index file. Therefore, I do not need to exclude my directory from the above code.
As an example, example.com/testdir does not have an index file. But, I'm able to access example.com/testdir/testfile.txt

File only to e downloaded on specific page

I have a website with files that should only be downloaded from the download.php file, their saved in a map like /uploads/map1_/file.bin
I don't want people to be able to download the file directly from the directory but only from the download page. I think this is possible with htacces, but I can't find how to do that..
First, create a .htaccess file in your /uploads/ folder.
Then, put this code into it
<FilesMatch "\.bin$">
Order Allow,Deny
Deny from All
</FilesMatch>
Options -Indexes
Note: the most secure solution is to put your bin files out of public scope

Resources