I've made a site 1 year ago using php, when I had alot less experience. My teacher and I were analysing the code today and there seems to be a security issue. He wants me to fix it before he gives me the points I need.
I've got an index.php and an edit.php file in the root directory, and a login page in /php/login.php (which I find to be a very silly place to put a login file in, now that I look back on it, I would probably swap edit.php's and login.php's directory's if I were to rewrite my site).
Basically, I want these three files to be accessible externally. I want all other php files to be restricted from the outside, so it's impossible to do an ajax call to /php/phpsavefile.php from outside the system (which is the security issue I mentioned). edit.php makes the ajax call to /php/savefile.php.
I think this is what I need to get the job done:
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
<Files /index.php>
Order Allow,Deny
Allow from all
</Files>
But how can I add three files instead of just one after <Files and before >?
I've also tried second approach:
Order Deny,Allow
Deny from all
This doesn't seem to work because an ajax call appears to be a regular http request as well, so it gets a 403 response.
Another approach I tried was putting the restricted php files inside a map called "private"
in the same folder where "httpdocs" remains (the parent folder of webroot). My teacher had told me about an admin folder, that no one can access but the site itsself. I tried including the restricted php files inside the private folder, but it didn't seem to include it properly...
Any help or tips for this novice at .htaccess would be appreciated :-)
Edit:
.htaccess allow access to files only from includes
Ray's comment said:
Of course, because they are requested by the client. You can't "allow the client" and "not allow the client" to serve files.
I suppose this is true, but how can I prevent people from calling my ajax file?
I secured it by checking if the user was logged in.
Related
I have a ton of PDF files in different folders on my website. I need to prevent them from being indexed by Google using .htaccess (since robots.txt apparently doesn't prevent indexing if other pages link to the files).
However, I've tried adding the following to my .htaccess file:
<Files ~ "\.pdf$">
Header append X-Robots-Tag "noindex, nofollow, noarchive, nosnippet"
</Files>
to no avail; the PDF files still show up when googling "site:mysite.com pdf", even after I've asked Google to re-index the site.
I don't have the option of hosting the files elsewhere or protecting them with a login system; I'd really like to simply get the htaccess file to do the job. What am I missing?
As I see in the comment made on another answer, I understand that
you are looking for removing indexed file/folder which is already done by google. You can temporary forbid it using following if you stop anyone accessing directly.
First, let me give you a workaround
after that I will let you know what you can do which will be taking bit longer time.
<Files "path/to/pdf/* ">
Order Allow,Deny
Deny from all
Require all denied
</Files>
this way all files/folders inside the given directory will be forbidden to use in the HTTP method. This means you can only access it programmatically for sending in attachment or deleting or something but the user will not be able to view these.
You can make a script on your serverside which will access file internally and show file using parsing instead direct URL.(assuming data is critical as of now).
Example
$contents = file_get_contents($filePath);
header('Content-Type: ' . mime_content_type($filePath));
header('Content-Length: ' . filesize($filePath));
echo $contents;
Indexing vs Forbidding (No need of this now)
Preventing indexing basically prevent this folder/files to be index by google bots or search engine bots, anyone visiting directly will still be able to view the file.
In the case of Forbidding, no external entity/users/bots will able to see/access this file/folder.
If you have recently forbidden access of your pdf folder, it may still be visible to Google until Googlebot visits again on your site and find those missing or you mention noindex for that specific folder.
You can read more about crawler rate on https://support.google.com/webmasters/answer/48620?hl=en
If you still want these to remove, you can visit the Google search console and request the same. visit: https://www.google.com/webmasters/tools/googlebot-report?pli=1
Just paste this in your htaccess file, use set instead of append
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
I use my website to host some files, but I do not want users to download the files directly. I first want to show a preview of the docx file (what I am doing via the officeviewer from Microsoft).
Since the viewer is an embed-link that has a parameter to the file, I obviously can't block the URL from Microsoft.
I have tried to allow it by IP, by URL and have been looking around on the internet, but I haven't found a solution yet that works for me. I mostly found solutions to block a site from viewing and I have no clue to make it inverse.
My code is currently this:
Order deny,allow
Deny from all
Allow from view.officeapps.live.com
How can I keep denying all, but allowing the domain of view.officeapps.live.com?
Thanks in advance
Following an answer to this question, I'm starting to look at keeping multiple .htaccess files for my different environments. The gist of it is, you create a file for each environment (.htaccess-dev, .htaccess-prod, etc) so you can track them all in Git, then symlink .htaccess to whichever file you want to use on a given environment. Simple enough, and easy to rebuild if it gets destroyed.
Before I implement this though, I wanted to do my diligence - I can't find anything relating to security of .dotfiles past .htaccess/.htpasswd. If I had .htaccess-dev and .htaccess-prod on my production server, would they be accessible through a browser? Are there any other security considerations I should be aware of?
There's probably something like this inside your server configuration (older Apache):
<FilesMatch "^.ht">
Order allow,deny
Deny from all
</FilesMatch>
Or maybe this (new Apache):
<Files ".ht*">
Require all denied
</Files>
Or even this (nginx):
location ~ /\.ht {
deny all;
}
As the first line of each bit suggests, these rules restrict access to any file starting with .ht. However, there's no guarantee that this configuration option is there, it just happens to be in the default config for some web servers.
In short, there's nothing magical about .htaccess files not being accessible, it's all in your config file. In your case, your alternative htaccess files happen to match the rule, but you're probably better off just writing similar rules for other files you want to deny access to, so you can make it explicit that you do want these stored but don't want them published.
Is there a way to restrict the external users to access my server files..
example is when i access this dir http://puptaguig.net/evaluation/js/ it shows the 404 page(though it's not obvious) but when i tried to view control.js here http://puptaguig.net/evaluation/js/controls.js it opened up..
IndexIgnore *
<Files .htaccess>
order allow,deny
deny from all
</Files>
i just want to make these files inside my server directory to secured from outside viewing for some reasons..but how?
Best Regards..
siegheil/js? Should be siegheil/ns for sure?
You could chmod 000 and then no one would see them or access them. You can't have people accessing and not seeing them at the same time. Can't be done.
You can add below lines to your httpd.conf or. htaccess this will avoid access of your JavaScripts
<Files ~ "\.js$">
Order allow,deny
Deny from all
Satisfy All
</Files>
The only way I can think to manage this is deny access to your js files by throwing a .htaccess in the siegheil/js/ folder that says something along the lines of:
deny from all
or just simply put your code in a folder above the root document level of the site itself.
After that, you then use something like minify to retrieve the js files from the backend (PHP / some other server language side) and have the minified / obfuscated code placed in another folder or just outputted directly from the script.
With all that said, in the end, the js code must be downloaded one way or another to be run by the browser. This will make it impossible to prevent people from looking at your code and figuring out what it does if they really want to.
You were able to access http://puptaguig.net/evaluation/js/controls.js but not http://puptaguig.net/evaluation/js/ because most Apache installs prevent an anonymous user from viewing the directory contents, and only permit access to specific files in the directory.
There is no way "hide" client-side JS because without access to those files your users will not be able to run your script. As suggested by #General Redneck, you can obfuscate and minify your js using a tools like minify or uglifyJS, but those can, potentially, been un-minified (minification is still a good idea for performance reasons). Ultimately you are fighting against the "open" nature of the web. I'd suggest putting a license on your code, and keeping an open mind : )
If you really need something to be secure, try accomplishing the essential functionality (which you want to keep private) with a backend language like php or asp.net and feeding the relevant data to you JS script.
You should create an .htaccess file in the relevant directory that has
-Indexes
in it. This will prevent listing of the directory and will cause a 403 error to be raised. Your application can then handle that however it wants to display whatever you want.
Hey. I need to prevent direct access to http://www.site.com/wp-content/uploads/folder/something.pdf through the browser.
However the Download Monitor plugin I am using, which allows logged in users to download the file, needs to be able to work.
Trying
Order Allow,Deny
Deny from all
http://www.site.com/wp-content/plugins/download-monitor/download.php">
Allow from all
but the download links do not now work... even though (I think) they are links produced by the script e.g.
http://www.site.com/wp-content/plugins/download-monitor/download.php?id=something.pdf
Enter that in the address bar and you correctly get a WordPress message, 'You must be logged in to download this file.'
However, if someone knows the URL where the file was uploaded
http://www.site.com/wp-content/uploads/folder/something.pdf
they can still access it directly.
I don't know how (guesswork?) they would find the direct URL anyway, but the client wants it stopped!
Thanks for any help.
You cannot set Deny in .htaccess because your WordPress and a standard file request has the same server user - www-data/apache/http/or something.
You can for example sat folder's chmod to 700 and it will allow access for script but not for direct file call.
And accept your recent questions.