I want to include multiple agent strings in 1 htaccess (or multiple if that is needed a.k.a each file for new agent string) file and I am not sure if that is possible
This is my current code
SetEnvIfNoCase User-Agent "PRIVATESTRING" good_bot
<FilesMatch ".exe">
Order Deny,Allow
Deny from All
Allow from env=good_bot
</FilesMatch>
PRIVATESTRING is because I am using an app that sends a custom user agent string as user logs in to download an item from my website
EDIT:
Answer: Repeat first line multiple time for each string
Related
I would like to deny the access to some json files (for example, opening in the browser mysite.com/files/name.json have to return an error 403), but I want to allow a specific page to read these json files in a javascript fetch function.
So I put an .htaccess in the folder where the json files are contained, and I tried to write this:
<Files ~ "^.*\.([Jj][Ss][Oo][Nn])">
order deny,allow
deny from all
allow from ^/lab/app/maker.js
</Files>
But it doesn't work and return an error 500. How can I say something like "allow to read these files if the page that are trying to read is maker.js"?
I have a subdomain that uses file_get_contents in php, and I need that to access an otherwise restricted file. In my .htaccess I have
<Files "file.txt">
Order Allow,Deny
Allow from subdomain.site.com
Deny from all
</Files>
The main problem I have here is that it doesn't unblock access from subdomain.site.com. I can't access the subdomain via url path because of site isolation set in place by my hosting provider.
The tutorial I have found says that you can whitelist certain websites to access this file. But, for some reason even following their syntax, it for some reason won't whitelist that site.
Tutorial:
https://www.askapache.com/htaccess/#developing_sites
I have a ton of PDF files in different folders on my website. I need to prevent them from being indexed by Google using .htaccess (since robots.txt apparently doesn't prevent indexing if other pages link to the files).
However, I've tried adding the following to my .htaccess file:
<Files ~ "\.pdf$">
Header append X-Robots-Tag "noindex, nofollow, noarchive, nosnippet"
</Files>
to no avail; the PDF files still show up when googling "site:mysite.com pdf", even after I've asked Google to re-index the site.
I don't have the option of hosting the files elsewhere or protecting them with a login system; I'd really like to simply get the htaccess file to do the job. What am I missing?
As I see in the comment made on another answer, I understand that
you are looking for removing indexed file/folder which is already done by google. You can temporary forbid it using following if you stop anyone accessing directly.
First, let me give you a workaround
after that I will let you know what you can do which will be taking bit longer time.
<Files "path/to/pdf/* ">
Order Allow,Deny
Deny from all
Require all denied
</Files>
this way all files/folders inside the given directory will be forbidden to use in the HTTP method. This means you can only access it programmatically for sending in attachment or deleting or something but the user will not be able to view these.
You can make a script on your serverside which will access file internally and show file using parsing instead direct URL.(assuming data is critical as of now).
Example
$contents = file_get_contents($filePath);
header('Content-Type: ' . mime_content_type($filePath));
header('Content-Length: ' . filesize($filePath));
echo $contents;
Indexing vs Forbidding (No need of this now)
Preventing indexing basically prevent this folder/files to be index by google bots or search engine bots, anyone visiting directly will still be able to view the file.
In the case of Forbidding, no external entity/users/bots will able to see/access this file/folder.
If you have recently forbidden access of your pdf folder, it may still be visible to Google until Googlebot visits again on your site and find those missing or you mention noindex for that specific folder.
You can read more about crawler rate on https://support.google.com/webmasters/answer/48620?hl=en
If you still want these to remove, you can visit the Google search console and request the same. visit: https://www.google.com/webmasters/tools/googlebot-report?pli=1
Just paste this in your htaccess file, use set instead of append
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
I have one mp3 folder that I want to protect. I'm using htaccess file match deny all, but I need access from a audio player in one especific HTML page. Is it possible to create an exception (to the deny all tag) to allow access to one especif file/domain?
Thanks
See Access Denial/Approval by Domain
<Limit GET>
order deny,allow
deny from all
allow from .yourplayerdomain.com
</Limit>
I'd like to allow my friend to upload some photos for me over FTP to my server (shared host). It's a trusted friend but I'd still like to block the execution of any php or similar scripts etc.
How can I use .htaccess (in a directory above the one I allow FTP to acces) to block everything except a list of approved extensions (images) and disallow htaccess (to prevent any further modifications)?
Does such method still have security risks?
Thanks!
You should be able to use
<FilesMatch ".+">
Order Deny,Allow
Deny From All
Allow From localhost # OR WHATEVER HERE
</FilesMatch>
<FilesMatch "\.(jpg|gif|stuff)$">
Order Deny,Allow
Allow From All
</FilesMatch>
EDIT
For preventing further modifications to htaccess, you need to set filesystem permissions accordingly (aka OS dependent), since you are most likely to give your friend full FTP access (including delete/overwrite/append).