htaccess doesn't allow my domain - .htaccess

I created a htaccess file and put it in public_html directory(root of my server):
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http://mdpcomics.ir/.*$ [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ https://blogvault.net/wp-content/uploads/2014/12/no-hotlinking.png [NC,R,L]
</ifModule>
it blocks hotlinking and shows another image. but it doesn't allow my own domain too.
i.e. this link should not show the image:
http://dl.mdpcomics.ir/logo.png
but this link should show it. I mean I want it to show the image by using this link:
http://mdpcomics.ir/?imagename=logo
but it won't allow that too.
I already searched everywhere and I tried so many suggestions but all of them worked as follow:
everyone can see images
no one can see images including myself O.o
my server os is linux and my panel is directadmin
Edit:
I realized that my host has a fake or invalid ip: 178.63.56.20320
I got that ip by php code:
echo $_SERVER['REMOTE_ADDR'];

Maybe this is your problem: In your RewriteRule it says https://, but your RewriteCond is testing for http://. You can make it test for both by simply adding s?:
RewriteCond %{HTTP_REFERER} !^https?://mdpcomics.ir/.*$ [NC]
Further things to be aware of:
clients won't send the Referer in a non-HTTPS request if the request originated from an HTTPS location
some clients never send the Referer header or set it to something totally different in order to protect their privacy
To catch these cases I would add another condition:
# Assume client sent "real" Referer if it begins with http(s)://
RewriteCond %{HTTP_REFERER} ^https?:// [NC]
The downside is, images can be hotlinked from foreign HTTPS locations if your site is running normal HTTP.

Related

htaccess - Prevent Hotlinking/Webpage Scraping & Redirect Attacker's Webpage to Warning Page

Malicious website owners are using the contents of our website to say example.com on their websites say spam.com like:
<?php
$url='https://example.com/';
// using file() function to get content
$lines_array=file($url);
// turn array into one variable
$lines_string=implode('',$lines_array);
//output, you can also save it locally on the server
echo $lines_string;
?>
We want to prevent the contents of our website from displaying on their websites and redirect those requests to a warning page on our website (to a webpage and not an image).
After doing some R&D, we tried doing this:
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^https://example\.com/.*$ [NC]
RewriteRule ^(.*) https://example.com/404 [R=301,L]
</ifModule>
But it doesn't work. What are we doing wrong?
Reference: htaccess prevent hotlink also prevents external links
"Hotlinking" and "webpage scraping" are two very different things. What you describe with the snippet of simplified PHP code is a form of "webpage scraping" or even "cloning". This does not (or is very unlikely to) generate a Referer header in the request, so cannot be blocked by simply checking the Referer (ie. HTTP_REFERER server variable) as you would do with "hotlinking".
(Your example mod_rewrite code blocks "hotlinking", not "scraping/cloning".)
The only way to block these types of requests is to block the IP address of the server making the request. For example, if the "malicious" requests are coming from 203.0.113.111 then you would do something like the following in the Apache 2.4 config (or .htaccess file) to block such requests:
<RequireAll>
Require all granted
Require not IP 203.0.113.111
</RequireAll>
However, the requests may not be coming from the same IP address that is hosting the "cloned" content. You'll need to determine this from your server's access logs. But to further complicate this the "attacker" may be using a series of IP addresses or have access to a botnet of ever-changing IPs. This can quickly become almost impossible to block without access to a more comprehensive firewall.
You can try other techniques such as issuing redirects to the canonical hostname from client-side code. However, more advanced "cloning" software (and/or reverse proxy servers) will "simply" modify the code/URLs to thwart your redirection attempts.
So, I'm try to google it, and finded this:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?example.com/.*$ [NC]
RewriteRule ^(.*)$ http://www.example.com/404 [R=404,L] # R=404 returns 404 page

Clicking on my web in google results redirects me back to google

Im helping a friend out on a website which is created using an online platform powered by plesk and theres an issue when trying to access this web through google.
Writing the domain directly in the browser works fine but when accessing it through a google search it redirects the user back to google.
What could be the issue?
this is my .htaccess file
<IfModule mod_negotiation.c>
Options -MultiViews
</IfModule>
<IfModule mod_rewrite.c>
RewriteEngine On
#HTTP-HTTPS
RewriteCond %{HTTPS} off
RewriteRule (.*) https://srad.wtf/es_ES/$1 [R=301,L,QSA]
RewriteCond %{REQUEST_URI}::$1 ^(/.+)/(.*)::\2$
RewriteRule ^(.*) - [E=BASE:%1]
# Sets the HTTP_AUTHORIZATION header removed by Apache
RewriteCond %{HTTP:Authorization} .
RewriteRule ^ - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteCond %{ENV:REDIRECT_STATUS} ^$
RewriteRule ^index\.php(?:/(.*)|$) %{ENV:BASE}/$1 [R=301,L]
# If the requested filename exists, simply serve it.
# We only want to let Apache serve files and not directories.
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^ - [L]
# Rewrite all other queries to the front controller.
RewriteRule ^ %{ENV:BASE}/index.php [L]
</IfModule>
<IfModule !mod_rewrite.c>
<IfModule mod_alias.c>
# When mod_rewrite is not available, we instruct a temporary redirect of
# the start page to the front controller explicitly so that the website
# and the generated links can still be used.
RedirectMatch 307 ^/$ /index.php/
# RedirectTemp cannot be used instead
</IfModule>
</IfModule>
Ive removed the majority of the comments from the file to keep it clean
As stated in comments, there doesn't appear to be anything in your .htaccess file that would cause this redirect.
the redirect response ... appears to be coming from an Nginx server (possibly a front-end proxy), not Apache.
#MrWhite does that mean its something that I cant solve myself?
The Nginx server, from which the response is ultimately being served from/through (a front-end/caching proxy I suspect) is part of your server config - so you would expect to have some control over this - although "using an online platform" then maybe not?
However, the redirect(s) you are seeing may be coming from your application server/PHP (not Nginx or Apache). The problem isn't just with "Google Chrome" (as you have tagged) or even with Google SERPs. Any inbound link to the homepage is being 302 redirected back to itself (the HTTP Referer).
Not wanting to sound alarming, but this sort of redirect is quite typical of a site being hacked - as it is potentially damaging for SEO. Although since this only affects the homepage and is a 302 (temporary) redirect and you appear to have other language specific redirects in the application logic then this may just be a missconfiguration - although redirecting back to the "HTTP Referer" is quite a deliberate action!
For example, the following link to your homepage currently 302 redirects back to "this page"!
https://srad.wtf/
Workaround
Your site appears to be in two languages, as denoted by the first path segment, /en/ or /es_ES/ (default). The application logic appears to unconditionally redirect(302) to /es_ES/ if omitted (it is not deduced from the user's browser preferences or remembered for returning visitors).
You may be able to redirect to /es_ES/ early in .htaccess before the application kicks in. (By the same logic that requesting the HTTP homepage also works OK, since it is redirected to HTTPS early in .htaccess.)
Try the following, after the RewriteEngine directive:
RewriteRule ^$ https://example.com/es_ES/ [R=302,L]
Note that this is a "workaround", it doesn't fix the underlying problem.
Additionally...
#HTTP-HTTPS
RewriteCond %{HTTPS} off
RewriteRule (.*) https://srad.wtf/es_ES/$1 [R=301,L,QSA]
This HTTP to HTTPS redirect is not strictly correct, as it unconditionally prefixes the request with /es_ES/ even when a valid language code might already be present. eg. Request http://example.com/es_ES/about (HTTP) and you are redirected to https://example.com/es_ES/es_ES/about (404). etc.
The HTTP to HTTPS redirect should simply redirect to the same URL-path (resolve any other language/path issues elsewhere*1). For example, this should be written:
RewriteCond %{HTTPS} off
RewriteRule (.*) https://example.com/$1 [R=301,L]
The QSA (Query String Append) flag is not required since the query string (if any) is passed through by default, unless you create a new query string on the substitution string (the QSA flag would then be required to append the query string from the original request).
(*1 To some extent, the preceding "workaround" resolves the missing language code.)
Strictly speaking, the language should be defaulted conditionally based on the value of the Accept-Language HTTP request header - but this is best done in PHP, not .htaccess.

Trying to put website into Maintenance Mode (302) - 'Too many redirects' htaccess issue

I'm trying to put my webpage into Maintenance Mode by using htaccess to redirect any page that begins with (domain name) to a maintenance.php file within a folder inside the root.
I got this to work on localhost with no issues, but it just won't work when I put it on my web host server. It keeps saying there are too many redirects (there's an infinite loop going on).
# MAINTENANCE-PAGE REDIRECT
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REMOTE_ADDR} !^100\.184\.54\.96
RewriteCond %{REQUEST_URI} !/maintenance/maintenance.php$ [NC]
RewriteCond %{REQUEST_URI} !\.(jpe?g?|png|gif) [NC]
RewriteRule .* /maintenance/maintenance.php [R=302,L]
</IfModule>
I tried plenty of the answers given to other questions such as
.htaccess error - ERR_TOO_MANY_REDIRECTS
htaccess maintenance page redirect results in "too many redirects" error
...among others. The same error keeps coming. I have another domain (domain-1) redirecting to the current webpage (domain-2), tried turning that off to see if it works, nope.
After following a ton of suggestions and styles from around the net, I finally came to a solution that worked for this issue.
To redirect all pages and sub-directories for your domain name to a maintenance page, create two files:
maintenance.html (maintenance page)
maintenance.enable (empty file)
Use the following code in your .htaccess file:
RewriteEngine On
RewriteCond %{REMOTE_ADDR} !^105\.228\.123\.16
RewriteCond %{DOCUMENT_ROOT}/maintenance.html -f
RewriteCond %{DOCUMENT_ROOT}/maintenance.enable -f
RewriteCond %{SCRIPT_FILENAME} !maintenance.html
RewriteRule ^.*$ /maintenance.html [R=503,L]
ErrorDocument 503 /maintenance.html
Header Set Cache-Control "max-age=0, no-store"
Be sure to place the 2 files in the same directory as your index page.
That's the solution that worked in my case. I'm yet to try it out with external resources (css/js files and images) but I think it shouldn't take more than some tweaking the above code. Hope it helps someone else too.
EDIT
For external resources and styling just add this line:
RewriteCond %{REQUEST_URI} !\.(jpe?g?|png|gif|css|js|ico)
Be sure to add all of the relevant directories (containing the stylesheets and scripts) in the same directory as the maintenance.html page.
I could be wrong but it seems like a bad idea to use this in conjunction with Header Set Cache-Control "max-age=0, no-store" if you're going to keep the maintenance page up for a while. I leave that for the experts though :-)
My maintenance page is a fancy countdown page.
This is actually part of the problem. Your "fancy" page contains links to numerous CSS and JS files (and the favicon.ico file) - 17 files in total - your .htaccess redirect will redirect these requests as well (all to your maintenance.php page - which will trigger further redirects etc.). You'll need to make additional exceptions for these URL/file extensions. For example:
RewriteEngine on
RewriteCond %{REMOTE_ADDR} !^100\.184\.54\.96
RewriteCond %{REQUEST_URI} !\.(jpe?g?|png|gif|css|js|ico)$
RewriteRule !maintenance\.php$ /maintenance/maintenance.php [R=302,L]
The <IfModule mod_rewrite.c> wrapper is not required (you know your server).
The NC flag is not required unless you really do have mixed case extensions.
I realise this isn't a normal "site down for maintenance" type page, however, maintenance pages should ideally link to as few external resources as possible. To avoid issues like the above, but also you don't want to be in a situation where the maintenance page itself cannot be displayed because the site is down for maintenance!

.htaccess cans and can'ts

I am very new to the idea of .htaccess and thought that it was what you used to do something like turn this:
http://www.domain.com/some/ugly/url/here.html
into this:
http://www.domain.com/niceurl
I was just told by my ISP that in order to get that to happen, no, it's done by putting the document into the web root folder. That .htaccess isn't used at all.
Does anyone know if this is true? I see a lot of examples about what .htaccess DOES but not so much about what it can't do. Somehow I thought this was all that was needed.
Lastly, if someone types in www.domain.com/niceurl what will happen? Don't I need to have that linked (if not by htaccess, how?!) to the location of the actual file?
Thank you for any and all help. I realize that .htaccess questions abound but they're hard to pick through for the layperson and I'm hoping to answer this specific question.
Here's what I believe should be an answer you want, put the block below to your .htaccess
Answer:
## Enabling Apache's Mod_rewrite module.
RewriteEngine On
# Following line is required if your webserver's URL is not directly related to physical file paths (just / for root, e.g. www.domain.com/)
RewriteBase /
# Restricts rewriting URLs only to paths that do not actually exists
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
# Redirect www.domain.com/bar to www.domain.com/foo
Redirect 301 /bar /foo
# Internally load the long URL without changing URL in address bar
RewriteRule ^foo/?$ http://www.domain.com/some/ugly/long/thing/here.html [L,NC]
As a result, www.domain.com/bar will be redirected to www.domain.com/foo and /foo will internally load http://www.domain.com/some/ugly/long/thing/here.html
FYI:
Your website's URL doesn't have to be directly related to physical file paths. Your URL's segment can be served as alias to your URL's parameters. for e.g,
http://www.domain.com/index.php?key1=value1&key2=value2
can be represented as
http://www.domain.com/value1/value2
Note: you need to implement a server side script to be served as a
router to manipulate the URL segments.
For more information about using .htaccess, check this out
Ref: http://htaccess-guide.com/
.htaccess files can be used to alter the configuration of the Apache Web Server software to enable/disable additional functionality and features that the Apache Web Server software has to offer. These facilities include basic redirect functionality, for instance if a 404 file not found error occurs, or for more advanced functions such as content password protection or image hot link prevention.
Below is a few examples,
# Custom Error Pages for Better SEO,
# for e.g, to handle 404 file not found error
ErrorDocument 404 http://www.domain.com/404page.html
# Deny visitors by IP address
order allow,deny
deny from 122.248.102.86
deny from 188.40.112.210
allow from all
# Redirects
Redirect 302 /en/my-dir/my-page.html /en/my-path/example.html
# Disallow some silly bots from crawling your sites
RewriteCond %{HTTP_USER_AGENT} (?i)^.*(BlackWidow|Bot\\ mailto:craftbot#yahoo.com|ChinaClaw|Custo|DISCo|Download\\ Demon|eCatch|EirGrabber|EmailSiphon|EmailWolf|Express\\ WebPictures|ExtractorPro|EyeNetIE|FlashGet|GetRight|GetWeb!|Go!Zilla|Go-Ahead-Got-It|GrabNet).*$
RewriteRule .* - [R=403,L]
# Setting server timezone
SetEnv TZ America/Los_Angeles
# trailing slash enforcement,
# e.g, http://www.domain.com/niceurl to http://www.domain.com/niceurl/
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !#
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://www.domain.com/$1/ [L,R=301]
Enable mod_rewrite and .htaccess through httpd.conf (if not already enabled) and then You can use this code in your DOCUMENT_ROOT/.htaccess file:
RewriteEngine On
RewriteRule ^niceurl/?$ some/ugly/url/here.html [L,NC]
This will allow you to use http://domain.com/niceurl in your browser and it will internally load http://domain.com/some/ugly/url/here.html without changing URL in browser.
If you also want to force redirection from ugly URL to pretty URL then add this redirect rule just below RewriteEngine On line:
RewriteCond %{THE_REQUEST} \s/+some/ugly/url/here\.html [NC]
RewriteRule ^ /niceurl [R=302,L,NE]

htaccess only accept traffic from specific http_referer

I'm trying to set up a htaccess file that would accomplish the following:
Only allow my website to be viewed if the viewing user is coming from a specific domain (link)
So, for instance. I have a domain called. protect.mydomain.com . I only want people coming from a link on unprotected.mydomain.com to be able to access protect.mydomain.com.
The big outstanding issue I have is that if you get to protect.mydomain.com from unprotected.mydomain.com and click on a link in the protect.mydomain.com that goes to another page under protect.mydomain.com then I get sent back to my redirect because the http_referer is protect.mydomain.com . So to combat that I put in a check to allow the referrer to be protect.mydomain.com as well. It's not working and access is allowed from everywhere. Here is my htaccess file. (All this is under https)
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_REFERER} ^https://(.+\.)*mydomain\.com
RewriteCond %1 !^(protect|unprotected)\.$
RewriteRule ^.*$ https://unprotected.mydomain.com/ [R=301,L]
You are matching your referer against ^https://(.+\.)*mydomain\.com. Which means if some completely other site, say http://stealing_your_images.com/ links to something on protect.mydomain.com, the first condition will fail, thus the request is never redirected to https://unprotected.mydomain.com/. You want to approach it from the other direction, only allow certain referers to pass through, then redirect everything else:
RewriteEngine On
RewriteBase /
# allow these referers to passthrough
RewriteCond %{HTTP_REFERER} ^https://(protect|unprotected)\.mydomain\.com
RewriteRule ^ - [L]
# redirect everything else
RewriteRule ^ https://unprotected.mydomain.com/ [R,L]

Resources