I want to redirect all requests to a perl file which is supposed to handle them depending on the sub domain.
I tried using this .htaccess:
Options +ExecCGI
AddHandler fcgid-script .pl
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(.+?)\.example\.com$
RewriteRule ^(.*)$ /%1/main.pl/$1 [L]
I would expect that, if I try to open master.example.com that my ~/html-root/master/main.pl would get executed, but instead I get an "Internal Server Error".
Apache's error_log says
[Wed Jan 20 18:06:36 2016] [error] [client xxx.xxx.xxx.xxx] Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace., referer: http://master.example.com/
If I try to visit example.com/master/main.pl my script gets executed just fine.
What am I doing wrong?
Try your rule this way,
Options +ExecCGI
AddHandler fcgid-script .pl
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(.+?)\.example\.com$
RewriteCond %1::%{REQUEST_URI} !^(.*?)::/\1/? [NC]
RewriteRule ^(.*)$ /%1/main.pl/$1 [L]
I have found sort of a workaround:
Options +ExecCGI
AddHandler fcgid-script .pl
RewriteEngine On
RewriteCond %{ENV:REDIRECT_ENV} !^.+$
RewriteCond %{HTTP_HOST} ^(.+?)\.example\.com$
RewriteRule ^(.*)$ /%1/main.pl/$1 [L,E=ENV:%1]
Instead of not redirecting requests to /\1/main.pl I set an environment variable on the first redirect. If that variable is set, I don't do any more redirects thus getting out of the infinite loop that otherwise would be happening.
Related
Im using .htaccess to remove the .html extension from my urls. For example...
https://www.example.com/work.html > https://www.example.com/work
I want to be able to have the site show a 404 page (or the correct error page) if someone tries to access a file/directory thats not there (for example) https://www.example.com/work/new. But this returns a 500 Internal Server Error.
The .htaccess line ErrorDocument 500 /errors/500.html doesn't return that file (it works for 404, 403 etc). It returns this error...
Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.
Apache error log says...
AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace.
I am assuming that this is due to it essentially trying to find a directory within a html file (trying to find directory "new" inside the html file "work.html"), so returns the 500 server error rather than 404.
Below is my .htaccess file. Can anyone help with a better way of doing this or a way around it?
# Disable Directory Listing
Options -Indexes
# X-Robots-Tag
Header set X-Robots-Tag "noindex, nofollow"
# Rewrite Engine
RewriteEngine On
# Root Directory
RewriteBase /
# Remove .html Extension
RewriteCond %{THE_REQUEST} ^GET\ (.*)\.html\ HTTP
RewriteRule (.*)\.html$ $1 [R=301]
# Remove index + Reference Directory
RewriteRule (.*)/index$ $1/ [R=301]
# Remove Trailing Slash **If Not Directory**
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} /$
RewriteRule (.*)/ $1 [R=301]
# Forward Request To html File, **But Don't Redirect (Bot Friendly)**
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteCond %{REQUEST_URI} !/$
RewriteRule (.*) $1\.html [L]
# Errors
ErrorDocument 403 /errors/403.html
ErrorDocument 404 /errors/404.html
ErrorDocument 500 /errors/500.html
RewriteRule (.*)\.html$ $1 [R=301]
You need L (last) flags on all your external redirects in order to prevent further processing. For example:
RewriteRule (.*)\.html$ $1 [R=301,L]
Also, make sure MultiViews is disabled:
Options -Indexes -MultiViews
Presumably, you are already linking to the URL without the extension?
Not sure how to fix this or replicate the error but it seems that sometimes my site goes into a 403 redirect error.
I've deleted cookies on my personal browser but it seems to enter 403 on other machines (on different IP's)
My hunch could be the htaccess file....
Can anyone spot anything odd with the following rules?
thanks....
<files .htaccess>
Order allow,deny
Deny from all
</files>
Options +FollowSymLinks
RewriteEngine On
#Options -Indexes
# All pages www
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]
# below to force https
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\. [OR,NC]
RewriteCond %{HTTPS} off
RewriteRule ^ https://www.my-site.com%{REQUEST_URI} [NE,R=301,L]
I found out what it was - basically my DNS was set differently...I was using OpenDNS and it was messing up my personal UI experience...
Apparently Bingbot is getting caught in an infinite loop on my site. It downloads pages like http://www.htmlcodetutorial.com/quicklist.html/applets/applets/applets/applets/applets/applets/applets/applets/applets/applets/applets/applets/applets/applets/sounds/forms/linking/frames/document/linking/images/_AREA_onMouseOver.html . Since I set my server to interpret .html as PHP the page is simply a copy of http://www.htmlcodetutorial.com/quicklist.html . How do I stop Bingbot from looking for these bogus copies?
Why is Bingbot looking for those pages to begin with?
I'd like to do something like the last line of the .htaccess file shown below (like at "Redirect to Apache built-in 404 page with mod_rewrite?"), but when I try RewriteRule ^.*\.html\/.*$ - [R=404] the entire site shows a 500 error.
Even when I use the last line below it redirects to http://www.htmlcodetutorial.com/home/htmlcode/public_html/help.html which is not what I wanted.
AddType application/x-httpd-php .php .html
RewriteEngine on
Options +FollowSymlinks
RewriteRule ^help\/.* help.html [L]
RewriteCond %{HTTP_HOST} ^example.com
RewriteRule (.*) http://www.htmlcodetutorial.com/$1 [R=301,L]
ErrorDocument 404 /404.html
RewriteRule ^.*\.html\/.*$ help.html [R=301]
P.S. I know the site is way out of date.
The problem here is that you either have Multiviews turned on, or apache is interpreting requests like /quicklist.html/blah/blah as a PATH_INFO style request, which will be interpreted as a valid request.
So turn off multiviews by changing your options line to:
Options +FollowSymlinks -Multiviews
Then replace your last rule with:
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-f
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-d
RewriteRule ^ - [L,R=404]
Change your last rule to this:
RewriteRule ^(.+?\.html)/.+$ - [R=404,L,NC]
I have this very basic rewrite rule, no matter what I try, results in an Error 500.
RewriteEngine On
RewriteRule ^folder/(.*) /folder/index.php?Alias=$1 [L]
My httpd.conf file has the following content: (which seems OK to me)
<Directory "/var/www/html">
Options -Indexes FollowSymLinks
AllowOverride All
Order allow,deny
Allow from all
<IfModule mod_suphp.c>
suPHP_Engine On
suPHP_UserGroup webapps webapps
SetEnv PHP_INI_SCAN_DIR
</IfModule>
</Directory>
Any suggestions on what might be going wrong? I've also tried to add $ at the end of my rewrite rule.
The rewrite engine will loop repeatedly, until the URI stops changing, or the internal redirect limit is reached which causes the 500 error to be thrown. Your rule's target URI /folder/index.php will get thrown back into the rewrite engine and your same rule's regex matches it, ^folder/(.*). So you'll need to add some kind of condition to prevent the loop.
RewriteEngine On
RewriteCond %{REQUEST_URI} !^/folder/index\.php
RewriteRule ^folder/(.*) /folder/index.php?Alias=$1 [L]
This is simple, it simply won't apply the rule if it already starts with /folder/index\.php. You can also try:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^folder/(.*) /folder/index.php?Alias=$1 [L]
This is a little less restrictive of a condition. It only applies the rule if the requested URI doesn't map to an existing file or directory. This assumes that when you try to go to /folder/blahblah there isn't a directory or file blahblah and that you want to route it through your index.php.
I can not get this .htaccess work because there is some syntax error that I can't determinate. When I access the website, it prints a Internal Server Error
RewriteEngine On
RewriteBase /
RewriteCond %{REMOTE_ADDR} !^201\.191\.20\.108 #
RewriteCond %{REQUEST_URI} !^/mantenimiento\.php$ #
RewriteRule ^(.*)$ http://example.com/mantenimiento.php [R=307,L] #
By the way, the following error is printed at error_log:
.htaccess: RewriteCond: bad flag delimiters
The idea with this htaccess is redirect all request from example.com to example.com/mantenimiento.php in order to simulate a "maintenance mode" in the website
Here is an example:
Requested URL:
http://example.com
Substitution URL:
http://example.com/mantenimiento.php
RewriteEngine On
RewriteCond %{REMOTE_ADDR} !^201\.191\.20\.108
RewriteRule ^(.*)$ http://example.com/mantenimiento.php [R=301,L]
The REQUEST_URI is not needed as there is NONE. To work, the REMOTE_ADDR cannot be the one in the pattern. Can use [307,L] too for a temporary redirection.