My server is Case Sensitive, and id like to turn it to inSensitive.
Example of what I mean is
lets say I upload Fruit.php
Well then going to this file wont work:
www.website.com/fruit.php
but this one will:
www.website.com/Fruit.php
Is there a way so Fruit.php and fruit.php will work? also with the directories. i.e:
/Script/script.php
/script/Script.php
You need to use the mod_speling (sic) apache module:
http://httpd.apache.org/docs/1.3/mod/mod_speling.html
In .htaccess
<IfModule mod_speling.c>
CheckCaseOnly On
CheckSpelling On
</IfModule>
The CheckSpelling operative makes Apache perform a more involved effort to find a match e.g. correcting common spelling mistakes
Case sensitivity depends on the file system, not Apache. There is a partial solution, however. mod_rewrite can coerce everything to lowercase (or uppercase) like so:
RewriteMap tolowercase int:tolower
RewriteRule ^(.*)$ ${tolowercase:$1}
Reference: http://httpd.apache.org/docs/2.2/mod/mod_rewrite.html#rewritemap
Unfortunately, this only works if all your files are lowercase, while you specifies mixed case filenames (Fruit.php.) Are you comfortable renaming all the files in your project lowercase?
UNIX-servers are case-sensitive - they distinguish between upper-case and lowercase letters in file names and folder names. So if you move your website from a windows to a UNIX-server (when you change web host for instance), you risk getting a certain amount of "Page not found"-errors (404 errors), because directories and other websites linking to yours sometimes get the cases wrong (typically writing the first letter of folder names in upper-case etc.). This javascript-based custom 404-error page solves the problem by converting URL's into lowercase.
You can get the script from http://www.forbrugerportalen.dk/sider/404casescript.js
Happy coding !!!!!!!
Related
I want robots.txt to allow only index.php and images folder and disallow all other folders, is this possible?
This is my code:
User-agent: *
Allow: /index.php
Allow: /images
Disallow: /
Secondly, is it possible to do the same job with htaccess?
First, be aware that the "Allow" option is actually a non-standard extension and is not supported by all crawlers. See the wiki page (in the "Nonstandard extensions" section) and the robotstxt.org page.
This is currently a bit awkward, as
there is no "Allow" field. The easy way is to put all files to be
disallowed into a separate directory, say "stuff", and leave the one
file in the level above this directory:
Some major crawlers do support it, but frustratingly they handle it in different ways. For example. Google prioritises Allow statements by matching characters and path length, whereas Bing prefers you to just put the Allow statements first. The example you've given above will work in both cases, though.
Bear in mind those crawlers who do not support it will simply ignore it, and will therefore just see your "Disallow" rule, effectively stopping them from indexing your entire site! You have to decide if the extra work moving files around (or writing a long list of Disallow rules for all your subdirectories) is really worth the bonus of getting indexed by the lesser crawlers. Probably not.
Ref htaccess, you can't really do anything useful with it here. You'd have to match the user agent against a large list of known bots and you'd just end up missing some - or worse, blocking real users.
Yes, that code is correct. The robots.txt file is read from top to bottom so as long as the disallow is on the bottom you won't run into problems. This is because it matches the first rule, if the disallow was on the top then it wouldn't ever reach the allow statements.
Edit/Sidenote:
This is only for "good" (Googlebot, Bingbot etc..) robots which follow the standard. Plenty of other robots either misinterpret the robots.txt file or just completely ignore it.
I want to remove numbers on the end url in specified folder, using htaccess.
(Numbers and minus sign befor numbers). For all urls in this folder.
For example
http://www.example.com/music/new-track-released-52
or
http://www.example.com/music/helo-there-4
Need to look like
http://www.example.com/music/new-track-released
http://www.example.com/music/helo-there
For all links in folder music
(I'm already removed php extension with htaccess)
How to do that?
Probably something like this:
RewriteEngine on
RewriteRule ^/music/(.+)-[0-9]+$ /music/$1
Note that this is the version for the host configuration. For .htaccess style files this has to be slightly modified. Whenever possible you should prefer not to use .htaccess style files but the real host configuration instead. Those files are notoriously error prone, hard to debug and really slow the server down.
code below doesn't work my .htaccess file. I mean, after this code is applied, I can still index folders in html.
# BEGIN disable folder index
Options All -Indexes
# END disable folder index
however, code below works. I mean, after this code is applied, server gives 403 if I try to index a folder which I know that it exists.
Options All -Indexes
I'm on a shared hosting and have nothing to do with server config. .htaccess is created via notepad++ with encoding setting UTF-8 without BOM. .htaccess permission is set to 0644. there exist no other code in .htaccess.
what does this situation mean? what am I doing wrong?
Ok, looks like my original comment above pushed you into the right direction:
Most likely this is a problem with the line breaks. So that for the
interpreting part of the http server that "Options" line is not on a
separate line, thus also commented out. Check your line ending
characters by using a hexeditor. That s the only reliable tool to do
so.
So I usually want to set htaccess rules slightly differently based on what server it is on, eg live or development.
The ErrorDocument usually needs to be different, as well as some of the AddType and SetHandlers bits.
Is there any way I can make this a bit more dynamic and get it to auto detect based on the URL, then set a variable and have if conditionals further down in the htaccess?
Want to do this entirely from URL detection instead of setting parameters with apache please :)
No there isn't any way to set those things via some url detection. You can't make normal if conditions surrounding some of the things you want (AddType SetHandlers and ErrorDocument).
You could use env variables and mod rewrite but I don't think you'll like the end result. You'll have to do something like this using env|E=[!]VAR[:VAL] syntax
If you were in the httpd.conf or vhost file you might be able to separate your different setups by using <directory> sections </directory>. But Directory is not allowed in htaccess.
Also I wouldn't do this in a production environment anyways since something could go wrong and I would think the detection is slower and not needed. Perhaps you may want to look into a build script you run to create/deploy your different setups for development/production depending on hostname and other factors.
I found that one of the main things that cause .htaccess rewrite rulesets to do seemingly bizarre things is when Apache decides to try to apply them inside a subrequest. This is to the extent that I now always use the [NS] flag on my rules or use a prefix rule
RewriteCond %{IS_SUBREQ}%{ENV:END} t|1 [NC]
RewriteRule ^ - [L]
(The %{ENV:END} bit just allows me to use E=END:1 to do the same as the V2.4 END flag.)
My Q is: can anyone give me of a good usecase where I wouldn't want to do this? (or alternatively where I would want to use the special -U or -F condition patterns).
I realise that there may be many that I haven't thought of, but the A tick goes to the first valid one.
I'd guess the typical situations where you'd want to apply rewrite rules to subrequests are more or less the same as the one where you'd use symlinks inside your document root.
For a plausible example, let's say you're using Server Side Includes, and have a bunch of files scattered around with suffixes like .html, .shtml and .htm, and perhaps some uppercase variants of these too. At some point, you decide to standardize on the .html suffix, and rename all your files accordingly. But you still have a bunch of legacy code and links that use the other suffixes, and rooting them all out will take a while.
In that case, you might want a rewrite rule like this:
RewriteRule ^(.*)\.s?html?$ $1.html [NC]
By applying this to subrequests too, you ensure that your Server Side Includes don't break because of the renaming.