Need to forward folder of 1000 pages - .htaccess

I am given the task of forwarding a folder of hundreds of URLs to a new folder name. The URL: "http://domain.com/old/1_of_1000_pages" to http://domain.com/new/1_of_1000_pages".
What is the best way to do this? Use .htaccess? I shouldn't write all of them individually, right? There might be 1,000. Is there a way to just forward everything going to /old/ to /new/ while still reaching the correct /1_of_1000_pages ?
I'm a web designer, but I'm not that familiar with .htaccess code yet!
Thank you ahead of time!!!

You can use this simple generic RedirectMatch rule to handle all URLs in your website root .htaccess:
RedirectMatch 301 ^/old/(.*)$ /new/$1

Related

Too many Rewrite Rules in .htaccess

I had to redesign a site last week. The problem is that last urls weren't seo friendly so, in order to avoid Google penalizing my site because too many 404 errors, I have to create a lot of Rewrite Rules because all the content had awful URL's ( and that content had a good position on SERP's).
For example:
RewriteRule ^documents/documents_for_subject/22-ecuaciones-exponenciales-y-logaritmicas http://%{HTTP_HOST}/1o-bachillerato/matematicas-cc.ss/aritmetica-y-algebra/ecuaciones-exponenciales-y-logaritmicas [R=301,L]
Is this a problem on my performance? Is there another solution to my situation?
Thanks
They are in the same domain.
Then an internal redirect is much better. A header redirect sends the new URL to the browser and causes it to make a new request; an internal one is handled, as the name says, internally.
This should work:
RewriteRule ^documents/documents_for_subject/22-ecuaciones-exponenciales-y-logaritmicas /1o-bachillerato/matematicas-cc.ss/aritmetica-y-algebra/ecuaciones-exponenciales-y-logaritmicas [L]
Any performance issues are going to be negligible with this - except maybe if you have many thousands or tens of thousands of individual rules, those may slow down Apache. In that case, if you have access to the central server configuration, put the rules there instead of a .htaccess file, because instructions in the server config get stored in memory and are faster.
A. Yes using 301 is the right way to notify search bots about changed URLs and eventually your old URL's will be removed from search results.
B. You don't need to use %{HTTP_HOST} in your rewrite rule just use it like this:
RewriteRule ^documents/documents_for_subject/22-ecuaciones-exponenciales-y-logaritmicas http://%{HTTP_HOST}/1o-bachillerato/matematicas-cc.ss/aritmetica-y-algebra/ecuaciones-exponenciales-y-logaritmicas [R=301,L]
C. If you have lots of RewriteRules like above I recommend using RewriteMap or else use some scripting support (like PHP) to redirect from old to new URL with 301.

.htaccess 301 redirect for thousands of entries or RewriteMap

I have a site with thousands of pages that need to be redirected. I was thinking of using a 301 redirect in my .htaccess, but I'm just afraid that this will be very inefficient.
Would having a .htaccess with thousands of lines (there is no way to have a re-write rule, they have to be mapped one by one), mean that every time someone accesses one of our pages, they have to read the entire .htaccess? Is that a bad thing? This sit is in a shared host.
I saw a previous answer here about using RewriteMap. How is that different than having the 301 redirects?
Thanks
For simple page redirects 301 is the best and it's very fast. RewriteMap is for more complex rewrite functions or doing very specific rewrite tasks.
Before black listing your pages server side, I would try remapping with your application first.
If you set up the redirect with .htaccess those pages will be dead to Google which of course may or may not be a bad thing. Basically once Google indexes those redirects there really is no going back (SEO).
In short redirect wisely.

.htaccess Redirect Issue , struggling with parameters

I am struggling with an url/parameter issue.
The main index.php page of my website was previously used for e.g.
www.domain.com/index.php?site=3&word=21 and
www.domain.com/index.php?site=3
But now I have changed those things, but in Google the URLs are still listed. So what I need is to redirect all the type of pages above to the main page/domain www.domain.com.
I believe .htaccess redirect is the right thing to use then, is that correct?
What type of code do I need to redirect those to main domain. Just the url's with parameters after index.php? etc. not other directories e.g. www.domain.com/sunshine/
Would be great if someone could help me with this. I believe it is actually a quick fix, but I have failed to get the code correct.
Thanks.
You can add this to an .htaccess in your site root:
RedirectMatch permanent ^/index.php* http://www.domain.com/
But, be careful if your new index page is also named index.php because you might run into a redirect loop.

.htacces redirection for specific files

i know that there are a lot of questions about .htaccess, but i am not able to find the solution.
I have an index.php on a folder on my domain.
mydomain.com/folder/index.php
This index.php list 3 urls that i do not want the people to see, i only want them to be redirected to the first one.
I cannot modify the index.php file, i need to redirect the trafic using htaccess, this way:
I need to redirect everything that go to
mydomain.com/folder/
www.mydomain.com/folder/
mydomain.com/folder/index.php
www.mydomain.com/folder/index.php
and send them into this url
www.mydomain.com/folder/text
Any ideas?
Thank you very much

How do I tell search engines not to index content via secondary domain names?

I have a website at a.com (for example). I also have a couple of other domain names which I am not using for anything: b.com and c.com. They currently forward to a.com. I have noticed that Google is indexing content from my site using b.com/stuff and c.com/stuff, not just a.com/stuff. What is the proper way to tell Google to only index content via a.com, not b.com and c.com?
It seems as if a 301 redirect via htaccess is the best solution, but I am not sure how to do that. There is only the one htaccess file (each domain does not have its own htaccess file).
b.com and c.com are not meant to be aliases of a.com, they are just other domain names I am reserving for possible future projects.
robots.txt is the way to tell spiders what to crawl and what to not crawl. If you put the following in the root of your site at /robots.txt:
User-agent: *
Disallow: /
A well-behaved spider will not search any part of your site. Most large sites have a robots.txt, like google
User-agent: *
Disallow: /search
Disallow: /groups
Disallow: /images
Disallow: /news
#and so on ...
You can simply create a redirect with a .htaccess file like this:
RewriteEngine on
RewriteCond %{HTTP_HOST} \.b\.com$ [OR]
RewriteCond %{HTTP_HOST} \.c\.com$
RewriteRule ^(.*)$ http://a.com/$1 [R=301,L]
It pretty much depends of what you want to achieve. 301 will say that the content is moved permanently (and it is the proper way of transferring PR), is this what you want to achieve?
You want Google to behave? Than you may use robots.txt, but keep in mind there is a downside: this file is readable from outside and every time located in the same place, so you basically give away the location of directories and files that you may want to protect. So use robots.txt only if there is nothing worth protecting.
If there is something worth protecting than you should password protect the directory, this would be the proper way. Google will not index password protected directories.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93708
For the last method it depends if you want to use the httpd.conf file or .htaccess. The best way will be to use httpd.conf, even if .htaccess seems easier.
http://httpd.apache.org/docs/2.0/howto/auth.html
Have your server side code generate a canonical reference that point to the page to be considered "source". Example =
Reference:
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
- Update: this link-tag is currently also supported by Ask.com, Microsoft Live Search and Yahoo!.

Resources