301 complex redirection via .htaccess [closed] - .htaccess

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Hi Guys i'm currently in the middle of a migration, and need the clients traffic to be sent as a fail save the new url's.
The old domain would have something like olddomain.com/abcd/1234 and i need to redirect that to newdomain.com/?paramenter1=1234&parameter2=ABCD.
This is probably something easy to do and i'm guessing this can be done via a .htaccess request so we can make sure there's no left over traffic. I'm not a developer but i'm a techie and need to advise the client's tech on placing this one since this is actually my idea to help them out.
Any help would be much appreciated!
Thanks in advance!

The following .htaccess rule will redirect http://olddomain.com/abcd/1234 to http://newdomain.com/?paramenter1=abcd&parameter2=1234.
Notice that the two captured groups (defined by parantheses) from the regular expression are numbered from 1 and up. So the first captured group will be accessible in the redirect url as $1 while the second group will be accessible as $2. Please adjust accordingly to your requirements.
RewriteRule ^(.+)/(.+)$ http://newdomain.com/?parameter1=$1&parameter2=$2 [R=301,L]

Related

SEO using .htaccess? And, how to redirect a fake subdirectory to an individual page? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
Since a recent redesign of our website, we've noticed that the search rankings for certain pages has plummeted as individual publications are no longer on their own page, but rather on publications.php?magazine=xx, where xx is a unique ID number for the publication.
Is there any way to use a .htaccess file to redirect fake subdirectories to the pages, i.e. visiting /publications/magazine-name takes you to publications.php?magazine=xx, and if so: would this even have an effect on their SEO?
If not, is there any other way you can make these URL query strings more search engine-friendly?
I'm only halfway there, but using the mod_rewrite tool with something like:
RewriteRule ^advanced-lift-truck?$ pub-automotive.php?mag=1 [NC,L]
can get me a URL that Google will understand and trawl.
Now, it's just a case of figuring out what I can do about each page effectively having the same "content", just with different CSS showing/hiding parts.
I'm investigating the following:
http://www.webdesignerdepot.com/2013/10/how-to-optimize-single-page-sites-for-search-engines/

What domain names are allowed? Is it possible to get a ".df" domain? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I don't understand really well the domains business, but I have seen some rare domains around, for example these webpages:
http://c9.io/
http://repl.it/
And I want to know if it is possible to get any domain I want, lets say a .rf domain or a .kj domain.
Thanks for your help
No, it don't seem to be possible to get a .df domain. Here is a list of the possible domain extensions:
http://www.idcwebs.com/Understanding_Web_Extensions.htm
http://www.webopedia.com/quick_ref/topleveldomains/countrycodeA-E.asp
You can probably get df.someavailabledomainextension, example: df.com or df.me and then create subdomains, like mysite.df.com if that would work for you

.htaccess to allow #fontface [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am using the master .htaccess from http://docs.joomla.org
Its working well but its also locking out all my woff|eot|svg|ttf
How can I add these to the allowed filetypes?
My fonts reside in /templates/mytemplate/css/type/ folder
I believe this is the line in question:
## Allow limited access for certain Joomla! system directories with client-accessible content
RewriteRule ^(components|modules|plugins|templates)/([^/]+/)*([^/.]+\.)+(jp(e?g|2)?|png|gif|bmp|css|js|swf|html?|mp(eg?|[34])|avi|wav|og[gv]|xlsx?|docx?|pptx?|zip|rar|pdf|xps|txt|7z|svg|od[tsp]|flv|mov)$ - [L]
Basically, in certain folders - including the templates/ folder, only certain file types are being allowed and your files aren't in the list. Adding them in the final bracketed clause pipe separated should do the job. Something like...
RewriteRule ^(components|modules|plugins|templates)/([^/]+/)*([^/.]+\.)+(jp(e?g|2)?|png|gif|bmp|css|js|swf|html?|mp(eg?|[34])|avi|wav|og[gv]|xlsx?|docx?|pptx?|zip|rar|pdf|xps|txt|7z|svg|od[tsp]|flv|mov|woff|eot|svg|ttf)$ - [L]

Difference between Apache rules to block libwww-perl [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want to know the difference between these rules and which is most effective to block libwww-perl with the file .htaccess
SetEnvIfNoCase User-Agent "libwww-perl" bad_bot
Order Deny,Allow
Deny from env=bad_bot
or
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* – [F,L]
Thank you!
Functionally, I think they are much the same, with some minor pros and cons. However the former is probably more portable, since you won't have worry about mod_rewrite being installed on the server should you move the site at some point in the future.
Naturally, if you have other mod_rewrite rules this won't make much difference to you.
You also have a wildcard set up in the mod_rewrite rule, that isn't present in SetEnvIfNoCase. I understand this is possible to do this there also, and it might be wise to do so, since you can then catch different libwww versions.
I'm sure you know libwww-perl can send an arbitrary user agent string, so neither will stop the determined.

Learning .htaccess [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I would like learn about .htaccess file, from the very basic to the complex portions. All its capacities, with blocking user, authentication, hiding files, redirection. So far I have only used them, but I want to learn about them, understand them. So that I will be able to create my own rule.
Could you please guide me through this, and point me to the basic and expert guides or lessons or even books. Anything, from basic to complex.
This page more .htaccess tips and tricks is the best simple introduction to using htaccess for rewriting and redirecting that I've found and it's easier to understand than the official Apache guide. You have to figure everything out from the examples, but it's a good selection of most of the common things you'd want to do, rewrite-wise.
Here an "ultimate" sample htaccess file and Apache's rewriting guide.

Resources