URL rewrite image links - .htaccess

I have third party sites that link to some images on my site. The images were placed in Magento's image cache some time ago. But when the cache is refreshed, Magento modifies the file names and thus the links become unreachable. It is not every image just certain ones that this is happening to. I have 22 images where I need to do this.
How can I modify my .htaccess to make the links go to a static copy of the image located in another directory?

Take a look at mod_alias and RedirectMatch, you can use regular expressions to match against a URI and a target (where to redirect to), if you don't need regular expressions, you can just use Redirect.
RedirectMatch /old_image_uri /new_image_uri

Related

WebURK without any kind of subdirectories

It is my first time seeing something like this.
Does anyone know, what the name/kind/type of the website is that does not have any kind of subdirectories on the web-URL page, and it always just stays as a plain domain name, and how it was made, and how it can be avoided since I need to send an API call to one of those subdirectories?
Example:
I have a website let's call it example.net. It has UI page and it has a home page, which should look like this in a browser: example.net/home, or it has a /shipment option inside of the UI page. So the URL should look like this:
example.net/shipment and it has one more subdirectory inside for example /report, and if I select it, it should look like this: example.net/shipment/report (something like this).
And open up that subdirectory, but again web-URL link on a website continues to stay just as a example.net all the time.
And for some reason whatever subdirectory I would go on a website, Web-browser URL will remain as a hello-world.net all the time without any kind of changes subdirectories on a web-browser URL.
It is an internal website, so I can not post examples of it from work here.
Does anyone knows, what the name of that kind of set?
How it can be avoided? Since I need to send an API request to one of the subdirectories?
I am not a developer, and I am new to IT, so I am not really sure, what the name of this, and how does it works.
If you are on example.net/shipment and you want to link to a subdirectory, the link needs to include that subdirectory. You have two possibilities:
Root relative links: <a href=/shipment/report>
Absolute links: <a href=https://example.com/shipment/report>
If you shipment directory has a trailing slash (example.net/shipment/), you a third possibility. (Note this only works with a shipment URL that is different than what you specified in your question.)
Document relative links: <href=report>
There is no name for websites that don't have subdirectories that I know of. Websites are often set up like this to make the URLs easy to type and remember which helps with SEO.

.htaccess: redirect specific link to another?

I have these three links:
localhost/my_projects/my_website.php
localhost/my_projects/my_website.html
localhost/my_projects/my_website
The paths of the php and html files are as follows:
C:\xampp\htdocs\my_projects\my_website.php
C:\xampp\htdocs\my_projects\my_website.html
The link without an extension is "artificial" and I want to use said link:
localhost/my_projects/my_website
to get the contents of either of these links:
localhost/my_projects/my_website.php
localhost/my_projects/my_website.html
The reason for the two example files, instead of just one, is that I want to be able to switch between those two files when I edit the htaccess file. Obviously I only want to access one of those files at a time.
What do I need to have in my .htaccess file inside the my_projects folder to accomplish that? How can I make one specific link redirect to another specific link?
After reading your comment clarifying your folder structure I corrected the RewriteRule. (By the way, it would be best if you add that info to the question itself instead of in comments).
The url you want to target is: http://localhost/my_projects/my_website
http:// is the protocol
localhost is your domain (it could also be 127.0.0.1 or a domian name like www.example.com in the Internet)
I assume you are running Apache on port 80, otherwise in the url you need to also specify the port. For port 8086 for example it would be http://localhost:8086/my_projects/my_website.
The real path is htdocs/my_projects/my_website.php or htdocs/my_projects/my_website.html depending on your needs (obviously both won't work at the same time).
Here the my_projects in the "fake" url collides with the real folder "my_projects" so Apache will go for the folder and see there is no my_website (with no extension) document there (it won't reach the rewrite rules).
There is a question in SO that provides a work around for this, but it is not a perfect solution, it has edge cases where the url will still fail or make other urls fail. I had posted it yesterday, but I seem not to find it now.
The simple solution if you have the flexibility for doing it is to change the "fake" url for it not to collide with the real path.
One option is for example to replace the underscores with hyphens.
Then you would access the page as http://localhost/my-projects/my-website if you want to keep a sort of "fake" folder structure in the url. Otherwise you could simply use http://localhost/my-website.
Here are both alternatives:
# This is for the directory not to be shown. You can remove it if you don't mind that happening.
Options -Indexes
RewriteEngine On
#Rule for http://localhost/my-projects/my-website
RewriteRule ^my-projects/my-website(.+)?$ my_projects/my_website.php$1 [NC,L]
#Rule for http://localhost/my-website
RewriteRule ^my-website(.+)?$ my_projects/my_website.php$1 [NC,L]
(Don't use both, just choose one of these two, or use them to adapt it to your needs)
The first part the rewrite rule is the regular expression for your "fake" url, the second part is the relative path of your real folder structure upto the page you want to show.
In the regular expression we capture whatever what we assume to be possible query parameters after .../my_website, and paste it after my_website.php in the second part of the rule (the $1).
Later on if you want to point the url to my_website.html, you have to change the second part of the rule, where it says .php, replace it by .html.
By the way, it is perfectly valid and you'll see it in most SEO friendly web sites to write an url as http://www.somesite.com/some-page-locator, and have a rewrite rule that translates that url to a page on the website, which is what I had written in my first answer.

RewriteRule - redirect multi variable URL to multi variable URL

Our old website has a search URL structure like this:
example.com/Country/United States/Region/California/Area/Southern California/City/San Diego/Suburb/South Park/Type/House/Bedrooms/4/Bathrooms/3/
This is currently rewritten to point to the physical page:
/search/index.aspx
The parameters in the URL can be mixed up in different orders, and the URL can include one or more parameters.
We want to 301 redirect these old URLs to a new structure that is ordered in a logical way and more concise:
example.com/united-states/california/southern-california/san-diego/south-park/?type=house&bedrooms=4&bathrooms=3
example.com/united-states/california/?type=house&bedrooms=4&bathrooms=3
Is there a way with URL rewriting to interrogate the old URL, work out what parameters are existing and then write out the new URL structure?
Even if we can limit it to just the Country, Region, Area, City and Suburb, that may be good enough to at least return some results even if it's not perfect.
Also, spaces should be turned into hyphens and all text made lowercase.
I already have the RewriteRule to turn the new URL structure into a URL to point to a physical page. It's just transforming the old URL in to the new URL I need help with. I've googled endlessly and it's just beyond me!
Can anyone help? Thanks.
Since you already have the old search page with rewriting rules set up for it and which is capable of parsing all parameters you need, the easiest and most appropriate solution I see here is to issue a redirect you require from this old search page's code. Just put the code that composes new URL with all parameters needed and redirects from this page - this should be a lot easier than trying to parse all these parameters in .htaccess and combine them into the new format.

CakePHP nice urls - how to prevent normal urls from working

I have a website that's written using CakePHP. I've added some rewrite rules in the .htacces file to change the default urls to different ones (instead of /controller1/action1/parameter I have /some-string-about-controller-and-action/parameter, for example).
The problem is that now both the normal url and the nice one are available, and google seems to be indexing both, which is a problem. I'd like to only keep the nice one, which is the proper way to handle this so that it affects the google results as little as possible?
I don't know why you don't want to use cakes own routing (if you are having trouble doing what you want, you can accomplish what you want with a custom route class), then make sure that you redirect all relevant URL's in your .htaccess file to the desired URL using a MOVED PERMANENTLY redirect.
This way google will index the target url instead of the one that is undesirable. You are right to take offense to this, double indexing is a great way to harm your SEO rankings.

Using one directory with different aliases using .htaccess

I'm redeveloping a site (replacing it with one based on CodeIgniter), which is currently a horrid mess of repeated procedural code, however, it has good search engine rankings. Because of this, I need to keep the exact same URL structure.
The company has many different quote pages, which are all essentially the same - so I've produced one clean version which can be used everywhere.
The quote system is now in a folder called /get-quote, but due to the old URLs being required, that folder mustn't be visible anywhere.
I'd like the following to happen, but don't know how to:
A user accessing /insurancequote.php should (on the server) load the /get-quote/ directory (which in turn will load the default CI route). The Base URL in CI should be http://www.mysite.com/insurancequote.php (I'm able to do that bit), so moving to step 2 would result in: http://www.mysite.com/insurancequote.php/step2 (which would map to /get-quote/step2).
Secondly, a user accessing /brokerquote.php should show mysite.com/broker in the address bar (redirect?), but on the server access /get-quote/broker.
Thirdly, a user accessing one of many broker-specific pages, e.g. mysite.com/brokername1.php or mysite.com/broker/brokername2.php (yep, they are scattered all over the place! - but I do know where each one is) should show mysite.com/broker/brokername1 or mysite.com/broker/brokername2. On the server, /get-quote/broker/brokername1 or /get-quote/broker/brokername2 should be accessed.
I don't think what I've written is completely clear, so maybe sudocode helps:
If '/insurancequote.php'
Dont Redirect
Use '/get-quote/'
If '/brokerquote.php'
Redirect '/broker/'
Use '/get-quote/broker/'
// Do the following (manually) for each broker
If '/brokername1.php'
Redirect '/broker/brokername1/'
Use '/get-quote/broker/brokername1/'
If '/brokers/bname2.php'
Redirect '/broker/brokername2/'
Use '/get-quote/broker/brokername2/'
If '/mybrokerpage.php'
Redirect '/broker/mybroker/'
Use '/get-quote/broker/mybroker/'
Is this possible? If so, how would I go about doing it?
Thanks!
The risk you take is messing up your new clean code for historical reasons (and the guy coming next you will say, WTF, this is a mess!).
For me the right solution would be handling the url migration in apache and not in your application. Every refferenced url that you do not want to keep should get a 410 - Gone message (think about referenced images for example) and every referenced page which have a new matching page should get a 301 - moved permanently redirection on the right page. Then after some time as gone check the access log of your server, and if nobody checks the old url anymore then remove the rules.
If you know every old url and every matching url then use a matching url file (or hash file, faster) and manage the redirection 301 with rewriteMap. You can have a really big number of files in a hash file, the match should be fast. And it should be a temporary function, waiting for robots to fix the urls.

Resources