My shared ssl is: https://server.domain.com/~username/folder/product.php?products_id=123
I am using the RewriteRule in htaccess:
RewriteRule ^(.*)-p-([0-9]+).html$ product.php?products_id=$2&%{QUERY_STRING}
The problem is, the link appear: https://server.domain.com/~username/folder/name-p-123.html
But when click appear page not found, because the ~username/folder/
Please how can i use the rewrite correct in this case ?
Thank you.
You can use http://www.visiospark.com/mod-rewrite-rule-generator/
It's fairly simple. Keep in mind that I'm not an expert. I was playing around with .htaccess a few days ago and the rules were not updating for client side which is my browser. The reason was that it was loading cache. I deleted the cache from firefox using ctrl+shift+delete, restarted httpd/apache and reloaded the page ctrl+F5 ( Must be CTRL+F5 ). Voila fixed.
Related
Some time ago (I think a couple of years) this simple RewriteRule in my htaccess stopped working.
RewriteRule tags/ tags.php [L]
It worked for years, then a day after a server change or a server upgrade or switch to php-fpm (I don't remember) it stopped working.
I solved it by deleting it, and sending all my links directly to the tags.php file.
This rule is part of a small CMS that I use for many of my sites. The sites work and everything works correctly.
But punctually when I create a new site, after a few days google sends me a warning telling me that the url mysite.com/tags/ creates an error 404.
And this is strange, because the url mysite.com/tags/ no longer exists for years now in my sites, nor in my sitemaps, I am sure because it is used only once in the main menu of my sites, and has been replaced with mysite.com/tags.php.
Above all it cannot exist in new sites. At first I didn't pay much attention to it on old sites. Probably google may have seen it in the old sites, and haven't forgotten it yet, but surely can't have seen it in the new sites.
So, I have a couple of unanswered questions.
The first and perhaps most important to understand: How does google see the url mysite.com/tags/? Is it possible that google reads my htaccess to understand what kind of url I'm going to create?
Second: how can I solve the problem permanently?
--------------------------update---------------------
Sorry for the delay with which I reply (summer vacation).
Regarding anubhava's answer, I have a doubt, but it's my fault, maybe I omitted part of the code.
The next rule says:
RewriteRule ^tags-([^/]+)\/$ tags.php?letter=$1 [L]
and makes work some urls like:
mysite.com/tags-k/
and these urls work, but if I put a 301 redirect on tags.php, will they still work?
No, Google (or any other) search bot cannot read your .htaccess
It is difficult to figure out how search bot found /tags URI but it is definitely hidden somewhere in your web pages.
Now to tell search bots that /tags doesn't exist is to use a redirect rule with R=301:
RewriteEngine On
RewriteRule ^tags(/.*)?$ /tags.php [L,NC,R=301]
With this 301 rule search bot will eventually let go old /tags/ result and will remember /tags.php only.
I am working on an E-Commerce website project which I created in localhost. It worked fine, until I moved it online.
Since I moved it online, I've had issues accessing the admin page and the index.php. I've managed to make the admin page work and can now access the backoffice without any issues, but my index.php still shows me an "error: too many redirects1" page.
What's happening?
Main page of my website is stuck in a redirecting loop (chrome error message : this url tried to redirect you too many times)
Everytime I reload the main page, the url switches between www.mydomain.com and mydomain.com (might be an htaccess issue?)
What I've done to try and solve the problem:
I have checked everything in core_config_data table to make sure
the right urls are written in web/secure/base_url and
web/unsecure/base_url. They are.
I have manually cleared the var/cache and var/session from my
FTP.
I have cleared all cookies / cache from Chrome / Firefox
I have reuploaded the files and database multiple times, thinking it might be due to a corrupted file from the upload.
I have tried to edit the htaccess, but it didn't change anything.
What should I do now ?
I feel like I've tried everything.
As it's my first time with magento, I'm sure it's some dumb thing I might not know about, but I've read nearly every single post about this kind of issues on this website and haven't found anything to resolve it.
So I'm asking you. I'm willing to try every single idea you throw at me, as I've been stuck on this issue for a while now ^^
Thanks for reading :)
Weird, It seems that You make everything right. Try to find and update all url settings in core_config_data: select * from core_config_data where path like '%url%'.
You can try update web/url/redirect_to_base config to 0 (if you have 1).
Remember to clear cache.
Goal: Every non-https link has to become https. Always.
Achieved by now:
Changed base-url to https
Each link is https IF I'm on non-http
My issue is stated at my second achievement. For some reasons the links targets to https if I am on a non-https page, otherwise to http. I know I could implement a work-around using .htaccess but this isn't the clean way and I want to implement it in a clean way.
What I've done to fix it:
I know one can force the protocoll for a single typo3 page and I can update all pages via database but in this case it's not possible as it's a multitree presentation and not every tree have to use https.
I also googled and read about config.baseUrl but I've changed that before I even googled.
On the presentation "Enforce https" having key "https" is installed but I can't find it in the Ext. Repository.
Further stuff:
I guess the issue is not caused by .htaccess but in any case: This in my .htaccess file
RewriteCond %{HTTP_HOST} ^mydomain\.de$
RewriteRule (.*) https://www.mydomain.de/$1 [R=301,L]
Any help is appreciated.
Solution
The solution is to deactivate the Extension "https" which is installed in the TYPO3 presentation. Why? Well, for some reasons the extension replaces/renders the urls as absolute and which is even more weird with "http".
I proceed with the investigation and update the post later on with in depth details.
I'm new to .htaccess file.
My site is hosted on 1and1 and by default it shows www.mydomain.com/defaultsite when nothing is uploaded to my account. Now I've uploaded my wp site and have managed to make it go to index, but if someone inputs in the url www.domain.com/defaultsite he will still get the wrong place.
How can I manage this issue with .htaccess file so that any request to defaultsite will take the user to www.mydomain.com ?
I'm not a 1and1 user, but this could be a DNS cache issue. First, check your document root for the presence of a directory called defaultsite. If it exists, remove it. If not, then you can attempt removing it using mod_rewrite. Insert this rule immediately after RewriteEngine On in your .htaccess file:
RewriteRule ^defaultsite/?$ http://yourdomain.com/ [R=302,L]
If it's working for you, you can safely change 302 to 301 to make in permanent and cache-able.
I have also seen comments referring to an index.html file in the document root. If you see one, delete it - it could be that, internally, 1and1 maps defaultsite to index.html.
Also, it will help for you to clear the cache in your browser when testing. If using Chrome, you can open the Developer Tools (Ctrl+Shift+I), click on the settings cog at the top right of the panel that opens, and check 'Disable cache (while DevTools is open)`.
I had a similar issue and was pulling out my hair trying to figure this out. 1&1 is hosting while Namecheap holds my domain. I was able to access my page without /defaultsite on Safari and mobile Chrome. But on desktop Chrome I was being redirected to /defaultsite.
To remedy this I cleared my cache, flushed my DNS cache, and cleared my browsing history. Not sure if the latter 2 were necessary but having done all three it did help resolve this issue.
We are moving a site from one CMS to another. The .htaccess file has been changed and it needs to be refreshed for the new site to work right. From what I understand the .htaccess file will only be refreshed if the browser cache is cleared? It is fine for those creating the site to clear our cache, but is there a way to get the users' browsers to get the new .htaccess file without the user clearing cache manually on his own initiative?
If you're using RewriteRule, just use R instead of R=301. For other purposes, you'll have to clear your browser cache whenever you change a redirect.
from https://stackoverflow.com/a/7749784/1066234
Some servers will reload as soon as you replace the .htaccess file.
If so it instantly be used for all subsequent requests. You do not need to refresh any caches.
Some servers only check htaccess periodically.
If you already had a R=301 redirect, then you just changed where it redirects to in htaccess, but it isn't updating for you, then it's probably cached in your browser. You can't do a hard refresh on that page because it redirects. You can try in another browser and if it works there, you just need to clear cache in Chrome. Settings > Privacy > Clear browsing data. Then you only need to check "Cached images and files" This will not clear your logins or anything.
I had a rewriterule in my .htaccess file like,
RewriteCond %{HTTPS} !on
RewriteRule (.*) https://example.com%{REQUEST_URI} [L,R=301]
and once I opened the site in localhost it never gave me a chance to hard refresh the page.
Solution: I added a random string like localhost/mywebsite/index.php?1234