htaccess: add a second rewrite rule [closed] - .htaccess

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have the following in my .htcaccess file. It redirects the non-www site to the www site. Can I use this same file to do the same for another site, mysite2? How?
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^mysite1.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite2.com/$1 [R=301,L]

You need a seperate .htacces file for every root URL you're applying rewrite rules to. In other words, you can't do what you're trying.

Related

Redirecting HTTP to HTTPS. What's the best method? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I've searched high and low for a good answer to this question so any help is greatly appreciated.
I have an Linux server with apache, and I'm trying to understand what specific methods are available to accept HTTP requests and forward or redirect them to HTTPS on the same server.
I also need to know WHY one method might be better than another, why would you choose a specific method over another? This is quite important for me to understand.
Thanks for your help
With Apache+PHP there are a couple different ways: server configuration or PHP code. The benefits of performing this kind of check in the server configuration is that it will be applied everywhere, or only on a per-directory basis without having to change the PHP on every page in that folder or on your site. The server configuration also provides faster runtime.
Server Configuration
Using mod_rewrite, you could add the following lines to your VHost config file:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{SERVER_NAME} (.*)
RewriteRule (.*) https://%1/$1 [R=301,L]
Obviously, mod_rewrite must be enabled on your server for this to work. Apache's mod_rewrite documentation has more details.
PHP Code
As others have mentioned, PHP provides a header function that can be used to set the Location HTTP response header, and redirect the client to HTTPS. This does not require access to the server configuration or mod_rewrite.
Well, assuming you're using PHP, what I do is:
header('Location: https://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'], true, 301);
but there are many other ways, all depending on what you're working with at the time. I'm sure someone will have a better answer then mine.
Good luck!

Using .htaccess to redirect bots [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions concerning problems with code you've written must describe the specific problem — and include valid code to reproduce it — in the question itself. See SSCCE.org for guidance.
Closed 9 years ago.
Improve this question
So my website has a page called presets_pc.html, which dynamically loads content, and I want to redirect bots to presets_pc_fallback.php, which literally dumps the database on the page.
I'm pretty new to this stuff, and can't get RewriteRule to work; can you help me?
Thanks :)
The way you can tell whether a bot is accessing your site is through the User-Agent HTTP request header. Anyone can spoof what that is so there's no real guarantee. If you have a list of bots you want to affect, you can look up the "exclusion" user agents from a site like robotstxt and chain them into a RewriteCond match.
For example, google, infoseek, msn:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (googlebot|InfoSeek|msnbot) [NC]
RewriteRule ^/?presets_pc\.html$ /presets_pc_fallback.php [L,R]

htaccess create all files redirect to new path [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
All my images is in this path:
root/lib/images/demo/
Now i all images move to new folder:
root/lib/images_new/demo/
How to do redirects all pictures to new address?
If user go on this address:
root/lib/images/demo/test.png
It should be redirected in this:
root/lib/images_new/demo/test.png
assuming that the old image url http://www.example.com/root/lib/images/demo/test.png , add the following directives to a .htaccess in the root directory of your website
RewriteEngine on
RewriteRule ^root/lib/images/demo/(.*)$ /root/lib/images_new/demo/$1 [R=301,L,QSA,NC]

Was my htaccess hacked? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I found these following lines in my .htaccess file.. I've replaced my website name with example.com. I didn't add this to my .htaccess file...
AuthName "example.com"
AuthUserFile "/home3/examplec1/.htpasswds/public_html/example.com/passwd"
What is this and why was it added to my .htaccess file? Is it possible someone has downloaded the public_html file?
If i did get hacked, how do i prevent from getting hacked again?
I did try to protect a folder(can't remember which one) previously in cpanel, would this alter my .htaccess?
This probably was added by your hosting provider or by someone who also has access to your environment. Adding a protection cannot seriously harm you. Removing or changing a protection would be worse.

How to force all request to come from my website to avoid spoofing using Apache [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
My website got spoofed, and I don’t want other domains to get my site info/data through http requests anymore.
How can I deny all request that came from outside my website?
Thanks!
From the question, I could guess you want to avoid hotlinking of your resources. In this case, you can add following files to your .htaccess file.
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ http://yourdomain.com/showerror.gif [NC,R,L]
The above code will redirect the users, who are not coming from yourdomain.com to some random page or resource to show the message. In this case showerror.gif.
The above code will also check for blank referer and allow them, thus not blocking legitimate users browsing behind proxies/firewalls.
The file extensions in braces can be changed with | separating them.
Another scenario can be forms, where you want users to post data from your website's form and not hotlink it from anywhere else. In this case, you may utilize csrf token as a hidden field in your form. Check this token against a stored session token and regenerate it on every request to keep it fresh.
Hope this helps.

Resources