Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions concerning problems with code you've written must describe the specific problem — and include valid code to reproduce it — in the question itself. See SSCCE.org for guidance.
Closed 9 years ago.
Improve this question
So my website has a page called presets_pc.html, which dynamically loads content, and I want to redirect bots to presets_pc_fallback.php, which literally dumps the database on the page.
I'm pretty new to this stuff, and can't get RewriteRule to work; can you help me?
Thanks :)
The way you can tell whether a bot is accessing your site is through the User-Agent HTTP request header. Anyone can spoof what that is so there's no real guarantee. If you have a list of bots you want to affect, you can look up the "exclusion" user agents from a site like robotstxt and chain them into a RewriteCond match.
For example, google, infoseek, msn:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (googlebot|InfoSeek|msnbot) [NC]
RewriteRule ^/?presets_pc\.html$ /presets_pc_fallback.php [L,R]
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I've searched high and low for a good answer to this question so any help is greatly appreciated.
I have an Linux server with apache, and I'm trying to understand what specific methods are available to accept HTTP requests and forward or redirect them to HTTPS on the same server.
I also need to know WHY one method might be better than another, why would you choose a specific method over another? This is quite important for me to understand.
Thanks for your help
With Apache+PHP there are a couple different ways: server configuration or PHP code. The benefits of performing this kind of check in the server configuration is that it will be applied everywhere, or only on a per-directory basis without having to change the PHP on every page in that folder or on your site. The server configuration also provides faster runtime.
Server Configuration
Using mod_rewrite, you could add the following lines to your VHost config file:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{SERVER_NAME} (.*)
RewriteRule (.*) https://%1/$1 [R=301,L]
Obviously, mod_rewrite must be enabled on your server for this to work. Apache's mod_rewrite documentation has more details.
PHP Code
As others have mentioned, PHP provides a header function that can be used to set the Location HTTP response header, and redirect the client to HTTPS. This does not require access to the server configuration or mod_rewrite.
Well, assuming you're using PHP, what I do is:
header('Location: https://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'], true, 301);
but there are many other ways, all depending on what you're working with at the time. I'm sure someone will have a better answer then mine.
Good luck!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have 2 domains:
www.first.com
www.second.com
Lets assume that In the first one I have an online store, at the second one I have only products of this store (seperate applications that running on the server).
The products link is
www.second.com/firstProduct
www.second.com/secondProduct
www.second.com/thirdProduct
and etc...
I want to redirect users to the first website when someone hit www.second.com, ie not the full product path.
What redirect should I use? 301? In terms of SEO what is the best approach?
Thanks.
Yes, the 301 Moved Permanently, is the code you want to return for this redirect. Search engines typically will queue up 301's for updates to their results, as this indicates that the resource will now be found at the new url, and that the old one is soon to be obsolete.
In your case, since you never want www.second.com/ to be accessed directly, the 301 is exactly what you want.
You might also consider adding a robots.txt file with allow + disallow statements in there, as most of the bots you actually care about for SEO will honor it.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am moving my blog from a sub-domain - blog.example.com to a sub-folder example.com/blog/
The URLs and the content are staying exactly the same.
What would be the best SEO-wise action to take, I was thinking the following:
Add rel="canonical" to sub-domain URLs and let the spiders crawl my pages to become aware of the new links.
Add a 301 redirect from sub-domain to sub-folder.
I understand that there's no point in having canonical if there's a 301 redirect.
Any help would be highly appreciated, thank You in advance!
The URLs are not staying the same. They are changing so you need to tell search engines and users where to find the content. 301 redirects are exactly what you want. They tell search engines where to find the new content and to update their indexes (plus Google will transfer PageRank) plus when users go to the old URL they are automatically redirected to the new URL which canonical URLs do not do.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
My website got spoofed, and I don’t want other domains to get my site info/data through http requests anymore.
How can I deny all request that came from outside my website?
Thanks!
From the question, I could guess you want to avoid hotlinking of your resources. In this case, you can add following files to your .htaccess file.
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ http://yourdomain.com/showerror.gif [NC,R,L]
The above code will redirect the users, who are not coming from yourdomain.com to some random page or resource to show the message. In this case showerror.gif.
The above code will also check for blank referer and allow them, thus not blocking legitimate users browsing behind proxies/firewalls.
The file extensions in braces can be changed with | separating them.
Another scenario can be forms, where you want users to post data from your website's form and not hotlink it from anywhere else. In this case, you may utilize csrf token as a hidden field in your form. Check this token against a stored session token and regenerate it on every request to keep it fresh.
Hope this helps.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have the following in my .htcaccess file. It redirects the non-www site to the www site. Can I use this same file to do the same for another site, mysite2? How?
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^mysite1.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite2.com/$1 [R=301,L]
You need a seperate .htacces file for every root URL you're applying rewrite rules to. In other words, you can't do what you're trying.