I would like to recreate old pages that were online 2 years ago. Unfortunately, all I have is the old htaccess file. Is there a way to find all used URLs?
Thanks in advance!
.htaccess does not store any information about pages or their urls. The only possible information to get is a redirection mechanism and/or some information regarding folders structure, if such was used. If you paste the source of your file here, one could help you with this. In general - in the most optimistic scenario you will just get a set of rules in the form of
regular Expression --> server query
(unless author did enumerate all the urls instead of using regular expressions)
Check the http://archive.org/web/web.php for some cached versions of your website.
Related
I'm trying to create friendly url for my site but with no success :(( and i have two questions
The first is:
How to change the url from domain.com/page/something.php to domain.com/something
And the second is:
Will the changes make duplicate content and if how to fix the problem.
Thank you for your time,
Have a nice day
Check out the official URL Rewriting guide: http://httpd.apache.org/docs/2.0/misc/rewriteguide.html
You'll be using the simplest use case, Canonical URLs. Assuming you have no other pages that you need to worry about, you can use a rule like this: (note: untested, your usage may vary)
RewriteRule ^/(.*)([^/]+)$ /$1$2.php
While that example might not exactly work for your use case, hopefully reading the guide and my example will help you get you on your way.
How do you rewrite a URL in Notes 9 XPages.
Let's say I have:
www.example.com/myapp.nsf/page-name
How do I get rid of that .nsf part:
www.example.com/page-name
I don't want to do lots of manual re-direct because my pages are dynamically formed like wordpress.
I've read this: http://www.ibm.com/developerworks/lotus/library/ls-Web_site_rules/
It does not address the issue.
If you use substitution rules like the following, you can get rid of the db.nsf part and call your XPages directly as example.com/xpage1.xsp:
Rule (substitution): /db.nsf/* -> /db.nsf/*
Rule (substitution): /* -> /db.nsf/*
However, you have to "manually" generate your URLs without the db.nsf part in e.g. menus because the XPages runtime will include the db.nsf part in the URLs if you use for instance the openPage simple action.
To completely control what is going in and out put your Domino behind an Apache HTTP and use mod_rewrite. On Domino 9.0 Windows you can use mod_domino
You can do it with a mix of subsitutions, "URL-pattern" and paritial refresh.
I had the same problem, my customers wants clean URLs for SEO.
My URLs now looks like these:
www.myserver.de/products/financesoftware/anyproduct
First i used one subsitution to cover the folder, database and xpage part of the URL.
My substitution: "/products" -> "/web/techdemo.nsf/product.xsp"
Problem with these is, any update on this site (with in redirect mode) and the user gets back the "dirty" URL.
I solved this with the use of paritial refreshes only.
Last but not least, i uses my own slash pattern at the end of the xpage call (.xsp)
In my case thats the "/financesoftware/anyproduct/" part.
I used facesContext.getExternalContext().getRequestPathInfo() to resolve that URL part.
Currently i used good old RegExp to get the slash separated parameters back out of the url, but i am investigating a REST solution at the moment.
I haven't actually done this, but just saw the option yesterday while looking for something else. In your Xpage, go to All Properties, and look at 'navigationRules' and 'pageBaseUrl'. I think you will find what you are looking for there.
I have a website that's written using CakePHP. I've added some rewrite rules in the .htacces file to change the default urls to different ones (instead of /controller1/action1/parameter I have /some-string-about-controller-and-action/parameter, for example).
The problem is that now both the normal url and the nice one are available, and google seems to be indexing both, which is a problem. I'd like to only keep the nice one, which is the proper way to handle this so that it affects the google results as little as possible?
I don't know why you don't want to use cakes own routing (if you are having trouble doing what you want, you can accomplish what you want with a custom route class), then make sure that you redirect all relevant URL's in your .htaccess file to the desired URL using a MOVED PERMANENTLY redirect.
This way google will index the target url instead of the one that is undesirable. You are right to take offense to this, double indexing is a great way to harm your SEO rankings.
I have a website with multiple folders and I was trying to fix them in my .htaccess. After a little while, I have a big .htaccess with rules that conflicts.
Now every time I want to add a folder I have to add it to the .htaccess.
I did some research and I found out I can create symbolic link instead, so no more .htaccess
In both solution I have to create or modify something so for me its the same result at the end but is it a better practice to create instead symbolic link ?
Symbolic links are faster yes (like Aki said) but here's my thoughts on this.
if you have images, css or js files then you don't need to rewrite or create symbolic links. You can use the full URL (eg /images/...) or use a common domain like i.domain.com (or anything you want) and refer all your JS, Images and CSS there. Eg: i.domain.com/logo.jpg or js.domain.com/site.js.
This way, you never have to think about rewriting rules or create links you might forget one day.
This one is very easy to manage and maintain if you need to add images, change js or update your CSS since you only have one point of entry and automatically everything be updated.
use symblink, .htaccess has to be proccesed by apache whereas the symblink are proccess by the OS which is faster.
creating 100 rules vs 100 symblink, if the rule you looking for is at the last you will have to parse all of them then use the one you need.
I'm in the process of rewriting all the URLs on my site that end with .php and/or have dynamic URLs so that they're static and more search engine friendly.
I'm trying to decide if I should rewrite file names as simple strings of words, or if I should add .html to the end of everything. For example, is it better to have a URL like
www.example.com/view-profiles
or
www.example.com/view-profiles.html
???
Does anyone know if the search engines favor doing it one way or another? I've looked all over Stack Overflow (and several other resources) but can't find an answer to this specific question.
Thanks!
SEO optimized URLs should be according to this logic (listed in priority)
unique (1 URL == 1 ressource)
permanent (they do not change)
manageable (1 logic per site section, no complicated exceptions)
easily scaleable logic
short
with a targeted keyword phrase
based on this
www.example.com/view-profiles
would be the better choice.
said that:
google has something i call "dust crawling prevention" (see paper: "do not crawl in dust" from this google http://research.google.com/pubs/author6593.html) so if google discovers a URL it must decide if it is worth crawling that specific page.
as google gives URLs with an .html a "bonus" credit of trust "this is an HTML page i probably want to crawl it".
said that: if your site mostly consists out of HTML pages that have actual textual content , this "bonus" is not needed.
i personally only add the .html to HTML sitemap pages that consists only out of long lists and only if i have a few millions of it, as i have seen a slightly better crawlrate above these pages. for all other pages i strictly keep the Franzsche URL logic mentioned above.
br
franz, austria, vienna
p.s.: please see https://webmasters.stackexchange.com/ for not programming related SEO questions