smarty string replace two times in a row - string

Good day all.
I've a little issue about creating https links and the only entry point I got is smarty, so my idea was to simply search and replace the string http with https, but, just to be sure, I'd like to convert also the string httpssinto https to handle those links already served as https.
Actually I'm doing this:
{$link->getCategoryLink($smarty.get.id_category, null, $lang.id_lang,null,null )|replace:"http":"https"}
is there a way to add another string replace in the same row?
I mean something like:
{($link->getCategoryLink($smarty.get.id_category, null, $lang.id_lang,null,null )|replace:"http":"https")|replace:"httpss":"https"}
I'm using smarty only twice a year, so I'm not so expert there and I don't want to add complexity (that I'll not be able to read the next year) ;)

There are better options in Smarty (regex_replace for example), but for a more correct replacement, couldn't you just replace the whole protocol?
{$link->getCategoryLink($smarty.get.id_category, null, $lang.id_lang,null,null )|replace:"http://":"https://"}
That way you can be sure it won't replace a part of the link containing the word http (not in the protocol), and only http:// links will be replaced

Related

Notes 9, rewriting URLs

How do you rewrite a URL in Notes 9 XPages.
Let's say I have:
www.example.com/myapp.nsf/page-name
How do I get rid of that .nsf part:
www.example.com/page-name
I don't want to do lots of manual re-direct because my pages are dynamically formed like wordpress.
I've read this: http://www.ibm.com/developerworks/lotus/library/ls-Web_site_rules/
It does not address the issue.
If you use substitution rules like the following, you can get rid of the db.nsf part and call your XPages directly as example.com/xpage1.xsp:
Rule (substitution): /db.nsf/* -> /db.nsf/*
Rule (substitution): /* -> /db.nsf/*
However, you have to "manually" generate your URLs without the db.nsf part in e.g. menus because the XPages runtime will include the db.nsf part in the URLs if you use for instance the openPage simple action.
To completely control what is going in and out put your Domino behind an Apache HTTP and use mod_rewrite. On Domino 9.0 Windows you can use mod_domino
You can do it with a mix of subsitutions, "URL-pattern" and paritial refresh.
I had the same problem, my customers wants clean URLs for SEO.
My URLs now looks like these:
www.myserver.de/products/financesoftware/anyproduct
First i used one subsitution to cover the folder, database and xpage part of the URL.
My substitution: "/products" -> "/web/techdemo.nsf/product.xsp"
Problem with these is, any update on this site (with in redirect mode) and the user gets back the "dirty" URL.
I solved this with the use of paritial refreshes only.
Last but not least, i uses my own slash pattern at the end of the xpage call (.xsp)
In my case thats the "/financesoftware/anyproduct/" part.
I used facesContext.getExternalContext().getRequestPathInfo() to resolve that URL part.
Currently i used good old RegExp to get the slash separated parameters back out of the url, but i am investigating a REST solution at the moment.
I haven't actually done this, but just saw the option yesterday while looking for something else. In your Xpage, go to All Properties, and look at 'navigationRules' and 'pageBaseUrl'. I think you will find what you are looking for there.

htaccess: Add random string to URL

I'm having a URL like this:
http://www.foobar.com/
If a user enters it, I want the URL to be expanded by a random string like
http://www.foobar.com/f896c0fb0924db5dfeae58d430c2d6ca
(In the example a MD5 is added, but it anything else would be fine too.)
Is it possible to do this via .htaccess and some clever Rewrite-rules?
It is possible to do it with mod_rewrite, but the programmatic mode (prg) needs to be defined in the server config (you can only use them then from .htaccess). You could come close with a long chain of predefined strings (meh) or use other pseudo-random sources. One option is to combine various TIME server variables like TIME_SEC. With repeats and without separators, it would not be easy to notice the pattern.

& Ampersand in URL

I am trying to figure out how to use the ampersand symbol in an url.
Having seen it here: http://www.indeed.co.uk/B&Q-jobs I wish to do something similar.
Not exactly sure what the server is going to call when the url is accessed.
Is there a way to grab a request like this with .htaccess and rewrite to a specific file?
Thanks for you help!
Ampersands are commonly used in a query string. Query strings are one or more variables at the end of the URL that the page uses to render content, track information, etc. Query strings typically look something like this:
http://www.website.com/index.php?variable=1&variable=2
Notice how the first special character in the URL after the file extension is a ?. This designates the start of the query string.
In your example, there is no ?, so no query string is started. According to RFC 1738, ampersands are not valid URL characters except for their designated purposes (to link variables in a query string together), so the link you provided is technically invalid.
The way around that invalidity, and what is likely happening, is a rewrite. A rewrite informs the server to show a specific file based on a pattern or match. For example, an .htaccess rewrite rule that may work with your example could be:
RewriteEngine on
RewriteRule ^/?B&Q-(.*)$ /scripts/b-q.php?variable=$1 [NC,L]
This rule would find any URL's starting with http://www.indeed.co.uk/B&Q- and show the content of http://www.indeed.co.uk/scripts/b-q.php?variable=jobs instead.
For more information about Apache rewrite rules, check out their official documentation.
Lastly, I would recommend against using ampersands in URLs, even when doing rewrites, unless they are part of the query string. The purpose of an ampersand in a URL is to string variables together in a query string. Using it out of that purpose is not correct and may cause confusion in the future.
A URI like /B&Q-jobs gets sent to the server encoded like this: /B%26Q-jobs. However, when it gets sent through the rewrite engine, the URI has already been decoded so you want to actually match against the & character:
Rewrite ^/?B&Q-jobs$ /a/specific/file.html [L]
This makes it so when someone requests /B&Q-jobs, they actually get served the content at /a/specific/file.html.

Is it possible with canonical URL for this pattern in htaccess: /a/*/id/uniqueid?

A big problem is that I am not a programmer….! So I need to solve this with means within my own competence… I would be very happy for help!
I have an issue with a lot of duplicated URLs in the Google index and there are strong signs that it is causing SEO problems.
I don’t have duplicate links on the site itself, but as it once was set-up, for certain pages the system allows all sorts of variations in the URL. As long as is it has a specific article-id, the same content will be presented under an infinite number of URLs.
I guess the duplicates in Google's index has been growing over long time and is due to links gone wrong from other sites that links to mine. The problem is that the system have accepted the variations.
Here are examples of variations that exists in the Google index:
site.com/a/Cow_Cat/id/5272
site.com/a/cow_cat/id/5272
site.com/a/cow…cat/id/5272
site.com/a/cowcat/id/5272
site.com/a/bird/id/5272
The first URL with mixed case is the one used site-wide and for now I have to live with it, it would take too long time to make a change to all lower case. I cannot make a manual effort via htaccess as it is a total of 300.000 articles. I believe there are 10 ‘s of thousands that have one or more duplicates.
My question is this:
Is it possible to create rules for canonical URLs in htaccess in order to make the above URLs to be handled as one as well as for the rest of the 300.000?
I e, is there a way to say that all URLs having
/a/*/id/uniqueid
should be seen as one = based only on the unique ID and not give any regard to the text expressed with the “*”?
My hope is that it would be possible to say that a certain pattern like above should only be differentiated by the last unique segment.
If it is not possible in htaccess, how would it be done with link rel="canonical" on each page, can the code include wildcards?
I should add that the majority of the duplicates are caused by incoming links being lower case where the site itself is using a mix. Would it be OK to assign a canonical URL only with lower case although the site itself is basically always using a mix of lower/upper case?
If this is possible, I would be very happy to be helped with how to do it!!!!
Jonas
Hi Michael! I am not an expert but this is how I think it could be done:
1) My problem is that the URLs have mixed cases and I cannot change that now.
2) If it is OK for the searchengines, it would be fine for me to make the canonical URL identical to the actual URLs with the difference that it was all lower case, that would solve approx 90% of the duplicates. I e this would be the used URL: site.com/a/Cow_Cat/id/5272 and this would be the canonical: site.com/a/cow_cat/id/5272. As I understand, that would be good SEO...or...?
My idea was NOT to change the address browser address bar (i e using 301 redirect) but rather just telling the search engines which URLs that are duplicates, as I understand, that can be done by defining a canonical URL either in htaccess (as a pattern - I hope) or as a tag on each page.
3) IF, it would be possible to find a wildcard solution...I am not sure if this is possible at all, but that would mean it was possible to NOT assign a specific canonical URL but rather a "group pattern", i e "Please search engine, see all URLs with this patter - having the unique identifier in the end - as if they are one and the same URL, you SE, decide which one you prefer": /a/*/id/uniqueid
Would that work? It will only work in htaccess if canonical URLs can be defined as a group where the group is defined as a pattern with a defined part as the unique id.
Is it possible when adding a tag for each page to say that "all URLs containing this unique id should be treated the same"? If that would work it would look something similar to this
link rel="canonical" /a/*/id/5272
I dont know if this syntax with wildcard exist but it would be nice : )
My advice would be to use 301 redirects, with URL rewriting. Ask your webmaster to place this in your apache config or virtual host config:
RewriteMap lc int:tolower
Then inside your .htaccess file you can use the map ${lc:$1} to convert matches to lower case. Here, the $1 part is a match (backreference from brackets in a regex in the RewriteRule) and the ${lc: } part is just how you apply the lc (lowercase) function set up earlier. Here is an example of what you might want in your .htaccess file:
RewriteCond %{REQUEST_URI} [A-Z] #this matches a url with any uppercase characters
RewriteRule (.*) /${lc:$1} [L,R=301] #this makes it lowercase
As for matching the IDs, presuming your examples mean "always end with the ID" you could use a regex like:
^(.+/)(\d+))$
The first match (brackets) gets everything up to and including the forward slash before the ID, and the second part grabs the ID. We can then use it to point to a single, specific URL (like canonical, but with a 301).
If you do just want to use canonical tags, then you'll have to say what you're using code wise, but an example I use (so as not add tags to hundreds of individual pages, for instance) in PHP would be:
if ($_SERVER["REDIRECT_URL"] != "") {
$canonicalUrl = $_SERVER["SERVER_NAME"] . $_SERVER["REDIRECT_URL"];
} else if ($_SERVER["REQUEST_URI"] != "") {
$canonicalUrl = $_SERVER["SERVER_NAME"] . preg_replace('/^([^?]+)\?.*$/', "$1", $_SERVER['REQUEST_URI']);
}
Here, the redirect URL is used if it's available, and if not the request uri is used. This code strips off the query string (this bold bit in http://www.mysite.com/a/blah/12345/?something=true). Of course you can add to this code to specify a custom path, not just taking off the query string, by playing with the regex.

CakePHP nice urls - how to prevent normal urls from working

I have a website that's written using CakePHP. I've added some rewrite rules in the .htacces file to change the default urls to different ones (instead of /controller1/action1/parameter I have /some-string-about-controller-and-action/parameter, for example).
The problem is that now both the normal url and the nice one are available, and google seems to be indexing both, which is a problem. I'd like to only keep the nice one, which is the proper way to handle this so that it affects the google results as little as possible?
I don't know why you don't want to use cakes own routing (if you are having trouble doing what you want, you can accomplish what you want with a custom route class), then make sure that you redirect all relevant URL's in your .htaccess file to the desired URL using a MOVED PERMANENTLY redirect.
This way google will index the target url instead of the one that is undesirable. You are right to take offense to this, double indexing is a great way to harm your SEO rankings.

Resources