Language in base url for multilingual website on Kohana 3 - kohana

Couldn't find example for my particular case.
I am creating a multilingual website on Kohana 3.3. I need to create URLs like:
http://example.com for English version
http://es.example.com for Spanish
http://fr.example.com for French
and so on. But all examples, that I could find, were like http://example.com/lang/
What do I have to do to accomplish the goal? Do I have to make different applications for every language and then adjust .htaccess? Or there is a way to read the language from URL (in case of lang.example.com).

I've never done this so I am merely attempting to describe how I would approach the problem as if I was faced with it.
Firstly, I'd start with setting up the server so all domains point to one, single application (which I assume you already have in place).
Next, in your bootstrap.php, when setting the language, I'd use a helper function which would define it for me, based on the URL the website is accessed at, and my bootstrap would then probably look like this:
/**
* Set the default language
*/
I18n::lang(MyApp::get_request_language_code());
I would expect my MyApp::get_request_language_code() to return a string that would match the language code used for I18n class. This helper should also redirect the request if the url was invalid (e.g. xxx.example.com).
Then, I'd probably set the base url using $_SERVER['HTTP_HOST'], i.e.:
Kohana::init(array(
'base_url' => $_SERVER['HTTP_HOST'],
));
Alternatively, I'd have another helper function which would return the domain prefix which I would concatenate with my actual domain, e.g.:
Kohana::init(array(
'base_url' => MyApp::get_request_language_url_prefix().'example.com',
));
By setting the domain name with language prefix in base_url during Kohana::init() you ensure that all urls generated using either Route or URL class have appropriate language included.
And that's how I would try to solve my problem. Again, I'd like to stress that I am only attempting here to come up with a solution to a problem and the above approach has not been tested.

Related

Multilingual Jekyll website running on multiple domains

I'm trying to setup proper solution for multilingual website generated using Jekyll. I checked some plugins and tricks without plugin. But still not sure how to achieve it. I found that it's possible to generate output of every language into subfolder. Eg.:
/en/ contains English version of website
/cz/ contains Czech version of website
But in my case every language will be published on own domain (example.com, example.cz). And this is the moment where I'm getting some troubles with the implementation. When I'll have every language in own folder (/en/, /cz/) this means that also {{page.url}} and parmalinks will contain that /en/... or /cz/... part.
Could you help me to find the trick I need to use? What is correct setup in this case?
Note: The only solution which is close to my situation is this https://frozenfractal.com/blog/2016/5/13/building-a-multilingual-website-in-jekyll/ Here is not possible to implement language switcher because solution excludes all files in alternative languages. (When I'll be on www.example.com/contact I need to be able to switch to Czech alternative www.example.cz/kontakt.)
Two different urls makes sense to me. Google will have a different page rank for your sites, but that is the only downside I can think of. I would set the language and set alternate tags. You can use your page front matter to fill the alternate tags. If you succeed in building them from one repo, you might be able to automatically match the different language versions of your pages with an english page identifier (for your alternate tags). Source

Whats the best way to use multiple languages on a website?

I was wondering what would be the best way to achieve a multi-language template based website. So say I want to offer my website in Englisch and German there are some different methods. My interest is mainly about SEO, so which would be the best way for search engines.
The first way that I often see is using different directories for each language, for example www.example.com for English and www.example.com/de/ for the German translation. The disadvantage of this is: when changing a file, ist has to be changed in every directory manually. And for search engines the two directories would be concerned as duplicate content, wouldnt they?
The second way I know is just using some GET value like www.example.com?lang=de and then setting a cookie. But this way search engines probably wont even find the different languages.
So is there another way or which one is the best?
I worked on internationalised websites until this year. The advice we always had from SEO gurus was to discriminate language based on URL - so, www.example.com/en and www.example.com/de.
I think this is also better for users; if i bookmark a page in German, then when i come back to it, i get a page in German even if my cookies have expired. Similarly, i can do things like post the URL on Facebook, and have my German-speaking friends click on it and get a site in German.
Note that if your site serves multiple countries, you should handle those along with language - so, you might have example.com/de-DE, example.com/en-GB, example.com/en-IE, etc.
However, this should not involve duplication. Instead, you should set your application up to process the URL, extract the locale information, and then forward the request internally to a locale-independent page. So, a request for example.com/de-DE/info and a request for example.com/en-IE/info should both be passed to /info.jsp (or i'm guessing info.php in your case). That page should then be coded to emit text in the appropriate language, using a page-level localisation mechanism.
Things are a bit trickier if you want the URLs themselves to be localised (eg example.org/de-DE/anmelden vs example.org/en-IE/sign-in). However, the same principle applies: extract the locale, then forward to a common page. The difference is that there must be more sophistication in figuring out what the page is from the URL; you will need a mapping from natural language in the URL to the page filename.

Complex Orchard Layer Rules Support

Is it possible to use more complicated layer url rule matching syntax? I want to be able to choose the layer to display based on a more regex type rule that matches the rules I have set up in my custom routing for my module.
I would like to be able to acheive something along the lines of:
url('~/my-{\w*}/something/{\w*}')
It's not available ootb, but could be pretty easy to implement yourself in a custom module (if you don't want to alter the core code).
It can be implemented as a slight modification to existing Orchard.Widgets.RuleEngine.UrlRuleProvider, so regexes would be also taken into account.
Just create an implementation of IRuleProvider, name your function as eg. 'urlregex' (so it wouldn't collide with the existing 'url', processed by UrlRuleProvider) and do all the processing stuff inside Process(RuleContext ruleContext) method. It's a very simple class to implement and would involve just a few lines of code - take a look at the default url rule provider I mentioned at the beginning.

Passing an urlencoded URL as parameter to a controller / action at CakePHP

I'm fairly new on CakePHP and because of so, there are some basic things that I used to do with Zend Framework that I'm beaten up with Cake.
I'm working on a project where I have to pass a named parameter to a controller / action. Setting up the route and passing the parameter is fairly simple, my problem is when the parameter is a urlencoded url.
For example: http://www.cakephp.com/controller/action/http%3A%2F%2Fwww.google.com regardless of the controller and action setup, will throw a 404, but passing /controller/action/http://www.google.com work in some way, the only problem is that it identifies the http as a named parameter. In another way, if I do /controller/action?url=http://www.google.com it will work.
The work around that I had used for this is to pass the value as a base64 encoded string, but it brings some limitations. For instance, if it is an API, there is no way that you can guarantee that the system using the API can encode base64 a string.
Anyway the best solution would be still passing a url encoded string to a named parameter. Question is, why CakePHP does not accept a urlencoded string as a parameter and why does it throws a 404?
Thanks all in advance.
I have added a work around this issue. The previous answer that pointed to a post actually answered why it was happening and one of the solutions. What happens is that the workaround for .htaccess on Apache is a bit dangerous because it will disable a security criteria.
There are 2 ways to work this out via code (and I'm using both):
Send all urls as base64 encoded strings
Accept the urls as named params, but, as you will notice, it converts any http:// to http:/, so is necessary to correctly identify when this happens and only then correct the string.
It is far from being a beautiful solution, but it is definitely a practical one.
I stumbled upon this same problem in Cakephp 4.x
Apparantly you can create a custom route with ** that will disable the default urldecoding. Fixing the problem.
So right now I throw a base64_encode(Security::encrypt($val)) value into the Router::url() function. This will url_encode the params by default so it becomes a valid/working url.
Cakephp then urldecodes by default, which is good. But it does it twice? Causing it to split up the params if there is a / present. Which isn't good.
So in my routes.php I added:
$builder->connect('/orders/callback/**', ['controller' => 'orders', 'action' => 'callback']);
Kinda annoying how this works, but it works now. Works like a charm in 4.x and cost me the entire afternoon. Just leaving this here in case anyone else has this problem. (and for future me).
Source: https://github.com/cakephp/cakephp/issues/4723#issuecomment-56912905

How Search Engines Crawl the websites?

I am creating a Multilingual web site and I use a resource manager for each language.
when user select a language all pages use the selected resource bondles.
as entire sites only is available in one language,how search engines crawl the other languages ?
or does search engine crawl optional provided languages ?
As you know, when you have a static multilingual website that has separate page for each language, you don't have any problem with search engines. Whereas, each page has an unique url.
But in dynamic applications, you don't have separate page for each language and have to use resource instead, you can add a new language or remove an already existing language and so on.
Therefore, we have to use Url Rewriter/Routing for generating unique url for each language. Check the following example out.
Suppose we have a webform in the following url and our application supports two languages (e.g. English United States en-US, English Great Britain en-GB).
www.domain.com/home.aspx
There are some problems, we have permanent url for all of languages. Thus, search engines will be index the default language anyway. The solution is simple, you have to generate separate url for each language by using Url Rewriter/Routing as follows.
www.domain.com/{country}/{language}/home.aspx
Afterwards, you have to inference the specified culture name from the above url and set the current Culture and UICulture properties. Thus, the requested page will be shown in desired language.
The sitemap should be generate programmatically and uses the same way as above, in this case.
www.domain.com/{country}/{language}/sitemap.xml
You have to inference the specified culture from the above url and generate sitemap dependent on culture. To introduce available sitemaps to the search engines you have to use robots.txt that should be generate programmatically as well.
you might be using cookies/sessions for remembering selected language, right?
Neither of them affects search engine. They simply ignore cookies. However, If u rely on session variable for remembering selected language, in the absence of cookies each time new session will be created canceling the language selection.
Ankit

Resources