Best practice multi language website

I've been struggling with this question for quite some months now, but I haven't been in a situation that I needed to explore all possible options before. Right now, I feel like it's time to get to know the possibilities and create my own personal preference to use in my upcoming projects.

Let me first sketch the situation I'm looking for

I'm about to upgrade/redevelop a content management system which I've been using for quite a while now. However, I'm feeling multi language is a great improvement to this system. Before I did not use any frameworks but I'm going to use Laraval4 for the upcoming project. Laravel seems the best choice of a cleaner way to code PHP. Sidenote: Laraval4 should be no factor in your answer . I'm looking for general ways of translation that are platform/framework independent.

What should be translated

As the system I am looking for needs to be as user friendly as possible the method of managing the translation should be inside the CMS. There should be no need to start up an FTP connection to modify translation files or any html/php parsed templates.

Furthermore, I'm looking for the easiest way to translate multiple database tables perhaps without the need of making additional tables.

What did I come up with myself

As I've been searching, reading and trying things myself already. There are a couple of options I have. But I still don't feel like I've reached a best practice method for what I am really seeking. Right now, this is what I've come up with, but this method also has it side effects.

  • PHP Parsed Templates : the template system should be parsed by PHP. This way I'm able to insert the translated parameters into the HTML without having to open the templates and modify them. Besides that, PHP parsed templates gives me the ability to have 1 template for the complete website instead of having a subfolder for each language (which I've had before). The method to reach this target can be either Smarty, TemplatePower, Laravel's Blade or any other template parser. As I said this should be independent to the written solution.
  • Database Driven : perhaps I don't need to mention this again. But the solution should be database driven. The CMS is aimed to be object oriented and MVC, so I would need to think of a logical data structure for the strings. As my templates would be structured: templates/Controller/View.php perhaps this structure would make the most sense: Controller.View.parameter . The database table would have these fields a long with a value field. Inside the templates we could use some sort method like echo __('Controller.View.welcome', array('name', 'Joshua')) and the parameter contains Welcome, :name . Thus the result being Welcome, Joshua . This seems a good way to do this, because the parameters such as :name are easy to understand by the editor.
  • Low Database Load : Of course the above system would cause loads of database load if these strings are being loaded on the go. Therefore I would need a caching system that re-renders the language files as soon as they are edited/saved in the administration environment. Because files are generated, also a good file system layout is needed. I guess we can go with languages/en_EN/Controller/View.php or .ini, whatever suits you best. Perhaps an .ini is even parsed quicker in the end. This fould should contain the data in the format parameter=value; . I guess this is the best way of doing this, since each View that is rendered can include it's own language file if it exists. Language parameters then should be loaded to a specific view and not in a global scope to prevent parameters from overwriting each other.
  • Database Table translation : this in fact is the thing I'm most worried about. I'm looking for a way to create translations of News/Pages/etc. as quickly as possible. Having two tables for each module (for example News and News_translations ) is an option but it feels like to much work to get a good system. One of the things I came up with is based on a data versioning system I wrote: there is one database table name Translations , this table has a unique combination of language , tablename and primarykey . For instance: en_En / News / 1 (Referring to the English version of the News item with ID=1). But there are 2 huge disadvantages to this method: first of all this table tends to get pretty long with a lot of data in the database and secondly it would be a hell of a job to use this setup to search the table. Eg searching for the SEO slug of the item would be a full text search, which is pretty dumb. But on the other hand: it's a quick way to create translatable content in every table very fast, but I don't believe this pro overweights the con's.
  • Front-end Work : Also the front-end would need some thinking. Of course we would store the available languages in a database and (de)active the ones we need. This way the script can generate a dropdown to select a language and the back-end can decide automatically what translations can be made using the CMS. The chosen language (eg en_EN) would then be used when getting the language file for a view or to get the right translation for a content item on the website.
  • So, there they are. My ideas so far. They don't even include localization options for dates etc yet, but as my server supports PHP5.3.2+ the best option is to use the intl extension as explained here: http://devzone.zend.com/1500/internationalization-in-php-53/ - but this would be of use in any later stadium of development. For now the main issue is how to have the best practics of translation of the content in a website.

    Besides everything I explained here, I still have another thing which I haven't decided yet, it looks like a simple question, but in fact it's been giving me headaches:

    URL Translation? Should we do this or not? and in what way?

    So.. if I have this url: http://www.domain.com/about-us and English is my default language. Should this URL be translated into http://www.domain.com/over-ons when I choose Dutch as my language? Or should we go the easy road and simply change the content of the page visible at /about . The last thing doesn't seem a valid option because that would generate multiple versions of the same URL, this indexing the content will fail the right way.

    Another option is using http://www.domain.com/nl/about-us instead. This generates at least a unique URL for each content. Also this would be easier to go to another language, for example http://www.domain.com/en/about-us and the URL provided is easier to understand for both Google and Human visitors. Using this option, what do we do with the default languages? Should the default language remove the language selected by default? So redirecting http://www.domain.com/en/about-us to http://www.domain.com/about-us ... In my eyes this is the best solution, because when the CMS is setup for only one language there is no need to have this language identification in the URL.

    And a third option is a combination from both options: using the "language-identification-less"-URL ( http://www.domain.com/about-us ) for the main language. And use an URL with a translated SEO slug for sublanguages: http://www.domain.com/nl/over-ons & http://www.domain.com/de/uber-uns

    I hope my question gets your heads cracking, they cracked mine for sure! It did help me already to work things out as a question here. Gave me a possibility to review the methods I've used before and the idea's I'm having for my upcoming CMS.

    I would like to thank you already for taking the time to read this bunch of text!

    // Edit #1 :

    I forgot to mention: the __() function is an alias to translate a given string. Within this method there obviously should be some sort of fallback method where the default text is loaded when there are not translations available yet. If the translation is missing it should either be inserted or the translation file should be regenerated.


    Topic's premise

    There are three distinct aspects in a multilingual site:

  • interface translation
  • content
  • url routing
  • While they all interconnected in different ways, from CMS point of view they are managed using different UI elements and stored differently. You seem to be confident in your implementation and understanding of the first two. The question was about the latter aspect - "URL Translation? Should we do this or not? and in what way?"

    What the URL can be made of?

    A very important thing is, don't get fancy with IDN. Instead favor transliteration (also: transcription and romanization). While at first glance IDN seems viable option for international URLs, it actually does not work as advertised for two reasons:

  • some browsers will turn the non-ASCII chars like 'ч' or 'ž' into '%D1%87' and '%C5%BE'
  • if user has custom themes, the theme's font is very likely to not have symbols for those letters
  • I actually tried to IDN approach few years ago in a Yii based project (horrible framework, IMHO). I encountered both of the above mentioned problems before scraping that solution. Also, I suspect that it might be an attack vector.

    Available options ... as I see them.

    Basically you have two choices, that could be abstracted as:

  • http://site.tld/[:query] : where [:query] determines both language and content choice

  • http://site.tld/[:language]/[:query] : where [:language] part of URL defines the choice of language and [:query] is used only to identify the content

  • Query is Α and Ω ..

    Lets say you pick http://site.tld/[:query] .

    In that case you have one primary source of language: the content of [:query] segment; and two additional sources:

  • value $_COOKIE['lang'] for that particular browser
  • list of languages in HTTP Accept-Language (1), (2) header
  • First, you need to match the query to one of defined routing patterns (if your pick is Laravel, then read here). On successful match of pattern you then need to find the language.

    You would have to go through all the segments of the pattern. Find the potential translations for all of those segments and determine which language was used. The two additional sources (cookie and header) would be used to resolve routing conflicts, when (not "if") they arise.

    Take for example: http://site.tld/blog/novinka .

    That's transliteration of "блог, новинка" , that in English means approximately "blog", "latest" .

    As you can already notice, in Russian "блог" will be transliterated as "blog". Which means that for the first part of [:query] you (in the best case scenario) will end up with ['en', 'ru'] list of possible languages. Then you take next segment - "novinka". That might have only one language on the list of possibilities: ['ru'] .

    When the list has one item, you have successfully found the language.

    But if you end up with 2 (example: Russian and Ukrainian) or more possibilities .. or 0 possibilities, as a case might be. You will have to use cookie and/or header to find the correct option.

    And if all else fails, you pick the site's default language.

    Language as parameter

    The alternative is to use URL, that can be defined as http://site.tld/[:language]/[:query] . In this case, when translating query, you do not need to guess the language, because at that point you already know which to use.

    There is also a secondary source of language: the cookie value. But here there is no point in messing with Accept-Language header, because you are not dealing with unknown amount of possible languages in case of "cold start" (when user first time opens site with custom query).

    Instead you have 3 simple, prioritized options:

  • if [:language] segment is set, use it
  • if $_COOKIE['lang'] is set, use it
  • use default language
  • When you have the language, you simply attempt to translate the query, and if translation fails, use the "default value" for that particular segment (based on routing results).

    Isn't here a third option?

    Yes, technically you can combine both approaches, but that would complicate the process and only accommodate people who want to manually change URL of http://site.tld/en/news to http://site.tld/de/news and expect the news page to change to German.

    But even this case could probable be mitigated using cookie value (which would contain information about previous choice of language), to implement with less magic and hope.

    Which approach to use?

    As you might already guessed, I would recommend http://site.tld/[:language]/[:query] as the more sensible option.

    Also in real word situation you would have 3rd major part in URL: "title". As in name of the product in online shop or headline of article in news site.

    Example: http://site.tld/en/news/article/121415/EU-as-global-reserve-currency

    In this case '/news/article/121415' would be the query, and the 'EU-as-global-reserve-currency' is title. Purely for SEO purposes.

    Can it be done in Laravel?

    Kinda, but not by default.

    I am not too familiar with it, but from what I have seen, Laravel uses simple pattern-based routing mechanism. To implement multilingual URLs you will probably have to extend core class(es), because multilingual routing need access to different forms of storage (database, cache and/or configuration files).

    It's routed. What now?

    As a result of all you would end up with two valuable pieces of information: current language and translated segments of query. These values then can be used to dispatch to the class(es) which will produce the result.

    Basically, the following URL: http://site.tld/ru/blog/novinka (or the version without '/ru' ) gets turned into something like

    $parameters = [
       'language' => 'ru',
       'classname' => 'blog',
       'method' => 'latest',
    ];
    

    Which you just use for dispatching:

    $instance = new {$parameter['classname']};
    $instance->{'get'.$parameters['method']}( $parameters );
    

    .. or some variation of it, depending on the particular implementation.


    Implementing i18n Without The Performance Hit Using a Pre-Processor as suggested by Thomas Bley

    At work, we recently went through implementation of i18n on a couple of our properties, and one of the things we kept struggling with was the performance hit of dealing with on-the-fly translation, then I discovered this great blog post by Thomas Bley which inspired the way we're using i18n to handle large traffic loads with minimal performance issues.

    Instead of calling functions for every translation operation, which as we know in PHP is expensive, we define our base files with placeholders, then use a pre-processor to cache those files (we store the file modification time to make sure we're serving the latest content at all times).

    The Translation Tags

    Thomas uses {tr} and {/tr} tags to define where translations start and end. Due to the fact that we're using TWIG, we don't want to use { to avoid confusion so we use [%tr%] and [%/tr%] instead. Basically, this looks like this:

    `return [%tr%]formatted_value[%/tr%];`
    

    Note that Thomas suggests using the base English in the file. We don't do this because we don't want to have to modify all of the translation files if we change the value in English.

    The INI Files

    Then, we create an INI file for each language, in the format placeholder = translated :

    // lang/fr.ini
    formatted_value = number_format($value * Model_Exchange::getEurRate(), 2, ',', ' ') . '€'
    
    // lang/en_gb.ini
    formatted_value = '£' . number_format($value * Model_Exchange::getStgRate())
    
    // lang/en_us.ini
    formatted_value = '$' . number_format($value)
    

    It would be trivial to allow a user to modify these inside the CMS, just get the keypairs by a preg_split on n or = and making the CMS able to write to the INI files.

    The Pre-Processor Component

    Essentially, Thomas suggests using a just-in-time 'compiler' (though, in truth, it's a preprocessor) function like this to take your translation files and create static PHP files on disk. This way, we essentially cache our translated files instead of calling a translation function for every string in the file:

    // This function was written by Thomas Bley, not by me
    function translate($file) {
      $cache_file = 'cache/'.LANG.'_'.basename($file).'_'.filemtime($file).'.php';
      // (re)build translation?
      if (!file_exists($cache_file)) {
        $lang_file = 'lang/'.LANG.'.ini';
        $lang_file_php = 'cache/'.LANG.'_'.filemtime($lang_file).'.php';
    
        // convert .ini file into .php file
        if (!file_exists($lang_file_php)) {
          file_put_contents($lang_file_php, '<?php $strings='.
            var_export(parse_ini_file($lang_file), true).';', LOCK_EX);
        }
        // translate .php into localized .php file
        $tr = function($match) use (&$lang_file_php) {
          static $strings = null;
          if ($strings===null) require($lang_file_php);
          return isset($strings[ $match[1] ]) ? $strings[ $match[1] ] : $match[1];
        };
        // replace all {t}abc{/t} by tr()
        file_put_contents($cache_file, preg_replace_callback(
          '/[%tr%](.*?)[%/tr%]/', $tr, file_get_contents($file)), LOCK_EX);
      }
      return $cache_file;
    }
    

    Note: I didn't verify that the regex works, I didn't copy it from our company server, but you can see how the operation works.

    How to Call It

    Again, this example is from Thomas Bley, not from me:

    // instead of
    require("core/example.php");
    echo (new example())->now();
    
    // we write
    define('LANG', 'en_us');
    require(translate('core/example.php'));
    echo (new example())->now();
    

    We store the language in a cookie (or session variable if we can't get a cookie) and then retrieve it on every request. You could combine this with an optional $_GET parameter to override the language, but I don't suggest subdomain-per-language or page-per-language because it'll make it harder to see which pages are popular and will reduce the value of inbound links as you'll have them more scarcely spread.

    Why use this method?

    We like this method of preprocessing for three reasons:

  • The huge performance gain from not calling a whole bunch of functions for content which rarely changes (with this system, 100k visitors in French will still only end up running translation replacement once).
  • It doesn't add any load to our database, as it uses simple flat-files and is a pure-PHP solution.
  • The ability to use PHP expressions within our translations.
  • Getting Translated Database Content

    We just add a column for content in our database called language , then we use an accessor method for the LANG constant which we defined earlier on, so our SQL calls (using ZF1, sadly) look like this:

    $query = select()->from($this->_name)
                     ->where('language = ?', User::getLang())
                     ->where('id       = ?', $articleId)
                     ->limit(1);
    

    Our articles have a compound primary key over id and language so article 54 can exist in all languages. Our LANG defaults to en_US if not specified.

    URL Slug Translation

    I'd combine two things here, one is a function in your bootstrap which accepts a $_GET parameter for language and overrides the cookie variable, and another is routing which accepts multiple slugs. Then you can do something like this in your routing:

    "/wilkommen" => "/welcome/lang/de"
    ... etc ...
    

    These could be stored in a flat file which could be easily written to from your admin panel. JSON or XML may provide a good structure for supporting them.

    Notes Regarding A Few Other Options

    PHP-based On-The-Fly Translation

    I can't see that these offer any advantage over pre-processed translations.

    Front-end Based Translations

    I've long found these interesting, but there are a few caveats. For example, you have to make available to the user the entire list of phrases on your website that you plan to translate, this could be problematic if there are areas of the site you're keeping hidden or haven't allowed them access to.

    You'd also have to assume that all of your users are willing and able to use Javascript on your site, but from my statistics, around 2.5% of our users are running without it (or using Noscript to block our sites from using it).

    Database-Driven Translations

    PHP's database connectivity speeds are nothing to write home about, and this adds to the already high overhead of calling a function on every phrase to translate. The performance & scalability issues seem overwhelming with this approach.


    I suggest you not to invent a wheel and use gettext and ISO languages abbrevs list. Have you seen how i18n/l10n implemented in popular CMSes or frameworks?

    Using gettext you will have a powerful tool where many of cases is already implemented like plural forms of numbers. In english you have only 2 options: singular and plural. But in Russian for example there are 3 forms and its not as simple as in english.

    Also many translators already have experience to work with gettext.

    Take a look to CakePHP or Drupal . Both multilingual enabled. CakePHP as example of interface localization and Drupal as example of content translation.

    For l10n using database isn't the case at all. It will be tons on queries. Standard approach is to get all l10n data in memory in early stage (or during first call to i10n function if you prefer lazy loading). It can be reading from .po file or from DB all data at once. And than just read requested strings from array.

    If you need to implement online tool to translate interface you can have all that data in DB but than still save all data to file to work with it. To reduce amount of data in memory you can split all your translated messages/strings into groups and than load only that groups you need if it will be possible.

    So you totally right in your #3. With one exception: usually it is one big file not a per-controller file or so. Because it is best for performance to open one file. You probably know that some highloaded web apps compiles all PHP code in one file to avoid file operations when include/require called.

    About URLs. Google indirectly suggest to use translation:

    to clearly indicate French content: http://example.ca/fr/vélo-de-montagne.html

    Also i think you need to redirect user to default language prefix eg http://examlpe.com/about-us will redirects to http://examlpe.com/en/about-us But if your site use only one language so you don't need prefixes at all.

    Check out: http://www.audiomicro.com/trailer-hit-impact-psychodrama-sound-effects-836925 http://nl.audiomicro.com/aanhangwagen-hit-effect-psychodrama-geluidseffecten-836925 http://de.audiomicro.com/anhanger-hit-auswirkungen-psychodrama-sound-effekte-836925

    Translating content is more difficult task. I think it will be some differences with different types of content eg articles, menu items etc. But in #4 you're in the right way. Take a look in Drupal to have more ideas. It have clear enough DB schema and good enough interface for translating. Like you creating article and select language for it. And than you can later translate it to other languages.

    Drupal翻译界面

    I think it isn't problem with URL slugs. You can just create separate table for slugs and it will be right decision. Also using right indexes it isn't problem to query table even with huge amount of data. And it wasn't full text search but string match if will use varchar data type for slug and you can have an index on that field too.

    PS Sorry, my English is far from perfect though.

    链接地址: http://www.djcxy.com/p/6392.html

    上一篇: 连接多个表的实体框架问题

    下一篇: 最佳实践多语言网站