gettext command-line utility not looking up translations - locale

I'm trying to translate some strings using gettext from a shell script under Linux, but have had no success. For example the following always returns the untranslated string:
LC_MESSAGES=et_EE gettext iso_639-3 'Estonian, Standard'
I've tried setting many combinations of LC_ALL, LANG, LANGUAGE and LC_MESSAGES environment variables to et, et_EE, et_EE.UTF-8 or empty strings, but nothing seems to work, even when using a clear enviroment (e.g. with env -i LC_MESSAGES=et_EE).
Although gettext --help clearly shows that the standard search directory is /usr/share/locale, the respective translation files (e.g. iso_639-3.mo) are properly installed in under /usr/share/locale/et/LC_MESSAGES/ and locale-gen has properly been run, gettext does not seem to look for the translations from under /usr/share/locale/. Output from strace -f -e trace=%file has shown that gettext only tries to read /usr/lib/locale/locale-archive, /usr/lib/locale/et_EE/LC_MESSAGES and /usr/lib/locale/et/LC_MESSAGES, with the last two openat() system calls failing with ENOENT as those files don't exist.
I'm probably just doing something plain stupid and don't know how to use the gettext utility. What am I doing wrong? How to properly use the gettext tool?

Related

SConstruct 101—moving on from Makefiles

Like
make,
scons has a large number of predefined variables and rules. (Try scons | wc on an SConstruct containing env = Environment(); print(env.Dump()) to see how extended the set is.)
But suppose we aren't after the wizardry of presets but rather want to do something a lot more primitive—simulating launching a few instructions from the (bash, etc) command line?
Also suppose we're quite happy with the default Decider('MD5'). What is the translation of the one-souce-one-target:
out/turquoise.xyz: out/chartreuse.xyz
chartreuse_to_turquoise $< $#
of the two-source-one-target:
out/purple.xyz: out/lilac.xyz out/salmon.xyz
gen_purple $< $#
and of:
run_this:
python prog.py
which we would run on-demand by typing make run_this?
What does the SConstruct for these elementary constructs look like?
All the answers you're looking for are in the users guide (and manpage)
Firstly, assuming you don't want to scan the input files to add included files specified in the input files, you can use Commmand()
(See info here: https://scons.org/doc/production/HTML/scons-user.html#chap-builders-commands)
Then you'll want an alias to specify an a non file command line target
(See here:https://scons.org/doc/production/HTML/scons-user.html#chap-alias)
Putting those two together yields
env=Environment()
# one source, one target
env.Command('out/turquoise.xyz', 'out/chartreuse.xyz', 'chartreuse_to_turquoise $SOURCE $TARGET')
# Two source, one target
env.Command('out/purple.xyz',['out/lilac.xyz','out/salmon.xyz'], 'gen_purple $SOURCES $TARGET')
# And your .phony make target which is actually not great for reproducibility and determining when it should be rerun, because you do not specify any sources or targets
env.Alias('run_this','python prog.py')
Note: SCons doesn't NOT propagate your shell environment variables. So if you depend on (for example) a non system path in your PATH, you'll need to explicitly specify that in env['ENV']['PATH'] for example. For more details take a read through the users guide, manpage and FAQ.
https://scons.org/doc/production/HTML/scons-user.html
https://scons.org/doc/production/HTML/scons-man.html
https://scons.org/faq.html
And you can reach the community directly via our discord server, IRC channel, or users mailing list

How to pass a QMAKE variable from the command line?

I am to trying cross-compile pile Qt from a Linux terminal. When I run qmake it applies the mkspecs qmake.conf in my context in such manner that the CROSS_COMPILE variable must be defined. For example, there is a critical conf line that looks like this:
QMAKE_CXX = $${CROSS_COMPILE}g++
Qmake returns an error though which clearly indicates $${CROSS_COMPILE} is not being resolved. It is simply using "g++" instead of the whole value which ought to be there.
I've tried to invoke qmake and define the variable from a bash script like this:
qmake qt.pro "CROSS_COMPILE=${CROSS_COMPILE}"
And like this :
qmake qt.pro -- "CROSS_COMPILE=${CROSS_COMPILE}"
And a few other such stabs at it. I've also tried hard coding the value in that command in case that had anything to do with it. I've tried defining this as an environmental variable too (just in case)...
Nothing works. Yet, I've seen piles of examples where this syntax seems to be valid. What am doing wrong? Could there be a character escape complication?
Your problem is that the shell already interpreted the ${} inside your string as a form of variable substitution.
Since you did not define the variable CROSS_COMPILE in the shell, it had no value and what qmake got were actually the 2 arguments between quotes "qt.pro" and "CROSS_COMPILE=", meaning that you have actually made qmake set CROSS_COMPILE to an empty value.
What you should try is:
qmake qt.pro "CROSS_COMPILE=\${CROSS_COMPILE}"
Note the backslash before the dollar sign, which escapes it to prevent it from having a special meaning to the shell and enables it to get passed on literally to qmake.
This question has also been already asked on Stackoverflow:
Define a string in qmake command line
More on the variable substitution of Bash:
https://www.gnu.org/software/bash/manual/html_node/Shell-Parameter-Expansion.html
EDIT:
Example:
I just tried myself with a fresh project file with the following contents:
SOME_OTHER_VAR=$${SOME_VAR}_something
message($${SOME_OTHER_VAR})
and doing
SOME_VAR=value
qmake qmake_variables.pro "SOME_VAR=${SOME_VAR}"
does work for me, printing:
Project MESSAGE: value_something
This is not the best answer, but I "solved" the problem by adding this to my qmake.conf:
CROSS_COMPILE=$$(CROSS_COMPILE)
That defined the variable in qmake by getting it from an environmental variable I set in my calling bash script.

How to replace paths to executables in source code with Nix that are not in PATH

I wish to write some Haskell that calls an executable as part of its work; and install this on a nixOS host. I don't want the executable to be in my PATH (and to rely on that would disrupt the beautiful dependency model of nix).
If this were, say, a Perl script, I would have a simple builder that looked for strings of a certain format, and replaced them with the executable names, based upon dependencies declared in the .nix file. But that seems somewhat harder with the cabal-based building common to haskell.
Is there a standard idiom for encoding the paths to executables at build time (including during development, as well as at install time) within Haskell code on nix?
For the sake of a concrete example, here is a trivial "script":
import System.Process ( readProcess )
main = do
stdout <- readProcess "hostname" [] ""
putStrLn $ "Hostname: " ++ stdout
I would like to be able to compile run this (in principle) without relying on hostname being in the PATH, but rather replacing hostname with the full /nix/store/-inetutils-/bin/hostname path, and thus also gaining the benefits of dependency management under nix.
This could possibly be managed by using a shell (or similar) script, built using a replacement scheme as defined above, that sets up an environment that the haskell executable expects; but still that would need some bootstrapping via the cabal.mkDerivation, and since I'm a lover of OptParse-Applicative's bash completion, I'm loathe to slow that down with another script to fire up every time I hit the tab key. But if that's what's needed, fair enough.
I did look through cabal.mkDerivation for some sort of pre-build step, but if it's there I'm not seeing it.
Thanks,
Assuming you're building the Haskell app in Nix, you can patch a configuration file via your Nix expression. For an example of how to do this, have a look at this small project.
The crux is that you can define a postConfigure hook like this:
pkgs.haskell.lib.overrideCabal yourProject (old: {
postConfigure = ''
substituteInPlace src/Configuration.hs --replace 'helloPrefix = Nothing' 'helloPrefix = Just "${pkgs.hello}"'
'';
})
What I do with my xmonad build in nix1 is refer to executable paths as things like ##compton##/bin/compton. Then I use a script like this to generate my default.nix file:
#!/usr/bin/env bash
set -eu
packages=($(grep '##[^#]*##' src/Main.hs | sed -e 's/.*##\(.*\)##.*/\1/' | sort -u))
extra_args=()
for p in "${packages[#]}"; do
extra_args+=(--extra-arguments "$p")
done
cabal2nix . "${extra_args[#]}" \
| head -n-1
echo " patchPhase = ''";
echo " substituteInPlace src/Main.hs \\"
for p in "${packages[#]}"; do
echo " --replace '##$p##' '\${$p}' \\"
done
echo " '';"
echo "}"
What it does is grep through src/Main.hs (could easily be changed to find all haskell files, or to some specific configuration module) and pick out all the tags surrounded by## like ##some-package-name##. It then does 2 things with them:
passes them to cabal2nix as extra arguments for the nix expression it generates
post-processes nix expression output from cabal2nix to add a patch phase, which replaces the ##some-package-name## tag in the Haskell source file with the actual path to the derivation.2
This generates a nix-expression like this:
{ mkDerivation, base, compton, networkmanagerapplet, notify-osd
, powerline, setxkbmap, stdenv, synapse, system-config-printer
, taffybar, udiskie, unix, X11, xmonad, xmonad-contrib
}:
mkDerivation {
pname = "xmonad-custom";
version = "0.0.0.0";
src = ./.;
isLibrary = false;
isExecutable = true;
executableHaskellDepends = [
base taffybar unix X11 xmonad xmonad-contrib
];
description = "My XMonad build";
license = stdenv.lib.licenses.bsd3;
patchPhase = ''
substituteInPlace src/Main.hs \
--replace '##compton##' '${compton}' \
--replace '##networkmanagerapplet##' '${networkmanagerapplet}' \
--replace '##notify-osd##' '${notify-osd}' \
--replace '##powerline##' '${powerline}' \
--replace '##setxkbmap##' '${setxkbmap}' \
--replace '##synapse##' '${synapse}' \
--replace '##system-config-printer##' '${system-config-printer}' \
--replace '##udiskie##' '${udiskie}' \
'';
}
The net result is I can just write Haskell code and a cabal package file; I don't have to worry much about maintaining the nix package file as well, only re-running my generate-nix script if my dependencies change.
In my Haskell code I just write paths to executables as if ##the-nix-package-name## was an absolute path to a folder where that package is installed, and everything magically works.
The installed xmonad binary ends up containing hardcoded references to the absolute paths to the executables I call, which is how nix likes to work (this means it automatically knows about the dependency during garbage collection, for example). And I don't have to worry about keeping the things I called in my interactive environment's PATH, or maintaining a wrapper that sets up PATH just for this executable.
1 I have it set up as a cabal project that gets built and installed into the nix store, rather than having it dynamically recompile itself from ~/.xmonad/xmonad.hs
2 Step 2 is a little meta, since I'm using a bash script to generate nix code with an embedded bash script in it
This is not indented to be the answer but if I post this in comment section it would turn out to be ugly formatted.
Also I am not sure if this hack is the right way to do the job.
I notice that if I use nix-shell I can get full path to nix store
Assume hash is always the same, AFAIK I believe it is, you can use it to hard-coded in build recipe.
$ which bash
/run/current-system/sw/bin/bash
[wizzup# ~]
$ nix-shell -p bash
[nix-shell:~]$ which bash
/nix/store/wb34dgkpmnssjkq7yj4qbjqxpnapq0lw-bash-4.4-p12/bin/bash
Lastly, I doubt if you have to to any of this if you use buildInput, it should be the same path.

programmatically access IME

Is there a way to access Japanese or chinese IME either from the command line or python? I have Linux/osx/win8 boxes, so which ever system exposes the easiest accessible api is fine.
I'm experimenting with building a Japanese kana-kanji conversion algorithm and would like to establish a baseline using existing tools. I also have some collections of kana I would like to process.
Preferably I would like something along the lines of
$ ime JP "きしゃのきしゃがきしゃできしゃした"
貴社の記者が汽車で帰社した
I've looked at anthy, mozc and dbus on Linux but can't find anyway to interact with them via the terminal or scripting (such as python)
Anthy provides a cli tool
Personally, I prefer google's IME / mozc for better results, but perhaps this helps.
The source for anthy (sourceforge, file anthy-9100h.tar.gz) includes a simple cli program for testing. Download the source file, extract it, run
./configure && make
Enter the directory test which contains the binary anthy. By default, it reads from test.txt and uses EUC_JP encoding.
Simple test:
Input file test.txt
*にほんごにゅうりょく
*もももすももももものうち。
Run (using iconv to convert to UTF-8:
./anthy --all | iconv -f EUC-JP -t UTF-8
Output:
1:(にほんごにゅうりょく)
|にほんご|にゅうりょく
にほんご(日本語:(1,1000,N,72089)2500,001 ,にほんご:(N,0,-)2 ,ニホンゴ:(N,0,-)1 ,):
にゅうりょく(入力:(1,1000,N,62394)2500,001 ,にゅうりょく:(N,0,-)2 ,ニュウリョク:(N,0,-)1 ,):
2:(もももすももももものうち。)
|ももも|すももも|もものうち|。
ももも(桃も:(,1000,Ny,72089)225,279 ,ももも:(N,1000,Ny,72089)220,773 ,モモも:(,1000,Ny,72089)205,004 ,腿も:(,1000,Ny,72089)204,722 ,股も:(,1000,Ny,72089)146,431 ,モモモ:(N,0,-)1 ,):
すももも(すももも:(N,1000,Ny,72089)202,751 ,スモモも:(,1000,Ny,72089)168,959 ,李も:(,1000,Ny,72089)168,677 ,スモモモ:(N,0,-)1 ,):
もものうち(桃のうち:(,1000,N,655)2,047 ,もものうち:(N,1000,N,655)2,006 ,モモのうち:(,1000,N,655)1,863 ,腿のうち:(,1000,N,655)1,861 ,股のうち:(,1000,N,655)1,331 ,モモノウチ:(N,0,-)1 ,):
。(。:(1N,100,N,70203)57,040 ,.:(1,100,N,70203)52,653 ,.:(1,100,N,70203)3,840 ,):
You can uncomment some printf statements in the source files test/main.c and src-main/context.c to make the output more readable/parsable, eg:
1 にほんごにゅうりょく
にほんご 日本語
にゅうりょく 入力
2 もももすももももものうち。
ももも 桃も
すももも すももも
もものうち 桃のうち
。 。

Handling command line options with multiple arguments for some flags

I'm writing a program where the command line usage should be something like:
mkblueprint FILE FILE FILE -o <output name> -s <string> -r <number> -p pOPT1 pOPT2 pOPT3
I'm currently using CmdLib and I can't figure out a way to handle this; a flag is required for each input(so I can't just have FILEs sitting alone) and there doesn't appear to be a way to pass multiple arguments to a flag, as with -p. These are extremely common in command line programs so I figure I'm just misunderstanding the documentation, but it's not mentioned in any command line library I look at for Haskell.
After some more work with CmdLib I was able to handle the bare FILE input via the Extra tag and then checking that each string is a valid file, which seems to be the standard way to handle it despite the name. -p pOPT1 pOPT2 pOPT3 is apparently not allowed under the POSIX standard, which is why I'm not finding libraries that will do it.
You might consider the GetOpt bindings that come with base. They're not as sexy as some of the more modern alternatives, but they support bare arguments and final options well.

Resources