Can a makefile update the calling environment? - linux

Is it possible to update the environment from a makefile? I want to be able to create a target to set the client environment variables for them. Something like this:
AXIS2_HOME ?= /usr/local/axis2-1.4.1
JAVA_HOME ?= /usr/java/latest
CLASSPATH := foo foo
setenv:
export AXIS2_HOME
export JAVA_HOME
export CLASSPATH
So that the client can simply do:
make setenv all
java MainClass
and have it work without them needing to set the classpath for the java execution themselves.
Or am I looking to do this the wrong way and there is a better way?

No, you can't update the environment in the calling process this way. In general, a subprocess cannot modify the environment of the parent process. One notable exception is batch files on Windows, when run from a cmd shell. Based on the example you show, I guess you are not running on Windows though.
Usually, what you're trying to accomplish is done with a shell script that sets up the environment and then invokes your intended process. For example, you might write a go.sh script like this:
!#/bin/sh
AXIS2_HOME=/usr/local/axix2-1.4.1
JAVA_HOME=/usr/java/latest
CLASSPATH=foo foo
export AXIS2_HOME
export JAVA_HOME
export CLASSPATH
java MainClass
Make go.sh executable and now you can run your app as ./go.sh. You can make your script more elaborate too, if you like -- for example, you may want to make "MainClass" a parameter to the script rather than hard coding it.

From your question I am assuming you're using the bash shell.
You can place the variable definitions in a shell script, like so:
AXIS2_HOME=/usr/local/axis2-1.4.1
export AXIS2_HOME
#etc
And then source the script into the current environment, with
source <filename>
or just
. <filename>
That executes the script in the current shell (i.e. no child process), so any environment changes the script makes will persist.

The quick answer is yes, however in your code, you would need to define the variables in the setenv: directive. Doing it at the beginning of the Makefile makes it a local variable to the Makefile. I would use LOCAL_... at the top of the file then set it in the setenv: directive with VAR=LOCAL_VAR etc... Also remember that you will need to call the makefile with make setenv only. I would really look into doing this in a bash script as the variable needs to be created outside of the Makefile. Once the variable has been generated in the environment, you should be able to assign and export from the Makefile.

Related

can not set linux environment variables as I expect

I open two terminals.
In first terminal:
export CLASSPATH="abc"
printenv CLASSPATH ---> output is abc
then in second terminal:
printenv CLASSPATH ---> no output
why in second terminal I dont have the variable?
It's not going to work because each program inherits environment, that
is a list of environment variables and their values from their parent
process. Environment is not automatically propagated to all other
programs on the system but is only inherited by children of the given
program. To set a global environment that would work in all newly
opened terminals you need set it in the file that is sourced each time
you open the terminal. What file would that be depends on what shell
you use and your system local setup. For example, if you use bash you
should put export CLASSPATH="abc" in ~/.bashrc.
For accessing global variable you need to put $ before it. Are you doing that?
try echo $CLASSPATH
I think you will find this helpful.

Set a temporary environment ($PATH)

I may fall into a X-Y problem with this question and I encourage you guys to correct me if I am wrong.
I would like to configure a toolchain environment that can work on different platforms and compiler versions. I initially wrote a long Perl script that generates a configuration Makefile that contain only variables. I wanted to be simple so I did not write anything complex using automake or autoconf. Moreover I wanted the reconfiguration process to be very fast. In my case my own written ./configure does everything in less than a second. I am very happy with that.
However I feel I can use a better approach using environment variables. Instead of writing a Makefile with the specific variables I can set the current shell environment directly. For example:
export cc=gcc
Unfortunately, some variables are already declared in the $PATH. The solution is to add the new $PATH in the front of the other:
export PATH=/new/toolchain/path:$PATH
echo $PATH
/new/toolchain/path:/old/toolchain/path:/usr/bin:/bin...
I feel this is ugly I would like to remove the old path before adding the new one.
To conclude:
Is it better to use the environment instead of custom makefiles to set a build configuration?
How to properly adjust existing environment variables?
When I have several variables to set, I write a wrapper script which I then use as a prefix to the command that I want to modify. That lets me use the prefix either
applying to a single command, such as make, or
initializing a shell, so that subsequent commands use the altered settings.
I use wrappers for
setting compiler options (such as clang, to set the CC variable, making configure scripts "see" it as the chosen compiler),
setting locale variables, to test with POSIX C versus en_US versus en_US.UTF-8, etc.
testing with reduced environments, such as in cron.
Each of the wrappers does what is needed to identify the proper PATH, LD_LIBRARY_PATH, and similar variables.
For example, I wrote this ad hoc script about ten years ago to test with a local build of python:
#!/bin/bash
ver=2.4.2
export TOP=/usr/local/python-$ver
export PATH=$TOP/bin:$PATH
export LD_LIBRARY_PATH=`newpath -n LD_LIBRARY_PATH -bd $TOP/lib $TOP/lib/gcc/i686-pc-linux-gnu/$ver`
if test -d $TOP
then
exec $*
else
echo no $TOP
exit 1
fi
and used it as with-python-2.4.2 myscript.
Some wrappers simply call another script.
For example, I use this wrapper around the configure script to setup variables for cross-compiling:
#!/bin/sh
# $Id: cfg-mingw,v 1.7 2014/09/20 20:49:31 tom Exp $
# configure to cross-compile using mingw32
BUILD_CC=${CC:-gcc}
unset CC
unset CXX
TARGET=`choose-mingw32`
if test -n "$TARGET"
then
PREFIX=
test -d /usr/$TARGET && PREFIX="--prefix=/usr/$TARGET"
cfg-normal \
--with-build-cc=$BUILD_CC \
--host=$TARGET \
--target=$TARGET \
$PREFIX "$#"
else
echo "? cannot find MinGW compiler in path"
exit 1
fi
where choose-mingw32 and cfg-normal are scripts that (a) find the available target name for the cross-compiler and (b) provide additional options to the configure script.
Others may suggest shell aliases or functions. I do not use those for this purpose because my command-line shell is usually tcsh, while I run these commands from (a) other shell scripts, (b) directory editor, or (c) text-editor. Those use the POSIX shell (except of course, for scripts requiring specific features), making aliases or functions of little use.
You can create an individualized environment for a particular command invocation:
VAR1=val1 VAR2=val2 VAR3=val3 make
I find this cleaner than doing:
export VAR1=val1
export VAR2=val2
export VAR3=val3
make
unless you're in a wrapper script and maybe even then as with
VAR1=val1 VAR2=val2 VAR3=val3 make the VAR variables will be whatever they were before the make invocation (including but not limited to unexported and nonexistent).
Long lines is a non-issue, you can always split it across several lines:
VAR1=val1\
VAR2=val2\
VAR3=val3\
make
You can set up environment variables like this for any Unix command.
The shell will all set it up.
Some applications (such as make or rake) will modify their environment based on arguments that look like variable definitions (see prodev_paris's answer), but that depends on the application.
Is it better to use the environment instead of custom makefiles to set a build configuration?
The best practice for build systems is to not depend on any environment variables at all. So that nothing more is necessary to build your project than:
git clone ... my_project
make -C my_project
Having to set environment variables is error prone and may lead to inconsistent builds.
How to properly adjust existing environment variables?
You may not need to adjust those at all. By using complete paths to tools like compilers you disentangle your build system from the environment.
As we all know, it is preferrable to integrate standard tools for a task like building your products instead of creating your own approach. The effort usually pays off in the long term.
That being said, a simple approach would be to define different environment files (e.g. build-phone.env) setting working directory, PATH, CC etc. for your different products and source your environment files interactively on demand:
. /path/to/build-phone.env
[your build commands]
. /path/to/build-watch.env
[your build commands]
I think you may benefit from using direct variable definition when you call your makefile, like in the following:
make FOO=bar target
Where FOO is the variable you want to set with value bar.
Note that in this case it take precedence over environment definition! So you can easily override your PATH variable...
Please have a look at this detail topic for more info: https://stackoverflow.com/a/2826178/4716013

Linux: export environment variable in a shell script to make it flexible on any server

In order to run a Tcl script on Linux, I need to set the environment variable "$LD_LIBRARY_PATH" each time.
For convenience, I develop a shell script to do this.Currently, on my own server, if I type
echo $LD_LIBRARY_PATH
the result is:
/opt/lsf/9.1/linux2.6-glibc2.3-x86_64/lib
so in my shell script I write the following code:
export LD_LIBRARY_PATH="/opt/lsf/9.1/linux2.6-glibc2.3-x86_64/lib:$INSTALL_ROOT/tcl_tk/lib64:$INSTALL_ROOT/tcl_tk/lib64"
where the "$INSTALL_ROOT/tcl_tk/lib64:$INSTALL_ROOT/tcl_tk/lib64" part is what I want to add. It works well. Now the issue is:
If I want to run the script on any server, so the original "$LD_LIBRARY_PATH" will be different, based on my understanding. So how to make it flexible on any server?
I try this in my shell script:
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$INSTALL_ROOT/tcl_tk/lib64:$INSTALL_ROOT/tcl_tk/lib64"
But not so sure,
I am new to system stuffs, need some help. Hope explain the issue clearly.
If your default shell is bash, i would define the variables in ~/.bashrc like:
export INSTALL_ROOT=...##assuming a lready defined
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$INSTALL_ROOT/tcl_tk/lib64:$INSTALL_ROOT/tcl_tk/lib64
So in this way, you dont have to worry anything about setting variables in multiple shell scripts as this .bashrc is going to setup variable for you beforehand.

How to make declare in a Linux shell script?

I want to put below declare in a shell script: proxy_set
declare -x https_proxy="https://192.168.220.4:8080/"
And then I execute it like below.
$ ./proxy_set
But "export" shows nothing happened.
And in another way if I execute it like this:
$ source proxy_set
Then "export" shows it works!
My question is how can I make it work without additional "source" cmd?
Thanks!
You can't. Setting variables in the environment only affects the environment of that shell and any future children it spawns; there's no way to affect the parent shell. When you run it without the source (or .), a brand new shell is started up, then the variable is set in that shell's environment, and then that shell exits, taking its environment with it.
The source reads the commands and executes them within the current shell as if you had typed them.
So if you want to set environment variables in a script, you have to source it. Alternatively, you can have a command generate shell commands as output instead of running them, and then the parent can evaluate the output of the command. Things like ssh-agent use this approach.
Try just adding:
export https_proxy="https://192.168.220.4:8080/"
Then execute your script normally.

How do I import environment settings into my Perl program?

I have a script whose content simply exports a variable in linux.
export LD_LIBRARY_PATH=....
I want to run this script in my Perl script so whoever is running my Perl script will have their LD_LIBRARY_PATH set. Can i just do this in the beginning of my Perl script:
#!/usr/bin/perl -w
system(". /myfolder1/myfolder2/myScript.sh");
#!/bin/sh
. /myfolder1/myfolder2/myScript.sh
exec perl -wxS "$0" "$#"
#!/usr/bin/perl -w
# .. the rest of your script as normal
When you run this, it will first be executed by /bin/sh, which is capable of loading myScript.sh into the local environment. sh then execs Perl, which is told to continue from the following line.
This won't work. To change the environment inside your Perl script (and to change the environment that will be passed on to commands run from inside your Perl script), change the %ENV variable.
$ENV{"LD_LIBRARY_PATH"} = ... ;
This won't work. There is no way for a subshell to manipulate the environment of the parent process.
But you could make your script echo the string you want to set as LD_LIBRARY_PATH and then from within your Perl script you could do something like that:
$ENV{LD_LIBRARY_PATH} = `path/to/your/script.sh`;
Of course, a bit of error checking might also be a good idea.
No. Your environment changes made in a child cannot affect the parent. This means running a script will not affect perl. Also perl will not affect the shell from which it was called. You can edit the environment inside perl by changing the special variable %ENV. If there's some kind of unreproducible calculation done in that script, maybe the script should just echo the setting and perl can pick that up on STDOUT and use it.
I {changed directory, modified my environment} in a perl script. How come the change disappeared when I exited the script? How do I get my changes to be visible?
Unix In the strictest sense, it can't be done -- the script executes
as a different process from the shell
it was started from. Changes to a
process are not reflected in its
parent, only in its own children
created after the change.
I had a similar problem a few years ago and whipped up a little module, Env::Sourced, that should do the trick.
use Env::Sourced qw(/myfolder1/myfolder2/myScript.sh);
...
Another option (other than making the changes directly in Perl's %ENV) is to make the changes you want a Perl module, so that you can say:
use MyEnvironment;
and have it modify your environment in all your scripts. It would make it simple to make changes after the fact that will not require editing every script.
The module itself will be simple, something like this:
package MyEnvironment;
$ENV{LD_LIBRARY_PATH} .= ":/some/path/you/want/appended";
# Any other changes you want here.
1;
That won't work. An (unpleasant) alternative might be to replace /usr/bin/perl with a shell script that first executes your script and then executes the perl executable.
This can't be done in the way you're trying to do this.
It either needs a wrapper shell script that sets LD_LIBRARY_PATH and then calls your perl script, or any user executing the script needs to have LD_LIBRARY_PATH set correctly in the first place.
If doing the latter, then this can be managed globally by editing /etc/profile and /etc/cshrc (for ksh, sh, bash, csh and tcsh) shells. You can then test for the value of LD_LIBRARY_PATH in your script and if not set/set incorrectly then print a friendly message to the user. Alternatively individual users can set this in their local .profile/.cshrc files.
Note: you haven't given any information about the environment or useres that might run this, so there's also the possibility that users may set LD_LIBRARY_PATH to something they need. If you do check LD_LIBRARY_PATH for a "good" value in your script, then keep in mind that several paths may have been specified, so you will need to parse this environment variable properly.
If you can find the right place in your perl script, this works as in my example:
$ENV{"LD_LIBRARY_PATH"} = "/oracle/product/10g/lib";
And it didn't require me to call another script to set the env var.
The Env::Modify module addresses this issue, at least for POSIX-y platforms:
use Env::Modify 'source';
source("/myfolder1/myfolder2/myScript.sh");
... environment settings from myScript.sh are now available to Perl ...

Resources