I'd like to retrieve the git build and branch data from an environment variable I can set/export at build time.
is there a way I can reference such an env variable somewhere in the markdown files or config files? tia!
Related
I am using mlflow run with a GitHub uri.
When I run using the below command
mlflow run <git-uri>
The command sets up a conda environment and then clones the Git repo into a temp directory, But I need it setup in a specific directory
I checked the entire document, but I can't find it. Is there no such option to do so in one shot?
For non-local URIs, MLflow uses the Python's tempfile.mkdtemp function (source code), that creates the temporary directory. You may have some control over it by setting the TMPDIR environment variable as described in Python docs (it lists TMP & TEMP as well, but they didn't work for me on MacOS) - but it will set only "base path" for temporary directories and files, the directory/file names are still will be random.
I have two .env files (.env.development and .env.production) and two different build scripts, one for dev and one for prod. Now I want exactly one build script and depending on a global environment variable I want to decide what of this two .env files should be used for the build.
Is there a way to write in a build script a checker on what the environment variable is set ?
So you can solve this problem by passing environmental variables from your unix server in production, but when in development pass it from .env file, this way you don't need to add twi build scripts because it will get variables from .env or from unix env.
To pass env variables to your Node.js app from Unix OS is like
Open Terminal and write the command as
> export MY_ENV=my environment value
After that you will see the env variable with
> echo "$MY_ENV"
But I suggest you to use Docker and set the env variables to your Docker env this way you will separate your env variables from OS env and prevent inconsistencies
I have a Dockerfile that in one of its RUN instructions creates a conan file. I'd like to edit and save that conan file in my Dockerfile to set project specific settings. Is there a way to do so via command line, for example the Python prompt?
Alternatively is there a way to embed a Python file in a Dockerfile?
I don't know any command to do so. But I would suggest you to use another approach :
Create a Conan template file with environment variables as placeholders (youconanfile.dist).
use envsubst command in order to runtime-create the file you need with current project variables.
I use this technique in a Docker stack to generate multiple files (wp-cli.yml, deploy.php...). My example is in a Makefile. If you need to use it in your Dockerfile, it is possible assuming that
envsubst is installed on your container
COPY command is used for pushing the Conan template file in your container.
In a mixed languages project I have often to refer to configuration files, header files or even Makefiles. I usually use relative paths from the current file. It can be confusing in the case of a YAML configuration file parsed with an external Python script. Should I use the relative path from the configuration file or from the tool used to parse the configuration file? Also if I decide to move my files I have to adjust all the relative paths as well.
To solve this issue I am thinking to use relative paths to the project's absolute root path.
Is there any convention to define a project's path such as:
other_file: "$PROJECT_ROOT/src/foo/bar.xml"
One possible other solution is to always refer to Git, or define an environment variable for that:
other_file: "$(git rev-parse --show-toplevel)/src/foo/bar.xml"
$ export PROJECT_ROOT=`git rev-parse --show-toplevel`
I had one CVS repository located in a remote location, which CVS can access via these environment variables:
export CVSROOT=:ext:xyz#abc.com:/home/xyz/cvsroot
export CVS_RSH=ssh
export CVS_SERVER=cvs
Recently I added another server which has a different location and a different repository. I tried adding the location via
export CVSROOT=$CVSROOT:ext:xyz#fgh.com:/cvs/cvsroot
However, I am unable to perform operations such as checkout and update with the following error:
Cannot access /home/xyz/cvsroot:xyz#fgh.com:/cvs/cvsroot
No such file or directory
What am I doing wrong?
When you are in a CVS repository, any cvs operations will take its CVSROOT information from the current directory's CVS/Root file, regardless of what any CVSROOT environment variables are there. So your only issue is how to initially checkout from your different repositories.
When you type export CVSROOT=$CVSROOT:ext:xyz#fgh.com:/cvs/cvsroot that says "change the CVSROOT environment variable to be the old variable with ':ext:xyz#fgh.com:/cvs/cvsroot' appended to the end if it. That's likely not what you want. You need to take the $CVSROOT out of the right hand side.
As a workable workflow, you can either
run cvs -d <newcvsroot> co <reponame> to specify each CVSROOT as you do your cvs co (you may need to unset CVS_RSH as well, I don't know about that), or
you could also export CVSROOT=<newcvsroot>; unset CVS_RSH; cvs co <reponame>.
If you're frequently checking out from multiple repositories, you may want to have environment variables set up like this, and then you can easily check out from each repo. (Add repos as necessary.)
# in your .bashrc or someting
export REPO1ROOT=:ext:xyz#fgh.com:/cvs/cvsroot
export REPO2ROOT=:ext:xyz#abc.com:/home/xyz/cvsroot
export CVSROOT=$REPO1ROOT # default
# when you use the command line
cvs co repo1
cvs -d $REPO2ROOT co repo2