I have a SCons Tool, which works when I put mytool.py and __init__.py under site_scons/site_tools/mytool.
Now I would like change it to rather be referenced via an absolute path from somewhere else.
So I called it via:
mytoolpath = '/tools/mytool'
env = Environment(tools=['mytool'], toolpath=mytoolpath)
and it excepts with EnvironmentError: No tool named 'mytool': not a Zip file:
mytool.py is located in /tools/mytool so I really do not understand where the problem is. Could someone shed some light.
Turns out this is one of the few places, where strings are not upconverted to lists.
So you have to invoke this via:
env = Environment(tools=['mytool'], toolpath=[mytoolpath])
Related
For a set of programs written in most languages (C for instance) a script can normally run those programs without any sort of interference between dynamic link libraries and with no special hand holding so long as they are all found on PATH. That is, the following will work:
#!/bin/bash
prog1
prog2
prog3
However, if these three programs are written in Python and they import conflicting package versions then to run each one successfully it must either be installed into a virtualenv or each must have a separate site-packages directory which is referenced by PYTHONPATH. Either way they need a set up and possibly a tear down before running. That is, for virtualenv:
#!/bin/bash
source $PROG1_ROOT/bin/activate
prog1
deactivate
source $PROG2_ROOT/bin/activate
prog2
deactivate
source $PROG3_ROOT/bin/activate
prog3
deactivate
and for separate site-packages:
#!/bin/bash
export PYTHONPATH=$PROG1_ROOT/lib/python3.6/site-packages
prog1
export PYTHONPATH=$PROG2_ROOT/lib/python3.6/site-packages
prog2
export PYTHONPATH=$PROG3_ROOT/lib/python3.6/site-packages
prog3
This problem results because
import pkg_resources
(at least through Python3.6) cannot reliably import the proper versions when multiple versions of a package share the same site-package directory, even if __requires__ precedes it listing all the version restrictions.
It occurs to me that if PYTHONPATH, or some equivalent, could be specified relative to the program instead of the $PWD, and some consistency in directory layout was observed, then it would only have to be set once. That is, if prog1 is in $PROG1_ROOT/bin and its libraries are in $PROG1_ROOT/lib/python3.6/site-packages, then setting PYTHONPATH to "../lib/python3.6/site-packages" would work not only for prog1, but also for prog2, prog3, and for as many more as are needed through progN.
However, PYTHONPATH is normally provided as an absolute path, and relative paths are I believe with respect to $PWD, not to the python program (prog1). Is there some other Python path variable which has the desired property? Failing that, is there some type of file which could be dropped into $PROG1_ROOT/bin which would be normally picked up by a python program when it starts and which could direct it to use $PROG1_ROOT/lib/python3.6/site-packages? It would be OK to have either the relative or absolute path in that file, although the former would still be preferred because then one could move the entire PROG1_ROOT directory tree to another location in the file system without having to rewrite this special file. I really want to avoid solutions which would require modifying prog1 etc. themselves (ie, prog1 in the example).
Thanks.
EDITED:
I wrote this:
https://sourceforge.net/projects/python-devirtualizer/
to implement some of these ideas. At this point it is Linux (or at least POSIX) specific. It slightly modifies python scripts in a package's "bin" directory by changing the first line, and it "wraps" everything in that directory with a replacement native binary which injects a custom PYTHONPATH into the true target's environment. That binary looks up its location using a function from libSDL2 and then specifies the PYTHONPATH relative to that. So far it has worked pretty well, and the "programs" in installed python packages (the "bin" directory's contents) are run based on PATH just like any other program, no futzing about with PYTHONPATH in the shell.
Making search paths relative to the executable is a Very Bad Idea (TM). Move the executable or libraries around, all hell breaks loose. Some enterprising miscreant might notice the path settings and place a script just right to get their own doctored libraries (or just flawed old versions) to be used. And so on.
Clean up the misbehaving scripts. Chances are that by using old versions they are vulnerable to by now fixed security boo-boos, or other misbehaviours. Or find a way to load the stuff in the script itself.
I have some scripts in my package, that rely on some template xml files.
Those scripts are callable by entry points and I wanted to reference the template files by a relative path.
When calling the script via python -m ... the scripts themselves are called from within lib\site-packages and there the xml files are available as I put them in my setup.py like this:
setup(
...
packages=['my_pck'],
package_dir={'my_pck': 'python/src/my_pck'},
package_data={'my_pck': ['reports/templates/*.xml']},
...
)
I know, I could copy those templates also by using data_files in my setup.py but using package_data seems better to me.
Unfortunately package_data seems not to copy those files to the Scripts folder where the entry points are located.
So my question is, is this even achievable via package_data and if, how?
Or is there a more pythonic, easier way to achieve this? Maybe not referencing those files via paths relative to the scripts?
Looks like importlib-resources might help here. This library is able to find the actual path to a resource file packaged as package_data by setuptools.
Access the package_data files from your code with something like this:
with importlib_resources.path('my_pck.reports.templates', 'a.xml') as xml_path:
do_something(xml_path)
I'm building my first python package (which I then install with pip) and I need to use some non-python files. In these answers, it is explained that I should use the pkg_resources function. But I can't figure out a working example. Let say I have this project structure:
package_name/
----data/
--------image.png
----package_name/
--------__init__.py
--------file.py
----setup.py
----MANIFEST.in
----conf.yml
Now I want to access conf.yml and image.png from file.py. How should I proceed in:
file.py ?
setup.py ?
MANIFEST.in ?
The simplest way to access these files would be to include in MANIFEST.in
global-include *.png
global-include *.yml
MANIFEST.in will only use files for a source distribution though, while setup.py will include files for binary/wheel distributions so just to be safe, inside of your setup.py
include_package_data = True,
package_data = {
'' : ['*.png'],
'' : ['*.yml'],
}
Then you can reference the specific file like so from file.py
from pkg_resources import resource_string
def foo():
pngfile = resource_string(__name__, 'data/image.png')
ymlfile = resource_string(__name__, 'conf.yml')
Notice how for the png file I've specified the directory.
This solution also does not account for files of the same extension which you may want to exclude, but those could easily be taken care of with exclude or specifying filenames rather than using the asterisk.
I know there are questions that could easily be considered duplicates, but I had trouble getting a workable example as well and after a good while badgering away myself I managed to get something to work, and this was it.
I'm trying to write an sconstruct file that will install headers in a destination directory. The intended effect is:
cp include/a.h ../dest/a.h
cp include/b.h ../dest/b.h
Or just as good:
cp include/a.h ../dest/include/a.h
cp include/a.h ../dest/include/b.h
Here's what I have so far:
env = Environment()
for header in Glob("include/*.h"):
env.Command(Dir("../dest").Append(header), header, Copy("$TARGET", "$SOURCE"))
env.Alias("includes", Dir("../dest").Append(header));
This obviously doesn't work because there's no Append function. Glob returns Node objects, and a Dir is also a Node object. I can't figure out how I'm supposed to combine two Node objects into a longer path. Can anyone help?
You don't need to paste those paths together on your own (thanks for describing the actual problem that you're trying to solve). You're looking for the already provided Install() method. Please also check the User Guide, chap 11 "Installing Files in Other Directories: the Install Builder", but a concrete solution should look something like this (from the top of my head):
env = Environment()
includes = env.Install("../dest", Glob("include/*.h"))
env.Alias("includes", includes)
And if you should ever really need this
str(node)
will return the path of the node in question. ;)
In a Makefile.am file I have come across the variable assignment dist_bin_SCRIPTS = foo
From this website under the heading Makefile.am it says it installs the script in the /usr/local/bin directory.
It also says there are ways to define your own values (directories to install to)... In my case I would like to change it to install the script foo in /etc/bash_completion.d, does anybody know how to do this?
I've tried looking at the automake manual but I couldn't find out the required info sadly.
All help is appreciated :)
bin_SCRIPTS installs to /usr/local/bin is because bindir is predefined already (with many layers of defaults). So,
foodir = ${sysconfdir}/bash_completion.d
foo_DATA = mycompletion.sh
is the fast way without parametrization from configure (but still overridable at make-time.)