Pants includes OS X specific Python wheels - pants

TLDR: Pants fetches OS X specific wheels bc I'm developing on Mac. How can I avoid this, or specify that I will deploy to Ubuntu?
Full story:
Trying to package a Python application with Pants. Going great so far, but ran into a problem which I've been stuck at for a while. I'm developing on a macbook but deploying to EC2 Ubuntu.
Here's what I've done so far:
Created virtualenv.
Added BUILD files to applications, with the suggested 3rd party pattern for third party packages.
Ran ./pants run.py backend:admin_server which runs fine and generated dist/admin_server.pex
Scp that .pex onto a fresh EC2 Ubuntu box.
However when I run the application there, I get:
Failed to execute PEX file, missing compatible dependencies for:
mysql-python
pycrypto
The problem seems to be that Pants takes OS X specific wheels for these 2:
pex: - MySQL_python-1.2.5-cp27-none-macosx_10_11_intel.whl
pex: - pycrypto-2.6.1-cp27-none-macosx_10_11_intel.whl
How can I avoid that, or specify which OS they should run on?
Here's the full output:
ubuntu#ip-***:~$ export PEX_VERBOSE=1
ubuntu#ip-***:~$ python admin_server.pex
pex: Found site-library: /usr/local/lib/python2.7/dist-packages
pex: Found site-library: /usr/lib/python2.7/dist-packages
pex: Tainted path element: /usr/local/lib/python2.7/dist-packages
pex: Tainted path element: /usr/lib/python2.7/dist-packages
pex: Scrubbing from site-packages: /usr/local/lib/python2.7/dist-packages
pex: Scrubbing from site-packages: /usr/lib/python2.7/dist-packages
pex: Scrubbing from user site: /home/ubuntu/.local/lib/python2.7/site-packages
pex: Failed to resolve a requirement: MySQL-python==1.2.5
pex: Failed to resolve a requirement: pycrypto==2.6.1
pex: Unresolved requirements:
pex: - mysql-python
pex: - pycrypto
pex: Distributions contained within this pex:
pex: - six-1.10.0-py2.py3-none-any.whl
pex: - protobuf-2.6.1-py2.7.egg
pex: - setuptools-19.5-py2.py3-none-any.whl
pex: - MySQL_python-1.2.5-cp27-none-macosx_10_11_intel.whl
pex: - pycrypto-2.6.1-cp27-none-macosx_10_11_intel.whl
pex: - futures-3.0.4-py2-none-any.whl
pex: - webapp2-2.5.2-py2-none-any.whl
pex: - requests-2.9.0-py2.py3-none-any.whl
pex: - jmespath-0.9.0-py2.py3-none-any.whl
pex: - beautifulsoup4-4.4.1-py2-none-any.whl
pex: - python_dateutil-2.4.2-py2.py3-none-any.whl
pex: - boto3-1.2.3-py2.py3-none-any.whl
pex: - WebOb-1.5.1-py2.py3-none-any.whl
pex: - cssutils-1.0.1-py2-none-any.whl
pex: - webapp2_static-0.1-py2-none-any.whl
pex: - Paste-2.0.2-py2-none-any.whl
pex: - docutils-0.12-py2-none-any.whl
pex: - botocore-1.3.22-py2.py3-none-any.whl
pex: - protobuf_to_dict-0.1.0-py2-none-any.whl
Failed to execute PEX file, missing compatible dependencies for:
mysql-python
pycrypto
PS: to make sure I didn't include my versions of the python libraries, I pip uninstalled both PyCrypto and MySQL-Python.

One of the nice things about distributing your project as a PEX file is that you can prepare it to run on multiple platforms. For example, one PEX can run on both Linux and Mac platforms. For many projects, there is nothing special to do other than build a PEX. But when your project has dependencies on platform specific binary code, you will need to perform some extra steps.
One example of a library that contains platform specific code is the psutil library. It contains C code that it compiled into a shared library when the module is installed. To create a PEX file that contains such dependencies, you must first provide a pre-built version of that library for all platforms other than the one where you are running pants.
The easiest way to pre-build libraries is to use the pip tool to build wheels.
This recipe assumes the following:
You want to build a multi-platform pex to run on both Linux and mac
You are going to pre-build the libraries in the Linux environment,
then build the PEX on the mac environment.
Your project directory lives under ~/src/cookbook
Let’s take a simple program that references a library and create a pex from it.
# src/python/ps_example/main.py
import psutil
for proc in psutil.process_iter():
try:
pinfo = proc.as_dict(attrs=['pid', 'name'])
except psutil.NoSuchProcess:
pass
else:
print(pinfo)
With Pants, you can define an executable by defining a python_binary target in a BUILD file:
# src/python/ps_example/BUILD
python_binary(name='ps_example',
source = 'main.py',
dependencies = [
':psutil', # defined in requirements.txt
],
)
# Defines targets from specifications in requirements.txt
python_requirements()
In the same directory, list the python libraries in a requirements.txt file:
# src/python/ps_example/requirements.txt
psutil==3.1.1
Now, to to make the multi-platform pex, you'll need access to a Linux box to create the linux version of psutil wheel. Copy the requirements.txt file to the linux machine, then, execute the pip tool:
linux $ mkdir ~/src/cookbook/wheelhouse
linux $ pip wheel -r src/python/multi-platform/requirements.txt \
--wheel-dir=~/src/cookbook/wheelhouse
This will create a platform specific wheel file.
linux $ ls ~/src/cookbook/wheelhouse/
psutil-3.1.1-cp27-none-linux_x86_64.whl
Now, you will need to copy the platform specific wheel over to the machine where you want to build your multi-platform pex (in this case, your mac laptop). If you use this recipe on a regular basis, you will probably want to configure a Python Respository to store your pre-built libraries.
We’ll use the same BUILD file setup as in above, but modify python_binary to specify the platforms= parameter.
# src/python/ps_example/BUILD
python_binary(name='ps_example',
source = 'main.py',
dependencies = [
':psutil', # defined in requirements.txt
],
platforms=[
'linux-x86_64',
'macosx-10.7-x86_64',
],
)
# Defines targets from specifications in requirements.txt
python_requirements()
You will also need to tell pants where to find the pre-built python packages. Edit pants.ini and add:
[python-repos]
repos: [
"%(buildroot)s/wheelhouse/"
]
Now, copy the file psutil-3.1.1-cp27-none-linux_x86_64.whl over to the mac workstation and place it in a directory named wheelhouse/ under the root of your repo.
Once this is done you can now build the multi-platform pex with
mac $ ./pants binary src/python/py_example
You can verify that libraries for both mac and Linux are included in the pex by unzipping it:
mac $ unzip -l dist/ps_example.pex | grep psutil
17290 12-21-15 22:09 .deps/psutil-3.1.1-cp27-none-linux_x86_64.whl/psutil-3.1.1.dist-info/DESCRIPTION.rst
19671 12-21-15 22:09 .deps/psutil-3.1.1-cp27-none-linux_x86_64.whl/psutil-3.1.1.dist-info/METADATA
1340 12-21-15 22:09 .deps/psutil-3.1.1-cp27-none-linux_x86_64.whl/psutil-3.1.1.dist-info/RECORD
103 12-21-15 22:09
... .deps/psutil-3.1.1-cp27-none-macosx_10_11_intel.whl/psutil-3.1.1.dist-info/DESCRIPTION.rst
19671 12-21-15 22:09 .deps/psutil-3.1.1-cp27-none-macosx_10_11_intel.whl/psutil-3.1.1.dist-info/METADATA
1338 12-21-15 22:09 .deps/psutil-3.1.1-cp27-none-macosx_10_11_intel.whl/psutil-3.1.1.dist-info/RECORD
109 12-21-15 22:09
...

Related

Rust import PAM C functions

I'm trying to figure out how to write a Linux PAM in Rust and I've started with this repo. When I compile it with --features libpam (which enables #[link(name="pam")] that imports external C functions), the compiler complains:
error: linking with `cc` failed: exit status: 1
...
note: /usr/bin/ld: cannot find -lpam: No such file or directory
Looking around SO, I've realized that the pam library was not found to be linked. However, I just cannot figure out how to either install it (I can find many libpam-something packages, but no libpam) or locate it such that I can indicate to the compiler/linker where to find it.
When I look into other crates that also work with PAM, they all import the C functions using #[link(name="pam")], and none specifies how to make it work.
I'm using Ubuntu 22.04.
The necessary headers for PAM library can be installed on Ubuntu as part of libpam0g-dev package (link for jammy):
sudo apt install libpam0g-dev
Debian package has the same name (so sudo apt-get install libpam0g-dev), CentOS uses pam-devel package for that (sudo yum install pam-devel).
Alternatively, on any platform you can use the source code from github and follow the installation instructions from README.

Building and installing mediapipe windows. getting error Permission denied: for opencv_world3410.dll

I am using windows 10 OS for building Mediapipe library locally.
Environment setup:
python3.10
windows 10 os
bazel version = 3.7.2
visual studio 2019
Followed the instructions from this link for building using bazel.
with some hick-ups I am able to build the specified hello world from the same above link.
With help of friend I could find building command for hands module.
command1 for building hand_tracking
bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 --action_env PYTHON_BIN_PATH="C://python_310//python.exe" mediapipe/examples/desktop/hand_tracking:hand_tracking_cpu
command 2 for running same.
set GLOG_logtostderr=1
.\bazel-bin\mediapipe\examples\desktop\hand_tracking\hand_tracking_cpu --calculator_graph_config_file=.\mediapipe\graphs\hand_tracking\hand_tracking_desktop_live.pbtxt
This is running just like this link: press run button on top right
I need this as a library installed in my venv.
I dont want the readily available pip install mediapipe. as it is not detecting some hand moments correctly. If I build my own on my pc, it is detecting.
I tried this command below,
(mediapipe_venv) C:\dev\mediapipe_repo\mediapipe>python setup.py install
After building and when I run this setup.py install, it is failing as below.
Warning: skipping import of repository 'pybind11' because it already exists.
WARNING: C:/dev/mediapipe_repo/mediapipe/mediapipe/framework/BUILD:54:24: in cc_library rule //mediapipe/framework:calculator_cc_proto: target '//mediapipe/framework:calculator_cc_proto' depends on deprecated target '#com_google_protobuf//:cc_wkt_protos': Only for backward compatibility. Do not use.
WARNING: C:/dev/mediapipe_repo/mediapipe/mediapipe/framework/tool/BUILD:182:24: in cc_library rule //mediapipe/framework/tool:field_data_cc_proto: target '//mediapipe/framework/tool:field_data_cc_proto' depends on deprecated target '#com_google_protobuf//:cc_wkt_protos': Only for backward compatibility. Do not use.
INFO: Analyzed target //mediapipe/python:_framework_bindings.so (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
Target //mediapipe/python:_framework_bindings.so up-to-date:
bazel-bin/mediapipe/python/_framework_bindings.so
INFO: Elapsed time: 0.815s, Critical Path: 0.01s
INFO: 1 process: 1 internal.
INFO: Build completed successfully, 1 total action
error: [Errno 13] Permission denied: 'build\\lib.win-amd64-3.10\\mediapipe\\python\\opencv_world3410.dll'
(mediapipe_venv) C:\dev\mediapipe_repo\mediapipe>
Highlighting error from above.
error: [Errno 13] Permission denied: 'build\lib.win-amd64-3.10\mediapipe\python\opencv_world3410.dll'
Please suggest me if I am following anything wrong.
I am new to building libs locally. and new for BAZEL.
The command line you're using does not have access to 'build\lib.win-amd64-3.10\mediapipe\python\opencv_world3410.dll'
Run the same command in a command line with administrative privileges.

they could not be imported: _sqlite3 /Python3.6 / CentOS6

Environment
# cat /etc/redhat-release
CentOS release 6.8 (Final)
$ which python3.6
/usr/local/bin/python3.6
# find /usr/local -name _sqlite3.so
/usr/local/lib/python2.7/lib-dynload/_sqlite3.so
# yum install sqlite-devel
When I usr SQLite3, Error occurred
No module named '_sqlite3'
I decided to build it from source again.
# ./configure --enable-shared --prefix=/usr/local LDFLAGS="-Wl,-rpath /usr/local/lib" --with-sqlite=/usr/local/lib/python2.7/lib-dynload/_sqlite3.so
# make
Python build finished successfully!
Following modules built successfully but were removed because they
could not be imported:
_sqlite3
・Why?
/usr/local/lib/python2.7/lib-dynload/_sqlite3.so is the Python 2 Python wrapper module and not the actual SQLite library. Moreover, --with-sqlite should point to the headers, not the shared library. Compilation requires the headers, the dynamic library is located at runtime.
Not that you need to use --with-sqlite once you installed the sqlite-devel headers; those are put in a default location that the Python configure script knows how to find, in usr/include. Only if you have headers in a non-default location would you use that option to point to the directory holding the sqlite3.h and sqlite3ext.h headers.
Once compiled, the sqlite3 Python module loads the _sqlite3*.so extension module (the exact name differs with the exact Python 3 release and platform), and the dynamic loader will find the right libsqlite3.so version.

How to install scikit-learn-0.14.1 with atlas at non-standard location

In Ubuntu 12.04 under python 2.7.3 I am
trying to install scikit-learn-0.14.1 using the
"test the package before installing". The first step
(python setup.py build_ext --inplace) since to finish OK
See the output at [ https://gist.github.com/anonymous/7043418 ]
The testing step, however (nosetests sklearn/), things are not so nice.
It ended with 22 errors and 9 tests failed.
See the output at [ https://gist.github.com/anonymous/7044411 ]
If I am right, the problem since to be related to the atlas library.
I am using my own build of the libraries blas-lapack-atlas, the
directory at which they are installed (/home/myacct/myProg/NumLibs64b/lib)
is listed at the
beginning of the building process. However, no where the compiling options
of gcc/g++ includes -llapack -lcblas -lf77blas -latlas
Only a mention of '-DNO_ATLAS_INFO=-1 is listed during the building process.
Does this option means not to use atlas?
Numpy and Scipy were build successfully using the same
blas-lapack-atlas libraries (via the corresponding STATIC versions).
So I guess the question is how to modify setup.py of scikit-learn-0.14.1
in order to build it using
a non-standard location for the STATIC libraries blas-lapack-atlas?
Regards,
Sergio

python 2.7.5 failed to build module when install the gentoo prefix

I tried to install the gentoo prefix on my lfs 32bit. But it comes error:
Python build finished, but the necessary bits to build these modules
were not found:
_bsddb _sqlite3 _tkinter bsddb185 dl nis sunaudiodev
To find the necessary bits, look in setup.py in detect_modules() for
the module's name.
Failed to build these modules: crypt
It comes in the last step emerge -e system.
If you need any question. Please tell me.
Try this: mv $EPREFIX/tmp/usr/lib/libpython2.7.a $EPREFIX/tmp/usr/lib/_libpython2.7.a as mentioned from the following link:
http://forums.gentoo.org/viewtopic-t-890016-start-0.html
Also look at this post for resolving some of the modules if the above doesn't work:
http://www.kelvinwong.ca/tag/python-2-7/

Resources