Setup.py Install Package under a different top level name - python-3.x

I have a project such as:
myproject
setup.py
-myproject
-package1
-package2
I am using a setup.py as:
NAME='myproject'
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
long_description=long_description,
long_description_content_type='text/markdown',
author=AUTHOR,
author_email=EMAIL,
python_requires=REQUIRES_PYTHON,
url=URL,
packages=find_packages(exclude=('tests',)),
package_data={NAME: ['VERSION']},
install_requires=require(),
extras_require={},
include_package_data=True)
When I install (pip install -e .) this I can access the packages as import myproject.package1. However, I want to change this so I instead import it as import mynewname.package1. In the example above when changing NAME=mynewname and then installing, the packages become no longer visible, and import mynewname gives a ModuleNotFoundError.
I don't want to change the name of the project or structure, just the top level name under which the package is installed. Something like import mynewname.myproject.package1 would also work, but i'm unsure of how to do this.
Thanks

Related

including a python stub file for pybind11 module

I'm trying to package a module built with pybind11, which has a stub .pyi file associated with it.
The project structure is something like this:
my-pkg/
setup.py
my_pkg.pyi
my_pkg/
src/
my_cpp_files.cpp
The setup.py file for the package looks something like this
# setup.py
from glob import glob
from setuptools import setup
from pybind11.setup_helpers import Pybind11Extension
ext_modules = [
Pybind11Extension("my_pkg", sorted(glob("src/*.cpp")), include_dirs=['./'], cxx_std=17),
]
setup(
name='my-pkg',
description='This is my package',
version='1.0.0',
ext_modules=ext_modules,
)
This successfully installs the my_pkg module into site-packages like so
site-packages/
my_pkg-1.0.0.dist-info/
my_pkg.cpython-38-x86_64-linux-gnu.so
However the my_pkg.pyi is not in included. I have manually copy and pasted the my_pkg.pyi stub file into site-packages directory to see if my IDE (vscode) recognises it, and it does.
Therefore I need a way for setup.py to install my_pkg.pyi into site-packages.
I have tried including the following lines into the setup() call without success:
data_files=[('.', ['my_pkg.pyi'])]
package_data={'.': ['my_pkg.pyi']}
package_data={'my_pkg': ['my_pkg.pyi']}

python setup.py override /usr/local/bin to /usr/local/MyApp for entry_point console_scripts

My setup.py
import setuptools
import os, stat
with open('README.md', 'r') as fh:
long_description = fh.read()
setuptools.setup(
python_requires='>3.7',
name='MyApp',
version='1.0.0',
description='MyApp',
long_description=long_description,
long_description_content_type='text/markdown',
packages=setuptools.find_packages(),
data_files=[
('MyApp/', ['MyAppScripts/my_script'])],
entry_points = { 'console_scripts':['my_script = MyApp.myapp:main'] }
)
My Package:
./README.md
./MyAppScripts
./MyAppScripts/my_script
./MyApp
./MyApp/__init__.py
./MyApp/myapp.py
./setup.py
Hello Everyone, I hope I find you well and happy.
I have created a python application and would like entry_point scripts to install into directory /usr/local/MyApp and NOT /usr/local/bin. So far I am unable to get this to work and wondered if there is a way to override the install location for the entry_point scripts only? Package files should live in the default location.
As a work around I have generated the entry_point scripts and placed them into my setup directory below MyAppScripts. Using 'data_files' they are then copied relative to '/usr/local' into '/usr/local/MyApp' at install time which is the overall aim, but this is a bit of a cludge and I'd really like those entry_point scripts to get generated and land in the correct spot at install time.
I tried unsuccessfully:
entry_points = { 'console_scripts':['MyApp/my_script = myapp.scripts.myapp:main'] }
I also tried numerous install options such as:
python3 -m pip install --install-option="--prefix=/usr/local/MyApp" dist/MyApp-1.0.0-py3-none-any.whl
WARNING: Disabling all use of wheels due to the use of --build-option / --global-option / --install-option.
Which didn't workout ( possibly because my build is a whl? )
My build command:
python3 setup.py bdist_wheel
Please excuse my ignorance around packging, it is something that I am only just getting to grips with.
To summarise I'd like to run pip install and end up with entry_point script:
/usr/local/MyApp/my_script
Is anyone able to provide any advice please?
Thank you.
Regards,
D

Installing python packages locally doesn't always work

I'm creating a python 3.9 program and want to install packages locally. So the way my project is set up is this:
__main__.py
test.py
requirements.txt
lib/
__init__.py
In my requirements.txt file I have 3 lines:
colorama==0.2.2
click==8.0.3
pendulum==2.1.2
Then I run: python -m pip install -r requirements.txt -t ./lib
This installs all the packages and dependencies inside of the lib directory.
Then I import the modules at the top of my test.py file:
from lib import colorama
from lib import click
from lib import pendulum
In doing some testing, I've found that colorama works fine. I'll use it in a simple test:
print(colorama.Fore.BLUE + "Hello, World!"). The text is blue in the console and everything is working.
I then try to use the other packages and I get ModuleNotFoundError exception:
print(pendulum.now('Europe/Paris'))
Exception has occurred: ModuleNotFoundError - No module named 'pendulum'
This is coming from one of its own files.
The same thing happens when I use Click, but it's a little different. I'll get the same ModuleNotFound exception, but it's for its own dependency on Colorama. I don't think it's related to the fact that I'm also importing Colorama because if I uninstall I get the same error.
I've also tried this with the python-docx package. I added python-docx==0.8.11 to the requirements.txt file, then issued the same command as above to install to my local lib directory. It seems to install fine. I see the docx directory and all its dependencies. Then I import from lib import docx then do something simple in test.py:
doc = docx.Document()
Then get ModuleNotFound error: File "C:\Users\name\Development\python\test-local-package\lib\docx_init_.py", line 3, in (Current frame) No Module named 'docx'
Does anyone know what I'm doing wrong?
When you put those libraries into your lib folder and import them the way you are doing, you're changing their package names. No longer is colorama a top-level package, it's now lib.colorama. Some libraries might be fine with that, but for others, they expect to be able to import their own code using their normal names. If colorama.some_submodule tries to import colorama, it will fail.
It's important to realize that a statement like from lib import colorama doesn't change how colorama can be found everywhere. It only changes the local namespace. The package is still lib.colorama, we've just bound it to the name colorama in the current module.
As JonSG has suggested in comments, a better solution is to put the lib folder into the Python search path so that import colorama will find the package with its normal name. Modifying sys.path is one way to do that, another is the PYTHONPATH environment variable (probably not ideal for your current issue, but sometimes useful in other situations).

how to load pip-installed module from sublimetext?

I installed recordclass via pip3 install recordclass which installs it under /Users/timothee/homebrew/lib/python3.7/site-packages/recordclass; I'm trying to use it from a sublimetext package but it fails:
import sys
sys.path.append('/Users/timothee/homebrew/lib/python3.7/site-packages')
import recordclass
ImportError: No module named 'recordclass.mutabletuple'
If I instead use sys.path.append('/Users/timothee/homebrew/lib/python3.7/site-packages/recordclass') I get: SystemError: Parent module '' not loaded, cannot perform relative import
Note that it does work for some other pip3 installed modules, eg jstyleson, simplejson.
Here is contents of recordclass:
\ls
__init__.py arrayclass.py datatype.py mutabletuple.cpython-37m-darwin.so recordobject.cpython-37m-darwin.so test utils.py
__pycache__ dataobject.cpython-37m-darwin.so litelist.cpython-37m-darwin.so recordclass.py structclass.py typing
Not sure whether the reason is that sublime bundles its own python interpreter (3.3 IIRC) and my system python is 3.7, and it installs as recordobject.cpython-37m-darwin.so so can't be seen by sublime's python; but temporarily renaming to recordobject.cpython-33m-darwin.so didn't help.
Note: I've read other similar questions but these didn't help, eg:
* Install python module without PIP => not relevant; I can install it; just not load it from sublime
* Installing python modules like "Web3" without pip/pip3? ditto
* https://github.com/SublimeText/UnitTesting/issues/67
This is the most relevant post: Sublime Plugin: How can I import wx? but the solution there doesn't seem to work; since the problem seems to be these darwin.so files in my case

Add numpy.get_include() argument to setuptools without preinstalled numpy

I am currently developing a python package that uses cython and numpy and I want the package to be installable using the pip install command from a clean python installation. All dependencies should be installed automatically. I am using setuptools with the following setup.py:
import setuptools
my_c_lib_ext = setuptools.Extension(
name="my_c_lib",
sources=["my_c_lib/some_file.pyx"]
)
setuptools.setup(
name="my_lib",
version="0.0.1",
author="Me",
author_email="me#myself.com",
description="Some python library",
packages=["my_lib"],
ext_modules=[my_c_lib_ext],
setup_requires=["cython >= 0.29"],
install_requires=["numpy >= 1.15"],
classifiers=[
"Programming Language :: Python :: 3",
"Operating System :: OS Independent"
]
)
This has worked great so far. The pip install command downloads cython for the build and is able to build my package and install it together with numpy.
Now I want to improve the performance of my cython code, which leads to some changes in my setup.py. I need to add include_dirs=[numpy.get_include()] to either the call of setuptools.Extension(...) or setuptools.setup(...) which means that I also need to import numpy. (See http://docs.cython.org/en/latest/src/tutorial/numpy.html and Make distutils look for numpy header files in the correct place for rationals.)
This is bad. Now the user cannot call pip install from a clean environment, because import numpy will fail. The user needs to pip install numpy before installing my library. Even if I move "numpy >= 1.15" from install_requires to setup_requires the installation fails, because the import numpy is evaluated earlier.
Is there a way to evaluate the include_dirs at a later point of the installation, for example, after the dependencies from setup_requires or install_requires have been resolved? I really like to have all dependencies resolved automatically and I dont want the user to type multiple pip install commands.
The following snippet works, but it is not officially supported because it uses an undocumented (and private) method:
class NumpyExtension(setuptools.Extension):
# setuptools calls this function after installing dependencies
def _convert_pyx_sources_to_lang(self):
import numpy
self.include_dirs.append(numpy.get_include())
super()._convert_pyx_sources_to_lang()
my_c_lib_ext = NumpyExtension(
name="my_c_lib",
sources=["my_c_lib/some_file.pyx"]
)
The article How to Bootstrap numpy installation in setup.py proposes using a cmdclass with custom build_ext class. Unfortunately, this breaks the build of the cython extension because cython also customizes build_ext.
First question, when is numpy needed? It is needed during the setup (i.e. when build_ext-funcionality is called) and in the installation, when the module is used. That means numpy should be in setup_requires and in install_requires.
There are following alternatives to solve the issue for the setup:
using PEP 517/518 (which is more straight forward IMO)
using setup_requires-argument of setup and postponing import of numpy until setup's requirements are satisfied (which is not the case at the start of setup.py's execution)
PEP 517/518-solution:
Put next to setup.py a pyproject.toml-file , with the following content:
[build-system]
requires = ["setuptools", "wheel", "Cython>=0.29", "numpy >= 1.15"]
which defines packages needed for building, and then install using pip install . in the folder with setup.py. A disadvantage of this method is that python setup.py install no longer works, as it is pip that reads pyproject.toml. However, I would use this approach whenever possible.
Postponing import
This approach is more complicated and somewhat hacky, but works also without pip.
First, let's take a look at unsuccessful tries so far:
pybind11-trick
#chrisb's "pybind11"-trick, which can be found here: With help of an indirection, one delays the call to import numpy until numpy is present during the setup-phase, i.e.:
class get_numpy_include(object):
def __str__(self):
import numpy
return numpy.get_include()
...
my_c_lib_ext = setuptools.Extension(
...
include_dirs=[get_numpy_include()]
)
Clever! The problem: it doesn't work with the Cython-compiler: somewhere down the line, Cython passes the get_numpy_include-object to os.path.join(...,...) which checks whether the argument is really a string, which it obviously isn't.
This could be fixed by inheriting from str, but the above shows the dangers of the approach in the long run - it doesn't use the designed mechanics, is brittle and may easily fail in the future.
the classical build_ext-solution
Which looks as following:
...
from setuptools.command.build_ext import build_ext as _build_ext
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
setupttools.setup(
...
cmdclass={'build_ext':build_ext},
...
)
Yet also this solution doesn't work with cython-extensions, because pyx-files don't get recognized.
The real question is, how did pyx-files get recognized in the first place? The answer is this part of setuptools.command.build_ext:
...
try:
# Attempt to use Cython for building extensions, if available
from Cython.Distutils.build_ext import build_ext as _build_ext
# Additionally, assert that the compiler module will load
# also. Ref #1229.
__import__('Cython.Compiler.Main')
except ImportError:
_build_ext = _du_build_ext
...
That means setuptools tries to use the Cython's build_ext if possible, and because the import of the module is delayed until build_ext is called, it founds Cython present.
The situation is different when setuptools.command.build_ext is imported at the beginning of the setup.py - the Cython isn't yet present and a fall back without cython-functionality is used.
mixing up pybind11-trick and classical solution
So let's add an indirection, so we don't have to import setuptools.command.build_ext directly at the beginning of setup.py:
....
# factory function
def my_build_ext(pars):
# import delayed:
from setuptools.command.build_ext import build_ext as _build_ext#
# include_dirs adjusted:
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
#object returned:
return build_ext(pars)
...
setuptools.setup(
...
cmdclass={'build_ext' : my_build_ext},
...
)
One (hacky) suggestion would be using the fact that extension.include_dirs is first requested in build_ext, which is called after the setup dependencies are downloaded.
class MyExt(setuptools.Extension):
def __init__(self, *args, **kwargs):
self.__include_dirs = []
super().__init__(*args, **kwargs)
#property
def include_dirs(self):
import numpy
return self.__include_dirs + [numpy.get_include()]
#include_dirs.setter
def include_dirs(self, dirs):
self.__include_dirs = dirs
my_c_lib_ext = MyExt(
name="my_c_lib",
sources=["my_c_lib/some_file.pyx"]
)
setup(
...,
setup_requires=['cython', 'numpy'],
)
Update
Another (less, but I guess still pretty hacky) solution would be overriding build instead of build_ext, since we know that build_ext is a subcommand of build and will always be invoked by build on installation. This way, we don't have to touch build_ext and leave it to Cython. This will also work when invoking build_ext directly (e.g., via python setup.py build_ext to rebuild the extensions inplace while developing) because build_ext ensures all options of build are initialized, and by coincidence, Command.set_undefined_options first ensures the command has finalized (I know, distutils is a mess).
Of course, now we're misusing build - it runs code that belongs to build_ext finalization. However, I'd still probably go with this solution rather than with the first one, ensuring the relevant piece of code is properly documented.
import setuptools
from distutils.command.build import build as build_orig
class build(build_orig):
def finalize_options(self):
super().finalize_options()
# I stole this line from ead's answer:
__builtins__.__NUMPY_SETUP__ = False
import numpy
# or just modify my_c_lib_ext directly here, ext_modules should contain a reference anyway
extension = next(m for m in self.distribution.ext_modules if m == my_c_lib_ext)
extension.include_dirs.append(numpy.get_include())
my_c_lib_ext = setuptools.Extension(
name="my_c_lib",
sources=["my_c_lib/some_file.pyx"]
)
setuptools.setup(
...,
ext_modules=[my_c_lib_ext],
cmdclass={'build': build},
...
)
I found a very easy solution in this post:
Or you can stick to https://github.com/pypa/pip/issues/5761. Here you install cython and numpy using setuptools.dist before actual setup:
from setuptools import dist
dist.Distribution().fetch_build_eggs(['Cython>=0.15.1', 'numpy>=1.10'])
Works well for me!

Resources