I’m following PyPA’s current guidance to use only setup.cfg without setup.py for configuring metadata when building a package for distribution. (“Static metadata (setup.cfg) should be preferred. Dynamic metadata (setup.py) should be used only as an escape hatch when absolutely necessary. setup.py used to be required, but can be omitted with newer versions of setuptools and pip.”)
To be concrete, I’m working with the following directory/file structure:
my-project/
├── LICENSE
├── pyproject.toml
├── README.md
├── setup.cfg
├── src/
│ └── my_package/
│ ├── __init__.py
│ └── my_module.py
└── tests/
(If you’re unfamiliar with this structure that has a src/ directory intermediating between the project directory and the import-package directory, it’s the structure given in the Packaging Python Projects tutorial and argued for by Ionel Cristian Mărieș and Mark Smith, et al. See also § “Using a src/ layout” in Configuring setuptools using setup.cfg files.)
I had been successfully specifying the version number in setup.cfg simply with:
[metadata]
version = "0.0.1"
However, PyPA et al. have discussed the concept of “single-sourcing the package version,” which basically means AFAICT moving the definition of the version outside of setup.py and setup.cfg to (a) make it more broadly accessible by other tools and (b) put the definition of the version into a file that’s within the import-package directory (viz., my-project/src/my_package) and therefore installed with the code. (Admittedly, my use case is so simple there may be little benefit to moving the definition outside setup.cfg, but I’m drawn to exploring best practices—perhaps irrationally so. 😉)
In particular, much of the discussion includes defining a separate file, e.g., __version__.py, version.py, or even VERSION, to hold the version information. Moreover, this version file would be located at (a) the root of the import package, e.g., of my_package:
path-to/project-directory/src/my_package/__version__.py
rather than at (b) the root of the project-directory where setup.cfg resides. (I deliberately hyphenate project-directory and use underscores in my_package to emphasize that PyPI normalizes underscores, etc. to hyphens in project names, whereas those normalized names would be illegal as the name of a Python package.)
The overwhelming majority of the discussion about single-sourcing the version number takes place in the context of defining metadata dynamically using setup.py rather than statically using only setup.cfg. And that part of the discussion that does address using static metadata (i.e., setup.cfg without setup.py) (e.g., Configuring setuptools using setup.cfg files) raises as many questions as it answers. In particular, it’s silent on how to configure the separate version-specifying file and how the various files involved interrelate.
What are the options, and how precisely do you configure the appropriate files, to specify the version of a package/module when you specify the project metadata statically with setup.cfg?
In the course of formulating the above question, I incrementally solved it to find three methods that work:
Two methods use the attr: special directive in setup.cfg. Of these:
One puts the version number directly in the package’s __init__.py file.
The other puts the version number in a separate file (__version__.py) and then __init__.py imports the version string from that separate file.
The third method uses instead the file: special directive in setup.cfg.
This reads the separate version-specifying file (VERSION) directly and doesn’t involve the package’s __init__.py file at all.
In light of these three possibilities, I re-present the directory/file structure with the addition of the two version-specifying files __version__.py and VERSION:
my-project/
├── LICENSE
├── pyproject.toml
├── README.md
├── setup.cfg
├── src/
│ └── my_package/
│ ├── __init__.py
│ ├── __version__.py
│ ├── VERSION
│ └── my_module.py
└── tests/
Of course, you’d have at most one of those two files, depending on which of the three solutions you chose to implement.
The two attr: special-directive solutions
In both of the solutions using the setup.cfg’s attr: special-directive, setup.cfg obtains the import package’s version from the import package’s __version__ attribute. (When you import some_package that has a version, use dir(some_package) and you’ll see that it has a __version__ attribute.) Now you see the connection here between the attr: name of the special directive and our goal.
The key task: how to assign the __version__ attribute to my_package?
We assign the __version__ attribute, either directly or indirectly, using the package’s __init__.py file, which already exists (assuming you have a traditional package rather than a namespace package, which is outside the scope of this answer).
The snippet in setup.cfg that is common in both Method A and Method B
In both of these attr: special-directive solutions, the configuration of the setup.cfg file is the same, with the following snippet:
[metadata]
name = my-project
version = attr: my_package.__version__
To be clear, here the .__version__ references an attribute, not a file, subpackage, or anything else.
Now we branch depending on whether the version information goes directly into __init__.py or instead into its own file.
Method A: Put the assignment into the package’s __init__.py file
This method doesn’t use a separate file for specifying the version number, but rather inserts it into the package’s __init__.py file:
# path-to/my-project/src/my_package/__init__.py
__version__ = '0.0.2'
Note two elements of the assignment:
the left-hand side (__version__) corresponds to the attr: line in setup.cfg (version = attr: my_package.__version__)
the legitimate version string on the right-hand side is a string enclosed by quotes.
We’re done with Method A.
Method B: Put the assignment into a __version__.py file and import it in __init__.py
Create __version__.py and put the version string in it
We construct a new Python file and locate it at the same level as the import package’s __init__.py file.
We insert the exact same __version__ directive that we inserted in __init__.py in Method A:
# my-project/src/my_package/__version__.py
__version__ = '0.0.2'
From within __init__.py, import __version__ from __version__.py
In __init__.py, we do a relative import to access the __version__ that was assigned in the separate file:
# path-to/my-project/src/my_package/__init__.py
from . __version__ import __version__
To unpack this a little…
We’re doing a relative import, so we have to use the from … import … syntax. (Absolute imports may use either the import <> or from <> import <> syntax, but relative imports may only use the second form.)
The . indicates a relative import, starting with the current package.
The first occurrence of __version__ refers to the “module” __version__.py.
This file name doesn’t have to be __version__.py. That’s just conventional. Whatever the filename is, however, it must match the name after from . (except that the .py is stripped off in the from . import statement).
The second occurrence of __version__ refers to the assignment statement inside of __version__.py.
I’m not sure whether this string needs to be __version__, but it certainly at a minimum needs to match the assignment statement.
We’re done with Method B.
Method C: Using the file: special directive
The separate file is populated differently
In this method, we use a separate file for the version number, as in Method B. Unlike Method B, we read the contents of this file directly, rather than importing it.
To prevent confusion, I’ll call this file simply VERSION. Like __init__.py and Method B’s __version__.py, VERSION is at the root level of import package. (See the directory/file diagram.) (Of course, in this method, you won’t have __version__.py.)
However, the contents of this VERSION file are much different than the contents of Method B’s __version__.py.
Here’s the contents of my-project/src/my_package/VERSION:
0.0.2
Note that:
This file contains nothing but the contents of the version string itself. In particular, do not enclose this string in quotation marks!
There’s also no assignment syntax going on; there’s no “__version__ = ” preamble to the assignment string.
This isn’t even a Python file, so I didn’t include a comment string with the path to the file, because that would be enough to give the error VERSION does not comply with PEP 440: # comment line.
setup.cfg is different than before
There are two points of note that distinguish setup.cfg in Method C from the setup.cfg that was common to both Methods A and B.
setup.cfg uses file: rather than attr:
In Method C, we use a different formulation in setup.cfg, swapping out the attr: special directive and replacing it with the file: special directive. The new snippet is:
[metadata]
name = my-project
version = file: src/my_package/VERSION
The file path to VERSION is relative to the project directory
Note the path to VERSION in the assignment statement: src/my_package/VERSION.
The relative file path to the VERSIONS file is relative to the root of the project directory my-project. This differs from Method B, where the relative import was relative to the import-package root, i.e., my_package.
We’re done with Method C.
Pros and cons
Method A might be seen to have a virtue of needing no additional file to set the version (because, in addition to setup.cfg, which is needed in any case, Method A uses only __init__.py, which likewise already exists). However, having a separate file for the version number has its own virtue of being obvious where the version number is set. In Method A, sending someone to change the version number who didn’t already know where it was stored might take a while; it wouldn’t be obvious to look in __init__.py.
Method C might seem to have the advantage over Method B, because Method B requires modification to two files (__init__.py and __version__.py) rather than only one for Method C (VERSION). The only perhaps countervailing advantage of Method B is that its __version__.py is a Python file that allows embedded comments, which Method C’s VERSION does not.
I use setuptools-scm to do this. Create a pyproject.toml file with the following contents:
[build-system]
requires = [
"setuptools>=42",
"setuptools_scm[toml]>=6.2",
"wheel",
]
build-backend = "setuptools.build_meta"
[tool.setuptools_scm]
write_to = "path/to/_version.py"
fallback_version = "1.0.0"
When you install the package, this will result in setuptools creating the file _version.py where you told it to.
_version.py will have a variable in it called version. I read this into __version__ in the main package __init__.py.
# __init__.py
from ._version import version as __version__
I guess I'm assuming you're using git here. I don't know if it works for other scm tools. It works by reading the tag information out of the git metadata. So your project needs to contain tags with pip-compatible version strings.
I have typescript project with absolute paths set with baseUrl property in tsconfig file to src directory. The problem is that I want to have constants.ts file in the root of the src directory, and when I want to import something from that file, compiler tries to import code from deprecated constants NodeJS module, and I see message below in the IDE:
module "constants"
#deprecated — since v6.3.0 - use constants property exposed by the relevant module instead.
How can I hide this module or create an alias, so that src/constants.ts file would be read first?
to make my first nodejs program with typescript. maybe i need import or references index.d.ts of #types/node. but it work just after exectue
npm install #types/node and append code like this import { setTimeout } from 'timers'
but way it was referenced.
sorry for my poor english and thanks.
TypeScript has an approximation of NodeJS's module resolution algorithm baked into it so that it can make this scenarios easier.
When you do not specify a value for --module, or "module" in tsconfig.json, it defaults to "commonjs" which in turn defaults the --moduleResolution setting to "node".
In addition to recursively traversing the node_modules directories above and within your working directory looking for files and directories matching the imported name, this also enables automatic resolution of the "packageName" >>> packageName/package.json.main resolution behavior of the NodeJS require function. It also enables the "packageName" >>> packageName/index convention where NodeJS requires a directory as a module via its index file.
In your case, this is how, roughly speaking, it resolves index.d.ts automatically:
When you import from a non-relative module specifier, it looks in node_modules directory adjacent to the working directory.
From there it first looks for an #types subdirectory.
If it finds a directory with the same name as the import, it looks for a package.json file in that directory.
If it finds a package.json with a "main" property it resolves the import to that file.
That file is index.d.ts.
Hint: to see this in action pass the --traceResolution flag to tsc, or set
{
"compilerOptions": {
"traceResolution": true,
// ...
}
}
in tsconfig.json.
Look at the output and you will see its file system walking behavior and each attempt to resolve the import.
I am trying to create Rust bindings for the C++ library cryptominisat. The actual code works, but I'm not sure how to properly package it up with Cargo.
The git repository looks like
src/
c++ code here
.gitignore
readme, etc.
I added a rust directory, and created my Cargo project inside of it like so
rust/
cryptominisat/
Cargo.toml
build.rs
src/
rust code here
src/
c++ code here
.gitignore
readme, etc.
Unfortunately, cargo package doesn't seem to want to package up anything outside of the rust/cryptominisat directory, which means it doesn't include the C++ code needed to actually build the library. What can I do? I don't want to move the entire repository into the rust directory if I can avoid it, since that would make it impossible to merge upstream.
The way it's generally solved:
Use a git submodule (or a script run before publishing) to embed a copy of the C++ repo inside the Rust repo (e.g. in rust/cryptominisat/vendor/). During development you could use a symlink instead to avoid having two copies of the C++ code.
Use build.rs to download a tarball/clone/rsync the code at build time. You can dump it into OUT_DIR env var specified by Cargo to avoid polluting user-visible directories.
Make the C++ code a system-level library. The Rust package would not build it, but expect it's already installed, and only search for it and specify link flags for it. That's how most *-sys crates work.
instead of requireing code relatively, ie starting with ./ or .., i'd like to define a module "globally". For example, take the following package structure:
/src
/index.js
/a.js
/b.js
/lib
/index.js
...
When in src/a.js or src/b.js, to require lib, I would have to do require('../lib') each time. This gets annoying when you start nesting more as you would have to manually resolve ../lib or ../../lib or ../../../lib.
I want to be able to do require('lib'). Can I do this? Or should I just use globals?
Using a non relative path to require your source files is not how node's require is intended to work! Don't try to work around this restriction by placing arbitrary code file in node_modules directory or workaround by changing the NODE_PATH environment variable.
If you want to use require without a path you should extract the required code as a node module and depend on this node module. This leads to better structured code, less complex modules, encapsulated functionality, better testability and easier code reuse.
You can include package dependencies from http or git so there is no requirement to publish node modules you use in npm. Take a look at npm dependencies for more detail.
use module.exports in the index.js file . and place it inside the node_modules folder
if relative path annoy you and you want to use lib always in your application, you can use global variable like this.
var lib = require('./lib');
global.lib = lib;
you can set lib to global variable in your entry point. after then you can access just lib.
but it's pollute global scope. so you have to use carefully.
placing your module in node_modules dont require you to include a path or relative path
EDIT:
if you place a file named package.json inside the module directory, Node will try to parse that file and look for and use the main attribute as a relative path for the entry point. For instance, if your
./myModuleDir/package.json
file looks something like the following, Node will try to load the file with the path
./myModuleDir/lib/myModule.js
:
{
"name" : "myModule",
"main" : "./lib/myModule.js"
}
If that folder does not contain a package definition file named package.json, the package entry point will assume the default value of index.js, and Node will look, in this case, for a file under the path ./myModuleDir/index.js.