My team and I are using python 3.7.5 and a virtual environments. We are developing and using a fairly informal package. Its an engineering tool and due to the informal and fast pace we arent using a 0.0.0 versioning scheme yet. Hopwever, I would like an easy way to know exactly which version of the tool people are using if they run into issues. And I need a solution that doesn't require manually updating a version number in setuptools because I don't trust my coworkers to do that.
We are using python setuptools and pip to generate wheel files and install the packages into our virtual environments.
My question is: How can I get the commit sha of a particular package that is installed in the virtual environment at run time?
If the solution involves adding some code to embedd that data via setuptools during installation, thats good too.
Related
Unfortunately, we have projects with different nodejs versions. I am not sure if this is like java where I can have multiple jdks installed(multiple nodejs installed) and each project magically uses the correct version via a config file? Also, most commands in tutorials are not including version numbers when installing tools and libraries like so
npm install -g expo-cli (tool)
npm install #react-navigation/native (library)
Coming from gradle with the gradle wrapper where everyone on the team uses the tools and library versions defined in the build.gradle file, this is odd to me. In the gradle world, everyone on the team uses the same exact version of gradle(gradle itself ensures if one person upgrades it in the repo, everyone gets the upgrade and stays in sync on the same version). Then there is plugins/tools and those versions are defined and then libraries and those versions are defined.
How do I guarantee everyone is using the same npm, node, expo, etc. tools?
How do I guarantee everyone is using the same libraries?
How do I guarantee everyone is using the same typescript?
Ideally, we upgrade any of these in the repo + any fixes to the upgrade so on checkout, developers start using the new tool + new *.tsx files so it is seamless much like in the gradle world. In gradle, I upgrade the
version via a property
versions of plugins/tools
build.gradle files
any source code files
and check that all in as a unit such that any developer that checks out is using all the correct versions together. I want this in react-native for our ios/android mobile project or as close as I can get.
I have hacked things before as in installing 'ant' into the git repo and this worked wonders(even though it is such an ugly hack) and everyone used the tool in the repo instead of the one on their OS. Perhaps there is a way to do that?
A bloated repo with binaries was worth it's weight in gold to prevent version compatibility hell as people upgraded libraries over time. NOT only that we found that tool bugs were easier to track down as we could revert the repo. NOT only that, we could reproduce builds form 1 year ago as the tooling was reverted where todays npm tools can't build the 1 year ago thing due to all the changes. The advantages just kept piling up and up and I can't even remember all of them.
Tooling running from the repo either via bootstrap like gradle wrapper or full blown thing is generally the best option until the full blown thing is really bloated but even then locking it to a hash on another tool repo could be better
Any ideas welcome here to put my team on all the same tooling(works great for people joining the company or team from another project as well to not have to install much).
thanks,
Dean
Typescript and libraries should be taken care of by removing any carets and tildes in your package.json and specifying exact versions.
One low overhead possibility for the rest could be shell scripting and a private package repo. You could host the versions you want to install internally, and get it all through cURLs.
Or you could add some simple scripts in your npm pre-install, for example nvm use 12.2.1 should, through error messages, guide the user to installing nvm and using the proper version.
I have Python 3.7 installed (system-wide) along with packages like Numpy, Pandas, pptx, xlsxwriter, and some more. Recently, I learned about virtual environments (yes I am very late to the party), and clearly see the benefits. But I am confused about a few things. Hopefully I can get answers here.
In most of my projects, I use common packages like Numpy, Pandas, Matplotlib, mysql.connector, etc. Should I install these system-wide? Or should I install them in each virtual environment that I create for each project, which, for example, amounts to installing Pandas 10 times in 10 different virtual environments?
If I install packages (system wide) that are not part of the Python Standard Library, for example, pptx and mysql.connector, AND I create a virtual environment, will I have access to these packages from within a virtual environment, or should I install them in the virtual environment as well?
What about a module like jupyter notebook, where it is not part of any project specifically, but I love using it for simple code development and testing (simple plots, etc.). Should this be installed system wide?
I am considering un-installing Python 3.7 and all the packages from my computer, and doing a fresh install of Python 3.8. I want to approach this the "right" way. I would think packages like Numpy and Pandas (I do quite a bit of data manipulation/plotting) should be installed system wide, and each virtual environment should have access to it automatically, and a more specialized package (for me at least) like pptx should be installed in a particular virtual environment for a particular project. Is my assumption correct about this?
Please advise on how to best approach package installation in the context of having virtual environments.
EDIT: based on Ni's reply, I would ask one more question: Are there modules (for example, python-dateutil, that might be used in many projects) and/or circumstances where it makes sense to install a module system-wide?
In general, I never install packages system wide.
You might install packages which require specific versions of Numpy, in your environments. In those cases, if you update the system wide version of Numpy, the package in the environment might break and you won't know that happened.
Yes, you can access them from virtual environment. But in general, don't install packages system wide
Again, I wouldn't install that system wide. For e.g., you might have environments running different python versions, which might not be compatible with the same version of Jupyter
Seems like you're doing a lot of data science work - you might want to use Anaconda to help you manage your virtual environments and package installations
I work on a Python project that in one place callse Julia's code, and in other uses OpenCV.
Unfortunately, pyJulia prefers Python interpreter to be dynamically linked to the libpython. (I know I can build a custom Julia system image, but I fear the build delays when I want to test a development version of my Julia code from Python.)
What has worked so far, is using Spack instead of Conda. Python built by Spack has a shared libpython and Spack's repository does include a recent opencv.
Unfortunately, contrary to Conda, Spack is designed around a paradigm of compiling everything, rather than downloading binaries. The installation time of opencv is well over 1 hour, which barely is acceptable for a one-off install in the development environment, but is dismayingly long to build a Docker image.
So I have a thought: maybe it is possible to integrate my own Python with the rest of the Conda ecosystem?
This isn't a full solution, but Spack does support binary packages, as well as GitLab build pipelines to build them in parallel and keep them updated. What it does not have (yet) is a public binary mirror, so that you could install these things very quickly from pre-existing builds. That's in the works.
So, if you like the Spack approach, you can set up your own binary caches and automated builds for your dev environment.
I am not sure what the solution would be with Conda. You could make your own conda-forge packages, but I think if you deviate from the standard ones, you may end up reimplementing a lot of packages to support your use case. On the other hand, they may accept patches to make your particular configuration work.
So I am in the following situation:
We are writing an application which needs to be deployed onto the customer machines which are running either some older version of Ubuntu (16.04 or some older) or Debian 9. Till today our application was packaged as a standard .deb package and it had system dependencies in a way that a sane apt install could handle installing it together with all its dependencies (such as Qt, sqlite, gdal, proj, etc...).
Due to some requirements however recently we have developed a few features which require that some libraries, specifically gdal and proj are using the latest version of aforementioned libraries, which are not to be found on these older systems and we can't got to each of our customer and compile these libraries on their machines.
So, obviously the question comes:
What are the best recommendations to deploy our application in the most painless way with the new libraries onto the old systems?
I have looked into AppImage but I just can't get my head around it (and did not find a good tutorial on how to deploy a Qt application with it), and flatpak and snap are not good, since we don't want to depend on any other repositories.
So I wrote a little Command-Line App and want to provide an easy way to clone and use it. Problem: My app has some dependencies like Hashlib and Pyperclip.
Is there a best practice to get missing packages installed on a different machine?
Can I do so without people needing to install global pip packages?
It is a good idea to ship it as a venv?
Can like compile them into a single python file (somewhat like a binary)?
If I cannot do so, how to I get packages installed on a different machine?
Is there a nice routine to allow the user to completely remove my app once installed?
You could use the py2exe module to make a standalone application for the Windows platform. You can use py2app to make standalone apps on the Mac platform. For Linux, you should prepare a package using the .deb format to target the Ubuntu/Debian environment (which is one of the most popular flavors of *nix). Each other *nix flavor has its own packaging system that you would have to follow to target it as a platform.