I'm on a mac running python3 on jupyter notebook. Pushing myself to learn more python via a project on road maps.
I'm reading in a shapefile like so
import networkx as nx
g = nx.read_shp('Road files/geo_export_4d537b7d-a470-4eb9-b147-1d0ea89e6b60.shp')
And it's working dandy.
But then I read about OSMnx and think "that's pretty cool! I could dynamically pull shapefiles, rather than hunt them out online".
So I tried to install (pip install osmnx) but kept getting failures. So I tried the other method mentioned (conda install -c conda-forge osmnx).
Now, I can no longer run my initial networkx read_shp because of this error:
ImportError: read_shp requires OGR: http://www.gdal.org/
. I've gone to the site and installed GDAL, but the error persists.
I also cannot import osmnx. It errors on
from fiona.ogrext import Iterator, ItemsIterator, KeysIterator due to
ImportError: dlopen(/Users/sb/anaconda/lib/python3.5/site-packages/fiona/ogrext.cpython-35m-darwin.so, 2): Library not loaded: #rpath/libjpeg.8.dylib
Referenced from: /Users/sb/anaconda/lib/libgdal.20.dylib
Reason: image not found
1\ What the heck did I just do to my environment?
2\ How do I restore networkx functionality? Presumably through a proper GDAL (re?)installation.
3\ How do I prep for osmnx?
Sorry for the vague open-endedness here, I've pushed my code a bit too far beyond my abilities.
Update
I ran conda config --add channels conda-forge and re-running conda install gdal and conda install libgdal.
Unfortunately I still error out, but it's a different error claiming that networkx needs gdal (which should be installed?)
/Users/sb/anaconda/lib/python3.6/site-packages/networkx/readwrite/nx_shp.py in read_shp(path, simplify)
ImportError: read_shp requires OGR: http://www.gdal.org/
fwiw, /Users/sb/anaconda/lib/ has both a python3.6 and python3.5 folder.
In general, you might want to avoid mixing conda channels. Presumably your environment had been defaulting to defaults and then you installed OSMnx via the conda-forge channel. In practice, it usually works OK but sometimes it can cause package conflicts like what you're seeing.
Per the OSMnx documentation, you could install it in a clean, dedicated virtual environment to ensure it is isolated:
conda create --yes -c conda-forge -n OSMNX python=3 osmnx
source activate OSMNX
If that still doesn't work then there is indeed an issue with the conda-forge packaging for your platform and version of Python, in which case you should open an issue in its conda-forge GitHub repo.
Finally, you might also consider making conda-forge the highest priority channel in your anaconda setup. Check your .condarc file and ensure that the conda-forge channel is on top of defaults so it gets priority. As another answer elsewhere suggests, there are 3 main reasons to use the conda-forge channel instead of the defaults channel maintained by Continuum:
Packages on conda-forge may be more up-to-date than those on the defaults channel
There are packages on the conda-forge channel that aren't available from defaults
You would prefer to use a dependency such as openblas (from conda-forge) instead of mkl (from defaults).
Wes McKinney has similarly commented on the benefits of using conda-forge.
Related
I am working on an existing Python 3 code-base that provides a setup.py so the code is installed as a Python library. I am trying to get this internal library installed with its own dependencies (the usual data science ones e.g. pandas, pyodbc, sqlalchemy etc).
I would like to have this internal library to deal with these dependencies and assume that if that library is installed, then all the transitive dependencies are assumed to be installed. I also would like to have the Anaconda (conda) version of the package rather than the pip version.
I started with a requirements.txt, but moved quickly to this field in setup.py:
install_requires=[
"pyodbc>=4.0.27",
"sqlalchemy>=1.3.8",
"pandas>=0.25.1",
"requests>=2.22.0",
"assertpy>=0.14",
"cycler>=0.10.0",
]
However when I run the installation process:
either with python setup.py install --record installed_files.txt
or with pip install .
I see that there is some gcc / C++ compilation going on that shows logs about Python wheels (I don't completely understand the implications of Python eggs and Python wheels, but AFAIK if conda is available then I should go with the conda version rather than eggs/wheels because then I don't have to take care of the C++ code underneath the Python code).
I really would prefer having conda to install these C++ blobs wrapped in some Python code as a libraries e.g. pandas.
is it possible at all to have conda driving the installation process described in setup.py so I am not dealing with gcc?
how can I make sure that other Python code depending on this internal library (installed via setup.py) is using the same (transitive) dependencies defined in that setup.py?
Regardless the installation method, how can I make sure that the dependencies for e.g. pandas are installed as well? Sometimes I see that numpy as a dependency of pandas is not installed when running setup.py, but I would like to avoid doing this manually (e.g. with some requirements.txt file).
pip doesn't know about conda, so you cannot build a pip-installable package that pulls in its dependencies from conda channels.
conda doesn't care about setup.py, it uses a different format for recording dependencies.
To install your code with conda, you should create a conda package, and specify your dependencies in a meta.yaml file. Refer to the documentation of "conda build" for details.
https://docs.conda.io/projects/conda-build/en/latest/resources/define-metadata.html
In my program, I need to use some joblib functions. However, when I run the program, I get the error message: sklearn.externals.joblib is deprecated in 0.21 and will be removed in 0.23.
Apparently the library has been updated in this Github repo but I did not have success installing the library with the pip install command
I did a test just to install the setup file
pip install https://github.com/dsxuser/scikit-learn/setup.py/0.20.x.zip
but i got 404 error.
What I need is to update all the joblib library in that branch.
Does anyone know how to properly install it?
That's not an error, that's a warning. It tells you that you shouldn't use sklearn.externals.joblib anymore, if you want your code to be compatible with later versions of scikit-learn. Should means that you still can, as long as you do NOT upgrade scikit-learn to 0.23 or later.
The way to make your code ready for later versions of scikit-learn is to not use the deprecated sklearn.externals.joblib, but to use joblib directly instead. It's not pre-installed, so you can do one of these:
conda install joblib
pip install joblib
You didn't mention what part of Watson Studio you are using. If it's notebooks without Spark, the preferred way to install packages is with conda. You can define a custom environment with this customization:
dependencies:
- joblib=0.13.2
or else you can call conda from a notebook cell:
!conda install joblib=0.13.2
If you're using some other part of Watson Studio, give conda a try, and if it doesn't work, switch to pip. Note that pip expects == instead of = before the version number. Specifying the version number protects you from surprises when new versions of joblib are released.
I've installed pygrib by using conda install -c conda-forge pygrib and no issues were raised. However, when importing it in order to use it I get this message:
ImportError: dlopen(/Users/andrea1994/anaconda3/lib/python3.6/site-packages/pygrib.cpython-36m-darwin.so, 2): Library not loaded: #rpath/libpng16.16.dylib
Referenced from: /Users/andrea1994/anaconda3/lib/python3.6/site-packages/pygrib.cpython-36m-darwin.so
Reason: Incompatible library version: pygrib.cpython-36m-darwin.so requires version 51.0.0 or later, but libpng16.16.dylib provides version 49.0.0
I've gone through several procedures that were thought to solve similar issues but none worked (updating libpng, uninstalling and installing back Anaconda,...). Does anyone have any clue? I'm not an expert in this field: most of the times I manage to get things working, but as you see sometimes I fail. Thank you!
I know this is old, but I had the same issue and finally was able to import pygrib after I started a clean environment, installed from conda
conda install -c conda-forge pygrib and then installed jasper, even though I believe it is installed with the pygrib install I am not sure if the correct one is installed or what.
conda install jasper -c conda-forge
sudo python -m pip install pygrib
When installing Theano anaconda automatically tries to install pygpu despite this being an optional dependency. I have deleted the .theanorc file from my windows user directory.
Also when running my application Theano tries to load from the GPU. It's like it remembers somehow?
conda install theano
Fetching package metadata .............
Solving package specifications: .
Package plan for installation in environment
C:\Users\zebco\Miniconda3\envs\py35:
The following NEW packages will be INSTALLED:
libgpuarray: 0.6.9-vc14_0
pygpu: 0.6.9-py36_0
theano: 0.9.0-py36_0
Proceed ([y]/n)?
As you can see I've only specified to install theano yet conda wants to install everything including optional dependancies.
Your assumption that pygpu is optional is dependent on the package manager you are using.
Regular Python (pip)
If you are using a direct Python install (obtained using brew or Python site) then you would be using pip to install theano. This basically comes from
https://pypi.python.org/pypi/Theano/1.0.0
If you download the file and unzip it. Open setup.py, you will see below lines
install_requires=['numpy>=1.9.1', 'scipy>=0.14', 'six>=1.9.0'],
So they are set as the dependencies for this package. Which means when you install theano you will also get numpy, scipy and six.
Anaconda Python (conda)
Now coming to Anaconda python. Anaconda doesn't use a package format that PyPI or pip uses. It uses its own format. In case of Anaconda you should be using conda to install the packages you need and not pip.
Conda has channels which is nothing but a repository which has some packages available. You can install a package from any channel using below
conda install -c <channel-name> <package-name>
The default channel is conda-forge. If you look at the theano package over there
https://anaconda.org/conda-forge/theano/files
And download and extract it. There will be a info/recipe/meta.yml file. You will notice below content in the same
requirements:
build:
- ca-certificates 2017.7.27.1 0
- certifi 2017.7.27.1 py36_0
- ncurses 5.9 10
- openssl 1.0.2l 0
- python 3.6.2 0
- readline 6.2 0
- setuptools 36.3.0 py36_0
- sqlite 3.13.0 1
- tk 8.5.19 2
- xz 5.2.3 0
- zlib 1.2.11 0
run:
- python
- setuptools
- six >=1.9.0
- numpy >=1.9.1
- scipy >=0.14
- pygpu >=0.6.5,<0.7
Which specifies that if you want to run this package then pygpu is also on of its dependencies. So conda downloads pygpu as a dependency which you though was optional (which is probably true if you were using regular python and pip)
Update:
Usually, 'Optional Dependency' is an oxymoron. Something optional is not a dependency, a dependency is a software package another piece of software depends on to function for features.
One may get by without a dependency if the dependency does not interact with the package except for one atomized feature which is not being used. As a beginner I would suggest you not take this path.
I am not super familiar with Theano, but Theano can use the system's GPU to speed up its computations, and it seems to me pygpu and gpulibarray are what enable this functionality. Which means it is not optional.
I believe pygpu is 'optional' if you do not wish to use the GPU for speeding up computation (only done if the GPU is powerful enough to be useful for this).
The --no-deps command above allows you to install a package without its dependencies but that is rarely wise, unless one really knows what they are doing. As a beginner I would not recommend you go down this path yet. Conda was designed specifically to ensure scientific packages are easily managed with all necessary stuff installed without any fuss or muss. pip is a general python package manager, but is not built specifically for scientific packages.
If you wish to install theano without installing its dependencies, then you have one of three options:
use conda install theano --no-deps.
Install it using pip instead of conda, using pip install theano. This will install theano, numpy, scipy and six but not pygpu and libgpuarray.
Create a custom conda build file for Theano. Documentation is at:
https://conda.io/docs/user-guide/tasks/build-packages/index.html
Original Answer:
You probably know this already but, use this command instead:
conda install theano --no-deps
This does not install dependencies of the package. If you already have the essential dependencies installed, as it would seem, this should work out for you.
libgpuarray is a dependency of pygpu. With this command switch neither will be installed.
Can you share the .yaml file that you edited?
i'm trying to install pydot in python3 and i came up with some questions:
The packages referenced by pip3.3 are the same referenced by pip2.7 or there is a different repository for the packages ?
How does all the packaging/distribution work in python ?
What should i do for installing pydot through pip ?
Actually the creator say that python3 is not supported, but pydot is listed in pip3.3
A fork of pydot (https://bitbucket.org/prologic/pydot) working on Python3 exists, why it is not listed in pip?
Can I install pydot through pip?
Almost all package/program distribution in python is done with distutils. This is very good documented in the python documentation.
To your specific problem: pip usually searches the PyPI for the package and then downloads a distribution of that package. Most often this is a source-package and needs to be byte-compiled. If that succeeds the package is most probably compatible to the python version you're using, if it's not you will most probably get SyntaxErrors or something like that.
As i know PyPI has no other, ultimate-sure version classifiers.
So the most sure way to tell if the package is compatible, is to try to install it and then try if it works.