In my system Python 3.8 version already install, but I want create virtual env for version 3.7 on vscode so how can I do>
To work on different environments, you can use Anaconda Prompt.
You can follow the steps given in this: https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
Related
I am taking a beginner course in Python so I downloaded and installed Anaconda on a
Windows 10 OS. It is possible to install the last version of Python, say 3.9.x or 3.10 to be run on VSCode. How do I manage to avoid conflicts between them. Can anyone tell how should I do because I have to use that text editor. Thanks in advance.
Yes, but Anaconda includes an environment management tool, Conda, and so it would be more idiomatic to use that to create a new environment, rather than installing a native, system- or user-level Python interpreter. If you are using Anaconda, then from Anaconda Command Prompt try
conda create -n py39 python=3.9
or
conda create -n py310 python=3.10
Note the argument used in the -n (or --name) is arbitrary - feel free to name environments as you'd like.
I have two virtual environments. One has python 3.7.9. the other had 3.8.5. I have downloaded torch 1.6.0 in the 1st virtual env. But when trying to install the same torch version in the second virtual env, it downloads the entire package again instead of installing from cache.
Is there any way to force download from cache as I want the same torch==1.6.0 in both the environments and its frustrating to download it everytime.
Have a look at the listed files...
https://pypi.org/project/torch/#files
There are different wheels for different Python versions.
This makes sense, as eg the syntax changes between Python versions.
when I try to check python version in Centos7 by typing command
1) python --version
-bash: /usr/local/bin/python3.6: No such file or directory
OR
2) which python
"alias python='/usr/local/bin/python3.6'.
Could anyone explain me why it shows like this instead of showing python version?
Thanks.
In order to sum up, for future people who will look in this question:
when installing different versions of python on linux using package manager (in this case yum because you are using CentOS, but it might be apt or something else) linux installs the side-by-side, meaning you have all of the versions installed together.
If you want to use a specific version other than your linux distribution diffault one you need to call it explicitly (i.e. python3.6 or python3.8)
Make sure you are looking for the wanted python version on the right path.
When you want to run some version of python after installing it I suggest you to just write python in the bash and just hit tab+tab and the bash will suggest all the installed versions in the $PATH.
I am using Anaconda installers and have Python 2 installed. I however, do want to use both Python 2 and 3 interchangeably in Jupyter Notebook. I read through the instructions provided in Installing the IPython kernel.
Does it mean that I do not need to install Python 3, just the kernel to start using both Python 2 and 3 in Jupyter?
Conda makes it a breeze. The full documentation should guide to manage both Python 3 and Python 2.
In order to run an optimization problem we set up Gurobi 6.0.4 together with
Anaconda (Version 2.2.0) Python (Python 2.7.9.) on
Linux CentOS release 6.6 (Final) with the 2.6.32-504.16.2.el6.x86_64 Kernel
Following the installation guidelines of Gurobi (listed here: http://www.gurobi.com/documentation/6.0/quickstart_linux.pdf)
everything worked out in the first step. Gurobi was installed, could obtain a license. Also the PATH variables have been set (in the .bashrc) according to the manual, with a little extension for the referal to anaconda python (and not the other local Versions of python (being 2.7 and 3.4):
export GUROBI_HOME="/opt/gurobi604/linux64"
export PATH="${PATH}:${GUROBI_HOME}/bin:${PATH}:opt/anaconda/bin"
export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:${GUROBI_HOME}/lib"
Following the procedure we executed: python2.7 setup.py install in the respective directory /opt/gurobi604/linux64. After this usually you could run the import gurobipy command in the python interpreter wihtout errors. For older Versions of Gurobi (as 5.6.3) this works out very well.
For 6.0.4 though we constantly receive the error:
ImportError: /opt/anaconda/lib/python2.7/site-packages/gurobipy/gurobipy.so: undefined symbol: _Py_FalseStruct
This is very reproducible, no matter if we put anaconda also in the global path, and check the bash for any overwriting of the environment variables, which is not the case.
On Windows 8 the Gurobi 6.0.4 and Anaconda Python 2.2.0 work together without any problems.
Also applying hints from here: Python Module Error on Linux did not work out.
Did anyone else experience these problems with this tooling combination? thx.
The error message indicates that you use the Python module for version 3.4 in your Python 2.7 package directory. This can happen if you do not clean your Python module build directory between builds. Please try the following:
Completely remove the 2.7 package from your Python 2.7 installation (e.g. remove /opt/anaconda/lib/python2.7/site-packages/gurobipy)
Completely remove the Python module build directory from your Gurobi installation (e.g. /opt/gurobi604/linux64/build)
Re-run the build process for the Python 2.7 module (e.g. run "python2 setup.py install" in /opt/gurobi604/linux64)
Please note that CentOS is currently a non-supported platform for Gurobi.
Thank you for the hint, I think we tried that, but did not finish the procedure in this way. We tried to clean the system but in that particular case still hat both python Versions (due to other applications that use 3.4) on the machine. Our solution in this case was just to reinstall everything clean on a Ubuntu 14.04 VM. Since then no further problems occured. (I know not the cleanest solution.)
We had some similar issues when we updated to Gurobi 6.5, but that could be solved when corrctly addressing the usual path issues.
Thank you in any case for the reply, I think this really will help us with the next, then clean deployment :-)