PyTorch installation from source into a second environment breaks - pytorch

I installed pytorch from source and I first installed into an environment and it installed. Now with the same git repo cloned into a second environment it returns:
$ python setup.py install
Building wheel torch-1.13.0a0+gitf50a248
-- Building version 1.13.0a0+gitf50a248
cmake --build . --target install --config Release
No such file or directory
CMake Error: Generator: execution of make failed. Make command was: /home/me/miniconda3/envs/torch_source/bin/ninja install &&
I am using a different environment than torch_source but it tries to employ it which is now not there.

Related

Building kobuki_ros and kobuki_interfaces return rosidl_generate_interfaces error

To replicate full error
Requirements
Ubuntu 22.04 LTS
ROS2 HUMBLE
PYTHON3
Installation/Setup guide
Installing pyenv:
https://realpython.com/intro-to-pyenv/#virtual-environments-and-pyenv
If python3 version is < 3.8,skip this step
After you are done installing pyenv:
pyenv install -v 3.7.16
Each version of python installed will be located in pyenv directory
ls ~/.pyenv/versions
You can uninstall pythons by, don't do this
pyenv uninstall <version>
Check all python versions by
pyenv versions
For more information about the usage of pyenv, you can read the website from the link above. It provides a very detailed walkthrough pyenv
IF your python version is >3.8
Remember to source ros2
Make the workspace and localize python
mkdir turtlebot_ws && cd turtlebot_ws
pyenv local 3.7.16 #change local python for this folder
python -V
Output should be
Python 3.7.16
In this folder, download script to setup workspace for kobuki
#a virtual environment launcher that will fetch build tools from pypi (colcon, vcstools)
wget https://raw.githubusercontent.com/kobuki-base/kobuki_documentation/release/1.0.x/resources/venv.bash || exit 1
#custom build configuration options for eigen, sophus
wget https://raw.githubusercontent.com/kobuki-base/kobuki_documentation/release/1.0.x/resources/colcon.meta || exit 1
#list of repositories to git clone
wget https://raw.githubusercontent.com/kobuki-base/kobuki_documentation/release/1.0.x/resources/kobuki_standalone.repos || exit 1
After that create a virtual env with this local python version
pyenv virtualenv 3.7.16 .venv
### pyenv virtualenv <versionname> <projectname>
Activate the virtual env
pyenv activate .venv
Then install kobuki from source
mkdir src
#vcs handles distributed fetching of repositories listed in a .repos file
vcs import ./src < kobuki_standalone.repos || exit 1
Now go into the src folder and delete ecl_lite, and clone in stonier version
cd src
rm -rf ecl_lite
git clone https://github.com/stonier/ecl_lite.git
Also clone in two other repo and one more dependencies
git clone https://github.com/kobuki-base/kobuki_ros_interfaces.git
git clone https://github.com/kobuki-base/kobuki_ros.git
pip install catkin_pkg lark empy
Now we can go back to root workspace and build
cd ..
# build everything
colcon build --merge-install --cmake-args -DBUILD_TESTING=OFF
# disable any unused cmake variable warnings (e.g. sophus doesn't use BUILD_TESTING)
colcon build --merge-install --cmake-args -DBUILD_TESTING=OFF --no-warn-unused-cli
# build a single package
colcon build --merge-install --packages-select kobuki_core --cmake-args -DBUILD_TESTING=OFF
# build everything, verbosely
VERBOSE=1 colcon build --merge-install --event-handlers console_direct+ --cmake-args -DBUILD_TESTING=OFF
# build release with debug symbols
colcon build --merge-install --cmake-args -DBUILD_TESTING=OFF -DCMAKE_BUILD_TYPE=RelWithDebInfo
Then update source workspace and deactivate venv
update the source workspace
vcs pull ./src
source install/setup.bash
pyenv .venv deactivate
Error
--- stderr: kobuki_ros_interfaces
CMake Error at /opt/ros/humble/share/rosidl_adapter/cmake/rosidl_adapt_interfaces.cmake:42 (get_executable_path):
Unknown CMake command "get_executable_path".
Call Stack (most recent call first):
/opt/ros/humble/share/rosidl_cmake/cmake/rosidl_generate_interfaces.cmake:130 (rosidl_adapt_interfaces)
CMakeLists.txt:55 (rosidl_generate_interfaces)
---
Failed <<< kobuki_ros_interfaces [0.64s, exited with code 1]
Aborted <<< ecl_eigen [0.05s]
Aborted <<< ecl_config [0.07s]
Aborted <<< ecl_mpl [0.07s]
Aborted <<< ecl_command_line [0.07s]
I tried building kobuki_ros_interfaces alone and it works. It fails when I try to build everything together.
The error you are getting is verbose enough for you to understand what is wrong i.e.:
--- stderr: kobuki_ros_interfaces
CMake Error at /opt/ros/humble/share/rosidl_adapter/cmake/rosidl_adapt_interfaces.cmake:42 (get_executable_path):
Unknown CMake command "get_executable_path".
Judging by this the CMake file which is called from within the [...]/rosidl_generate_interfaces.cmake is calling a function named get_executable_path. You didn't share with us the rosidl_adapt_interfaces.cmake so I had to google it.
Here is the link for the actual .cmake file that is causing the issues.
The lines that you should be interested in are lines 40-42:
find_package(ament_cmake_core REQUIRED) # for get_executable_path
find_package(Python3 REQUIRED COMPONENTS Interpreter)
get_executable_path(python_interpreter Python3::Interpreter CONFIGURE)
You are most likely missing the ament_cmake_core package that is required for the get_executable_path as the author of said .cmake file commented.
Try to get the missing package or make sure that the one that you use has the required function that is missing.

How to Have Pycrypto at Docker Properly Working?

I am using Pychromeless repo with success at AWS lambda.
But now I need to use pycrypto dependency, but I am getting
configure: error: no acceptable C compiler found in $PATH
 
when running make docker-build
(after placing pycrypto==2.6.1 at requirements.txt file).
There's this thread and someone said about the same problem:
 
"The gcc compiler is not in your $PATH. It means either you dont have gcc installed or it's not in your $PATH variable".
So tried placing apt-get install build-essential at Dockerfile, but I got
/bin/sh: apt-get: command not found
Then, I tried with yum install gcc
only to get
The command '/bin/sh -c yum install gcc' returned a non-zero code: 1
Docker-lambda [info page] (https://hub.docker.com/r/lambci/lambda/) says:
This project consists of a set of Docker images for each of the supported Lambda runtimes.
There are also a set of build images that include packages like gcc-c++, git, zip and the aws-cli for compiling and deploying.
So I guess I shouldn't be needing to install gcc. Maybe the gcc compiler is not in $PATH, but I don't know what to do to fix that.
Here is the dockerfile
FROM lambci/lambda:python3.6
MAINTAINER tech#21buttons.com
USER root
ENV APP_DIR /var/task
WORKDIR $APP_DIR
COPY requirements.txt .
COPY bin ./bin
COPY lib ./lib
RUN mkdir -p $APP_DIR/lib
RUN pip3 install -r requirements.txt -t /var/task/lib
Any help on solving this?
Well, well, well...today was a lucky day for me.
So simple: all I had to do was replace
pycrypto==2.6.1
by
pycryptodome
on my requirements.txt file.
This thread says: "Highly recommend NOT to use pycrypto. It is old and not maintained and contains many vulnerabilities. Use pycryptodome instead - it is compatible and up to date".
And that's it! Docker builds just fine with pycryptodome.

Building cv_bridge Package with ROS Kinetic and Python3 ignoring Cmake Arguments

I'm trying to integrate a ROS package into our system for a research project and the cv_bridge package and python3 is needed in order to get the package working. Currently I can't get the cv_bridge package to build in python3 despite multiple steps, constantly builds in python2 directory.
Working in Ubuntu 16.04 with ROS kinetic. Using python3.5
Error Message:
[ERROR] [1563897986.999724]: bad callback: <function color_callback at 0x7f00ffa06598>
Traceback (most recent call last):
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/topics.py", line 750, in _invoke_callback
cb(msg)
File "/home/rival/Documents/Repos/ROS/src/rcnn_apple_detector/detection.py", line 84, in color_callback
image = bridge.imgmsg_to_cv2(image_msg, "bgr8")
File "/home/rival/Documents/Repos/ROS/src/vision_opencv/cv_bridge/python/cv_bridge/core.py", line 163, in imgmsg_to_cv2
dtype, n_channels = self.encoding_to_dtype_with_channels(img_msg.encoding)
File "/home/rival/Documents/Repos/ROS/src/vision_opencv/cv_bridge/python/cv_bridge/core.py", line 99, in encoding_to_dtype_with_channels
return self.cvtype2_to_dtype_with_channels(self.encoding_to_cvtype2(encoding))
File "/home/rival/Documents/Repos/ROS/src/vision_opencv/cv_bridge/python/cv_bridge/core.py", line 91, in encoding_to_cvtype2
from cv_bridge.boost.cv_bridge_boost import getCvType
ImportError: dynamic module does not define module export function (PyInit_cv_bridge_boost)
I've tried the steps in this previous question's answer:
Unable to use cv_bridge with ROS Kinetic and Python3
"You are right, you should build cv_bridge with python3.
You can do it with passing -DPYTHON_EXECUTABLE=/usr/bin/python3
-DPYTHON_INCLUDE_DIR=/usr/include/python3.5m -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.5m.so args to cmake. Or, if you are using catkin to build packages, you can do next
steps:"
The only variations to the steps is I have to use catkin_make because issues with a bunch of older packages I'm using aren't currently working with catkin build. I don't know if that is causing it or not. But I pass the cmake arguments into my workspace and still it targets to the ROS python2 directory.
Location cv_bridge is currently still being built:
/opt/ros/kinetic/lib/python2.7/dist-packages
Version:
apt-cache show ros-kinetic-cv-bridge | grep Version
Version: 1.12.8-0xenial-20190320-142632-0800
catkin config:
-------------------------------------------------------------------------------------------------------------------------------------------------
Profile: default
Extending: [env] /home/rival/Documents/Repos/ROS/devel:/opt/ros/kinetic
Workspace: /home/rival/Documents/Repos/ROS
---------------------------------------------------------------------------------------------------------------------------------------------------
Build Space: [exists] /home/rival/Documents/Repos/ROS/build
Devel Space: [exists] /home/rival/Documents/Repos/ROS/devel
Install Space: [missing] /home/rival/Documents/Repos/ROS/install
Log Space: [missing] /home/rival/Documents/Repos/ROS/logs
Source Space: [exists] /home/rival/Documents/Repos/ROS/src
DESTDIR: [unused] None
---------------------------------------------------------------------------------------------------------------------------------------------------
Devel Space Layout: linked
Install Space Layout: merged
---------------------------------------------------------------------------------------------------------------------------------------------------
Additional CMake Args: -DPYTHON_EXECUTABLE=/usr/bin/python3 -DPYTHON_INCLUDE_DIR=/usr/include/python3.5m -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.5m.so
Additional Make Args: None
Additional catkin Make Args: None
Internal Make Job Server: True
Cache Job Environments: False
---------------------------------------------------------------------------------------------------------------------------------------------------
Whitelisted Packages: None
Blacklisted Packages: None
---------------------------------------------------------------------------------------------------------------------------------------------------
Workspace configuration appears valid.
Exact steps taken:
sudo apt-get install python-catkin-tools python3-dev python3-catkin-pkg-modules python3-numpy python3-yaml ros-kinetic-cv-bridge
catkin clean
catkin config -DPYTHON_EXECUTABLE=/usr/bin/python3 -DPYTHON_INCLUDE_DIR=/usr/include/python3.5m -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.5m.so
cd src/vision_opencv/
git checkout 1.12.8
catkin_make cv_bridge
source devel/setup.bash --extend
I expect cv_bridge to be built with the cmake arguments but it still gets targeted back into the python2 directory. It seems the cmake args are being ignored/not targeting correctly.
I followed this blog post.
Essentially you need to create new catkin workspace so that its configuration doesn't mess with your original (python2) workspace.
Install python3 and dependencies. I like using --user without sudo to make resolving dependency conflicts a little easier.
sudo apt-get install python-catkin-tools python3-dev python3-numpy
sudo apt-get install python3-pip python3-yaml
pip3 install rospkg catkin_pkg --user
Make a new workspace
mkdir py3_catkin_ws
cd py3_catkin_ws
mkdir src
Init catkin with python3 configured (use your python3x version)
catkin config -DPYTHON_EXECUTABLE=/usr/bin/python3 -DPYTHON_INCLUDE_DIR=/usr/include/python3.6m -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.6m.so
catkin config --install
Clone and catkin build vision_opencv in your new workspace. Continue as usual. Python3 should now be able to locate cv_bridge. The blog post suggests to source install/setup.bash --extend but it wasn't necessary in my case.
I was having the same issue under ROS Melodic, Ubuntu 18.04 and using conda to separate the environments. I started to get some strange errors, like NumPy not being found. At this point it was too much of a hassle and a possible waste of time.
This was especially annoying as I could launch my ROS node using Python 3 and OpenCV worked perfectly well, only the transfer from ROS to CV was missing...
I ended up just copying the lines from cv_brdige that did the translation to my package and made the translation without importing cv_bridge package.
As ROS is getting more Python 3 friendly, and packages will hopefully follow, this will not be an issue in the future.

How to fix "CMake Error at CMakeLists.txt (bison_target_or_gen):"

I am setting up clingo-master in Ubuntu, which is an ASP app. But I when I followed this guide to cmake.
cmake -H./libgringo -B./gringoBin -DCMAKE_BUILD_TYPE=Release
I met the error:
CMake Error at CMakeLists.txt:70 (bison_target_or_gen):
unknown CMake command "bison_target_or_gen
I have installed bison, but the error was still there. I am new to linux, I have no idea what to do.
This is what the guide says:
When cloning the git repository, do not forget to update the submodules (with source releases, you can skip this step):
git submodule update --init --recursive
To build gringo, clingo, and reify in their default configurations in release mode, run:
cmake -H<SOURCE_DIR> -B<BUILD_DIR> -DCMAKE_BUILD_TYPE=Release
cmake --build <BUILD_DIR>
The resulting binaries and shared libraries will be in <BUILD_DIR>/bin and are ready to use.
To install all binaries and development files under cmake's install prefix (see the build options), run:
cmake --build <BUILD_DIR> --target install
To run the tests, enable option CLINGO_BUILD_TESTS (see build options) and run:
cmake --build <BUILD_DIR> --target test

Pybind11 linux building tests failure - 'Could not find package configuration file pybind11Config.cmake and pybind11-config.cmake'

I'm trying to build pybind11 tests on a linux box. I downloaded the source and do the following -
cd pybind11-master
cd tests
mkdir build
cd build
cmake ..
I get the errors -
` Could not find a package configuration file provided by "pybind11" with any of
the following names:
pybind11Config.cmake
pybind11-config.cmake
Add the installation prefix of "pybind11" to CMAKE_PREFIX_PATH or set "pybind11_DIR" to a directory containing one of the above files. If "pybind11" provides a separate development package or SGK, be sure it has been installed
`
I followed this link - http://pybind11.readthedocs.io/en/master/basics.html and did as per the instruction in section 'Compiling Test Cases for linux/mac'
I am not sure how to proceed. Any pointers are helpful.
I already hit the same problem after installing pybind11 by pip install pybind11.
I'll post here my solution in case someone ends up here.
I installed following this link, everything went ok, and the file needed was there.
Basically:
$ git clone https://github.com/pybind/pybind11.git
$ cd pybind11
$ mkdir build
$ cd build
$ cmake ..
$ make -j`nproc`
$ make check
$ make -j`nproc` install

Resources