gcc on Lubuntu not working - linux

I am trying to use the gcc on Lubuntu.
I have done the following installs but get the message "The program 'gcc' can be found in the following packages"
sudo apt-get install gcc
sudo apt-get install build-essential
What I am doing wrong, is there a way I can verify the gcc was correctly installed?

I bet its not in your path variable
env | grep path
If that doesn't work you need to add the location of gcc to your path environment variable by the looks of it.
To find gcc try:
find . -name gcc
Then to add to path
export PATH=$PATH:/path/to/gcc
echo $PATH;
If that fails, try this guide it may be an issue with your calls to apt-get https://askubuntu.com/questions/240919/how-to-install-gcc-4-7-on-lubuntu-11-10

Related

pip install fasttext gives fatal error on Linux [duplicate]

I am trying to build a shared library using a C extension file but first I have to generate the output file using the command below:
gcc -Wall utilsmodule.c -o Utilc
After executing the command, I get this error message:
> utilsmodule.c:1:20: fatal error: Python.h: No such file or directory
compilation terminated.
I have tried all the suggested solutions over the internet but the problem still exists. I have no problem with Python.h. I managed to locate the file on my machine.
Looks like you haven't properly installed the header files and static libraries for python dev. Use your package manager to install them system-wide.
For apt (Ubuntu, Debian...):
sudo apt-get install python-dev # for python2.x installs
sudo apt-get install python3-dev # for python3.x installs
For yum (CentOS, RHEL...):
sudo yum install python-devel # for python2.x installs
sudo yum install python3-devel # for python3.x installs
For dnf (Fedora...):
sudo dnf install python2-devel # for python2.x installs
sudo dnf install python3-devel # for python3.x installs
For zypper (openSUSE...):
sudo zypper in python-devel # for python2.x installs
sudo zypper in python3-devel # for python3.x installs
For apk (Alpine...):
# This is a departure from the normal Alpine naming
# scheme, which uses py2- and py3- prefixes
sudo apk add python2-dev # for python2.x installs
sudo apk add python3-dev # for python3.x installs
For apt-cyg (Cygwin...):
apt-cyg install python-devel # for python2.x installs
apt-cyg install python3-devel # for python3.x installs
Note: python3-dev does not automatically cover all minor versions of python3, if you are using e.g. python 3.8 you may need to install python3.8-dev.
On Ubuntu, I was running Python 3 and I had to install
sudo apt-get install python3-dev
If you want to use a version of Python that is not linked to python3, install the associated python3.x-dev package. For example:
sudo apt-get install python3.5-dev
For Python 3.7 and Ubuntu in particular, I needed
sudo apt install libpython3.7-dev
.
I think at some point names were changed from pythonm.n-dev to this.
for Python 3.6, 3.8 through 3.10 (and counting…) similarly:
sudo apt install libpython3.6-dev 
sudo apt install libpython3.8-dev 
sudo apt install libpython3.9-dev
sudo apt install libpython3.10-dev
Two things you have to do.
Install development package for Python, in case of Debian/Ubuntu/Mint it's done with command:
sudo apt-get install python-dev
Second thing is that include files are not by default in the include path, nor is Python library linked with executable by default. You need to add these flags (replace Python's version accordingly):
-I/usr/include/python2.7 -lpython2.7
In other words your compile command ought to be:
gcc -Wall -I/usr/include/python2.7 -lpython2.7 utilsmodule.c -o Utilc
on Fedora run this for Python 2:
sudo dnf install python2-devel
and for Python 3:
sudo dnf install python3-devel
If you are using tox to run tests on multiple versions of Python, you may need to install the Python dev libraries for each version of Python you are testing on.
sudo apt-get install python2.6-dev
sudo apt-get install python2.7-dev
etc.
Make sure that the Python dev files come with your OS.
You should not hard code the library and include paths. Instead, use pkg-config, which will output the correct options for your specific system:
$ pkg-config --cflags --libs python2
-I/usr/include/python2.7 -lpython2.7
You may add it to your gcc line:
gcc -Wall utilsmodule.c -o Utilc $(pkg-config --cflags --libs python2)
For me, changing it to this worked:
#include <python2.7/Python.h>
I found the file /usr/include/python2.7/Python.h, and since /usr/include is already in the include path, then python2.7/Python.h should be sufficient.
You could also add the include path from command line instead - gcc -I/usr/lib/python2.7 (thanks #erm3nda).
Solution for Cygwin
You need to install the package python2-devel or python3-devel, depending on the Python version you're using.
You can quickly install it using the 32-bit or 64-bit setup.exe (depending on your installation) from Cygwin.com.
Example (modify setup.exe's filename and Python's major version if you need):
$ setup.exe -q --packages=python3-devel
You can also check my other answer for a few more options to install Cygwin's packages from the command-line.
In AWS API (centOS) its
yum install python27-devel
AWS EC2 install running python34:
sudo yum install python34-devel
If you use a virtualenv with a 3.6 python (edge right now), be sure to install the matching python 3.6 dev sudo apt-get install python3.6-dev, otherwise executing sudo python3-dev will install the python dev 3.3.3-1, which won't solve the issue.
In my case, what fixed it in Ubuntu was to install the packages libpython-all-dev (or libpython3-all-dev if you use Python 3).
It's not the same situation, but it also works for me and now I can use SWIG with Python3.5:
I was trying to compile:
gcc -fPIC -c existe.c existe_wrap.c -I /usr/include/python3.5m/
With Python 2.7 works fine, not with my version 3.5:
existe_wrap.c:147:21: fatal error: Python.h: No existe el archivo o el
directorio compilation terminated.
After run in my Ubuntu 16.04 installation:
sudo apt-get install python3-dev # for python3.x installs
Now I can compile without problems Python3.5:
gcc -fPIC -c existe.c existe_wrap.c -I /usr/include/python3.5m/
I managed to solve this issue and generate the .so file in one command
gcc -shared -o UtilcS.so
-fPIC -I/usr/include/python2.7 -lpython2.7 utilsmodule.c
I also encountered this error when I was installing coolprop in ubuntu.
For ubuntu 16.04 with python 3.6
sudo apt-get install python3.6-dev
If ever this doesn't work try installing/updating gcc lib.
sudo apt-get install gcc
try apt-file. It is difficult to remember the package name where the missing file resides. It is generic and useful for any package files.
For example:
root#ubuntu234:~/auto# apt-file search --regexp '/Python.h$'
pypy-dev: /usr/lib/pypy/include/Python.h
python2.7-dbg: /usr/include/python2.7_d/Python.h
python2.7-dev: /usr/include/python2.7/Python.h
python3.2-dbg: /usr/include/python3.2dmu/Python.h
python3.2-dev: /usr/include/python3.2mu/Python.h
root#ubuntu234:~/auto#
Now you can make an expert guess as to which one to choose from.
This problem can also arrive when you have different Python versions installed and you use a pip that's not the system's one. In that case, the non-system pip won't find the right version of Python headers.
It happened to me when trying to pip install a package for a Python bundled with an application. As it was not system's python, apt install pythonXX-dev didn't work.
In this case, the solution is to find the right python header:
find / -iname 'Python.h'
In the output, you will see system python headers, and hopefully the one you are looking for, for example:
/usr/include/python3.7m/Python.h
/usr/include/python3.6m/Python.h
/home/ubuntu/workspace/blender-git/lib/linux_centos7_x86_64/python/include/python3.7m/Python.h
/home/ubuntu/miniconda3/pkgs/python-3.8.5-h7579374_1/include/python3.8/Python.h
/home/ubuntu/miniconda3/pkgs/python-3.7.0-h6e4f718_3/include/python3.7m/Python.h
/home/ubuntu/miniconda3/include/python3.8/Python.h
/home/ubuntu/miniconda3/envs/sim/include/python3.7m/Python.h
/home/ubuntu/src/blender-deps/Python-3.7.7/Include/Python.h
/opt/lib/python-3.7.7/include/python3.7m/Python.h
Then, you can set a compiler flag that will get used by gcc when called by pip.
Mine was /home/ubuntu/workspace/blender-git/lib/linux_centos7_x86_64/python/include/python3.7m/Python.h, so I did:
export CPPFLAGS=-I/home/ubuntu/src/blender-deps/Python-3.7.7/Include
pip install <package>
For CentOS 7:
sudo yum install python36u-devel
I followed the instructions here for installing python3.6 on several VMs: https://www.digitalocean.com/community/tutorials/how-to-install-python-3-and-set-up-a-local-programming-environment-on-centos-7
and was then able to build mod_wsgi and get it working with a python3.6 virtualenv
For the OpenSuse comrades out there:
sudo zypper install python3-devel
Here is yet another solution, because none of these solutions worked for me. For reference, I was trying to pip install something on an Amazon Linux AMI base Docker image for Python 3.6.
Non-docker solution:
# Install python3-devel like everyone says
yum -y install python36-devel.x86_64
# Find the install directory of `Python.h`
rpm -ql python36-devel.x86_64 | grep -i "Python.h"
# Forcefully add it to your include path
C_INCLUDE_PATH='/usr/include/python3.6m'
export C_INCLUDE_PATH
Docker solution:
# Install python3-devel like everyone says
RUN yum -y install python36-devel.x86_64
# Find the install directory of `Python.h`, for me it was /usr/include/python3.6m
RUN rpm -ql python36-devel.x86_64 | grep -i "Python.h" && fake_command_so_docker_fails_and_shows_us_the_output
# Since the previous command contains a purposeful error, remove it before the next run
# Forcefully add it to your include path
ARG C_INCLUDE_PATH='/usr/include/python3.6m'
NOTE: If you're getting the error when compiling C++, use CPLUS_INCLUDE_PATH.
Alternatively, you may prefer to use another Docker image. For example, I was trying to install asyncpg~=0.24.0 on python:3.9.4-slim, which generated the same error as you saw. However, when I updated the image to python:3, it worked fine.
If you're using Python 3.6 on Amazon Linux (based on RHEL, but the RHEL answers given here didn't work):
sudo yum install python36-devel
You must install the Python development files on your operating system if the Python provided with your operating system does not come with them. The many answers on this question show the myriad ways this can be achieved on different systems.
When you have done so, the problem is telling the compiler where they're located and how to compile against them. Python comes with a program called python-config. For compilation, you need the --includes output and for linking a program against the Python library (embedding Python into your program) the --ldflags output. Example:
gcc -c mypythonprogram.c $(python3-config --includes)
gcc -o program mypythonprogram.o $(python3-config --ldflags)
The python-config program can be named after the Python versions - on Debian, Ubuntu for example these can be named python3-config or python3.6-config.
Sure python-dev or libpython-all-dev are the first thing to (apt )install, but if that doesn't help as was my case, I advice you to install the foreign Function Interface packages by sudo apt-get install libffi-dev and sudo pip install cffi.
This should help out especially if you see the error as/from c/_cffi_backend.c:2:20: fatal error: Python.h: No such file or directory.
try locate your Python.h:
gemfield#ThinkPad-X1C:~$ locate Python.h
/home/gemfield/anaconda3/include/python3.7m/Python.h
/home/gemfield/anaconda3/pkgs/python-3.7.6-h0371630_2/include/python3.7m/Python.h
/usr/include/python3.8/Python.h
if not found, then install python-dev or python3-dev; else include the correct header path for compiler:
g++ -I/usr/include/python3.8 ...
I am on Ubuntu. I have installed all packages as was recommended in some answers.
sudo apt-get install python-dev # for python2.x installs
sudo apt-get install python3-dev # for python3.x installs
I still had this problem, the line:
#include "Python.h"
And some others, I can edit them manually, it is a bad practice.
I know the secret now, it comes from the cython source code. I have the file. It compiles without errors. That is the file.
Change PYTHON to python version you have, python/python3. Change FILE to your c-filename. The name of the makefile file should be Makefile. Run the the file with the command:
make all
Makefile for creating our standalone Cython program
FILE := file.c
PYTHON := python3
PYVERSION := $(shell $(PYTHON) -c "import sys;
print(sys.version[:3])")
PYPREFIX := $(shell $(PYTHON) -c "import sys; print(sys.prefix)")
INCDIR := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_python_inc())")
PLATINCDIR := $(shell $(PYTHON) -c "from distutils import
sysconfig; print(sysconfig.get_python_inc(plat_specific=True))")
LIBDIR1 := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBDIR'))")
LIBDIR2 := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBPL'))")
PYLIB := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBRARY')[3:-2])")
CC := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('CC'))")
LINKCC := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LINKCC'))")
LINKFORSHARED := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LINKFORSHARED'))")
LIBS := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LIBS'))")
SYSLIBS := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('SYSLIBS'))")
.PHONY: paths all clean test
paths:
#echo "PYTHON=$(PYTHON)"
#echo "PYVERSION=$(PYVERSION)"
#echo "PYPREFIX=$(PYPREFIX)"
#echo "INCDIR=$(INCDIR)"
#echo "PLATINCDIR=$(PLATINCDIR)"
#echo "LIBDIR1=$(LIBDIR1)"
#echo "LIBDIR2=$(LIBDIR2)"
#echo "PYLIB=$(PYLIB)"
#echo "CC=$(CC)"
#echo "LINKCC=$(LINKCC)"
#echo "LINKFORSHARED=$(LINKFORSHARED)"
#echo "LIBS=$(LIBS)"
#echo "SYSLIBS=$(SYSLIBS)"
$(FILE:.c=): $(FILE:.c=.o)
$(LINKCC) -o $# $^ -L$(LIBDIR1) -L$(LIBDIR2) -l$(PYLIB)
$(LIBS) $(SYSLIBS) $(LINKFORSHARED)
$(FILE:.c=.o): $(FILE)
$(CC) -c $^ -I$(INCDIR) -I$(PLATINCDIR)
all: $(FILE:.c=)
This error occurred when I attempted to install ctds on CentOS 7 with Python3.6. I did all the tricks mentioned here including yum install python34-devel. The problem was Python.h was found in /usr/include/python3.4m but not in /usr/include/python3.6m. I tried to use --global-option to point to include dir (pip3.6 install --global-option=build_ext --global-option="--include-dirs=/usr/include/python3.4m" ctds). This resulted in a lpython3.6m not found when linking ctds.
Finally what worked was fixing the development environment for Python3.6 needs to correct with the include and libs.
yum -y install https://dl.iuscommunity.org/pub/ius/stable/CentOS/7/x86_64/python36u-libs-3.6.3-1.ius.centos7.x86_64.rpm
Python.h needs to be in your include path for gcc. Whichever version of python is used, for example if it's 3.6, then it should be in /usr/include/python3.6m/Python.h typically.
Sometimes even after installing python-dev the error persists,
Check for the error if it is 'gcc' missing.
First download as stated in https://stackoverflow.com/a/21530768/8687063, then install gcc
For apt (Ubuntu, Debian...):
sudo apt-get install gcc
For yum (CentOS, RHEL...):
sudo yum install gcc
For dnf (Fedora...):
sudo dnf install gcc
For zypper (openSUSE...):
sudo zypper in gcc
For apk (Alpine...):
sudo apk gcc
It often appear when you trying to remove python3.5 and install python3.6.
So when using python3 (which python3 -V => python3.6) to install some packages required python3.5 header will appear this error.
Resolve by install python3.6-dev module.
This means that Python.h isn't in your compiler's default include paths. Have you installed it system-wide or locally? What's your OS?
You could use the -I<path> flag to specify an additional directory where your compiler should look for headers. You will probably have to follow up with -L<path> so that gcc can find the library you'll be linking with using -l<name>.

Unable to install Twisted [duplicate]

I am trying to build a shared library using a C extension file but first I have to generate the output file using the command below:
gcc -Wall utilsmodule.c -o Utilc
After executing the command, I get this error message:
> utilsmodule.c:1:20: fatal error: Python.h: No such file or directory
compilation terminated.
I have tried all the suggested solutions over the internet but the problem still exists. I have no problem with Python.h. I managed to locate the file on my machine.
Looks like you haven't properly installed the header files and static libraries for python dev. Use your package manager to install them system-wide.
For apt (Ubuntu, Debian...):
sudo apt-get install python-dev # for python2.x installs
sudo apt-get install python3-dev # for python3.x installs
For yum (CentOS, RHEL...):
sudo yum install python-devel # for python2.x installs
sudo yum install python3-devel # for python3.x installs
For dnf (Fedora...):
sudo dnf install python2-devel # for python2.x installs
sudo dnf install python3-devel # for python3.x installs
For zypper (openSUSE...):
sudo zypper in python-devel # for python2.x installs
sudo zypper in python3-devel # for python3.x installs
For apk (Alpine...):
# This is a departure from the normal Alpine naming
# scheme, which uses py2- and py3- prefixes
sudo apk add python2-dev # for python2.x installs
sudo apk add python3-dev # for python3.x installs
For apt-cyg (Cygwin...):
apt-cyg install python-devel # for python2.x installs
apt-cyg install python3-devel # for python3.x installs
Note: python3-dev does not automatically cover all minor versions of python3, if you are using e.g. python 3.8 you may need to install python3.8-dev.
On Ubuntu, I was running Python 3 and I had to install
sudo apt-get install python3-dev
If you want to use a version of Python that is not linked to python3, install the associated python3.x-dev package. For example:
sudo apt-get install python3.5-dev
For Python 3.7 and Ubuntu in particular, I needed
sudo apt install libpython3.7-dev
.
I think at some point names were changed from pythonm.n-dev to this.
for Python 3.6, 3.8 through 3.10 (and counting…) similarly:
sudo apt install libpython3.6-dev 
sudo apt install libpython3.8-dev 
sudo apt install libpython3.9-dev
sudo apt install libpython3.10-dev
Two things you have to do.
Install development package for Python, in case of Debian/Ubuntu/Mint it's done with command:
sudo apt-get install python-dev
Second thing is that include files are not by default in the include path, nor is Python library linked with executable by default. You need to add these flags (replace Python's version accordingly):
-I/usr/include/python2.7 -lpython2.7
In other words your compile command ought to be:
gcc -Wall -I/usr/include/python2.7 -lpython2.7 utilsmodule.c -o Utilc
on Fedora run this for Python 2:
sudo dnf install python2-devel
and for Python 3:
sudo dnf install python3-devel
If you are using tox to run tests on multiple versions of Python, you may need to install the Python dev libraries for each version of Python you are testing on.
sudo apt-get install python2.6-dev
sudo apt-get install python2.7-dev
etc.
Make sure that the Python dev files come with your OS.
You should not hard code the library and include paths. Instead, use pkg-config, which will output the correct options for your specific system:
$ pkg-config --cflags --libs python2
-I/usr/include/python2.7 -lpython2.7
You may add it to your gcc line:
gcc -Wall utilsmodule.c -o Utilc $(pkg-config --cflags --libs python2)
For me, changing it to this worked:
#include <python2.7/Python.h>
I found the file /usr/include/python2.7/Python.h, and since /usr/include is already in the include path, then python2.7/Python.h should be sufficient.
You could also add the include path from command line instead - gcc -I/usr/lib/python2.7 (thanks #erm3nda).
Solution for Cygwin
You need to install the package python2-devel or python3-devel, depending on the Python version you're using.
You can quickly install it using the 32-bit or 64-bit setup.exe (depending on your installation) from Cygwin.com.
Example (modify setup.exe's filename and Python's major version if you need):
$ setup.exe -q --packages=python3-devel
You can also check my other answer for a few more options to install Cygwin's packages from the command-line.
In AWS API (centOS) its
yum install python27-devel
AWS EC2 install running python34:
sudo yum install python34-devel
If you use a virtualenv with a 3.6 python (edge right now), be sure to install the matching python 3.6 dev sudo apt-get install python3.6-dev, otherwise executing sudo python3-dev will install the python dev 3.3.3-1, which won't solve the issue.
In my case, what fixed it in Ubuntu was to install the packages libpython-all-dev (or libpython3-all-dev if you use Python 3).
It's not the same situation, but it also works for me and now I can use SWIG with Python3.5:
I was trying to compile:
gcc -fPIC -c existe.c existe_wrap.c -I /usr/include/python3.5m/
With Python 2.7 works fine, not with my version 3.5:
existe_wrap.c:147:21: fatal error: Python.h: No existe el archivo o el
directorio compilation terminated.
After run in my Ubuntu 16.04 installation:
sudo apt-get install python3-dev # for python3.x installs
Now I can compile without problems Python3.5:
gcc -fPIC -c existe.c existe_wrap.c -I /usr/include/python3.5m/
I managed to solve this issue and generate the .so file in one command
gcc -shared -o UtilcS.so
-fPIC -I/usr/include/python2.7 -lpython2.7 utilsmodule.c
I also encountered this error when I was installing coolprop in ubuntu.
For ubuntu 16.04 with python 3.6
sudo apt-get install python3.6-dev
If ever this doesn't work try installing/updating gcc lib.
sudo apt-get install gcc
try apt-file. It is difficult to remember the package name where the missing file resides. It is generic and useful for any package files.
For example:
root#ubuntu234:~/auto# apt-file search --regexp '/Python.h$'
pypy-dev: /usr/lib/pypy/include/Python.h
python2.7-dbg: /usr/include/python2.7_d/Python.h
python2.7-dev: /usr/include/python2.7/Python.h
python3.2-dbg: /usr/include/python3.2dmu/Python.h
python3.2-dev: /usr/include/python3.2mu/Python.h
root#ubuntu234:~/auto#
Now you can make an expert guess as to which one to choose from.
This problem can also arrive when you have different Python versions installed and you use a pip that's not the system's one. In that case, the non-system pip won't find the right version of Python headers.
It happened to me when trying to pip install a package for a Python bundled with an application. As it was not system's python, apt install pythonXX-dev didn't work.
In this case, the solution is to find the right python header:
find / -iname 'Python.h'
In the output, you will see system python headers, and hopefully the one you are looking for, for example:
/usr/include/python3.7m/Python.h
/usr/include/python3.6m/Python.h
/home/ubuntu/workspace/blender-git/lib/linux_centos7_x86_64/python/include/python3.7m/Python.h
/home/ubuntu/miniconda3/pkgs/python-3.8.5-h7579374_1/include/python3.8/Python.h
/home/ubuntu/miniconda3/pkgs/python-3.7.0-h6e4f718_3/include/python3.7m/Python.h
/home/ubuntu/miniconda3/include/python3.8/Python.h
/home/ubuntu/miniconda3/envs/sim/include/python3.7m/Python.h
/home/ubuntu/src/blender-deps/Python-3.7.7/Include/Python.h
/opt/lib/python-3.7.7/include/python3.7m/Python.h
Then, you can set a compiler flag that will get used by gcc when called by pip.
Mine was /home/ubuntu/workspace/blender-git/lib/linux_centos7_x86_64/python/include/python3.7m/Python.h, so I did:
export CPPFLAGS=-I/home/ubuntu/src/blender-deps/Python-3.7.7/Include
pip install <package>
For CentOS 7:
sudo yum install python36u-devel
I followed the instructions here for installing python3.6 on several VMs: https://www.digitalocean.com/community/tutorials/how-to-install-python-3-and-set-up-a-local-programming-environment-on-centos-7
and was then able to build mod_wsgi and get it working with a python3.6 virtualenv
For the OpenSuse comrades out there:
sudo zypper install python3-devel
Here is yet another solution, because none of these solutions worked for me. For reference, I was trying to pip install something on an Amazon Linux AMI base Docker image for Python 3.6.
Non-docker solution:
# Install python3-devel like everyone says
yum -y install python36-devel.x86_64
# Find the install directory of `Python.h`
rpm -ql python36-devel.x86_64 | grep -i "Python.h"
# Forcefully add it to your include path
C_INCLUDE_PATH='/usr/include/python3.6m'
export C_INCLUDE_PATH
Docker solution:
# Install python3-devel like everyone says
RUN yum -y install python36-devel.x86_64
# Find the install directory of `Python.h`, for me it was /usr/include/python3.6m
RUN rpm -ql python36-devel.x86_64 | grep -i "Python.h" && fake_command_so_docker_fails_and_shows_us_the_output
# Since the previous command contains a purposeful error, remove it before the next run
# Forcefully add it to your include path
ARG C_INCLUDE_PATH='/usr/include/python3.6m'
NOTE: If you're getting the error when compiling C++, use CPLUS_INCLUDE_PATH.
Alternatively, you may prefer to use another Docker image. For example, I was trying to install asyncpg~=0.24.0 on python:3.9.4-slim, which generated the same error as you saw. However, when I updated the image to python:3, it worked fine.
If you're using Python 3.6 on Amazon Linux (based on RHEL, but the RHEL answers given here didn't work):
sudo yum install python36-devel
You must install the Python development files on your operating system if the Python provided with your operating system does not come with them. The many answers on this question show the myriad ways this can be achieved on different systems.
When you have done so, the problem is telling the compiler where they're located and how to compile against them. Python comes with a program called python-config. For compilation, you need the --includes output and for linking a program against the Python library (embedding Python into your program) the --ldflags output. Example:
gcc -c mypythonprogram.c $(python3-config --includes)
gcc -o program mypythonprogram.o $(python3-config --ldflags)
The python-config program can be named after the Python versions - on Debian, Ubuntu for example these can be named python3-config or python3.6-config.
Sure python-dev or libpython-all-dev are the first thing to (apt )install, but if that doesn't help as was my case, I advice you to install the foreign Function Interface packages by sudo apt-get install libffi-dev and sudo pip install cffi.
This should help out especially if you see the error as/from c/_cffi_backend.c:2:20: fatal error: Python.h: No such file or directory.
try locate your Python.h:
gemfield#ThinkPad-X1C:~$ locate Python.h
/home/gemfield/anaconda3/include/python3.7m/Python.h
/home/gemfield/anaconda3/pkgs/python-3.7.6-h0371630_2/include/python3.7m/Python.h
/usr/include/python3.8/Python.h
if not found, then install python-dev or python3-dev; else include the correct header path for compiler:
g++ -I/usr/include/python3.8 ...
I am on Ubuntu. I have installed all packages as was recommended in some answers.
sudo apt-get install python-dev # for python2.x installs
sudo apt-get install python3-dev # for python3.x installs
I still had this problem, the line:
#include "Python.h"
And some others, I can edit them manually, it is a bad practice.
I know the secret now, it comes from the cython source code. I have the file. It compiles without errors. That is the file.
Change PYTHON to python version you have, python/python3. Change FILE to your c-filename. The name of the makefile file should be Makefile. Run the the file with the command:
make all
Makefile for creating our standalone Cython program
FILE := file.c
PYTHON := python3
PYVERSION := $(shell $(PYTHON) -c "import sys;
print(sys.version[:3])")
PYPREFIX := $(shell $(PYTHON) -c "import sys; print(sys.prefix)")
INCDIR := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_python_inc())")
PLATINCDIR := $(shell $(PYTHON) -c "from distutils import
sysconfig; print(sysconfig.get_python_inc(plat_specific=True))")
LIBDIR1 := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBDIR'))")
LIBDIR2 := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBPL'))")
PYLIB := $(shell $(PYTHON) -c "from distutils import sysconfig;
print(sysconfig.get_config_var('LIBRARY')[3:-2])")
CC := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('CC'))")
LINKCC := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LINKCC'))")
LINKFORSHARED := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LINKFORSHARED'))")
LIBS := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('LIBS'))")
SYSLIBS := $(shell $(PYTHON) -c "import distutils.sysconfig;
print(distutils.sysconfig.get_config_var('SYSLIBS'))")
.PHONY: paths all clean test
paths:
#echo "PYTHON=$(PYTHON)"
#echo "PYVERSION=$(PYVERSION)"
#echo "PYPREFIX=$(PYPREFIX)"
#echo "INCDIR=$(INCDIR)"
#echo "PLATINCDIR=$(PLATINCDIR)"
#echo "LIBDIR1=$(LIBDIR1)"
#echo "LIBDIR2=$(LIBDIR2)"
#echo "PYLIB=$(PYLIB)"
#echo "CC=$(CC)"
#echo "LINKCC=$(LINKCC)"
#echo "LINKFORSHARED=$(LINKFORSHARED)"
#echo "LIBS=$(LIBS)"
#echo "SYSLIBS=$(SYSLIBS)"
$(FILE:.c=): $(FILE:.c=.o)
$(LINKCC) -o $# $^ -L$(LIBDIR1) -L$(LIBDIR2) -l$(PYLIB)
$(LIBS) $(SYSLIBS) $(LINKFORSHARED)
$(FILE:.c=.o): $(FILE)
$(CC) -c $^ -I$(INCDIR) -I$(PLATINCDIR)
all: $(FILE:.c=)
This error occurred when I attempted to install ctds on CentOS 7 with Python3.6. I did all the tricks mentioned here including yum install python34-devel. The problem was Python.h was found in /usr/include/python3.4m but not in /usr/include/python3.6m. I tried to use --global-option to point to include dir (pip3.6 install --global-option=build_ext --global-option="--include-dirs=/usr/include/python3.4m" ctds). This resulted in a lpython3.6m not found when linking ctds.
Finally what worked was fixing the development environment for Python3.6 needs to correct with the include and libs.
yum -y install https://dl.iuscommunity.org/pub/ius/stable/CentOS/7/x86_64/python36u-libs-3.6.3-1.ius.centos7.x86_64.rpm
Python.h needs to be in your include path for gcc. Whichever version of python is used, for example if it's 3.6, then it should be in /usr/include/python3.6m/Python.h typically.
Sometimes even after installing python-dev the error persists,
Check for the error if it is 'gcc' missing.
First download as stated in https://stackoverflow.com/a/21530768/8687063, then install gcc
For apt (Ubuntu, Debian...):
sudo apt-get install gcc
For yum (CentOS, RHEL...):
sudo yum install gcc
For dnf (Fedora...):
sudo dnf install gcc
For zypper (openSUSE...):
sudo zypper in gcc
For apk (Alpine...):
sudo apk gcc
It often appear when you trying to remove python3.5 and install python3.6.
So when using python3 (which python3 -V => python3.6) to install some packages required python3.5 header will appear this error.
Resolve by install python3.6-dev module.
This means that Python.h isn't in your compiler's default include paths. Have you installed it system-wide or locally? What's your OS?
You could use the -I<path> flag to specify an additional directory where your compiler should look for headers. You will probably have to follow up with -L<path> so that gcc can find the library you'll be linking with using -l<name>.

arm-linux-gnueabi-g++: command not found

I am trying to compile C++ code for ARM architecture. I don't know exactly the full name of processor (waiting for information from some hardware guy), I know only it is some ARM.
The problem which I have.
I use command in order to compile my resource files for ARM architecture:
arm-linux-gnueabi-g++ myApp.cpp -g -Wall -o myApp
and also
arm-linux-gnueabi-gcc myApp.cpp -g -Wall -lstdc++ -o myApp
and gets output:
-bash: arm-linux-gnueabi-g++: command not found
and
-bash: arm-linux-gnueabi-gcc: command not found
In linux which I used I am not sure if there is installed gcc,g++ arm package...
There is:
locate arm-none-linux-gnueabi-
locate arm-linux-gnueabihf-g++
locate arm-linux-gnueabihf-gcc
, there is none:
locate arm-linux-gnueabi-g++
locate arm-linux-gnueabi-gcc
I am not allowed to do some tries and install arm package, because this linux is running on some server to which many developers are attached.
setting PATH in shell:
PATH=$PATH:/opt/eds/x86_64/13.1-2/embedded/ds-5/bin/arm-linux-gnueabihf-g++
or with gcc
doesn't solve the issue.
Setting it in:
~/.bashrc
also doesn't solve the issue because additional problems occurs, I cannot connect to linux server.
Thanks in advance for any help.
for kernel or uboot cross compiling below commands are enough:
sudo apt-get install -y gcc-arm-linux-gnueabihf
sudo apt-get install -y libncurses-dev
sudo apt-get install -y libqt4-dev pkg-config
sudo apt-get install -y u-boot-tools
sudo apt-get install -y device-tree-compiler
but for c++ cross compiling you should install g++ using below command:
sudo apt-get install g++-arm-linux-gnueabihf

"pkg-config script could not be found" on OSX

I am trying to install some software on my mac; however I keep receiving the same error:
configure: error: The pkg-config script could not be found or is too old. Make sure it
is in your PATH or set the PKG_CONFIG environment variable to the full
path to pkg-config.
Alternatively, you may set the environment variables XMEDCON_GLIB_CFLAGS
and XMEDCON_GLIB_LIBS to avoid the need to call pkg-config.
See the pkg-config man page for more details.
To get pkg-config, see <http://pkg-config.freedesktop.org/>.
See `config.log' for more details
I am not quite sure how to go about adding the pkg-config to the PATH. I have seen online (see link) that I should add the following:
Link showing how to direct PATH variable
export PATH=$PATH:/opt/local/bin # Fixed typo as mentioned in comment
which is where I have placed pkg-config. I still keep getting the error though every time I try to configure the files using ./configure. Any help would be super appreciated!
For Ubuntu/Debian OS,
apt-get install -y pkg-config
For Redhat/Yum OS,
yum install -y pkgconfig
For Archlinux OS,
pacman -S pkgconf
for me, (OSX) the problem was solved doing this:
brew install pkg-config
Answer to my question (after several Google searches) revealed the following:
$ curl https://pkgconfig.freedesktop.org/releases/pkg-config-0.29.tar.gz -o pkgconfig.tgz
$ tar -zxf pkgconfig.tgz && cd pkg-config-0.29
$ ./configure && make install
from the following link: Link showing above
Thanks to everyone for their comments, and sorry for my linux/OSX ignorance!
Doing this fixed my issues as mentioned above.
if you have this error :
configure: error: Either a previously installed pkg-config or "glib-2.0 >= 2.16" could not be found. Please set GLIB_CFLAGS and GLIB_LIBS to the correct values or pass --with-internal-glib to configure to use the bundled copy.
Instead of do this command :
$ ./configure && make install
Do that :
./configure --with-internal-glib && make install
Try
which pkg-config
if it is empty then fire
brew install pkg-config
OR : ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" < /dev/null 2> /dev/null
MacOS users
Unfortunately, pkg-config does not come with OS X by default. Here are some notes on how to compile from source. It assumes that you have Xcode installed.
Download and extract
curl -O http://pkgconfig.freedesktop.org/releases/pkg-config-0.28.tar.gz
tar xfz pkg-config-0.28.tar.gz
Configure and Install
cd pkg-config-0.28
setenv CC /usr/bin/cc (for tcsh)
export CC=/usr/bin/cc (for bash)
2a) If you have super-user powers
./configure --prefix=/usr/local CC=$CC --with-internal-glib
make
sudo make install
2b) if not
./configure --prefix=$HOME/someplace/in/my/path CC=$CC --with-internal-glib
make
make install
Source: https://opensource.ncsa.illinois.edu/confluence/display/DESDM/Installing+pkg-config+from+source+for+OSX

bash - make command not found

I have created make file named Makefile in my linux ec2 server.
all: a b
a: daemon.cpp dictionary_exclude.cpp
g++ -o a daemon.cpp dictionary_exclude.cpp -lpthread -std=c++0x -L.
b: user_main.cpp client.cpp
g++ -o b user_main.cpp client.cpp
I could run each of this independently successfull.
But when I execute
make
make -f Makefile
It says make : -bash: make: command not found
Any idea? I can see manually for make is available through man make
Please execute following command to install make in your system
sudo yum install build-essential
In CentOS or Red Hat, try this:
yum groupinstall "Development Tools"
It might be that you have not installed binutils http://en.wikipedia.org/wiki/GNU_Binutils or you have not set your PATH variable to the correct location of bin utils.
sudo apt-get install build-essential on Ubuntu 16 worked for me

Resources