Boost not finding local installed icu - linux

I am trying to build libboost, 1.70.0.
I've already compiled ICU4C, and installed it locally at my $HOME/usr. I have it build icu-config as well in order to help finding it.
I have $HOME/usr/lib in my LD_LIBRARY_PATH environment variable, and I have $HOME/usr/bin in my PATH.
I am running bootstrap this way:
./bootstrap.sh --with-icu=$HOME/usr/ --prefix=$HOME/usr/
The output suggest that icu will be enabled. And the supplied path is saved in bjam configuration.
Later, when running
./b2 --reconfigure
it says ICU will not be used:
- bzip2 : yes
- lzma : no
- zstd : no
- iconv (libc) : yes
- icu : no
- icu (lib64) : no
- native-atomic-int32-supported : yes
Although not critical, it is interesting it doesn't find lzma, even it being installed locally too.
I am kind of lost, with no idea what I am missing. Recompiled ICU for a couple of times, tried different approaches I found online, and nothing works. Any help is welcome.
Config.log shows,
In file included from /home/ambs/usr/include/unicode/uversion.h:30:0,
from libs/regex/build/has_icu_test.cpp:12:
/home/ambs/usr/include/unicode/umachine.h:340:13: error: char16_t does not name a type
typedef char16_t UChar;
^
In file included from libs/regex/build/has_icu_test.cpp:12:0:
/home/ambs/usr/include/unicode/uversion.h:173:55: error: UChar does not name a type
u_versionFromUString(UVersionInfo versionArray, const UChar *versionString);
^
as well as other similar errors. I am (well, I think I am) compiling with the same compiler (that isn't in a standard path, too, but is in my path).

While I am not understanding why bjam is not detecting correctly the compiler, if I use:
./bjam cxxflags='-std=c++11'
then, not just I have icu, as I have C++11 features detected.

Related

How do I fix "ld: error: unable to find library -lgcc" when cross-compiling rust to android?

I'm trying to get rust working on android. However, when I try to cross-compile to android I get the following linking error:
$ cargo build --target=arm-linux-androideabi
Compiling <project> v0.1.0 (<project>)
error: linking with `/opt/android-sdk/ndk/23.0.7599858/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi31-clang` failed: exit status: 1
(very long toolchain command from cargo)
ld: error: unable to find library -lgcc
clang-12: error: linker command failed with exit code 1 (use -v to see invocation)
I have installed the ndk and changed the linker in .cargo/config to the android clang linker. I also tried the standalone toolchains with the same result. The guide I used was the following: https://mozilla.github.io/firefox-browser-architecture/experiments/2017-09-21-rust-on-android.html
Cross-compilation does work when using crate-type = ["rlib"] instead of crate-type = ["cdylib"], but I need an .so file not an .rlib file.
In case it's relevant, i'm using Manjaro Linux.
UPDATE:
I found the following pull request: https://github.com/rust-lang/rust/pull/85806 After switching to ndk22 it worked. I havn't tried if the pull request fixes the issue (probably does).
Without switching to an older NDK version, I found using the workaround provided by ssrlive to work for me. Here's their comment:
Fixing build error for NDK 23 and above
find out all the 4 folders containing file libunwind.a, in my PC,
it's
C:\Users\Administrator\AppData\Local\Android\Sdk\ndk\23.1.7779620\toolchains\llvm\prebuilt\windows-x86_64\lib64\clang\12.0.8\lib\linux\x86_64\
and more. create 4 text files named libgcc.a in the same folders
with this contents
INPUT(-lunwind)
reference
link
In macOS, the paths are
~/Library/Android/sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/darwin-x86_64/lib64/clang/14.0.1/lib/linux/i386/libunwind.a
~/Library/Android/sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/darwin-x86_64/lib64/clang/14.0.1/lib/linux/arm/libunwind.a
~/Library/Android/sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/darwin-x86_64/lib64/clang/14.0.1/lib/linux/aarch64/libunwind.a
~/Library/Android/sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/darwin-x86_64/lib64/clang/14.0.1/lib/linux/x86_64/libunwind.a
In Linux, the paths are
~/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/lib64/clang/14.0.1/lib/linux/i386/libunwind.a
~/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/lib64/clang/14.0.1/lib/linux/aarch64/libunwind.a
~/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/lib64/clang/14.0.1/lib/linux/x86_64/libunwind.a
~/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/lib64/clang/14.0.1/lib/linux/arm/libunwind.a
In Windows, the paths are
~/AppData/Local/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/windows-x86_64/lib64/clang/14.0.1/lib/linux/aarch64/libunwind.a
~/AppData/Local/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/windows-x86_64/lib64/clang/14.0.1/lib/linux/arm/libunwind.a
~/AppData/Local/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/windows-x86_64/lib64/clang/14.0.1/lib/linux/i386/libunwind.a
~/AppData/Local/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/windows-x86_64/lib64/clang/14.0.1/lib/linux/x86_64/libunwind.a
create file command in Linux/macOS
cat << EOF > libgcc.a
INPUT(-lunwind)
EOF
This is of course extremely brittle and not the "right" solution, but the workaround works fine as of 2022-10-12 with ndk version 25.1.8937393.

Building x64 NSIS (using VS2012)

I am in the process of creating a pure x64 version of my application. In order to do that, I also need an x64 installer. I've read online that the NSIS code does support x64 but since they don't distribute an x64 build that I need to build from source (including all plugins/etc).
I've been able to build NSIS (v3.0.4) from source for x86 using Python 2.7/SCons 3.1.1/VS 2012/Zlib 1.2.7.
scons ZLIB_W32=C:\Source\zlib-1.2.7
But when I add the TARGET_ARCH=amd64 to the scons command,
scons ZLIB_W32=C:\Source\zlib-1.2.7 TARGET_ARCH=amd64
It doesn't work. Initially it built but didn't link because zlib was still x86.
C:\Source\zlib-1.2.7\lib\zdll.lib : warning LNK4272: library machine type 'x86' conflicts with target machine type 'x64'
However, after rebuilding zlib1.dll (using VS2012) to be x64 (dumpbin confirms this)
I now get an error that it can't find zlib
scons ZLIB_W32=C:\Source\zlib-1.2.7 TARGET_ARCH=amd64
scons: Reading SConscript files ...
WARNING: VER_PACKED not set, defaulting to 0x03003666!
Delete("nsis-18-Nov-2019.cvs")
Delete(".instdist")
Delete(".test")
Using Microsoft tools configuration (14.2)
Checking for memset requirement... (cached) yes
Checking for memcpy requirement... (cached) yes
Checking for C library gdi32... (cached) yes
Checking for C library user32... (cached) yes
Checking for C library pthread... (cached) no
Checking for C library iconv... (cached) no
Checking for C library shlwapi... (cached) yes
Checking for C library oleaut32... (cached) yes
Checking for C library version... (cached) yes
Checking for C library zdll... no
Checking for C library z... no
zlib (win32) is missing!
Note that I have made sure that the directory structure matches in the x64 build so that the following files exist:
C:\Source\zlib-1.2.7\zlib1.dll
C:\Source\zlib-1.2.7\lib\zlib1.lib
C:\Source\zlib-1.2.7\include\zconf.h
C:\Source\zlib-1.2.7\include\zlib.h
It did occur to me that I'm telling scons to look for x86 zlib (hence the W32 in ZLIB_W32) but I didn't see an option for telling scons to look for x64 zlib in the -h output.
What am I missing?
-UPDATE 1-
I'm making progress but not out of the woods yet. I've identified several issues with my build. #1, I wasn't actually using VS2012 like I thought (I have several versions installed). See above where the scons output says 14.0 (VS2019). Oops. Unfortunately, simply adding a MSVC_VERSION=11.0 to my command line didn't fix it. It seems that the nsis project isn't passing this along to scons. The only way I could figure out how to do this was to modify the nsis SConstruct file from:
######################################################################
####### Build Environment ###
######################################################################
path = ARGUMENTS.get('PATH', '')
toolset = ARGUMENTS.get('TOOLSET', '')
arch = ARGUMENTS.get('TARGET_ARCH', 'x86')
if toolset and path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, TOOLS = toolset.split(',') + ['zip'])
else:
if path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path})
if toolset:
defenv = Environment(TARGET_ARCH = arch, TOOLS = toolset.split(',') + ['zip'])
if not toolset and not path:
defenv = Environment(TARGET_ARCH = arch)
Export('defenv')
to:
######################################################################
####### Build Environment ###
######################################################################
path = ARGUMENTS.get('PATH', '')
toolset = ARGUMENTS.get('TOOLSET', '')
arch = ARGUMENTS.get('TARGET_ARCH', 'x86')
vs_version = ARGUMENTS.get('MSVC_VERSION', '')
if toolset and path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, TOOLS = toolset.split(',') + ['zip'], MSVC_VERSION = vs_version)
else:
if path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, MSVC_VERSION = vs_version)
if toolset:
defenv = Environment(TARGET_ARCH = arch, TOOLS = toolset.split(',') + ['zip'], MSVC_VERSION = vs_version)
if not toolset and not path:
defenv = Environment(TARGET_ARCH = arch, MSVC_VERSION = vs_version)
Export('defenv')
Now my build output correctly identifies as VS2012:
C:\Source\nsis\nsis-code-r7069-NSIS-tags-v304>scons ZLIB_W32=C:\Source\zlib MSVC_VERSION=11.0
scons: Reading SConscript files ...
WARNING: VER_PACKED not set, defaulting to 0x03003666!
Delete("nsis-19-Nov-2019.cvs")
Delete(".instdist")
Delete(".test")
Using Microsoft tools configuration (11.0)
Checking for memset requirement... (cached) yes
<snip>
It seems like there should be a better way to allow this but I'm not familiar enough (yet) with scons or nsis's usage of it to submit a PR to fix this. Or maybe it already exists and I'm just not smart enough to find it (yet).
Now, on to building x64 nsis. Big thanks to #Anders for narrowing down which files are needed a bit and #bdbaddog for suggesting to look into the scons config.log where I found that I had named the zlib.dll lib file incorrectly when I build it for x64. After changing the output lib to
C:\Source\zlib-1.2.7\lib\zdll.lib
It at least tries to build but fails at linking...
-edit to update 1- all of the linking errors that were here previously were a bad rabbit hole I went down yesterday. Not sure if it was caused by scons leaving some temp files somewhere or VS doing it or me just being stupid but today when I started trying to track down the linking errors I couldn't reproduce them (there was a machine reboot yesterday and who knows what might have been locked/cached until then).
-UPDATE 2-
This update is removing all the mis-leading crap from update 1. Current situation is x64 NSIS build (using both my zlib-1.2.7-amd64 build AND the official NSIS zlib-1.2.8-amd64 pre-built binaries on the NSIS wiki) is failing at linking with the same error:
System.obj : error LNK2019: unresolved external symbol CallProc2 referenced in function Call
build\urelease\System\amd64-unicode\System.dll : fatal error LNK1120: 1 unresolved externals
scons: *** [build\urelease\System\amd64-unicode\System.dll] Error 1120
scons: building terminated because of errors.
So far I'm not having any luck figuring out what CallProc2 is but I think this means zlib is no longer at fault here.
-UPDATE 3-
Thanks to #Anders for the direction here since I was really stumped on the Callproc issue, the CallProc2 method is isolated to the system plugin and is caused by it not building correctly. There is some discussion in the comments below trying to find the cause but for now, I'm just excluding that plugin to get to a system that works before coming back to this.
Right now, the scons build completes and produces x64 binaries.
C:\Source\nsis\build\urelease\makensisw\>dumpbin /HEADERS ./makensisw.exe
Microsoft (R) COFF/PE Dumper Version 11.00.61030.0
Copyright (C) Microsoft Corporation. All rights reserved.
Dump of file ./makensisw.exe
PE signature found
File Type: EXECUTABLE IMAGE
FILE HEADER VALUES
8664 machine (x64)
However, the installer build now results in the following error:
scons dist-installer ZLIB_W32=C:\Source\zlib-1.2.8-x64 MSVC_VERSION=11.0
link /nologo /map /subsystem:console,5.01 /OUT:build\urelease\VPatch\Source\GenPat\GenPat.exe /LIBPATH:C:\Source\zlib-1.2.8-x64\lib zdll.lib build\urelease\VPatch\Source\GenPat\adler32.obj build\urelease\VPatch\Source\GenPat\Checksums.obj build\urelease\VPatch\Source\GenPat\ChunkedFile.obj build\urelease\VPatch\Source\GenPat\FileFormat1.obj build\urelease\VPatch\Source\GenPat\GlobalTypes.obj build\urelease\VPatch\Source\GenPat\main.obj build\urelease\VPatch\Source\GenPat\md5.obj build\urelease\VPatch\Source\GenPat\PatchGenerator.obj build\urelease\VPatch\Source\GenPat\POSIXUtil.obj
adler32.obj : error LNK2019: unresolved external symbol _adler32 referenced in function "unsigned long __cdecl Checksum::adler32(unsigned long,unsigned char const *,unsigned int)" (?adler32#Checksum##YAKKPBEI#Z)
build\urelease\VPatch\Source\GenPat\GenPat.exe : fatal error LNK1120: 1 unresolved externals
scons: *** [build\urelease\VPatch\Source\GenPat\GenPat.exe] Error 1120
Unfortunately, without the installer, I'm kinda lost again. The x64 binaries run but when trying to compile any script (I believe), I get the following error (note that I copied makensis and zlib1.dll into the same directory for this):
C:\Source>makensis.exe ./myapp.nsi
Error: reading stub "C:\Stubs\zlib-amd64-unicode"
Error initalizing CEXEBuild: error setting default stub
-UPDATE 4-
I have finally gotten a build to work successfully after ignoring the system plugin. This works both with compiling zlib from source for x64 or with the prebuilt version from the NSIS wiki (linked in #Anders answer below):
scons ZLIB_W32=C:\Source\zlib-1.2.8-x64 TARGET_ARCH=amd64 MSVC_VERSION=11.0 SKIPPLUGINS=System
and I was able to deploy to my dev system using (from an admin enabled prompt since I was installing to Program Files):
scons PREFIX="C:\Program Files (x86)\NSIS" install ZLIB_W32=C:\Source\zlib-1.2.8-x64 TARGET_ARCH=amd64 SKIPPLUGINS=System
Unfortunately, you cannot build the NSIS installer from here (dist-install) because the NSIS installer itself apparently depends on usage of the System plugin. So this scenario isn't sufficient for being able to prep a build machine unless you are going to build everything from scratch on that machine.
Also, I'm not out of the woods yet because I also use the System plugin for my installer so I need to figure out why that isn't working.
However, I was able to compile and run a pure x64 bare minimum installer package with the above setup:
# name the installer
OutFile "Installer.exe"
# default section start; every NSIS script has at least one section.
Section
# default section end
SectionEnd
The above was straight out of the NSIS docs.
My environment:
cl.exe:
Microsoft (R) C/C++ Optimizing Compiler Version 19.24.28316 for x64
nsis version: 3.06.1
zlib: Zlib-1.2.8-win64-AMD64
I modified SCons/Tool/masm.py:
C:\miniconda3\Lib\site-packages\SCons-4.0.1-py3.7.egg\SCons\Tool>diff -c masm.old.py masm.py
*** masm.old.py Fri Oct 09 22:02:45 2020
--- masm.py Fri Oct 09 21:58:06 2020
***************
*** 61,66 ****
--- 61,68 ----
shared_obj.add_emitter(suffix, SCons.Defaults.SharedObjectEmitter)
env['AS'] = 'ml'
+ if env.get('TARGET_ARCH')=='amd64':
+ env['AS'] = 'ml64'
env['ASFLAGS'] = SCons.Util.CLVar('/nologo')
env['ASPPFLAGS'] = '$ASFLAGS'
env['ASCOM'] = '$AS $ASFLAGS /c /Fo$TARGET $SOURCES'
Also modified nsis-3.06.1-src\Contrib\System\SConscript as follows:
C:\dev\nsis-3.06.1-src\Contrib\System>diff -c SConscript.old SConscript
*** SConscript.old Fri Oct 09 22:06:31 2020
--- SConscript Fri Oct 09 19:17:40 2020
***************
*** 4,9 ****
--- 4,10 ----
Source/Buffers.c
Source/Plugin.c
Source/System.c
+ Source/Call-amd64.S
""")
libs = Split("""
Then C:\dev\nsis-3.06.1-src>scons TARGET_ARCH=amd64 ended successfully.
I have
\zlib1.dll
\include\zconf.h
\include\zlib.h
\lib\libzdll.a
\lib\zdll.lib
\lib\zlib.lib
You can get pre-built zlib for NSIS here.
In the past I have actually used Process Monitor to figure out which file SCons is looking for, that sadly seems to be the fastest way to find out.
Regarding CallProc2. Add "System" to SKIPPLUGINS to get past this issue to make sure everything else compiles. CallProc2 is implemented in the amd64 .S file and used by system.c. You need the 64-bit Microsoft assembler detected correctly to build the .S file.
If you only need to compile NSIS once then you can cheat; Open a Visual Studio command prompt and execute ml64.exe /c Call-amd64.S and then copy the .obj file to the same build directory as system.obj and run Scons again.
To do it properly you need to investigate why Scons is not compiling the .S file. Could be a configuration issue or a bug in the related Sconscript file.

Error in attempting to create a new Stack project with GHCJS compiler

I am attempting to set up a new Stack project on NixOS with GHCJS as the compiler following the instructions at http://docs.haskellstack.org/en/stable/ghcjs.html
I have included in my stack.yaml file the following lines of code (all on one line because tab spaces seem to give issues):
# Compiler specifying the GHCJS compiler for this project (using improved base).
compiler: ghcjs-0.2.0.20151230.3_ghc-7.10.2
compiler-check: match-exact
setup-info:
ghcjs: source:
ghcjs-0.2.0.20151230.3_ghc7.10.2:
url: "https://github.com/nrolland/ghcjs/releases/download/v.0.2.0.20151230.3/ghcjs-0.2.0.20151230.3.tar.gz"
and I have retrieved the following error message when I ran stack setup
Could not parse '/home/lorkaan/pandocJS/stack.yaml':
InvalidYaml (Just (YamlParseException {yamlProblem = "mapping values are not allowed in this context", yamlContext = "", yamlProblemMark = YamlMark {yamlIndex = 487, yamlLine = 12, yamlColumn = 17}}))
See https://github.com/commercialhaskell/stack/blob/release/doc/yaml_configuration.md.
Additionally, I tried removing the setup-info field because Stack was complaining about it, leaving my stack.yaml file like:
# Compiler specifying the GHCJS compiler for this project (using improved base).
compiler: ghcjs-0.2.0.20151230.3_ghc-7.10.2
compiler-check: match-exact
which produces this output with the stack setup command:
Warning: /home/lorkaan/pandocJS/stack.yaml: Unrecognized field in ProjectAndConfigMonoid: compiler
Preparing to install GHC to an isolated location.
This will not interfere with any system-level installation.
Already downloaded.
The following executables are missing and must be installed: make
Does anybody have any idea why this would be happening?
the first error is because of a basic syntax error in your YAML configuration. The correct version would be:
setup-info:
ghcjs:
source:
ghcjs-0.2.0.20151230.3_ghc7.10.2:
url: "https://github.com/nrolland/ghcjs/releases/download/v.0.2.0.20151230.3/ghcjs-0.2.0.20151230.3.tar.gz"
The second error is because of exactly what it says: you are lacking the make utility. You need to use your Linux distribution's package management system to install make. Since I don't know which distribution you are on, I can only recommend simply executing the $ make command and seeing if the environment is smart enough to point out which package it can be found in. Ubuntu typically does that. Then it's only a matter of apt-get install-ing the package, or possibly yum install-ing on e.g. CentOS and Fedora, etc.
P.S. questions like yours normally get a downvote for not having shown sufficient effort in diagnosing the problem (or for putting 2 totally separate problems under a single question) but I'm giving you the benefit of the doubt and just hoping you'll be tidier next time.

VAPI problems with GTK+ 3

I'm trying to compile some Vala on ArchLinux, and when I try to include the package gtk+-3.0, it seems GDK and GTK+ 2.0 are being included as well; valac --pkg gtk+-3.0 test.vala gives the following errors:
gdk-2.0.vapi:8.3-8.28: error: `Gdk.Selection' already contains a definition for `convert'
public static void convert (Gdk.Window requestor, Gdk.Atom selection, Gdk.Atom target, uint32 time_);
^^^^^^^^^^^^^^^^^^^^^^^^^^
gdk-3.0.vapi:8.3-8.28: note: previous definition of `convert' was here
public static void convert (Gdk.Window requestor, Gdk.Atom selection, Gdk.Atom target, uint32 time_);
^^^^^^^^^^^^^^^^^^^^^^^^^^
gdk-2.0.vapi:10.3-10.44: error: `Gdk.Selection' already contains a definition for `owner_get'
public static unowned Gdk.Window owner_get (Gdk.Atom selection);
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
gdk-3.0.vapi:10.3-10.44: note: previous definition of `owner_get' was here
public static unowned Gdk.Window owner_get (Gdk.Atom selection);
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
--snip--
Compilation failed: 942 error(s), 0 warning(s)
Is there some way of specifying not to include gtk+-2.0 or of making valac ignore these errors?
Without having access to your source code or build environment (assuming you're not merely typing the valac command directly), it's tough to troubleshoot this. Using a dead-simple test.vala with Vala 0.12.1, it builds fine on my system.
In the past I've seen bad Vala environments due to old versions of Vala (and its support files) lurking around. I recommend uninstalling Vala 0.12.1 completely, then going through /usr for any remnants. An easy and thorough way (although time-consuming) is to do this:
$ find /usr -name "*vala*"
$ find /usr -name "*.vapi"
Remove anything that's obviously not part of another package. (Note that some packages install their own VAPIs, like libgee.) Then reinstall Vala 0.12.1 and see if the problem persists.
What version of vala? I fear it must be something messed up in your distribution. Can you paste the contents of /usr/share/.../gtk+-3.0.deps?
Also try using valac --verbose so that you see all the vapis being loaded. For each vapi, look the relative .deps file and check whether there is a gdk-2.0 lying around.

Problem in Cross-Compiling libSDL for MIPS Platform

I was trying to compile libSDL-1.2.14 for my mips platform.
But it was not successful.
These were the steps that I tried out :
export PATH=/opt/mips-4.3/bin:$PATH
Went inside the libSDL-1.2.14 source folder.
Gave a "./configure --prefix=/usr/local/SDL_Lib --host=mips-linux-gnu"
Executed the "make" command
This was the error received :
cc1: warning: include location
"/usr/include" is unsafe for
cross-compilation
./src/audio/dma/SDL_dmaaudio.c: In
function 'DMA_WaitAudio':
./src/audio/dma/SDL_dmaaudio.c:167:
error: can't find a register in class
'COP3_REGS' while reloading 'asm'
./src/audio/dma/SDL_dmaaudio.c:167:
error: 'asm' operand has impossible
constraints make: *
[build/SDL_dmaaudio.lo] Error 1
But then i reconfigured the make file by giving the following commands :
make clean
./configure --prefix=/usr/local/SDL_Lib --host=mips-linux-gnu CPPFLAGS=-I/opt/mips-4.3/mips-linux-gnu/libc/usr/include/
make
NOTE : /opt/mips-4.3/mips-linux-gnu/libc/usr/include/ - This is the path where you can locate the select.h file for the mips Platform.
It contains the definitions of the macros FD_ZERO and FD_SET.
Still I am getting the same error.
cc1: warning: include location
"/usr/include" is unsafe for
cross-compilation
./src/audio/dma/SDL_dmaaudio.c: In
function 'DMA_WaitAudio':
./src/audio/dma/SDL_dmaaudio.c:167:
error: can't find a register in class
'COP3_REGS' while reloading 'asm'
./src/audio/dma/SDL_dmaaudio.c:167:
error: 'asm' operand has impossible
constraints make: *
[build/SDL_dmaaudio.lo] Error 1
Please help me with some valuable pointers.
Thanks,
Sen
First, don't set the path to the cross-compiler as the first part of your PATH, set it as last:
export PATH=$PATH:<path to cross-compiler>
It's safer this way. Second, run ./configure --help to get all the options. What that error message would say if it was smarter is the following:
You're trying to cross-compile since you're setting the --host flag
But you're not changing any of the other options for where to find includes and libs for the target environment
I'm going to use /usr/include by default
But that's for the host system which will not work when cross-compiling
Check what other configure options you need to set to tell the configure script where to find the .h files (includes) and the libraries for your target. These usually come with the cross-compiler that you download. Also, you should probably set the CROSS_COMPILE environment variable to the cross-compiler prefix before running configure. The prefix is the part before gcc in a cross-compiler, assuming you're using GCC as your cross-compiler.

Resources