Build sqlcipher on ubuntu - linux

Hi I'm trying to build sqlcipher on ubuntu (11.10 minimal). On Mac OS X I had no problems though.
I followed the instructions from sqlcipher.net. First step was configure. I tried to execute configure with the following command:
./configure --enable-tempstore=yes CFLAGS="-DSQLITE_HAS_CODEC" LDFLAGS="-lcrypto"
but I got the following error message: "configure: error: C compiler cannot create executables"
In the config.log some lines caught my eyes but I don't know how to fix it:
gcc version 4.6.1 (Ubuntu/Linaro 4.6.1-9ubuntu3)
configure:2544: $? = 0
configure:2551: gcc -V >&5
gcc: error: unrecognized option '-V'
gcc: fatal error: no input files
compilation terminated.
configure:2555: $? = 4
configure:2578: checking for C compiler default output file name
configure:2600: gcc -DSQLITE_HAS_CODEC -lcrypto conftest.c >&5
/usr/bin/ld: cannot find -lcrypto
collect2: ld returned 1 exit status
Has anybody successfully build sqlcipher on ubuntu yet? Regards

You will need to install the GNU compiler toolchain in order to build from source. Execute the following command in a terminal.
$ sudo apt-get install build-essential

Related

Micronucleus does not update

I'm trying to upgrade my micronuclues to upload my code to digispark,but when I try to upgrade that happens:
Building command line tool: micronucleus...
gcc -Ilibrary -O -g -D LINUX -o micronucleus micronucleus.c micronucleus_lib.o littleWire_util.o -static -L/usr/lib/x86_64-linux-gnu -lusb
/usr/bin/ld: cannot find -lusb
collect2: error: ld returned 1 exit status
make: *** [Makefile:61: micronucleus] Error 1
I'm a little confused as to how you've gotten it compiling but not linking because, at least on Debian based distributions, the header file that would be needed during compiling is provided by the same package that provides the libusb.a that it is failing to link against.
If you are on a Debian based distro, try (re)installing libusb-dev:
sudo apt install libusb-dev
This is what I've built it against locally.
If you have a libusb.a and it's not in /usr/lib/x86_64-linux-gnu, then you'd need a different directory supplied to -L.

ld: error: unable to find library -lmysqlclient

I am trying to compile my game program and it s giving me this
root#vps:/usr/src/Sursa/Server/source/game/src # gmake -j20
linking ../game
ld: error: unable to find library -lmysqlclient
c++: error: linker command failed with exit code 1 (use -v to see invocation)
gmake: *** [Makefile:228: ../game] Error 1
root#vps:/usr/src/Sursa/Server/source/game/src #
This is my Makefile path for library
# mysql
INCDIR += -I../../../extern/mysql
LIBDIR += -L/usr/local/lib/mysq
LIBS += -lmysqlclient -lz -pthread -lm -lssl -lcrypto
### END
You need to update your GCC compiler since some features of c++14 is not supported by gcc4.9.
You can search the available packages in FreeBSD using
pkg search <package_name>
and install whatever you want via
pkg install <package_name>
Note : you might need sudo before those commands if your current user is not root
Finally if you have problems such as
Fatal error : "some_file"."some_extension" file not found
you can search the package name via the aforementioned command and install them in order to compile successfully.
For example to mitigate the following error
fatal error: 'boost/intrusive_ptr.hpp' file not found
you can install the boost-libs package.

nvm: install fails on BSD while building from source

System Info: FreeBSD 11.3-RELEASE-p3, amd64
I have tried using nvm to install node v12.16.2, v10.20.1, and v10.15.3, however it fails building from source (no binary available for pull on BSD) with the same error on all three:
/usr/bin/ld:/usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/tools/v8_gypfiles/libv8_libbase.a: file format not recognized; treating as linker script
/usr/bin/ld:/usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/tools/v8_gypfiles/libv8_libbase.a:1: syntax error
Configure completes successfully:
$>./configure --prefix=/home/ifiht/.nvm/versions/node/v12.16.2 <
INFO: configure completed successfully
gmake -C out BUILDTYPE=Release V=0
and the last command before failure is:
/usr/bin/clang++ -o /usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/bytecode_builtins_list_generator -pthread -rdynamic -m64 -Wl,--export-dynamic -Wl,--start-group /usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/bytecode_builtins_list_generator/deps/v8/src/builtins/generate-bytecodes-builtins-list.o /usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/bytecode_builtins_list_generator/deps/v8/src/interpreter/bytecode-operands.o /usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/bytecode_builtins_list_generator/deps/v8/src/interpreter/bytecodes.o /usr/home/ifiht/.nvm/.cache/src/node-v12.16.2/files/out/Release/obj.target/tools/v8_gypfiles/libv8_libbase.a -L/usr/local/lib -lexecinfo -Wl,--end-group
I'm out of troubleshooting ideas, if anyone knows how to enable nvm install verbosity that would also help, not sure why the linker is trying to read random .a (assembly?) files??

Cabal update fails with unix package configure error

I have a very old Cabal (1.16.0) installed from my distro's repo (Linux Mint). Today, I tried updating it and got:
cabal install cabal-install -v3
...
unix-2.7.2.2 failed during the configure step. The exception was:
ExitFailure 1
zlib-0.6.1.2 failed during the building phase. The exception was:
ExitFailure 1
I also see
checking for suffix of executables...
checking whether we are cross compiling... configure: error: in `/tmp/unix-2.7.2.2-2138/unix-2.7.2.2':
configure: error: cannot run C compiled programs.
If you meant to cross compile, use `--host'.
See `config.log' for more details
sh returned ExitFailure 1
After checking the config.log, I see some errors but I am not sure they are related and the configuration failed on them:
Thread model: posix
gcc version 4.8.4 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configure:2738: $? = 0
configure:2727: gcc -V >&5
gcc: error: unrecognized command line option '-V'
gcc: fatal error: no input files
compilation terminated.
configure:2738: $? = 4
configure:2727: gcc -qversion >&5
gcc: error: unrecognized command line option '-qversion'
gcc: fatal error: no input files
compilation terminated.
configure:2738: $? = 4
configure:2758: checking whether the C compiler works
configure:2780: gcc -Wl,--hash-size=31 -Wl,--reduce-memory-overheads conftest.c >&5
configure:2784: $? = 0
configure:2832: result: yes
configure:2835: checking for C compiler default output file name
configure:2837: result: a.out
configure:2843: checking for suffix of executables
configure:2850: gcc -o conftest -Wl,--hash-size=31 -Wl,--reduce-memory-overheads conftest.c >&5
configure:2854: $? = 0
configure:2876: result:
configure:2898: checking whether we are cross compiling
configure:2906: gcc -o conftest -Wl,--hash-size=31 -Wl,--reduce-memory-overheads conftest.c >&5
This is weird (I tried running the gcc.. command and it was also fine). Any help with this error is welcome.
You can install the cabal-install binary for linux from the cabal homepage as well: https://www.haskell.org/cabal/download.html

Clang on Raspberry Pi/Raspbian Error?

When I try to run clang as my C compiler I get an error (I think a linker error)
Compilation started at Sun Nov 11 14:34:55
make -k
clang -std=c99 -ggdb -o0 -Wall -Werror helloworld.c -o helloworld
clang: warning: unknown platform, assuming -mfloat-abi=soft
/usr/bin/ld: cannot find crt1.o: No such file or directory
/usr/bin/ld: cannot find crti.o: No such file or directory
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [helloworld] Error 1
Compilation exited abnormally with code 2 at Sun Nov 11 14:34:56
I actually re-installed the Raspbian image thinking that the problem would go away but it persists. Raspbian version is the latest Raspbian Wheezy hard float ABI (2012-10-28).
As you've essentially figured out, the installed clang is unusably broken. If you installed this through a package manager, complain to whoever distributes the package, because they clearly didn't bother testing it.

Resources