Why doesn't a 32bit .deb package install on 64bit Ubuntu? - linux

My .deb package, built on 32-bit Ubuntu and containing executables compiled with gcc, won't install on the 64-bit version of the OS (the error message says 'Wrong architecture i386'). This is confusing to me because I thought that in general 32-bit software worked on 64-bit hardware, but not vice-versa.
Will it be possible for me to produce a .deb file that I can install on a 64-bit OS, using my 32-bit machine? Is it just a matter of using the appropriate compiler flags to produce the executables (and if so what are they), or is the .deb file itself somehow specific to one processor architecture?

The deb installer is probably refusing to install your package because it was (correctly) labeled with a conflicting Architecture: field. i386 code can be executed on an amd64 machine, but it requires that all the appropriate dependencies (32-bit libraries, etc.) be present. It's better to build separate packages for each architecture.
Yes, you can build for 64-bit on your 32-bit machine. It's called cross-compiling, and it requires that you create a build environment for that purpose. To get started, you might want to look up the dpkg-cross and apt-cross tools.
Alternatively, you can just install a virtual machine running a 64-bit OS, and build for your secondary architecture there.

The architecture is just an option in the config file of debian package. By default it uses those from uname. You can override it but there is an easier way.
In general, most 32-bit programs will run fine on 64-bit. However, unless you have a very old PC, it is also very easy to install a mini 64-bit debian in a virtualbox virtual machine. You probably only need base + build essentials + dev libraries. This will not take a lot of diskspace. If you can spare 2G diskspace, just install a desktop debian.
There are more options to do crosscompilation, with various degrees of automation.
I use the virtualbox method regularly. It is easy and fast.
If you run 64-bit linux making a 32-bit environment is as easy as mkdebootstrap + linux32 + chroot.

Related

I need a way to change Msys2 configuration to use Arch Linux AUR server mirrors instead of Msys2 ones

Because Msys2 sucks,
as mentioned above, I need to change its default server mirrors to point to Arch Linux Mingw-w64 AUR ones, and make it as the default one.
So when I issue some pacman -S mingw-w64-* it will download the package from Arch Linux Repository and not Msys2.
I need to use Msys2 only as a shell.
Msys2 Minwg-32/64 builds use Dwarf instead of SJLJ as exception model, and this is a very bad choice, because they don't catch exceptions from other DLLs that are built with other tool-chains, and the application will crash (For example Firebird 2).
Arch Linux is smart, and has chosen to use SJLJ as exception model for its Minwg-32/64 builds.
This seems very unlikely to work. pacman for MSYS2 will download Windows PE binaries for your MSYS2 environment. pacman for Arch Linux is going to download Linux ELF binaries. You won't be able to run these on your Windows device.
You may be able to get what you want if you use Windows Subsystem for Linux (WSL).

Building with Mono mkbundle on x86 won't run on x64

I have a .NET application that runs on Linux, using Mono. I want to avoid users having to install Mono, so am using mkbundle. I am running mkbundle on an x86 machine, with the expectation of the resulting binary being able to run on x64 machines:
mkbundle MyApp.exe *.dll -o MyApp
I can then run the resulting application on the build machine with `./MyApp'
However, when I copy it to an x64 machine (and make it executable) it won't run, just outputting:
bash: ./MyApp: No such file or directory
If I try ldd I get:
not a dynamic executable
Shouldn't binaries built for x86 run on x64 systems?
I'm rather new to Linux, and it seems x86/x64 isn't quite as straightforward as it is on Windows, as many x64 Linux distributions don't ship with the capability to run 32-bit binaries.
After installing 32-bit libraries on the x64 machine, the x86 code will execute as expected (e.g. on Ubuntu 7.04, apt-get install ia32-libs.
While this works, as I need to target a number of distributions I've decided to just create separate builds for x86 and x64 instead.

make-kpkg not working in Fedora 20

I have been working with Linux kernel, compiling and inserting modules, in my custom kernels. Previously I had Ubuntu where I had been working with my custom kernel and all the commands for compiling and installing kernel worked like a charm once I had installed all the required libraries.
Now I have switched over to Fedora 20, here I want to install my custom kernel and for that I downloaded all possible kernel tools, namely, Kernel Development Kernel Tools these are group installs and other libraries that I downloaded were ia32 libraries (as I am working on 64-bit OS), kernel-devel package. Still I am not able to work with make-kpkg command. It says bash: make-kpkg: command not found....
I googled out and did everything I could.
Can anyone get me out of this trouble?
make-kpkg is a Debian kernel packaging tool. It does not exist on RHEL family distributions, such as Fedora.
Please refer to the Fedora documentation page "Building a custom kernel" for the correct procedure. (I have not reproduced it here as it is rather long, and I'm not sure how far you may have gotten.)
The make-kpkg tool is part of the 'kernel-package' package on Debian systems. It is a Debian tool to produce debian package files. Ubuntu is based on Debian and has this tool. However, Fedora uses a different system to manage packages. So, make-kpkg would not be available on Fedora.

Compile for CentOS on Ubuntu

Can I install an older version of gcc/g++ (4.1.3) on the latest Ubuntu (which comes with 4.4.3) and use it to compile a .so which should run on CentOS? The binary compiled with the Ubuntu version of gcc fails to load on CentOS because of missing imports (GLIB_2_11, ...). I need C++ (including exceptions), so I can't just statically link against glibc, which I already tried.
Can I install the older gcc without removing the newer one? How do I go about the libs required by the older gcc?
I'm currently developing code in CentOS, but it's such a pain to use. I really want to move to an Ubuntu desktop.
g++-4.1 is available for Ubuntu; just run apt-get install g++-4.1 then run g++-4.1 instead of g++. However, simply using an older compiler may not fix all of your library issues.
Like Joachim Sauer said, your best bet is to do your development on Ubuntu then do the final compilation on CentOS.
Even though you're using C++, static linking should still be an option. (However, you're much better off compiling on CentOS and using dynamic linking.)
Edit: A virtual machine is the most straightforward way to build on CentOS, but if you want to avoid the memory and CPU overhead of running a VM and don't care about differences between Ubuntu's and CentOS's kernel, then you can create a subdirectory containing a CentOS or Fedora filesystem and chroot do that to do your builds. This blog posting has details.

Cannot run 32-bit apps on 64-bit Linux

I have a very minimal install of Ubuntu 8.04 64-bit.
When I try to run some 32-bit programs, such as my jhead program, I get the message No such file or directory.
I figured it may be a library problem, but when I do:
ldd jhead
instead of a list of libraries it needs, I just get the message not a dynamic application. Same for another old 32-bit app I use.
So it would appear some very important components for running 32 bit apps are not installed. But how do I even determine what these are?
You will need to install the 32bit library package
ia32-libs - ia32 shared libraries for use on amd64 and ia64 systems
go and execute
sudo aptitude install ia32-libs
Doing ldd ./ might help to see which library dependancies are succesfully resolved.

Resources