One of our projects is a cross-platform piece of code. We build it on Windows, Linux, and Solaris/SPARC mostly. Of the 3, we deal with Solaris the least and it's a maintenance pain to keep our SPARC box up and running and in general Solaris administration is not our competency.
A few years back I built a working cross-compiler for SPARC64 on Linux, and that part worked great. What stopped us from going forward was the last part of our build process, which involves building a Solaris package with pkgmk and pkgtrans.
I was never able to find a Linux solution for building pkg files that can be installed on SPARC Solaris -- does anyone know if one exists today?
I have personally used the tools from Heirloom project: http://heirloom.sourceforge.net/
The idea was to cross compile for Sparc on a faster Linux machine and build also the package.
I don't know that anyone has done the work to build it on Linux, but Sun has released the pkgmk sources as part of the OpenSolaris source base.
See https://hg.java.net/hg/solaris~on-src/file/tip/usr/src/cmd/svr4pkg/ for the source to the entire suite of SVR4 pkg* commands, though it may have dependencies on other libraries as well.
Related
I'm relatively new to coding and building apps, and I wan to contribute to PowerPC Linux via packages. So I'm trying to port apps and tools over to then package them for the platform and try to package them for different distro's such as ubuntu and void. But, as it stands with running Ubuntu Mate 16.04, basically everything is out of date (LLVM, Clang, ETC) so I'm having to build stuff by hand to get other stuff built, which atm is Firefox 61.
No problem! But, what all tools should I acquire? And I have the system as up to date as I can make it by the way, the tools are just out of date. I have the code for LLVM, clang, and associated tools, but I don't know what other tools I may need in the future (If I have Clang do I need GCC? If an app calls for GCC in the build process can I bump it to clang instead? What tools do I need for python? Or rust? What are your general recommendations for languages I should be up to date with?).
For note, I'm building on-platform via a powerbook G4 A1138. Its a little slow, but its not that bad over all. I know I could cross compile, but something doesn't sound right about building on X86 for powerpc. Like I'm gimping the platform by doing that.
Anyways, all the info I can get would be helpful! Thanks!
There is a clang-5.0 and clang-6.0 for powerpc in the Ubuntu repository.
The PPA for Ubuntu Toolchain Uploads (restricted)” team has very recent toolchaing for powerpc on 16.04 too.
Background:
Im using QT and have visual studio 2012 as my IDE (used the QT plugin for visual studio).
And finally the whole project is done. However due to my .NET background I have no experience when it comes to deploying my project so it can be run on Linux.
Question:
Anyone knowing how to deploy a QT project made in visual studio to linux?
You should install Linux and prepare a Qt development environment on it.You can then copy your project there, compile it and see the results in the real environment. This way you can cope with the minor differences when porting from one OS to another easily.
So don't think of cross compiling your app for Linux on Windows. From a complexity point of view, I think setting up a Linux machine (VM or not) and the necessary environment for Qt is a whole lot simpler than cross compiling bug hunting afterwards. After all you will need a real target environment to finally test your application.
Before you can deploy something you have to compile it for that platform, and here you have two main choiches: either you cross-compile which means you compile it on windows using a set of tools so that your software is built to run on a linux, or you get a linux machine, you copy your entire project over and let Qt for linux do the magic.
Once you have your working binary compiled on linux or for linux then you start thnking about deployoment.
If you really want to be fully linux-compatible and "linux-ally correct" you should distribute your source-code precooked using some tools like "automake" that will make it possible to linux users to compile it on any linux version.
If you do not want to release your source code, you technically can distribute binaries without source code (not sure if you will be ok with licenses) but you have to be aware that there is no standard in linux for distributing binary packages, there are at least 2 main package building standards that are the ubuntu/debian style and red hat (and friends) style.
You are going to find plenty of documentation about all this stuff from cross-compile to automake and of course building debian packages and building red hat rpm packages.
I have developed a small application in Qt Creator on Ubuntu 12.04 which I want should run on any other linux distro (mostly different versions of CentOS and ubuntu), just like any portable application on windows does.
I want to be able to simply share the binary file of the Application, and run the application.
I am able to successfully do this in windows, by just building the project in QT Creator and then putting the required libraries in the Application directory and then transfering them to other windows systems.
I searched all over and found out that I should be trying to build the project using LSB(Linux Standard Base) Compatibility, so that it runs on other linux distros. Is that the right way to do this?
I am very new to Qt and also to Linux (dont know much of Shell Scripting).
Thus, I dont know how I should proceed to make the Application LSB Compliant.
I have refered to, the following links:
Distributing Qt-based binaries on Linux and
Deploying Qt applications on Linux but have not beem able to understand what I am suposed to do.
I also found this question here which states a very similar situation as mine, but because I am a novice, I dont know how I should do this.
Moreover, considering that the first two articles were written 6 years back, shouldn't there be a simpler way to deploy Qt apps on the linux platform now?
I also saw something about static linking, is that the way to go?
Isn't there a way by which all of this can be done through Qt Creator itself?
If there is no hope of creating a portable Qt Application for Linux, then is there a way, say a shell script or something that would combine all the steps required to compile the Qt project on another computer and run it. Say, download Qt-SDK if not present, run qmake and make and then the newly compiled application, if not already there, so that the user can run the program just by running one script.
Your problem here is not the Linux Standard Base, but rather the presence or not of the specific version of Qt you need (or a later one).
Exactly like on a Windows machine, a user may have any of Qt installed, or they may not have it at all. On Windows it is easier to check for the presence of a certain version of Qt than it is on Linux, thus it is easier to write install tools that automate the experience.
To solve your problem there are a few ways:
Inform the user that your program requires a certain version of Qt or higher, and let the user handle the problem
Learn how to create packages for every distribution you want to target and create specific packages
Use a program like 0Install or Elf Statifier to create a package/executable containing all the necessary libraries.
The latter is similar to what many Windows and Mac programs do (they include every library they need within the installer), but it is not the preferred way on Linux, which relies heavily on shared libraries.
Making a binary application compatible with any other Linux distro is practically impossible since you will never know in advance which libraries are available in distro X, or what version of that library is available. Even among a single distro (e.g. Ubuntu), binary application are almost never backward-compatible, since anything built on Ubuntu 12.04 will have dependencies on versions libraries which are installed on that version of Ubuntu, and trying to run that binary on Ubuntu 10.04 will most probably fail simply because it doesn't have a recent enough version of glibc or some other necessary library.
However, the idea can be much more implementable if you limit yourself to a finite list of distros and versions of those distros. You can then know which libraries are available for those distros, and aim for the lowest common denominator. I used to maintain a binary application which had to support several distros (Ubuntu, Fedora, OpenSUSE, SLED, Mandriva), and the way I would do it is install the oldest distro I was targeting on my build machine. That way, the binary application would be linked to the oldest versions of the libraries available on those distros. Unless there's a new major version of such a library (which happens quite rarely, and even then, distros usually distribute the previous major version for a while for compatibility purposes), your compiled binary will then be compatible with all your targeted distros.
Therefore, the quick piece of advice I would give for your situation, use the oldest LTS version of Ubuntu which is still supported (10.04 at the moment) for your development, and you should be pretty safe for most recent popular distros. For the application you already developped on Ubuntu 12.04, you should have no problem simply recompiling the same source on 10.04. Understand that you will never however achieve 100% compatibility with a compiled C++ Qt application.
If Qt is not all that important to you, you could use a higher-level or interpreted language such as Python, Java, Perl or Ruby. With such languages, you can usually count on the language implementation already being installed on the target distro.
Deploy an application in Linux is a nightmare, luckily there are some solutions. Check this projects to build a portable binary with all their dependencies bundled:
http://statifier.sourceforge.net/statifier/main.html
http://www.magicermine.com/index.html
http://www.pgbovine.net/cde.html
Another solution is make a portable 0install package:
http://0install.net/
I recomend this solution. Personally I have been problems with the 3 first packagers.
I've been fighting a whole day with UNIX utilities - so sorry if I appear confused! I'm describing my painful and (so far) fruitless process a little because maybe someone may correct me, or maybe describing the process might be helpful to someone later on. If you want to skip this, the question is bolded below.
So I'm trying to convert a Linux program developed using kdevelop. I'm trying to make it run on Windows 7. (This is the SHoUT Speech Took mentioned here, developed by Marijn Huijbregts).
I've wasted half a day trying to install kdevelop on Windows, only to understand that kdevelop can't run on Windows and that I've been installing KDE all that time :( (If kdevelop CAN run on Windows, information would be highly appreciated).
OK, so following the advice in SO's Best environment to port C/C++ code from Linux to Windows, I installed MinGW32 only to find out that SHoUT's makefile contains targets such as aclocal, autoheader etc. - I've come face to face with the hitherto unknown GNU Build System.
I'm now in the middle of installing GnuWin32 using GetGnuWin32. This is taking hours. And I suspect that once it finishes, I'll stumble on something else.
A day of pain - and still not one code line compiled :((.
So, I'm thinking about an alternative approach: Install Linux and run kdevelop as a cross-compiler to compile to Windows. As this is a console application, MAYBE it'll be easier.
So, finally, my question:
If I want to install Linux guest in VMWare Workstation (8, running on Windows 7 host), I understand I need a "distribution". I understand there's a ton of distributions, some free, some paid.
Which distribution should I choose which would run kdevelop and be as simple as possible? I just want to ##$$ing compile, and I can't stand one more day like this...
Avi
Edit:
I've tried compiling the code using VS - very tedious. Many differences between Linuix/GCC and windows/MSVC. Moreover, this is code deveoped by someone else, and I'm not even sure that the program sovles the business needs. So I've decided on the following process:
Configure Linux and run the software on Linux.
Validate that program solves business rule. If not - Abort.
Try cross oompiling on Linux. If running on Windows, verify by comparing outputs to those obtained on Linux. If good - Done.
Try compiling on Windows using ported Windows versions of the GNU Build tools. Use understanding and values obtained from building on the Linux target. If good - Done. Else
Abend and try another solution to the business problem, or try the MS tools (again using understanding and values obtained from building on the Linux target).
Many distributions are possible. Mandriva is KDE based.
But you can also install a Debian distribution, and install KDE in it.
I suggest to contact the ShOUT project community.
You should not cross-compile. MinGW can come handy but it is not required. What you need is to port the code and its dependencies to Windows, and there is nothing wrong if you use Visual Studio, for example.
I am using Ubuntu on VirtualBox OSE and through it use kdevelop and it runs seamlessly. Alternatively you can try kubuntu.
Why VirtualBox OSE - Free, Mature
It is easier to compile with MinGW on Windows than cross compile on Linux.
Build system... It could be quite easy to write Your own. Much easier than actual porting of C++ code. Could be even easier than using GNU Build System.
Please DON'T install Linux! It will take you another half a day and another questions asked here if you're doing it for the first time.
Just install VirtualBox and grab some VirtualBox image from some site. Kubuntu should be working fine with your KDE stuff: http://virtualboxes.org/images/kubuntu/
It will get you a running KDE Linux in just 5 minutes.
I'd like to set up a cross-compilation environment on a Ubuntu 9.10 box. From the documents I've read so far (these ones, for example) this involves compiling the toolchain of the target platforms.
My question is: how do you determine the required version of each of the packages in the toolchain for a specific target platform? Is there any rule of thumb I can follow?
This is a list found in one of the websites linked above:
binutils-2.16.1.tar.bz2
linux-2.6.20.1.tar.bz2
glibc-2.5.tar.bz2
glibc-linuxthreads-2.5.tar.bz2
gcc-core-4.2.0.tar.bz2
gcc-g++-4.2.0.tar.bz2
But suppose I want to generate executables for standard Ubuntu 8.04 and CentOS 5.3 boxes. What are the necessary packages?
My primary need is to avoid errors like "/usr/lib/libstdc++.so.6: version `GLIBCXX_3.4.11' not found" in the customers' machines but in the future I want to deal with different architectures as well.
It is generally a good idea to build a cross-toolchain that uses the same version of libc (and other libraries) found on the target system. This is especially important in the case of libraries that use versioned symbols or you could wind up with errors like "/usr/lib/libstdc++.so.6: version 'GLIBCXX_3.4.11' not found".
Same Architecture
For generating executables for standard Ubuntu 8.04 and CentOS 5.3 systems, you could install the distributions in virtual machines and do the necessary compilation from within the virtual machine to guarantee the resulting binaries are compatible with the library versions from each distribution.
Another option would be to setup chroot build environments instead of virtual machines for the target distributions.
You could also build toolchains targeted at different environments (different library versions) and build under your Ubuntu 9.10 environment without using virtual machines or chroot environments. I have used Dan Kegel's crosstool for creating such cross-toolchains.
Different Architecture
As I noted in my answer to a another cross-compiler question, I used Dan Kegel's crosstool for creating my arm cross-toolchain.
It appears it may be slightly out of date, but there is a matrix of build results for various architectures to help determine a suitable combination of gcc, glibc, binutils, and linux kernel headers.
Required Package Versions
In my experience, there really isn't a rule of thumb. Not all combinations of gcc, binutils, glibc, and linux headers will build successfully. Even if the build completes, some level of testing is necessary to validate the build's success. This is sometimes done by compiling the Linux kernel with your new cross-toolchain. Depending on the target system and architecture, some patching of the source may be necessary to produce a successful build.
Since you are setting up this cross-compilation environment on Ubuntu 9.10, you might want to look into the dpkg-cross package.
Compiling for other Linux distributions is easiest by installing them in virtual machines (apt-get install kvm) and then doing the compilation from within. You can also script them to do it automatically. Building a cross-compiler and providing the exact same versions of all libraries and such, as the other Linux distro does, is nearly impossible.
My question is: how do you determine
the required version of each of the
packages in the toolchain for a
specific target platform?
...
binutils-2.16.1.tar.bz2
gcc-core-4.2.0.tar.bz2
gcc-g++-4.2.0.tar.bz2
Generally pick the latest stable: these only affect your local toolchain, not runtime.
linux-2.6.20.1.tar.bz2
You don't need this. (For targeting embedded platforms you might use it.)
glibc-2.5.tar.bz2
glibc-linuxthreads-2.5.tar.bz2
You don't need these. I.e. you should not download them or build them; you should link against the versions from the oldest distro you want to support.
Is there any
rule of thumb I can follow?
But suppose I want to generate
executables for standard Ubuntu 8.04
and CentOS 5.3 boxes. What are the
necessary packages?
You survey the distros you want to target, find the lowest common denominator versions of
of libc, libstdc++, pthreads, and any other shared library you will link with, then copy these libs and corresponding headers from the box that has these LCD versions to your toolchain.
[edit] I should clarify, you really want to get all the dependent libs from a single system. Picking and choosing the LCD of each file version from different distributions is a recipe for a quick trip to dependency hell.
Depending on your target platforms, have you considered using Optware?
I'm currently working on getting Mono and Moonlight built for my Palm Pre using the cross-compilation toolchain (and the Optware makefiles handle the majority of dependencies already).