Does the Gentoo install CD contain everything for C++ development? - linux

I'd like to install Gentoo. I need it to develop GUI C++ applications using wxWidgets, so I need:
build tools: make, automake, autoconf, etc.
C++ compiler (GCC)
X Window System for testing (Fluxbox or something minimal would be enough)
Now, I have two options:
download the small network installer (57MB) do a network install
download the 600MB CD
I'd like to download as less as possible and still have all the tools above.
I also don't understand whether the network installer will first prompt me for the packages I want, or it will fetch 600 MB of data anyway?
I might want to install it on other computers later, so I'd go with 'full' install from CD if the network install does not save me anything.

Gentoo is ultraminimalist by default.
The install CD gets you a basic working system, a basic compile environment ( Some version of the GCC suite ), and package management.
Its up to you then to install what you want to use.
Its not like many other distributions where theres a big set of "default" packages to have installed.
You have to know what you want, and install what you want.
The "Live" Cd will make things a bit quicker by having a few precompiled binaries available, but besides that, you still have to choose what you want to install.
I also don't understand whether the
network installer will first prompt me
for the packages I want, or it will
fetch 600 MB of data anyway?
it will only install what you want to. If you use NetInstall and install nothing except GCC, it will only download enough to have GCC.
Welcome to gentoo.
It can be a little daunting for first timers, but once you've gotten past the steep learning curve you'll love it :)

You're not missing anything. Furthermore, if you actually want any useful applications, you're going to have to do a lot more downloading than that even. The point is, the small network install CD lets you download whichever version of those components that you want, and the latest version of portage, etc, instead of providing you with (likely) outdated copies on the full CD.

Gentoo is fundamentally a network based distro. The minimal CD is really minimal, it contains just enough to have a functional system booting as livecd so one can install the distribution basically from the network. The livecd (there are also livedvd's around, just not as regularly released (they eat diskspace and bandwith). contains a full graphical environment (and being a compile-yourself distro) obviously gcc as C++ compiler, and can be used to install a binary version of the packages to disk (actually from the livecd environment using some clever hackery).
However, gentoo is a continuously updated distribution. If you want to update your system you need to get the packages from the network (there are ways to find what to download etc, but that is not for beginners) and update. In general, if you don't update every couple of months, your updates can become painful or really painful.

After you have installed Gentoo — emerge vim wxGTK and read
and you are ready to go!
Happy coding.

Use network install. It does save you something and even if you do multiple installations you'll probably want the newest packages anyway.
And no the network installer will not download the 600MB without asking you that would make no sense.

Gentoo doesn't exactly "prompt" for packages. The network CD will get you a small base system, then it's up to you to set everything else you need up yourself. This is one of the positive and negative things about Gentoo.

Update: looks like network install is not that minimal. There's 57MB .iso image, but you also need to download stage3 which is about 120MB and portage with is 29MB. And later on, you need the Linux kernel, which is about 46MB. This totals about 250MB. Or am I missing something?
Update: I have installed the Kernel, X Window system, mc, Lilo, etc. and have a working system :) Summing all up, it downloaded about 580 MB. Well, it is still less than install CD!

Related

Install minimal lightweight Mono CLR on linux

I can install, compile code with and run Mono on linux just fine but I am interested in this code running on client machines. The Mono installation is a 200+ MB package however which seems very heavy (a minimal Python installation seems to be smaller for example) and some clients may hesitate at such a size. So ideally there would be a package that only contains the CLR so that they can run the .exe files (or whatever) but contains nothing else; is this possible?
It seems that the only packages available include developer tools, which obviously clients don't need:
https://www.mono-project.com/download/stable/#download-lin
Using apt show mono-complete I can see that there are many dependencies like "mono-runtime" but I have no idea what combination of these I will need to be able to run programs.

Distribute wxWidgets applications on Linux platforms

I started to learn wxWidgets some days ago and I found it really good, but when I tried to "export" it to another platform, it didn't work. I expected that, but when I searched how to distribute wxWidgets apps, I didn't find that much, except to install all the wxWidgets library on that platform.
I use Debian 10, while the other platform where I need to export my app uses Lubuntu. It says that it needs a wxgtk3.0 library, but I guess that library will need even more libraries.
How do I export my wxWidgets application without asking the user to manually install and build all the library on its platform? Maybe even using .deb package (if really needed). Thanks in advance.
Please understand that every single distribution uses it own format comparing to Windows where there is an Installation Wizard or OSX where there is a Bundle.
Now, you can create a deb file where you set everything up.
So you can create an rpm (which is basically the same thing as deb, just for different distribution).
This is the best way as it ensures that all dependencies and their versions are satisfied.
Hope this helps!
Many Linux distributions have wxWidgets packages, so one possibility would be to simply ask users to install these packages when they need to use your application. This is not really different from installing GTK libraries or even X11 (or Wayland) that your application also depends on -- the only difference is that these libraries are almost surely already installed on any Linux desktop system, while wxWidgets ones might not be.
Another alternative is to link your application statically with wxWidgets libraries. This will make it much bigger and will prevent the users from upgrading the libraries on their system to improve the application behaviour, but can be simpler for the users to install. Note that if you choose the static linking route, you typically need to build your application on the oldest distribution you want to support (which is probably not Debian 10, which is relatively recent), as this is a simple way to ensure that it doesn't require newer versions of the (other, non-wx) libraries than the ones already present on the user's system.

RHEL5 Qt compiler/linker/qmake issues... advice?

I have about a few problems with a new install of the Qt SDK. I probably only need advice, but specific answers are also welcome. Before I begin a mini-story, I am running RHEL5 on academic license under VirtualBox on OSX 10.6. Using Qt version 4.5.3. This is my situation...
1.) I couldn't compile because g++ wasn't found. I fixed this by creating a link: g++ -> g++34. This allowed me to compile but it generated more errors at link-time. I had installed the framework in my home directory unintentionally so I uninstalled/reinstalled the entire SDK to /usr/local/qt.
2.) At this point I could compile but the linker complained about a missing freetype package. I had that already installed but wasn't sure why it couldn't be found. So I installed a few packages that I thought might be missing like libqt4-devel and libqt4-devel-debug. I also installed a few other general programming packages for later use.
3.) Somehwere in this process I can no longer run qmake. I ran it before and I have it installed at /usr/local/qt/qt/bin/qmake. I could create a link to it (though I shouldn't have to OR I could ensure that the location was in the PATH var). However, at this point Qt Creator says there's no Qt installation found. I re-pointed it to the installation location (using Tools/Options) but it still won't run qmake or anything else for that matter...
I only need this linux install to compile and test my Qt projects which I am developing in OSX. So my question is, should I just wipe this RHEL install and start over? And if so, should I use something else like Ubuntu? I am having plenty of hassles that I don't want to deal with as is. Note, this project will require good OpenGL support.
Is there a particular reason that you don't simply use the Qt package that's part of RHEL?
If for some reason you need to build your own, you can get all of the build dependancies with:
$ yum install yum-utils
$ yum-builddep <whatever the qt package's name is>
#scotchi is right, and you should try to use the Qt package that comes with your system unless you need a very different version. I don't know what version of Qt comes with RHEL but if its not up-to-date enough for you (and it might not be, see below) then you could consider changing OS versions. I would only do this after trying his suggestion though, because you may be able to get things working without the hassle of a full OS install.
Now, as to why you might want to switch: RHEL is, as its name ("Enterprise Linux") indicates aimed at companies who want to run servers, or large deployments of desktops. It emphasizes stability and reliability over being cutting edge. Fairly often the version of the compiler and development libraries lag a little behind the curve. This is what their clients want: a stable platform they can develop against and run programs on for a period of time, not constantly needing to keep up with the latest changes, and thoroughly tested. But for people doing development at home it may not be necessary to stay that conservative. I don't know if this is for work, school or personal programming, but it sounds to me like you should move to one of the more desktop-oriented distros. Ubuntu is great, as is Fedora. If you prefer a RHEL-like environment, then choose Fedora.

Distributing binary applications across linux distros

I've written an application which as of yet is not open source and I'd like to distribute the executable across various linux distros. What's the best way to do this, I've looked a little bit at .rpm and .deb packaging but I can't find if that can be used for binaries or not. Ideally I'd like something like the PackageMaker on OS X or a regular installer on windows that will have it automatically copy into /usr/bin. Is that what .rpm and .deb packages are for or do I have to bundle a shell script that will do it automatically?
RPM and DEB packages are the two primary mechanisms for distributing binary packages in Linux. RPM is used by RedHat and its derivatives (Fedora, CentOS), while DEB is used in Debian and Ubuntu.
The .rpm and .deb files themselves are generally "dumb" archives, and are installed to the correct locations in the filesystem by pre-installed helper applications. You don't have to worry about writing scripts to install files, unless it's a very complicated application which needs special per-system configuration.
The usual patterns I see for distributing binaries are:
Release a compessed tarball (.tar.gz or .tar.bz2), and let distribution packagers worry about the details. This works well for popular applications, but if it's newly released, nobody will care enough about your application to package it.
Release as a tarball, plus RPM and/or DEB packages (depending on customer needs). Customers with a supported distribution may install the pre-made package. Anybody who's using an unusual distribution is probably happy to install from a tarball anyway.
MojoSetup is a user-friendly, perfectly cross-distro solution and nicely-licensed (zlib, very permissive). All it requires is the standard sh shell which comes with any Linux distribution. It also allows for desktop shortcuts the easy way by creating freedesktop.org spec shortcuts, which are supported by just about all graphical environments for Linux (so just dump in a PNG at different resolutions and fill in the blanks of the .desktop file).
Installers are scripted using the very simple Lua programming language and there are several example installer scripts in the Mercurial repository as well as a lengthy tutorial. There are also many years to back up its development into a lightweight cross-distro installer.
The rpm and deb will store the binaries. You'll need to have a different binaries for each distro or distro variant most likely, just because on different distros things are different like paths.
I recommend starting with the two you have rpm and deb and nail those two distro. Then maybe do a tarball for misc distros have people can extract and directory structure and copy and handle permissions on their own.
Also, for things like deb you can setup a site as a repository. That makes it easy for people to add the repo and get/install the deb in ubuntu very easily. A lot of 3rd party closed source devs do that.
That's what .rpm and .deb files do BUT you have to be sure that the installee distro has the ability to deal with the .rpm and .deb files. If you want something that's sure to run across multiple distros, where you can't be sure that they will have the right package manager, then you pretty much have to resort to the shell script method. I would advise, if you can get away with it, building your binary for both .rpm and .deb - this way, you get most of the distros covered, and you allow users to install in a way they are comfortable and familiar with, and you don't have to try to roll your own installer / uninstaller shell scripts.
You should probably provide a native package for each Linux distribution that you officially support (as you officially support them, you'll be testing on them so doing this should be trivial), and provide a .tar.gz which people can drop in for other ones.
Users can always make their own .rpm etc for some alien distribution which you don't support; but they can't complain to you unless it doesn't work on an officially supported OS.
Which OSs do you officially support? You'll obviously need to test on them all (at the very least, you'll need to pass all your regression test suite on each OS on each release).
This is of course complicated if you support multiple architectures.

Best approach to writing a generic installer for a Linux application?

We have a Linux server application that is comprised of a number of open-source tools as well as programs we've written ourselves. Ideally we would like to be able to install this application on any common Linux distribution.
In the past, we've written perl scripts to automate installs of this application. Unfortunately, due to idiosyncrasies of different Linux distros, the logic inside these install scripts gets horribly complex, and can change as new versions of each supported distro are released. Maintaining the installer thus becomes one of the most time-intensive parts of the project!
I'm looking for assistance, be it a framework, documentation, code samples, that can make this process less painful. Here are the types of things our installer needs to do:
Create user/group accounts
Create directory trees with specific ownership and permissions
Install open-source applications, potentially compiling them from source during install
Insert pre-compiled binaries, scripts, config files, and docs into specific directories
Register init-type startup and shutdown scripts
Generate encryption keys
Verify connectivity to a central server
Instead of the installer approach, I think a better way than having a single script that does it at install time is to have a build system which generates .deb or .rpm files suitable for installation on each system you have to support.
A poor man's way of going at that might be to use checkinstall, which creates packages from the files installed via 'make install'. So you'd build your app on each system and have the package magically created in the distro's native format.
I believe that most of the tasks which you describe are fairly standardized between Linux distros. In my experience, the following should work the Debian family (including Ubuntu) and the Red Hat family (including Fedora and CentOS):
Create user / group accounts - adduser command
Create directory trees - mkdir or install, or just expand a tarball
Install open source applications - Unless you have particularly esoteric needs, this should probably be left to the distro's package manager.
Install files - install, or just expand a tarball
Startup and shutdown scripts - install to /etc/init.d then symlink to /etc/rc*.d
VMware Server is freely available for Linux and does most of the tasks which you describe. It uses Perl and maybe shell for its installation and configuration, so you might see the approach that it takes.
However, speaking as a Linux admin, I strongly prefer applications that integrate with my package management system. In other words, create .deb and .rpm files, as Vinko Vrsalovic suggested. Building packages is extremely well documented:
Building RPMs for Fedora (or Red Hat or CentOS): draft documentation, RPM Guide
Building .debs for Debian (or Ubuntu): Debian Maintainer's Guide
I tried Autopackage a few years ago, don't know how universal it is but worked quite well (was the only truly universal way back then). Surely you have to provide some LSB-compatible ways of setting up proper directories on your own, but this piece of software should help you.
Though there's probably still too much diversity among linux distributions to do everything in a completely platform-agnostic way but I may be wrong.
You may want to try BitRock InstallBuilder. It is a cross platform installation tool that allows you to do exactly what you are looking for (adding users, installing services, install pre-compiled binaries, etc). Although some of the other posts mention a number of tools that you could use in your scripts, the problem is that every Linux distribution is a bit different and simple tasks like adding an user or installer a service are suddenly non-trivial when you need to do them across Debian, Ubuntu, Mandriva, RedHat, Gentoo, etc. A good cross platform installer should isolate you from all that. Many commercial open source companies like MySQL, SugarCRM, Zenoss, Jaspersoft, Groundwork etc. have built installers based on our technology exactly because of that (in addition to their regular source code tarballs, etc.) We also provide free licenses for open source projects.
Autopackage now merged with Listaller project. Documentation's not really thorough yet but seems to be working.

Resources