Linux distribution binary compatibility - linux

Any way to make a binary in a Linux distribution and run it on another distribution with same architecture? Or I should compile and build it on different distributions?
Is there any compatibility between Redhat, Debian based distributions for binary files?
(I want to use my Ubuntu binary file on fedora!)

Enter Linux Standard Base to reduce the differences between individual Linux distributions.
See
http://www.linuxfoundation.org/collaborate/workgroups/lsb
http://en.wikipedia.org/wiki/Linux_Standard_Base

Statically linking your binaries makes them LESS portable because some libraries won't then work correctly for that machine (differing authentication methods, etc).
If you statically link any "unusual" libraries and keep your set of supported distros to a minimum, you should be ok.
Don't statically link the C library (or the whole binary), that's a recipe for trouble :)
Look at what (e.g.) Google do with Chrome.

What language is your application coded in? If its in a language like Python, (and no C bindings) or Java or any other VM based language, then I think you can trust the VM to make sure your application will work on the different Linux distributions.
Also, there is the Linux Standard Base which you can refer to.
HTH,Amit

I realize this is a very old question, but it comes up high in search results and this hasn't been mentioned:
CDE is a tool to create portable Linux applications. This tool packages together all needed files (including libraries) by analyzing at run-time. I have used it successfully on command-line tools several times, one example being getting tcpdump to run on a old hardware appliance running a custom distribution. CDE also doesn't require source, it just packages an executable you are able to run.
At one point I had an error running the cde command which was fixed by prepending the command with LD_ASSUME_KERNEL=2.4.1, this might not be necessary in recent versions as it was years back.
Code is also on GitHub: https://github.com/pgbovine/CDE

It works. But it also depends on the version of the shared libraries you use, including libc, libstdc++ which are forced by the compiler version that may differ from distro to distro.

The best way is to distribute the source code and to make it easy to build the source on any reasonable Linux distribution. This is better than binary distribution because it is not enough to make the binary compatible with shared libraries. You also need to make sure you adapt your program to things like distribution specified locations and conventions for where web apps go, or how e-mail is sent, or how services are started, or how to determine the default paper size, or a myriad of other details.
See for example the Debian Policy Manual for a document describing many of the things a distribution needs to decide to ensure compatibility between applications running on it. You don't need to read it through or learn it by heart, but it shows the scope of the issues that may trip you.
You should probably work together with several of the major distributions to ensure your application works well with all of them. Most distributions' developers will happily help if you approach them politely. If you're lucky, you can attract volunteers from the distros to make the binary packaging for you, and that will quickly give you feedback on what you need to change at the source level to make your application easy to package.
The Linux Standard Base already mentioned by others attempts to work out a cross-distribution solution to these variables, but it is not comprehensive and not fully supported by most distributions. However, most distributions consider it a problem if they accidentally break LSB compatibility.

Normally it's ok to use binaries across linux distributions as long as you have the same set of libraries available. You can use 'ldd' to check which libraries are needed by a binary. libc should sure have the same version in the distributions involved.

You could statically link your executables for portability.

LSB is definitely worth checking out. Though in regards of working with libraries I was most satisfied by this answer here at SO https://stackoverflow.com/questions/1209674/shipping-closed-source-application-for-linux/1242738#1242738 and this detailed treatment of the rpath mechanism http://www.eyrie.org/~eagle/notes/rpath.html

How about HTML?
It's cross platform, it's been around forever, and if you consult caniuse, and you know your target environment. It can render any UI I ever dreamt of, and if you're willing to learn javascript, you can approach this from a server programming perspective and a client programming perspective, without switching languages, which comes in handy if you're the one doing both.
It's probably the closest thing we have to a machine lingua france these days, and that is a good thing, because that means there's potentially more pickins for everybody involved.
People don't know that the web browser does most things you would want from a program, including rendering 3D graphics and PGP style encryption.
The biggest benefit I see in the browser as a platform is that everyone's nephew knows how to install a browser on a new computer, and from there it's just a url away, including packaging in some of the stores.

Related

CMake/CPack: Preferred package generators for different platforms

I want to distribute executables and libraries of a C/C++ project on Linux, OSX and Windows. What are the preferred CPack generators, i.e. which are likely to be available for most users? On Windows there only seems to be NSIS, but on Linux and OSX there are several alternatives.
By the way, a source distribution is generated as well, so in theory, users of all platforms should be able to compile the code themselves, but we want to provide precompiled binaries for convenience.
There are multiple common practices on each of the different platforms. Which one is best for you will depend on a variety of factors, but the following should at least help choose among the more popular formats that CMake/CPack has direct support for. I'm assuming you are using CPack via CMake (i.e. via the CPack module, possibly with package components using the CPackComponent module as well).
Windows:
The NSIS package generator produces executable installers which average users are well accustomed to using. These support component-based installs, so you could provide the source as an optional component. CMake's support for this package generator is fairly mature, but it is perhaps becoming a less preferred method in recent times.
The WIX package generator produces MSI installers. Support for this is newer and seems to be more active in terms of feature development, etc. It also supports component-based installs and seems to be becoming the preferred format over NSIS.
Mac
There are a number of options to choose from for Mac, but which one is most appropriate depends on what you want to package up. If you just want to provide a single app bundle, the DMG package generator (also sometimes referred to as the DragNDrop generator) is probably what you want. Users are well acquainted with these and they are easy to use. Avoid the Bundle generator, it is older and more limited in what it supports, the DMG generator should be preferred instead.
For packages containing more than a single bundle, the DMG generator is still potentially suitable, but a proper installer may be more appropriate. Until recent years, the PackageMaker generator was the go-to generator for that, but it has been superseded by the ProductBuild generator (supported by CMake since version 3.7).
Linux
On RedHat-based systems, RPM is usually the package format of choice (use the RPM generator), whereas for Debian-based systems the DEB format is preferred (use the DEB generator). Debian-based systems can support RPM using tools like alien, but users almost always prefer a native DEB format. If you're happy to provide both, you can keep both camps happy, but note that you will have to pay careful attention to binary compatibility. Simple packages used to be able to build against the LSB (Linux Standards Base) to produce a single RPM that would work on all major Linux distributions (even Debian-based ones), but the LSB hasn't really kept up with recent developments and it never really supported the full set of functionality most complex apps needed (or the versions of packages they provided were too old). The LSB does, however, provide very useful tools like the app checker for assessing whether packages you've built (by any means) will be missing symbols, etc. across various Linux distributions.
Note that for Linux, you should distinguish between whether you are targeting packaging for inclusion in Linux distributions themselves or whether you expect users to download and install the packages outside of the distribution's packaging system. Larger independent commercial software products will tend to be distributed as standalone packages, including the relevant libraries, etc. and installing under /opt by default (if they follow guidelines like those advocated by the LSB and Filesystem Hierarchy Standard - FHS (PDF)). Ideally, you would make your packages relocatable so that distribution maintainers have an easier time adapting your packaging method to their distribution's requirements.
RPM and DEB both support source packages to some degree.
Cross-platform
The IFW package generator is favoured by some as a way to produce installers which have a similar look and feel across all platforms. It is also quite progressive in offering support for features like downloadable components. If having an easy-to-use graphical installer across all platforms is of interest, then this one is probably what you're looking for.
The Archive package generator provides support for archives like ZIP, tarballs, 7z and more. These are very basic formats that simply lump your files together into a single archive. These don't have the useful features like desktop integration, pre-/post-install and uninstall, but they are handy as a second alternative package format to one of the above. In particular, they can be useful for users who don't have admin access on their systems and want to simply unpack to a convenient location.

Can one Linux OS based environment application run in another Linux OS environment?

I don't have any knowledge about Linux/Unix environment. So for some understanding I have put this question in front of all the developers and Unix/Linux technical people.
By applications I target IDE's used by developers, especially:
Visual Studio
IntelliJ Idea Community Version
PyCharm Community Version
Eclipse
And other peripheral apps used by developers, gamer and network engineers
To some experienced Linux users, my question might be baseless. But consider me a beginner with Linux. Thank You in advance.
The term "application" is a very vague, fuzzy one these days. It does not describe some artifact with a certain internal structure and way how to invoke it but merely the general fact that it is something that can be "used".
Different types of applications are in wide spread use on today's systems, that is why I asked for a clarification of your usage of the term "application" in the comments. The examples you then gave are diverse though they appear comparable at first sight.
A correct and general answer to your question would be:
One application can be used in different Linux based environments if that that environment provides the necessary preconditions to do so.
So the core of your question is kind of shifted towards whether different flavors of Linux based systems offer similar execution environments. Actually it makes sense to extend that question to operating systems in general, the difference between today's alternative is relatively small from an applications point of view.
A more detailed answer will have to differ between the different types of applications or better between their different preconditions. Those can be derived from the architectural platform the application is build on The following is a bit simplified, but should express what the situation actually is:
Take for example the IntelliJ IDEA and the Eclipse IDE. Both are java based IDEs. Java can be seen as a kind of abstraction layer that offers a very similar execution environment on different systems. Therefore both IDEs typically can be used on all systems offering such a "java runtime environment", though differences in behavior will exist where necessary. Those differences are either programmed into the IDEs or origin from the fact that certain components (for example file selection dialogs) are not actually part of the application, but the chosen platform. Naturally they may look and behave different on different platforms.
There is however another aspect that is important here especially when regarding Linux based environments: the diversity of what is today referred to as "Linux". Unlike with pure operating systems like MS-Windows or Apple's MaxOSX which both follow a centralized and restrictively controlled approach we find differences in various Linux flavors that far extend things like component versions and that availability. Freedom of choice allows for flexibility, but also holds a slightly more complex reality in the outcome. Here that means different Linux flavors do indeed offer different environments:
different hardware architecture, unlike MS-Windows and MacOSX the system can not only be used on intel x86 based hardware, but on a variety of maybe 120 completely different hardware architectures.
the graphical user interface (GUI or desktop environment, so windows, panels, buttons, ...) is not an integral part of the operating system in the Linux (Unix) world, but a separate add on. That means you can chose.
the amount of base components available in installations of different Linux flavors differs vastly. For example there are "full fledged, fat desktop flavors" like openSUSE, RedHat or Ubuntu, but there are also minimalistic variants like Raspbian, Damn Small Linux, Puppy, Scientific Linux, distributions specialized in certain tasks like firewalling or even variants tailored for embedded devices like washing machines or moon rockets. Obviously they offer a different environment for applications. They only share the same operating system core, the "kernel", which is what the name "Linux" actually only refers to.
...
However, given all that diversity with it's positive and negative aspects, the Linux community has always been extremely clever and active and crafted solutions to handle that specific situations. That is why all modern desktop targeting distributions come with a mighty software management system these days. That controls dependencies between software packages and makes sure that those dependencies are met or resolved when attempting to install some package, like for example an addition IDE as in your example. So the system would take care to install a working java environment if you attempt to install one of the two java based IDE's mentioned above. That mechanism only works however if the package to be installed is correctly prepared for the distribution. That is where the usage of Linux based systems differs dramatically from other operating systems: here comes the introduction of repositories, how to search, select and install available and usable software packages for a system and and and, all a bit to wide a field to be covered here. Basically: if the producer of a package does his homework (or someone else does for him) and correctly "packages" the product, then the dependencies are correctly resolved. If however the producer only dumps the raw bunch of files, maybe as a ZIP archive and insists on a "wild" installation as typically done for example on MS-Windows based systems, so writing files into the local file system by handing administrative rights to some bundled "installer" script that can do whatever it wants (including breaking and ruining or corrupting) the system is executed on, then the systems software management is bypassed and often the outcome is "broken".
However no sane Linux user or administrator would follow such a path and install such a software. That would show a complete lack of understanding how the own system actually works and the consequent abandonment of all the advantages and comfort offered.
To make a complex story simple:
An "application" usually can be used in different Linux based environments if that application is packaged in a suitable way and the requirements like runtime environment posed by the application are offered by that system.
I hope that shed some light on a non trivial situation ;-)
Have fun!

Deploying C++ game in Linux

I am indie game developer working on Windows platform, but I have actually little to none experience with Linux and deploying apps for it. I am polishing my game written in C++'11 based on SDL 2.0 with several other cross-platform dependencies (like AngelScript or PugiXML) on Windows and I want to distribute it over Linux too and have a few question about that. The game is commercial, closed source which is currently on Steam's GreenLite, but I want to distribute free alpha version downloadable from my website regardless of GreenLite status.
1.) Are the main Linux distributions ABI (application binary interface) compatible? Or do I need to compile my game on every supported distribution/platform?
2.) If so, which distributions/platforms are reasonable choices to support?
3.) What is the best way to install an app and it's dependencies on Linux? I've read about deb and rpm systems, but it's still confusing - is there any way to automatically generate setup packages for various distributions?
4.) How does Steam work with Linux? How should I prepare my app for distribution via it?
Excuse me if I ask wrong questions, the whole world of Linux is pretty new to me and I got lost reading various articles and manual pages...
This depends on what the distribution is derived from. Generally, there's no need to recompile a program on something like Ubuntu under Fedora so long as the code remains unchanged. Since Ubuntu and Fedora would have to be using the same libraries (albeit in different locations perhaps) and anything OpenGL-related would be a driver issue; therefore it is hardly a requirement to recompile your software. I would be extremely surprised if you had to recompile your software since all distros use pretty much the same set of libraries/got bash/use the Linux kernel but have different package managers. The last part is where it gets complex:
The aforesaid distributions have different package managers which requires you to repackage your software accordingly. You could release pre-compiled binaries under a tar.gz file and simply have distro maintainers package the software for you; though if you want to control how your software is distributed then you should better do that yourself. Because of this issue surrounding the many package managers out there, people still resort to recompiling source code through a make file which can be generated through cmake. This only happens if certain dependencies are, for whatever reason, 'renamed' in which case because of a simple name change the program magically doesn't find the dependency. Since there's also no naming convention it even makes life harder. The most important virtue here is Trust, and Trust therein to have developers follow naming conventions so everyone can reference the same package of the same name.
The best distros to support are the most popular ones: Ubuntu and openSUSE would be great starting points. Linux Mint, Fedora and Red Hat and Debian use package managers likewise of the aforesaid.
Meanwhile, you should know that you can't statically link GPL'd code in your software without also making your software also GPL. A way to work round this is to resolve dependencies by either: A. Including the relevant dependencies in the same folder as the executable (much like *.dlls on Windows) or to depend on the system upon which your program is run to look inside the same directories your program looked into while compiling and linking. The latter is actually riskier since it bears the assumption the user will have the libraries and unmodified. The former will make your overall software use less storage, but including the dependencies would only mean increasing the size of your package but ensures consistency across all systems.
As for installation, you would need a bash executable that moves the contents of your directory to the right locations. Generally, the main app goes into /usr/bin and any app-related data goes into the home folder. This again depends on the distro; look for a .local directory, or you could create a directory dedicated to your app that is hidden. Hidden folders are prefixed with a period. Why put this stuff in the home folder, and what? The why: because the home folder gives read-and-write permissions to everyone by default. The what: anything that needs to be run with the app without the user having to authorise it. Dependencies should thus be located in the home folder, preferably under your own directory. Different conventions follow; some may disagree with me on this one.
You might also want to use steam's API which does most of this work for you. Under Steam, your app may be under steam's own directory and thus functions as a steam APP with all the functionality therein.
http://www.steampowered.com/steamworks/
To find out more about how to get your app on steam. I have to say I was really impressed, and they even include code samples. The best part is that this API is on Linux as well. I don't know much apart from that Steam would be handling the execution of your app through its own layer. Wherein, there's no need to independently distribute your app the previous steps.
Note that you can also distribute your software through the Ubuntu Software Centre if you are interested.
http://developer.ubuntu.com/apps/
Though, Ubuntu has more focus on getting apps running regardless of platform.
Know that Linux has no single convention, but its convention is simply derived from pragmatism, not by theory. At the end of the day, how you want your software run on Linux is up to you.
I'm not a game developer and I come from open source community so can't really advise on delivering binaries. I'll try to answer some of your questions though:
Valve has a steam runtime you can target on linux (https://github.com/ValveSoftware/steam-runtime) - this would be the best way to port your game. I saw it mentioned in one of their Linux dev videos on youtube my understanding is it bundle a bunch of libraries inclding SDL and it is setup to emulate a specific version of ubuntu. So if you write your game against the steam runtime it will run in any linux distro that steam has been ported too.
As for natively packaging your game one thing to consider is if you package it as a deb or RPM and instruct it to depend on distro provided libraies than your app may break when the libraries are updated (some distros update libs quite often - others are more stable). Dynamically linking against system libraries works well for open source since people can patch the code when libraies change not ideal for close sourced stuff.
You can statically link your binary at build time, which means you have a larger sized binary. But than you don't have to worry about app breaking when libs are updated.
Some programs like Chrome bundle their own libs (which are essentially forks of the system libs) again this makes download size much larger but also has potential to cause security problems, people tend to frown on this. (see: http://lwn.net/Articles/378865/)
1.) No, ABIs of main Linux distributions are not fully compatible. In a narrow sense, they are mostly compatible. But in a broader sense, there are many differences in which some elements of the system work, are configured, etc.., OTOH, making "packages" is not a big problem per se, just some work. Also, what works for a "major" distribution (like Debian) works (pretty much always) on all derivatives (like Ubuntu, Mint...).
2.) A good start list to support is: Debian (.deb), RedHat and Fedora (.rpm both). Their package formats and tools are mature and well known. This will "cover" a lot of derivatives.
3.) There are some "cross-distribution" package builders, but are mostly not up for the task. Writing a definition script for most of package formats is not hard, once you get a hang of it. Also, some commercial tools have "Installation script generators" for Linux which take care of things for you (they don't generate .deb or .rpm, rather a complex shell script, similar to a Windows EXE installer)
4.) Sorry, I don't know much about Steam. O:)
So, from my experience, the best way to do it is to accept the ugly truth that you have to select a few major distributions and their versions ('cause things can change very much between versions) and make packages for them, and test often on all of them. And be happy that you're not developing some long-lived kernel module/driver, because if you add tracking kernel API changes to the whole picture... :)

Linux distro/version to support when releasing a software on Linux

We are about to release a couple of softwares with Linux support.
As for Mac and Windows, the number of version to support is quite limited (xp, 2000, vista, 7 for win, 10.4-6 for Mac). But for linux it's another story.
We'd like to support as many Linux as possible, but the choice is large.
The questions are:
Which distribution format (binaries) to use to support as many Linux as possible?
For testing, what "base linux" can we test on and extend our results to other linuxes.
According we provide statically linked binary with all the dependencies, what do we need to check? I assume kernel version and libc version, but I'm wondering.
Our software is written in ANSI compliant C with a bit of BSD and POSIX (gettimeofday, pthreads).
So you think three versions each for Mac and Windows is normal, but you shy away from Linux? Hm.
Just make sure it builds using the standard tool chains -- configure, make and make install traditionally. The rest should take care of itself.
Else, pick what you are comfortable with. For me that would be Debian/Ubuntu, others prefer Fedora. Look at the Linux Standards Base and things like FreeDesktop.org for other standards. Kernel and libc should not matter unless you are doing something very hardware or driver-specific.
The kernel strives to maintain a backwards-compatible binary API. Statically linked binaries built against 1.0 series kernels are supposed to still run fine to this day on the latest 2.6 series kernels.
If you are statically linking with everything (including libc), then the major problem you are likely to face is different filesystem arrangements, which may not even be a great issue for you. (Testing is the only way to find out, though).
An idea is to survey your proposed customer base so see which linux version they run and make a short list from their feedback. However from what I know (which is subjective!) ...
I would suggest running two different distribution types -- rpm and .tar.gz. With rpm you cater for the latest Fedora/openSUSE/RHEL/SLES (and derived distros, which is a fair chunk of the corporate market). You are already handing a lot of dependency problem by static linking, so kernel version should be sufficient.
With .tar.gz distribution you cater for 'all others' but watch support and configuration problems as they quickly become a time sink.
For testing, have virtual machines of each version you choose to support. These can also be used for product support (I assume you will need to provide product support??) I wouldn't try to extrapolate results between linux versions because there a too many hidden 'gotchas'.
You can release statically compiled Linux binaries against the kernel & version of glibc. You really only need worry about compatibility-breaking revisions. If you have some time, you can setup everything to cross-compile on the same host. The kernel is backward compatible. glibc is more temperamental.
File paths can be assumed to be Linux Standard Base, if you want to package it with an installer. The more flexible you can be here, the better. I've never heard a customer complain about receiving a tarball of binaries, which I'd recommend offering. I have had customers complain about incorrect assumptions.
Your best bet for a formal package format is probably between DEB (Debian Linux & derivatives, like Ubuntu) and RPM (Red-Hat & derivatives, like Cent-OS). Packages are nice to have, but are just a headache if you don't plan on utilizing the native update manager.
For test & build, I'd personally recommend Gentoo. It's pretty raw, however, so you might want to look into Ubuntu as a distant second choice.
This is an issue for your product management team. Once they have determined that producing a Linux version is a desirable idea (i.e. on a cost-benefit basis), then you will need to find out what distros your customers use or want supported.
In principle you can support any but the more you support the more of a headache it will be, so you want as FEW as possible.
Support as few OS / architecture combinations as your PM thinks you can get away with
Deprecate OSs / architectures as soon as you can
Only take on new ones if premium support customers demand it, or to get big deals, as per your PM's decision.
How hard it is to support them is largely dependent on how complex your product is (esp. dependencies) and how complete its auto-test suite is. Adding more supported OSs ties your hands with respect to library usage, kernel feature usage etc as well as testing, so it's not something you want to be lumbered with long-term.
So in short, it's not a software engineering issue, but a product management one.

Which linux distribution fits a new learner of linux programming?

As I come up with linux ,I found the commands are different in OpenSuse and Ubuntu.
Which of them is suitable to somebody who was new in linux and want to master the command
needed when programming and using linux?
I got the impression that OpenSuSE did some things a little unconventionally (kdesu, gksu), but it's a fine (KDE) distro. I've found (K)Ubuntu is a little better for beginners since it has access to huge compiled package repositories, plus the community is unbeatable.
They're pretty similar for most things, including programming.
Whichever one the people around you can ask questions of know. The value of a knowledgeable support network vastly outweigh the benefits of a particular distribution. If you don't have a local support network, I'd go with Ubuntu, they tend to have more useful resources on the 'net (and it's the distro I'd prefer out of those two options).
Any of Ubuntu/Slackware/Gentoo should be fine as a development environment. You didn't mention what kind of programming you're interested in, and that may have some influence on the answer, too.
If you are at all interested in making better code I also recommend dual booting (or running a 2nd computer) into a non-Linux system such as OpenBSD/FreeBSD, OpenSolaris, etc. Writing code that's portable across UNIX systems isn't just a good idea for portability's sake, it can also help shake out some bugs. The same can be said for working with 32-bit vs. 64-bit, big endian vs. little endian platforms. You can pick up an old 64-bit Sun workstation for cheap and use it to test your code.
We'd have to know your preferences better to be able to answer that. Basically any LINUX would do i guess. I heard nice feedback from Ubuntu, though.
I'm running on Ubuntu right now. It was easier than windows to get running. Very smooth. Of course, advanced functions are beyond me right now. With C# classes and everything else I gave up on trying to learn too much and just run it as is. It has a very good user interface, and I've heard a new version is out or coming out as well, probably more eye candy.
Check out Ubuntu. Doing a dual boot can't hurt since then you can try it out! You can also run some distributions straight off the disc to try them out.... Can't hurt to try it when it is free. Much more stable for me than XP, and faster. I HOPE Windows 7 ends up being less of a monster! I'd stick with Ubuntu if it was more compatible with windows programs. .NET development while in school typically isn't done on a Linux distribution!
best wishes, try it out!
I prefer Ubuntu. my 2 cents...
They are tools. Use the one that you are most comfortable with. Really calling them tools isn't a good analogy. Better is to call them vehicles. Use the vehicle that fits your needs and desires.
Actually , as a programmer you would face such questions everyday. Which framework? which language? which data structure ? ... you get the idea. There is no right answer.
Choose any. They are not too different. Soon you would not be a "new learner" and then it wouldn't matter anyway.
Depending on how deeply you want to learn, one possible candidate distro would be Linux From Scratch. It also has awesome documentation and by playing with it will surely make you think more "Linux"-way.
This is a very hard question; most distros aim to be "the best", or at least "good enough" for a wide variety of activities, of course including programming.
It's also an issue that easily spawn "wars", where people fight to claim that the distribution they use is the best, and that all others should conform. Heh.
My current preference is for Gentoo, and I think one (probably minor) advantage it has when programming is that since it is a source-based distribution, you typically never need to bother to get the "development version" of packages. If you have e.g. readline installed, you will have its header file(s) too, and so on. Many other distros split packages into "user" and "developer" versions, so you need to install both packages.
Of course, I guess in those cases the developer packages depend on the non-developer versions anyway, so if you always install developer versions, you'd be all set. Oh well. Nevermind, then. :)
When choosing a linux distro I usually consider two things:
1: packaging system (and release cycle):
Opensuse probably has the most up to date packages of any distro (without building your own), Ubuntu's packaging system tends to hold your hand a little bit more though. I have used both and found that as a developer I slightly preferred Opensuse since it was easier to get the latest versions of development packages (for example, IDEs).
2: default configuration/ease of administration:
All linux distros have their quirks here. Both Opensuse and Ubuntu are well documented and have good support forums. Opensuse has Yast which is a nice one stop shop for most configuration tasks. Ubuntu seems to be slightly better at automatically configuring itself. Really, either distro is fine here.
The good news is that there is not a wrong decision per se. I have used a lot (more than 10) linux distributions and I now stick to Opensuse. Ubuntu was a close second, the only reason that I don't use it is that I found I was often stuck waiting for its 6 month release cycle to get up to date dev packages (building the monodevelop beta was not feasible at the time). Opensuse's build service and the Packman third party repository seem to keep nearly current packages for everything I've ever wanted.
I use Gentoo and Ubuntu for development.
Gentoo I love because I can so easily select which packages I have available and which versions. The guy that did Flash 9 and 10 does his development on a Gentoo system as well.
Ubuntu I enjoy now because it's so stable. After a few years, a Gentoo installation will tend toward some instabilities that require sometimes rebuilding the whole installation.
Another I'd look at is Slackware.
The principles are mostly the same over the different distributions, so I'd suggest you to choose one and dig in. However, there are some considerations to make. If you want to program gui programs you need to make a decision about which graphic toolkit library to use (Qt, Gtk+ ...), which would also imply the choice of your desktop environment (GNOME, KDE, XFCE). As you will notice, in Linux world everything (or almost everything) depends on something else. I'm talking about the packages. It is quite common to reuse available libraries and not write your own so the decisions you would probably have to make are about which libraries/frameworks to use and which language.
I, however, chose Ubuntu and don't feel sorry at all.
It depends on what you want to do. If you want something easy to setup and basically just works, use Ubuntu. If you want to /learn/ Linux, I would recommend Slackware. Getting Slackware up and running will force you to know HOW and WHAT is going on with your installation. This can be good or bad, depending on your desires.
Ubuntu fits the build and the community is very helpful.
If your exploring Linux / GNU as a programmer, you might consider selecting a distribution that uses the apt packaging tool.
You will likely need to install lots of libraries with development headers and obtain the source code to other things. Apt makes it quite simple to do such things, it is very good at resolving package dependencies and fetching source packages.
Distributions using apt (either with .deb or .rpm packages) are Debian, Ubuntu (and its forks) and others.
That being said, Ubuntu does a really good job at keeping up to date with recent libraries and tools, while resisting the urge to cherry pick alpha / unstable code. My desktop is my development machine, I use Ubuntu.
I would vote for Ubuntu. The principles are the same while some of the "commands" are different. I assume you mean differences such as "sudo".
I'm very happy with my Ubuntu Server which handles all my development Windows VMs. I even used Ubuntu desktop on my laptop for a time...at least until I needed Visual Studio on it again. :-)
EDIT - "sudo" does exist in OpenSUSE
Gentoo is the easiest to use. I'd go with that

Resources