Are git versions compatible between windows and linux - linux

does somebody know, if windows and linux versions are compatible?
I need to know because I need to share the disk where my local reposities are between linux and win, which runs in virtualBox on the linux pc.
I develop on linux, but I need to use git on windows when I work remote (because of a VPN issue). Another option would be to always use git from windows, but I prefer not to start vbox.
Has somebody done this? I suppose it could be a bit risky, or would it be OK to rely on the versions to be 100% compatible. I would not like to have my repository corruptet...
Cheers Henning

Yes, as long as you're concerned about the database format they are compatible. Moreover, such compatibility is required even for "non-native" solutions like the JGit or libgit2 libraries.
I can only see two possible problems:
From time to time Git might change behaviour in a way not compatible with some of its past versions (but very rarely and with bold warnings in release notes documents long before the change is made).
So find out what versions you're running (git --version) in both worlds and if there's a major difference between them (in the X number of 1.X.Y version) consider reading the changelog for the one which has a greater X for possible gotchas.
Potential filesystem issues: these days you can mount an NTFS volume R/W when running a recent Linux kernel, and same goes about Ext2 (but not Ext3 and certainly not Ext4) on Windows but you might, in theory, accidently hit some problem with these drivers — of course, they haven't received that much love their native variants have.

Related

Are there any benefits to keep your files (scripts) on WSL filesystem

When reading the WSL documentation, it is stated that:
"Unlike our practice with trying to exclusively install programs and software on Ubuntu, our files and folders need to live exclusively on the Windows FS [...] Windows and Windows Apps can only read and write Windows files, and VSCode will be making our changes."
I understand the reasoning behind this and indeed, if one uses VSCode for example, it all makes sense. But my question is:
Is there any real reason why you couldn't keep your files (i.e. scripts) on the WSL filesystem itself? More specifically, if you don't ever intend to use the Windows filesystem (i.e you won't ever need a GUI or else), is there any sense in placing the files in the Windows FS?
Obviously you need to make sure you backup your data (GitHub or else) but aside from that, is there any downside? I guess what I'm saying is: can I use WSL like a VM? Can I keep BOTH software AND scripts all in WSL, separate from the Windows filesystem?
PS: The reason for avoiding a VM in this context is because I have a low spec laptop which has struggled a lot in the past with VMs (slow, not enough RAM), and so far, WSL seems be running much more smoothly.
Thanks
The simple answer is yes, you can use WSL as if it were a VM. WSL is for the most part fully-fledged Linux, and you can use Linux as your primary operating system, ignoring the fact you need to start it from within Windows. I haven't tried WSL 2, but it's said to be implemented as a fast VM, which is exactly what you ask for. (Further, the lack of GUIs can be mitigated using built in support for sending X data over SSH to the Windows half of your computer, and display it with an X Server. If I remember correctly, these two articles got me most of my way there.)
However, if you want to get pedantic, you can't store any files separate from your Windows filesystem on WSL 1. If you run e.g. Ubuntu, your Linux filesystem is instead always contained within %USERPROFILE%\AppData\Local\Packages\CanonicalGroupLimited.Ubuntu18.04onWindows_79rhkp1fndgsc\LocalState, so it'll technically not be separate. I can't test WSL 2, but according to this article, WSL 2 also stores its data in that folder, just as a single VDHX image. Presumably every WSL distro stores its data on the Windows filesystem.
Warning: Do not access the files themselves in your Linux filesystem within AppData using Windows tools, or you run a high risk of corrupting those files.
Yes, you can, and only place files in the Windows filesystem if you want to share them with Windows programs. Moreover, in Windows 1903 you don't need to place files in Windows filesystem to share them with Windows Programs, they can access them.
In WSL2 they encourage you to keep everything in WSL filesystem to take advantage of the filesystem's performance improve.
So, yes you can and you should.

Best way to build cross toolchains on Mac OS X

I spent the last three weeks researching about crossdevelopment under Mac OS X. I want to achieve two separate results, but I believe they can be reached through the same path.
I want to
set up distcc to help my old Gentoo laptop using the iMac I recently got at home (OS X 10.6, 64 bit native) which I also use for iOS development, so Xcode 4 tools are already there;
develop my pet project which is an elf kernel for x86, x86_64, and arm (and I'll stop here as it's OT).
So, after a lot of that thinking thing we all do in these cases, I came up the idea that to reach the first goal I need to set up an i686-pc-linux-gnu toolchain (or is it i686-unknown-linux-gnu?) with all the appropriate versions (eg gcc-4.4) and make it callable by distcc. It seems like a reasonable task, but unfortunately there seem to be clearer tools and instructions to build toolchains for obscure archs like sparc or mips, and not a single reasonably updated resource on how to go for x86 the best way. Therefore, first question: is there anybody that succesfully build such a toolchain and feels like sharing the pain? :)
Second goal. My current workbench is made of Gentoo on an i686 laptop (yes, the same as the first goal) with all the regular development stuff, and I use QEMU to test it (its gdb integration is awesome). What I'd really like to do is to keep using the laptop while travelling (I do a lot of commuting) and continue to work and test on the iMac when I'm home (git is awesome in this respect). Hence, second question: is there anybody that have done something like this and wants to share?
I'd really appreciate any input. Seriously.
EDIT I know about MacPorts, crosstool, and crosstool-ng. I tried installing i386-elf-binutils 2.18 from MacPorts just to discover I have 2.20 in my laptop. Also I couldn't get gcc44 to produce i686-pc-linux-gnu elf objects, and using i386-elf-gcc is not an option as I need 4.4 and the packaged one is 4.3.
This is no easy task, specially because you want to cross compile for so many different platforms.
The most used approach is to run a Virtual Machine with the desired OS (e.g. VirtualBox, Parallels, VMWare Fusion) and install your workbench tools to work from it. This is very often used because it's not complex to setup and it also make it easier to write, test and debug code for/from the target system.
Of course, if you search enough you'll find all sorts of hacks/tricks to setup a toolchain on Mac OS X and compile code for other architectures:
One of these uses Buildroot, but that means that there is no official support for Mac OS X.
Another one, also interesting, offers a .dmg package with the tools needed to compile for Linux on MacOS X.
You already mentioned Gentoo, so I think you should take a look at Gentoo Prefix. Gentoo Prefix lets you install a small Gentoo system in a user defined directory (= prefix). From there, you may start a shell which lets you use portage (= Gentoo's package system) which should enable you to install the necessary tools.
I do not know in what shape Prefix on OS X today is, but I was able to install it on a friend's MacBook a year or so ago. If you are interested, I can give further details about the installation process which can be a bit tricky.

Linux distro/version to support when releasing a software on Linux

We are about to release a couple of softwares with Linux support.
As for Mac and Windows, the number of version to support is quite limited (xp, 2000, vista, 7 for win, 10.4-6 for Mac). But for linux it's another story.
We'd like to support as many Linux as possible, but the choice is large.
The questions are:
Which distribution format (binaries) to use to support as many Linux as possible?
For testing, what "base linux" can we test on and extend our results to other linuxes.
According we provide statically linked binary with all the dependencies, what do we need to check? I assume kernel version and libc version, but I'm wondering.
Our software is written in ANSI compliant C with a bit of BSD and POSIX (gettimeofday, pthreads).
So you think three versions each for Mac and Windows is normal, but you shy away from Linux? Hm.
Just make sure it builds using the standard tool chains -- configure, make and make install traditionally. The rest should take care of itself.
Else, pick what you are comfortable with. For me that would be Debian/Ubuntu, others prefer Fedora. Look at the Linux Standards Base and things like FreeDesktop.org for other standards. Kernel and libc should not matter unless you are doing something very hardware or driver-specific.
The kernel strives to maintain a backwards-compatible binary API. Statically linked binaries built against 1.0 series kernels are supposed to still run fine to this day on the latest 2.6 series kernels.
If you are statically linking with everything (including libc), then the major problem you are likely to face is different filesystem arrangements, which may not even be a great issue for you. (Testing is the only way to find out, though).
An idea is to survey your proposed customer base so see which linux version they run and make a short list from their feedback. However from what I know (which is subjective!) ...
I would suggest running two different distribution types -- rpm and .tar.gz. With rpm you cater for the latest Fedora/openSUSE/RHEL/SLES (and derived distros, which is a fair chunk of the corporate market). You are already handing a lot of dependency problem by static linking, so kernel version should be sufficient.
With .tar.gz distribution you cater for 'all others' but watch support and configuration problems as they quickly become a time sink.
For testing, have virtual machines of each version you choose to support. These can also be used for product support (I assume you will need to provide product support??) I wouldn't try to extrapolate results between linux versions because there a too many hidden 'gotchas'.
You can release statically compiled Linux binaries against the kernel & version of glibc. You really only need worry about compatibility-breaking revisions. If you have some time, you can setup everything to cross-compile on the same host. The kernel is backward compatible. glibc is more temperamental.
File paths can be assumed to be Linux Standard Base, if you want to package it with an installer. The more flexible you can be here, the better. I've never heard a customer complain about receiving a tarball of binaries, which I'd recommend offering. I have had customers complain about incorrect assumptions.
Your best bet for a formal package format is probably between DEB (Debian Linux & derivatives, like Ubuntu) and RPM (Red-Hat & derivatives, like Cent-OS). Packages are nice to have, but are just a headache if you don't plan on utilizing the native update manager.
For test & build, I'd personally recommend Gentoo. It's pretty raw, however, so you might want to look into Ubuntu as a distant second choice.
This is an issue for your product management team. Once they have determined that producing a Linux version is a desirable idea (i.e. on a cost-benefit basis), then you will need to find out what distros your customers use or want supported.
In principle you can support any but the more you support the more of a headache it will be, so you want as FEW as possible.
Support as few OS / architecture combinations as your PM thinks you can get away with
Deprecate OSs / architectures as soon as you can
Only take on new ones if premium support customers demand it, or to get big deals, as per your PM's decision.
How hard it is to support them is largely dependent on how complex your product is (esp. dependencies) and how complete its auto-test suite is. Adding more supported OSs ties your hands with respect to library usage, kernel feature usage etc as well as testing, so it's not something you want to be lumbered with long-term.
So in short, it's not a software engineering issue, but a product management one.

Which linux distribution fits a new learner of linux programming?

As I come up with linux ,I found the commands are different in OpenSuse and Ubuntu.
Which of them is suitable to somebody who was new in linux and want to master the command
needed when programming and using linux?
I got the impression that OpenSuSE did some things a little unconventionally (kdesu, gksu), but it's a fine (KDE) distro. I've found (K)Ubuntu is a little better for beginners since it has access to huge compiled package repositories, plus the community is unbeatable.
They're pretty similar for most things, including programming.
Whichever one the people around you can ask questions of know. The value of a knowledgeable support network vastly outweigh the benefits of a particular distribution. If you don't have a local support network, I'd go with Ubuntu, they tend to have more useful resources on the 'net (and it's the distro I'd prefer out of those two options).
Any of Ubuntu/Slackware/Gentoo should be fine as a development environment. You didn't mention what kind of programming you're interested in, and that may have some influence on the answer, too.
If you are at all interested in making better code I also recommend dual booting (or running a 2nd computer) into a non-Linux system such as OpenBSD/FreeBSD, OpenSolaris, etc. Writing code that's portable across UNIX systems isn't just a good idea for portability's sake, it can also help shake out some bugs. The same can be said for working with 32-bit vs. 64-bit, big endian vs. little endian platforms. You can pick up an old 64-bit Sun workstation for cheap and use it to test your code.
We'd have to know your preferences better to be able to answer that. Basically any LINUX would do i guess. I heard nice feedback from Ubuntu, though.
I'm running on Ubuntu right now. It was easier than windows to get running. Very smooth. Of course, advanced functions are beyond me right now. With C# classes and everything else I gave up on trying to learn too much and just run it as is. It has a very good user interface, and I've heard a new version is out or coming out as well, probably more eye candy.
Check out Ubuntu. Doing a dual boot can't hurt since then you can try it out! You can also run some distributions straight off the disc to try them out.... Can't hurt to try it when it is free. Much more stable for me than XP, and faster. I HOPE Windows 7 ends up being less of a monster! I'd stick with Ubuntu if it was more compatible with windows programs. .NET development while in school typically isn't done on a Linux distribution!
best wishes, try it out!
I prefer Ubuntu. my 2 cents...
They are tools. Use the one that you are most comfortable with. Really calling them tools isn't a good analogy. Better is to call them vehicles. Use the vehicle that fits your needs and desires.
Actually , as a programmer you would face such questions everyday. Which framework? which language? which data structure ? ... you get the idea. There is no right answer.
Choose any. They are not too different. Soon you would not be a "new learner" and then it wouldn't matter anyway.
Depending on how deeply you want to learn, one possible candidate distro would be Linux From Scratch. It also has awesome documentation and by playing with it will surely make you think more "Linux"-way.
This is a very hard question; most distros aim to be "the best", or at least "good enough" for a wide variety of activities, of course including programming.
It's also an issue that easily spawn "wars", where people fight to claim that the distribution they use is the best, and that all others should conform. Heh.
My current preference is for Gentoo, and I think one (probably minor) advantage it has when programming is that since it is a source-based distribution, you typically never need to bother to get the "development version" of packages. If you have e.g. readline installed, you will have its header file(s) too, and so on. Many other distros split packages into "user" and "developer" versions, so you need to install both packages.
Of course, I guess in those cases the developer packages depend on the non-developer versions anyway, so if you always install developer versions, you'd be all set. Oh well. Nevermind, then. :)
When choosing a linux distro I usually consider two things:
1: packaging system (and release cycle):
Opensuse probably has the most up to date packages of any distro (without building your own), Ubuntu's packaging system tends to hold your hand a little bit more though. I have used both and found that as a developer I slightly preferred Opensuse since it was easier to get the latest versions of development packages (for example, IDEs).
2: default configuration/ease of administration:
All linux distros have their quirks here. Both Opensuse and Ubuntu are well documented and have good support forums. Opensuse has Yast which is a nice one stop shop for most configuration tasks. Ubuntu seems to be slightly better at automatically configuring itself. Really, either distro is fine here.
The good news is that there is not a wrong decision per se. I have used a lot (more than 10) linux distributions and I now stick to Opensuse. Ubuntu was a close second, the only reason that I don't use it is that I found I was often stuck waiting for its 6 month release cycle to get up to date dev packages (building the monodevelop beta was not feasible at the time). Opensuse's build service and the Packman third party repository seem to keep nearly current packages for everything I've ever wanted.
I use Gentoo and Ubuntu for development.
Gentoo I love because I can so easily select which packages I have available and which versions. The guy that did Flash 9 and 10 does his development on a Gentoo system as well.
Ubuntu I enjoy now because it's so stable. After a few years, a Gentoo installation will tend toward some instabilities that require sometimes rebuilding the whole installation.
Another I'd look at is Slackware.
The principles are mostly the same over the different distributions, so I'd suggest you to choose one and dig in. However, there are some considerations to make. If you want to program gui programs you need to make a decision about which graphic toolkit library to use (Qt, Gtk+ ...), which would also imply the choice of your desktop environment (GNOME, KDE, XFCE). As you will notice, in Linux world everything (or almost everything) depends on something else. I'm talking about the packages. It is quite common to reuse available libraries and not write your own so the decisions you would probably have to make are about which libraries/frameworks to use and which language.
I, however, chose Ubuntu and don't feel sorry at all.
It depends on what you want to do. If you want something easy to setup and basically just works, use Ubuntu. If you want to /learn/ Linux, I would recommend Slackware. Getting Slackware up and running will force you to know HOW and WHAT is going on with your installation. This can be good or bad, depending on your desires.
Ubuntu fits the build and the community is very helpful.
If your exploring Linux / GNU as a programmer, you might consider selecting a distribution that uses the apt packaging tool.
You will likely need to install lots of libraries with development headers and obtain the source code to other things. Apt makes it quite simple to do such things, it is very good at resolving package dependencies and fetching source packages.
Distributions using apt (either with .deb or .rpm packages) are Debian, Ubuntu (and its forks) and others.
That being said, Ubuntu does a really good job at keeping up to date with recent libraries and tools, while resisting the urge to cherry pick alpha / unstable code. My desktop is my development machine, I use Ubuntu.
I would vote for Ubuntu. The principles are the same while some of the "commands" are different. I assume you mean differences such as "sudo".
I'm very happy with my Ubuntu Server which handles all my development Windows VMs. I even used Ubuntu desktop on my laptop for a time...at least until I needed Visual Studio on it again. :-)
EDIT - "sudo" does exist in OpenSUSE
Gentoo is the easiest to use. I'd go with that

Linux Lightweight Distro and X Windows for Development [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I want to build a lightweight linux configuration to use for development. The first idea is to use it inside a Virtual Machine under Windows, or old Laptops with 1Gb RAM top. Maybe even a distributable environment for developers.
So the whole idea is to use a LAMP server, Java Application Server (Tomcat or Jetty) and X Windows (any Window manager, from FVWM to Enlightment), Eclipse, maybe jEdit and of course Firefox.
Edit: I am changing this post to compile a possible list of distros and window managers that can be used to configure a real lightweight development environment.
I am using as base personal experiences on this matter. Info about the distros can be easily found in their sites. So please, focus on personal use of those systems
Distros
Ubuntu / Xubuntu
Pros:
Personal Experience in old systems or low RAM environment - #Schroeder, #SCdF
Several sugestions based on personal knowledge - #Kyle, #Peter Hoffmann
Gentoo
Pros:
Not targeted to Desktop Users - #paan
Don't come with a huge ammount of applications - #paan
Slackware
Pros:
Suggested as best performance in a wise install/configuration - #Ryan
Damn Small Linux
Pros:
Main focus is the lightweight factor - 50MB LiveCD - #Ryan
Debian
Pros:
Very versatile, can be configured for both heavy and lightweight computers - #Ryan
APT as package manager - #Kyle
Based on compatibility and usability - #Kyle
-- Fell Free to add Prós and Cons on this, so we can compile a good Reference.
-- X Windows suggestion keep coming about XFCE. If others are to add here, open a session for it Like the distro one :)
Try using Gentoo, Most distros with X are targetted towards desktop user and by default includes a lot of other application you don't need and at the same time lacks a lot of the stuff you need. YOu could customize the install but usually a lot of useless stuff will get into the 'base' install anyway.
If you worried about compile time, you can specify portage(the getoo package management system) to fetch binaries when available instead of compiling. It allows you to get the flexibility of installing a system with only the stuff you want.
I used gentoo and never went back.
http://www.gentoo.org/
I installed Arch (www.archlinux.org) on my old MacMini (there is a PPC version) which only has 512MB RAM and a single 2.05GHz processor and it absolutely flys!
It is almost bare after installation, so about a lightweight as you can get.. but comes with pacman, a software package manager, which is as-good-as apt-get (ubuntu/debian) if not better.
You have a choice of installing many desktop managers such as: awesome, dwm, wmii, fvwm, GNOME, XFCE, KDE, etc.. straight from pacman using a single line of code.
In my opinion(!!) it's lightweight like Gentoo but a binary distro so it isn't as much hassle (although I can imagine it can be a little daunting if you're new to Linux). I had a system running (with X and awesome WM) in about 1.5 hours!
I'm in a similar situation to Schroeder; having a laptop with 512mb RAM is a PITA. I tried running Xubuntu but tbh I didn't find it that it was either useable or a great saver on RAM. So I switched to Ubuntu and it's worked out pretty well.
My 2c:
I'd recommend basing your system on Debian - the apt system has become the de-facto way to quickly install and update programs on Linux. Ubuntu is Debian based with an emphasis on usability and compatibility. As for windowing managers, in my opinion Xfce hits the right balance between being lightweight and functional. The Ubuntu-based Xubuntu would probably be a good match.
Remember - for security only install essential network services like SSH.
If it were my decision, I would set up a PXE boot server to easily install Ubuntu Server Edition to any computer on the network. The reason why I would choose Ubuntu is because it's the one I've had the most experience with and the one I can easily find help for. If I needed a windowing manager for the particular installation, I would also install either Xfce or Blackbox. In fact, I have an old laptop in my basement that I've set up in exactly this way and it's worked out quite well for me.
I would recommend to use Archlinux which I'm using now. XFCE is my choice for desktop environment by now but if you prefer more lightweight one you can try LXDE
Archlinux is much like Gentoo but with binary packages prebuilt and with more simpler configuration
If all those distos still won't work for you, you may want to try LFS - Linux From Scratch
I would recommend Xubuntu. It's based on Ubuntu/Debian and optimized for small footprint with the Xfce desktop environment.
I am writing this on a Centrino 1.5GHz, 512MB RAM running Ubuntu. It's Debian based and is the first Linux distro I have tried that actually worked with my laptop on first install. Find more info here.
Second the Arch suggestion. You will be tinkering quite a few configuration files to get everything going, but I've found none better for a lean and mean setup.
I suggest you should checkout the following three distros:
Damn Small Linux - Very lightweight. Includes its own lightweight browser (Dillo), but you can install Firefox easily. The entire distro fits on a 50MB LiveCD.
Slackware - Performance wise Slackware will probably perform the best out of the three, but I'd suggest running your own benchmarks with your hardware.
Debian- Debian is extremely versatile. This is the only distro of the three I'd recommend for both a 32 bit 1GB RAM laptop and also a 4GB RAM 64 bit machine.
I would recommend something mcuh lighter than XFCE: IceWM. It takes so time to configure it to be really usable, but it's worth it. I have a fully running IceWM which only takes about 5MB of RAM.
The primary reason I use Linux is because it can be lightweight. In 1999, I used Redhat, Mandrake (now Mandriva), and Debian. All were faster and more lightweight than my typical Windows 98 installations.
Not so anymore. I now have to research and experiment in order to find distros that are lightweight in both storage and memory footprint, and speedy. These are the ones I have played with lately:
Slitaz, a French distro (I use the English version and it works well).
Crunchbang, a lightweight Ubuntu and Debian-derived distro
Crux, which is source-only and very low-level geeky (I chose it because it has good support for PowerPC, and I was using it on my aging Powerbook G4)
Currently, however, I use Archlinux for most of my work, as it offers a good compromise between lightweight and feature-full.
But if you decide to roll your own distro from scratch, you may want to try Buildroot or Openembedded. I do not have much experience yet with Openembedded, but using Buildroot I have been able to create a very simple OS that boots quickly, loads only what I want, and only takes up 7 MB of storage space (adding development tools will increase this greatly, of course; I am merely using it as an ssh terminal, although I can do some editing with vi, and some text-only web browsing).
As far as window managers, I have been very happy with OpenBox. I frequently experiment with lighter-weight window mangers listed on this page, however.
here is my opinions as well. I have used Fedora, Gentoo, SliTaz, Archlinux, and Puppy Linux for development. The constraints: the system virtual image had to be under 800mb to allow for easy download and include all necessary software. The system had to be fast and customizable. It had to support version control SVN and Git, XAMPP or LAMP, SHH client, window environment (X or whatever) with latest video drivers/higher resolution, and some graphical manipulation software for images.
I tried Archlinux, Puppy, and SliTaz. I have to say that SliTaz was the easiest to work with and to set up. The complete base-OS install from the image is around 120mb using the cooking version. TazPkg is a great package manager but some of the listed packages were outdated. Some of the latest versions needed to be built from source code.
SliTaz is extremely lightweight and you have to live with some older packages in the supported TazPkg package list. There is increasing support and XAMPP, Java, Perl, Python, and SVN port well using TazPkg with latest versions. SliTaz is all about customization and lightweight. The final size was 800mb with all necessary software. ArchLinux and Pupppy, although also lightweight were over 1.5GB after all of the software was installed. The base systems were not comparable to SliTaz.
If anyone is interested in a virtual image for SliTaz with XAMPP to try out, contact away and link will be posted.
All the best and happy development! :)

Resources