Copy installed ghc and all packages to another machine - haskell

I am facing a peculiar problem. Here at high school I have got about 10 computers (all are same type, same type cpu, same type memory etc) donated which are now running Debian after reinstall. I was try to teach the pupils some Haskell, I myself learned it little. The kids are interested. A problem is our country is third world and the internet is very slow and costly. The basic ghc and ghci I installed using deb packages (found by using apt-rdepends) on all machines after once downloading all of the deb files only on one machine using some limited time free internet connection. It has taken more than 10 hours to download the all ghc deb files that are missing.
I want know if such trick is possible for cabal? I will download all required tar or other files once, on one computer, using the costly and slow internet, but then I do not want spend all my money to download from internet for all 10 computers.
I want show the kids diagrams and gloss package as it is enjoyable and funny.
I am inspired by this gentleman Smith
How should I do this ? Is there way for other packages in general other than diagrams and gloss?
Thank you and sorry for my bad English.

By default, cabal caches each package it downloads to ~/.cabal/packages (and prefers its cache to re-fetching the package unless you explicitly request a re-fetch). So it should be simple enough to just copy that directory between computers.
This would still require you to build all the packages on each machine. If you would prefer to skip even that step, you could consider directly copying GHC's package database around to each of the machines. This is a bit more delicate, but could save quite some time/power.
The global package database (where you should be installing packages that you want to be shared between users) is in /usr/local/lib/ghc-$version by default, and you should be able to copy that directory around to all your computers as well. You can check that you have installed the packages you want into the global database using ghc-pkg list, which will list all the package/version combos installed, separating them by whether they are installed in the global or user package database.

In the past I have done this to get GHC and Cabal working on a machine behind a firewall that "cabal install" couldn't see through.
You can use "wget" to download the latest version of every Hackage package. (Or you might try doing something similar with Stack, but I haven't tried that). Also download https://hackage.haskell.org/packages/index.tar.gz, which is the index file.
Install GHC, cabal and cabal-install, and then find the cabal-install configuration file and point it at a local repository containing the index.tar.gz package and the archives for the packages that you downloaded. Then hopefully you should find "cabal install" will work from the local repository.

Related

Deploy linux binaries without internet access and without root?

boundaries:
I have a Linux64 (Redhat 6.8) test-server here which I can access via FTP in our intranet. This server has no access to the internet.
I want to use an SVN command line client and Python with cx_Oracle and an oracle client on that machine
I don't have root access.
I don't have much idea of Linux
I thought I will start with the easiest thing which was SVN in my opinion:
My first guess was, that I could just download the binaries for SVN for Redhat 6 on my windows machine and copy them to the Linux machine using FTP.
But what I found here was "yum install subversion" (which does not work due to missing root and internet access) and a file "subversion_installer_1.9.sh" that I got from WANdisco (but which also needs root and internet access again).
Why is that so complicated? I come from a windows world and I am a little bit disappointed at the moment, because I always thought that stuff like this should work quite easy on LINUX (just copying binaries and you are good to go).
What do I overlook?
How would you do that?
You can "install" Subversion and Python and cx_Oracle without root access but since you are straying outside of the "normal" approach to things you will find it much more difficult than if you simply followed the "normal" approach. And the "normal" approach is to use a package manager (like yum) which requires root access. The Windows approach is simply different. There have been many arguments over which is "better" but I won't get into that here!
Installing something on Linux is as easy as copying binaries. The difficulty lies with getting the right binaries to copy. Unlike Windows where the system API (kernel32.dll/user32.dll/gdi32.dll) is extremely consistent and highly compatible between versions, Linux distributions have multiple system APIs (glibc, newlib, uclibc) and more frequent ABI breakage the n in libc.so.n changes.
When you download binaries from a repository hosted by your distribution maintainer, you know that they are built to use the same versions of the various dependencies as every other binary on your system. There's no such guarantee for binaries obtained from the developer, who may use a totally different distribution.
So the common thing for open source projects such as subversion is to obtain an archive of the source code from the developer, unpack it, run ./configure to customize the makefiles for the system libraries on your system, make to build binaries that use your distro's particular flavor of system libraries, and make install DESTDIR=~/somesoft to install in any directory you have write access to.

How to distribute open source package you built yourself?

I built ZeroMQ and Sodium from source and have them installed properly on my development machine, which is just a Pi2. I have one other machine that I want to make sure these gets installed to properly. Is there a proper way to do this other than just copy .a and .so files around?
So, there are different ways of handling this particular issue.
If you're installing all your built-from-source packages into a dedicated tree (maybe /usr/local, or /opt/mypackages) then simply copying files around is a fine solution, using something like rsync. Particularly since you only have two machines, anything more complicated may not be worth the effort.
If you're trying to install ZeroMQ and Sodium along side system-managed files (in, e.g., /usr/lib and /usr/bin)...don't do that. That is, don't try to mix "things installed by packages" with "things installed from source", because that way lies sadness and doom.
That said, a more manageable way of distributing these files would be to build custom packages and then setting up a local apt repository, so that you can just apt install the packages on your systems. There are various guides out there for doing this if you want to go down this route. It's a good skill to have in general, especially if you ever want to share your tools with someone else (because it makes it easy for them to install any necessary dependencies).

Installing Meteor packages globally

Is there a way to install meteor packages globally?
So, having the once globally installed packages installable without internet connection for projects created later, avoid repetitive downloading, and other benefits one may imagine.
Like in Node.js, using npm command (of Node Package Manager) with -g flag, npm install -g, doing so npm installs node packages into a global directory and when wanted to be loaded from javascript programs, loading from there if available, as well as looking in and loading packages from project's node modules folder.
Meteor already downloads packages into a global repository that all your local apps benefit off of.
So if you meteor add iron:router#1.0.7 it is downloaded and added to your project. Next time another project requires the same version, it is used off that same spot.
Also, there is a PACKAGES_DIR environment variable, when set, allows you to keep your own local packages centrally, so that you can share them among projects. In fact, you can keep that on a network drive (NFS) which your whole team can mount and consume centrally.
Yet, there is an inherent problem. Meteor's version resolver looks up for updates unless you pin down your package dependency versions so that is exactly why meteor seems to be so desperate to be connected.
Even if you pin your dependencies, the packages you depend on may not have (which apparently is the case for most packages) so Meteor keeps looking for updates to the whole package tree and downloads those that it deems satisfying the version constraint resolver.
The good news is, they are constantly improving their tooling, requiring lower number of lookups, faster builds, better search etc.
All in all, in essence, there is not much you can do unless Meteor provides some way of hosting an entire mirror of its package repository for you to consume offline. And I guess it is very unlikely to happen.
Meteor is a tool for the connected world and it does assume your connectivity. Heck, the whole journey begins with a curl https://install.meteor.com/ | sh
And yes, it would be great if we could hack away on a remote beach, or the 12 hour flight to that beach.
Until then, happy coding online ;)

Light weight packaging tool

I am looking for a good way to install an application I developed with all its dependencies in a fancy way. Currently I have a big make file that downloads, unpacks, compiles and installs all dependencies. This however is a little tedious, since there are quite a few dependencies and the make file is getting larger and larger which eventually will be hard to maintain. Therefore I am looking for a packaging tool with the following features:
It should be a light weight package manager which is very easy to install (or even installs itself and afterwards all my dependencies)
The destination of the installed binaries, libraries etc. should be customizable
Each installation process of a dependency should be easy configurable
It should be possible to include self written scripts that get executed at a specific point during the installation process (in order to manipulate make files, flags etc)
No admin rights should be necessary since all clients that install my application will not have admin rights and are not able to use an already installed package manager
I do not know if this kind of software exists. I myself don't have much of experience with packaging tools.
Thx in advance for any link, hint, suggestion!
opkg is something thats based on ipkg (now defunct) and originally dpkg. Its used in embedded systems. Light weight for sure.
ports from crux linux (www.crux.nu)?
A quick search returns InstallJammer. I would propose make debs and rpms and tarballs and stick with standard installation process (root privileges and such)m but if you can't do that, then, well, you can't.
I'm sure you know how suspicious it would look for the user.

Please recommend a way to deploy into a Linux box in a LAN environment

have you struggled with Linux deployment before?
I need to deploy an application into a dedicated Linux box with no outside network access.
The configuration should be as simple as possible, robust for different configurations (missing libraries and build tools) and preferably automatic. Another difficulty I need to consider is that I need to connect to an Oracle database.
What would you recommend as the best way for deployment? I have some ideas, but not sure which is the best.
I can use Java
I will need to install JDK, and that solves mostly everything
Another big problem is that the code we presently have in Java is poorly written and slow.
I'm not sure if I need to install Instantclient to connect to Oracle under linux
I can use C (I do have the source code for a very well-written LGPL program)
And use dpkg to deploy
The Linux box is most likely a Ubuntu server, but I'm not sure which version is installed
I can't use apt-get, but I can copy all the packages I need
I know I can use dpkg -s to check which packages they are, but I'm really not sure if I might miss dependencies.
I guess I will need build-essentials and pcap or such
And use static linking
I configured it with ./configure LDFLAGS=-static with no errors and it works on my computer now
I have chroot into this directory and run it without problems, does this mean this is okay?
I really need to test this on a new Linux box to make sure
And use Statifier
I browsed stackoverflow and found this app, haven't tried it out yet.
Seems like people have used it with mixed success.
And create a build environment and make
I have no confidence that this is going to work
Using C leaves some problems
But the program is incomplete, I have to process this data, preferably not in C.
I have to install Instantclient, which is difficult to deploy
I can use Perl
I can't use CPAN
I have already downloaded the libraries, so maybe I could just copy them into the deployed machine, I am not sure how or whether this works
Perl is slow
I have to install Instantclient anyways
Please share your similar experience.
C with static linking solves a lot of the portability problems at the expense of a larger executable. To make sure that everything is truly getting statically linked and not secretly depending on any outside libraries, run ldd on your executable and make sure it isn't dynamically loading everything. Note that this won't be 100% portable amoung various linux machines because Oracle instantclient has some dependencies on kernel versions, but it should work on any reasonably new kernel.
Edit: If the box has LAN access and and just no internet access, why not run your own apt repository on the local network. You could even create a .deb for your application and put it on the same server, than on that machine you just need to execute apt-get myApplication and it will pull down your app and any noninstalled dependencies as well. Setting up an apt mirror is actually pretty easy and this would be pretty slick. If network access is missing alltogether, you can still create an install dvd that has all the debs including your app and set up apt-get to pull from there.

Resources