Yum/apt-get before cpan to manage UNIX system-wide Perl modules? - linux

Perl's cpan command is a powerful way to manage Perl modules. However, when maintaining modules system-wide under UNIX, Michal Ingeli notes that another possible option is
yum install 'perl(PerlModuleName)'. If available, should yum be my first resort in this case?
For example, the command cpanm CGI installs the CGI module under my ~/perl5 directory, which may be best if the CGI module is only needed by scripts run under my account. But this won't provide the CGI module to scripts run by other accounts.
I can use cpanm -l <directory> to force the cpanm command to load modules to a specific directory (e.g., cpanm -l /usr/local CGI to install CGI to /usr/local/lib/perl5), or I can edit ~/cpan/CPAN/MyConfig.pm to change the default install location cpan uses.
But on nearly all systems, multiple Perl system library locations exist (/usr/local/share/perl5, /usr/share/perl5/vendor_perl, /usr/lib64/perl5, etc.), and choosing the correct one is somewhat arbitrary since these are not generated by the cpan command.
With this in mind, should I turn to yum (if available) before cpan for system-wide UNIX Perl module management? It's easy enough to test with a command like:
yum install 'perl(LWP::Simple)'
If yum failed in this instance, I would fall back to:
cpanm -l <directory> LWP::Simple
What do you recommend in this type of case, and why?
(Note that nxadm has answered a more general question about this.)
To summarize answers so far:
If at all possible, use the system package manager to update CPAN modules. E.g., for LWP::Simple:
yum install 'perl(LWP::Simple)', or
apt-get install liblwp-simple-perl
If the preceding fails, try to implement a separate Perl environment in which to use CPAN modules not present in the system-wide libraries. Consider local::lib or Perlbrew for this;
Only if the above options don't apply, use cpanm -l <directory> to load the module to a system-wide directory.

I can't speak from experience with RPM/yum systems, but I have done a lot of work with Perl applications on Debian systems and I do highly recommend going with the system packaged versions of CPAN modules if you can. I know a lot of people disagree and historically they may have had good reason but I've been doing it for a long time and find it works very well.
In the Debian world there are an enormous number of Perl modules in pre-packaged form and if you happen to need one that isn't packaged you can build your own package with dh-make-perl and put it in your local apt repository. Being able to run apt-get install your-application and have it pull in all the required dependencies is a real time saver when your code is moving through Dev -> Staging/UAT -> Production workflows. It also gives you confidence that the version of a particular module you're deploying to production is the same as the one you tested in UAT.
One thing you absolutely should not do is use cpanm or the cpan shell as root to install modules into the system directories. If you decide to install direct from CPAN, then use local::lib to install the modules in an application-specific lib directory.
[Edit] Some sample commands as requested:
On a Debian-based system, you would first install the dh-make-perl tool:
sudo apt-get-install dh-make-perl
Then to download a package from CPAN and build it into a .deb file you would run a command like this*:
dh-make-perl --build --cpan Algorithm::CouponCode
You could install the resulting .deb file with:
sudo dpkg -i libalgorithm-couponcode-perl_1.005-1_all.deb
Managing your own apt repository is a whole other topic. In my case I'd copy the .deb to an appropriate directory on the local apt server and run a script to update the index (I think our script uses dpkg-scanpackages).
Note in my opening paragraph above I recommend using systems packages "if you can". To be clear, I meant in the case where most of the modules you want are already packaged by Debain. The example above did not build packages for any dependencies. If your app involves installing modules which have long dependency chains that are not in Debian already, then using cpanm and local::lib will simplify the install. But then you shoulder the burden of repeating that as your code advances through staging to production servers. And you may need to use cpanfile or carton to make sure you're getting the same versions at each step.
* one gotcha: if you have previously set up local::lib so that cpan installs go into a private directory (e.g.: /home/user/perl5) then that will affect the pathnames used in the .deb produced by dh-make-perl. To avoid that, run this before dh-make-perl:
unset PERL5LIB PERL_LOCAL_LIB_ROOT PERL_MB_OPT PERL_MM_OPT

Your system's perl was put there for your system's use. The folks that maintain your distribution will update it when they see fit to another version that suits the needs of your system. Using your system's Package Manager to manage it is really your best idea.
Feel free to use it, but if you need a different version, for whatever reason, you are best rolling your own into a separate location. When maintaining your own perl install, use CPAN.

Related

RPM Vs Tar based Installation

My knowledge on Linux administration is limited and hence wanted to check here about the pros and cons of installing any RHEL/CentOS Linux software using rpm packages over installing through tar/zip files.
Thanks
a non-exhaustive list of pros and contras:
rpm
intelligent dependency managment
conflict checking
allow easy and clean uninstall
allow for upgrades / downgrades
list all files owned by a package
a central database with all packages installed, which files they own, their interdependencies
from source
you choose yourself all compiler flags
you can choose a custom installation path
I have tried to explain the diff, pros and cons,
Tar
Basically tar is old way of dealing with in Linux. We can say its existence when the Linux was created.
Usually the tar consists of Source Code and needs to be compiled in binary format for us to use.
Pros:
Using tar packages you gain more control over the programs that you install.
If you want certain portions that avoided, you could do that on the go. Which give you the upper hand.
Cons:
The main issue comes in the maintainability of the packages installed.
They are hard to manage. Once you install, there was no way to manage the software unless and until its well documented. It also hard to version them and you are left blank on the software version you have. The possible reason for this because of the non-indexing nature of files. The files could be spread across your file system, which makes it difficult to remove or upgrade it.
Hard to automate.
It is also hard to automate because of the complexities in maintaining the packages.
Below I tried explaining how tar file are compiled to get better understanding,
Prepare(setup) environment for building
./configure
This script has lots of options that you should change. Like --prefix or --with-dir=/foo. That means every system has a different configuration. Also ./configure checks for missing libraries that should be installed. Anything wrong here causes not to build your application. That's why distros have packages that are installed on different places, because every distro thinks it's better to install certain libraries and files to certain directories. It is said to run ./configure, but in fact you should change it always.
Building the system
make
This is actually make all by default. And every make has different actions to do. Some do building, some do tests after building, some do checkout from external SCM repositories. Usually you don't have to give any parameters, but again some packages execute them differently.
Install to the system
make install
This installs the package in the place specified with configure. If you want you can specify ./configure to point to your home directory. However, lots of configure options are pointing to /usr or /usr/local. That means then you have to use actually sudo make install because only root can copy files to /usr and /usr/local.
Please go through the below link for more information on the above commands
Why always ./configure; make; make install; as 3 separate steps?
RPM
The RPM Package Manager (RPM) is an open packaging system,
RPM packages pre-compiled binary packages (as well as source packages) for an easy one-click installation experience. RPM by itself does not manage dependency and resolve conflicts. When combined with Yum or PackageKit it will resolve all the dependency for the package.
RPM makes system updates easy. Installing, uninstalling and upgrading RPM packages can be accomplished with short commands. RPM maintains a database of installed packages and their files, so you can invoke powerful queries and verification on your system. During upgrades, RPM handles configuration files carefully, so that you never lose your customisation, that you cannot accomplish with regular .tar files.
RPM feature has the ability to verify packages. If you deleted an important file for some package, you can verify the package. You will notified of changes, if any—at which point you can reinstall the package, if necessary. Any configuration files that you modified are preserved during re installation.
Pros:
Install, reinstall, remove, upgrade and verify packages
Use a database of installed packages to query and verify packages
Use metadata to describe packages, their installation instructions, and so on
Package pristine software sources into source and binary packages
Add packages to Yum repositories
Digitally sign your packages
Querying a package (if the package is on your local file system or after the package is installed)
Validating a package (checking a package has not been tampered with, before or after installation).
Cons
Not as customisable as tar.
eg on usability: We will see how to install package using Tar or rpm:
in Tar:
$ tar xvf package.tar
$ cd package
$ ./configure --prefix=PREFIX
$ make
$ make install
in RPM:
rpm -U package-2.4.x-1.i686.rpm
That simple!!.
It basically depends on the usability and the purpose of your use.
Each of them has its on pros and cons depends on how and for what we use it.
I know it a long explanation,how this will give you clear picture. I know there are more untouched such as architecture and execution. I am not pretty confident to explain those here.
In simple words you can say that rpm are prepackaged binaries. They're just ready to go, it does everything for you. But to install rpm and deb you need to be root to have some write permissions. That leaves some serious security hole in the system. You may be unknowingly installing a Torjan horse. Also if the packages are screwed up they may cause the installation to fail altogether.
I personally recommend using tar as you are in more control. It is old school, I know, that's why a bit difficult but, in my opinion best way to go.
You can further refer to the link:
https://tldp.org/HOWTO/Software-Building-HOWTO-4.html

How to automake installation process on Linux-like operating systems?

I wrote a simple C application but it has some dependencies. Instead of giving my friend (who is a linux noob) commands to run in terminal, to install the dependencies, I would like to give him a single file that would install everything my application needs.
Btw, is makefile a good idea, or maybe a bash script would be most appropriate? I would like to ask about the root password only once, save it somewhere (in the script/makefile variable) and then simply use it to install all dependencies. Any ideas how to do it the most professional way?
This is what packages are for.
Depending on what target OS/distribution, you will need to package a DEB or an RPM.
There are tools to simplify this process that allow declaring dependencies as well as running pre/post install/uninstall scripts.
The most professional way to distribute this is using a private repository.

Make - Make Install and Linux update

I am trying my hands new on Linux.
The following command is very useful:
sudo apt-get install <application>;
As it adds the application into the linux programs list and automatically upgrades it while running the update manager.
But I would like to get more knowledge on installing the programs from the .tar.gz archives as well.
So I do:
Extract the archive
./configure;
make;
make install;
I have two questions in this process:
1) I read in the forum that "make install" is not good if we are updating the binaries.
So should I just do "make" and the "install" ?
2) Second question is that is there a way to add the program installed in such manner to the Linux Software Update list so that I do not have to use the terminal for every new version that is released
Installing programs from tarballs:
You really do not want to install packages from .tar.gz when they are in the repositories. It is much harder to update or remove it manually than you could do with apt-get.
If you really have to compile the program yourself use checkinstall instead of make install. This creates a package you can install it via the package management and later remove using apt-get. This is much cleaner.
Also you may want to type
./configure && make && sudo checkinstall
instead of the commands you wrote. This way the program is only compiled if the configuration succeeded. The package is only built if the compilation succeeded. With ; instead of && all processes would be attempted no matter if its prerequisites are matched.
Graphical package managers
You can install your packages from GUI programs. Kubuntu uses for example uses muon for this, but the programs vary between distributions.
make install is "not good" if you want to be able to easily remove the files associated with a package as there is no log of the work it does and often no easy way to reverse the process. That has little to nothing to do with updating the software though (though updates can certainly run into related issues).
No, you can't add the manually compiled and installed software to your distributions list of packaged software (other than through something like checkinstall or creating a package yourself) since that's exactly what you were avoiding in the first place.
That all being said if the package exists for your distribution and you want to build it from source yourself you can often just build a more-or-less official version of the package from the distributions source package.

Install CPAN Modules without messing up the system Perl installation

I have heard that it is best to not install modules from CPAN where your system's version of Perl is. I know how to install modules using the command line, I was just wondering if there is a way to keep CPAN separate from the system's core Perl.
Should I:
Download the source and make a directory specifically for these modules?
Anybody have any other ideas or implementations they have used successfully?
I am using Arch Linux with Perl 5.16.2.
Are you looking for something like local::lib
local::lib - create and use a local lib/ for perl modules with PERL5LIB
Download and extract the latest version of local::lib:
curl -LO http://search.cpan.org/CPAN/authors/id/A/AP/APEIRON/local-lib-1.008004.tar.gz
tar xzf local-lib-1.008004.tar.gz
cd local-lib-1.008004/
Deploy it:
perl Makefile.PL --bootstrap=$HOME/perl5
make
make test
make install
Save persistent configuration:
cat << PROFILE >> $HOME/.profile
eval \$(perl -I\$HOME/perl5/lib/perl5/ -Mlocal::lib)
PROFILE
Now, you can logoff/logon your session or simply source ~/.profile.
After that steps, CPAN modules will be installed locally.
You don't need to install the module manually. You just need to have somewhere to install it to, and your environment configured to install it there. Then you can use cpan/cpanp/cpanm/etc as normal. (cpan minus wins for me)
Setting up that environment manually is a bit of a pain, so most people use an application to set up the configuration for them.
The two main choices for this are:
local::lib — This sets up your environment variables so you can install modules away from the system perl, but continues to use the system perl.
Perlbrew — this installs a complete perl for you so lets you avoid your system perl entirely, and use a more up to date version of perl itself then might come with your system. It also manages multiple perl installs side by side (so you can test your modules against different versions of perl).
Personally, I prefer Perlbrew (as it makes it easy to play with shiny new features like the yada yada operator and smart match (not that smart match is all that new now) but it takes longer to set up (as you have to compile perl).
I have heard that it is best to not install modules from CPAN where your system's version of Perl is.
The idea is to avoid breaking your distro's tools by upgrading a module they use.
Installing the modules to a fresh directory and telling Perl about it using PERL5LIB (which is what aforementioned install::lib does) is not going to help at all in that case, since Perl sees exactly the same thing as if you had installed the module in the usual site directory.
(One would mainly use PERL5LIB to install modules when one doesn't have permission to install to the default directories.)
The other problem with using the system Perl is that you are prevented from upgrading it.
The solution to both is to install your own build of Perl. This is very easy to do using perlbrew.
What's about cpanminus?
CPAN minus module
Why don't you pack the modules into real packages, rpm or dep style? That way you keep control over the installed software, you can remove and update the packages as required and as you are used to. So instead of bypassing the management, which rarely is a good idea, you stay in control.
If you are using an rpm based distribution I really recommend OBS for this task. You can create your own project, configure sources, test them and have packages created for all sorts of distributions and architectures. And when you import your home projects repository into your software management then installing the packages comes down to a single click.

How to find out and control where the Perl modules stored locally?

Some Perl modules, such as DBI, need to be downloaded, compiled and installed.
I'm connecting to a remote production testing computer, for which I have only my local user password (no root, for obvious reasons). I've used wget to download some external modules that I need, such as DBI, and unpacked these resulting in directories like ~/modules/DBI-<version>.
Normally, when compiling something for Linux, you run configure to pre-configure everything before installation; and one of its switches is --prefix=<some_dir>, which controls where the compiled executable and all compiled dependencies will ultimately end up.
But for Perl modules, you don't run configure, so my first question is:
Can I control where the compiled modules (e.g. DBI.pm) go when I run make? If so, how?
Failing that, I at least need to update #INC, so I can refer to the module; so my second question is:
How can I find out where the compiled modules went when I ran make?
I can't issue make install after compiling, and moreover, I've been asked not to. (I've been asked to design the script so that it doesn't rely on external modules being in the standard system path.)
perl modules should either be installed with the distributions's system, like you did with gentoo or pkg_add on BSD, etc. or by using CPAN. Don't do what you're doing, that is going to confuse you & the system.
perl -MCPAN -e "install DBI"
You can use local::lib to install Perl modules in a custom directory. Modules so installed can be used from Perl scripts:
use local::lib '/path/to/custom/directory'; # Custom modules can be `use`d from hereon
cpanm uses local::lib internally when you use the -l or -L flag. To install a module in the current directory:
cpanm -l. DBI
The installation directory is set when the makefile for the module is built. Each module comes with a Makefile.PL which must be run to build the makefile, taking into account the current Perl configuration. Makefile.PL has the option PREFIX that says where the build is going to be installed, so after unpacking the module's distribution and cding to the unpacked directory you can say
perl Makefile.PL PREFIX=/module/directory/path
make test
make install
This process is described in the Perl documentation - read perldoc perlmodinstall. You could go into the CPAN shell and use the 'o' (lower-case) oprion that allows you to change the options passed to makefile.PL, but I think the manual build/test/install is more straightforward and gives you more control over the process.
Remember to add
use lib qw(/module/directory/path);
to the start of your program to make sure Perl searches the new directory for modules.

Resources