Perl 5.10 with Spreadsheet-ParseXLSX-0.17? - linux

Hi I sit behind a firewall at work unable to get a direct route to the internet but wanting to install a version of Perl for Linux that comes with, or is capable of being made to run Spreadsheet-ParseXLSX.
I am modifying a program that uses Spreadsheet-ParseExcel to work with xlsx files but an new to perl installations and only have user access.
Help would be appreciated.

In general you can get along fine by copying the perl modules from a distribution into a directory of your choice on the server, adding said directory to the PERL5LIB environment variable. Observe the local directory structure the distribution defines for its files. In the case of Spreadsheet::Parse, that would be:
_your dir_
Spreadsheet
ParseExcel
...(lots of stuff)...
WriteExcel
...(lots of stuff)...
XLSX
Fmt2007.pm
Utility2007.pm
ParseExcel.pm
WriteExcel.pm
XLSX.pm
Offhand I don't remember any dependencies that aren't satisfied by the core modules of 5.10 - however, if there are some, your perl will tell you ;-).
A slightly more robust method is to install the modules on a local machine under your control using eg. the CPAN module and copy the files from the build subdirectory or the site_perl subdirectory of your perl installation.
Caveat
This practice will only work reliably with pure perl modules !
It will always be better to address the sys admin people and ask them nicely to install the needed modules!

Related

Can one just copy perl modules from one Linux machine to another?

I have a remote CentOS server (Release 6.10) set up by someone else. I have quite a few perl modules installed on the machine.
I have set up a local CentOS server (Release 7.7.1908). I would like to have the EXACT same set of perl modules on my local machine. Installing them one by one via cpan is an option but I can run into issues as some of the perl modules are older (very) versions.
I was wondering, if I can copy modules from the remote server to my local server. Can I do that? Are there other options?
It's not safe to copy modules from one machine to another because things may not be set up identically. It's best to reinstall them.
You can use the autobundle command in the cpan shell to create a dump of all the modules you have installed on the old machine. You can then use that dump to tell the cpan shell on the new machine what modules to install.
Thanks to Polar Bear, here's a link to an article that explains how to reinstall the autobundle.
The solution suggested by Andy Lester is perhaps the best way to do it. I found a documentation here in addition to ones suggested.
In my case however this was not straight forward because the source server environment is very old and there were many dependencies that I will need to manually resolve. In general if you have similar environments and clean installs, the auto bundle approach will make it easy.

How to manage development and installed versions of a shared library?

In short: This question is basically about telling Linux to load the development version of the .so file for executables in the dev directory and the installed .so file for others.
In long: Imagine a shared library, let's call it libasdf.so. And imagine the following directories:
/home/user/asdf/lib: libasdf.so
/home/user/asdf/test: ... perform_test
/opt/asdf/lib: libasdf.so
/home/user/jkl: ... use_asdf
In other words, you have a development directory for your library (/home/user/asdf) and you have an installed copy of its previous stable version (/opt/asdf) and some other programs using it (/home/user/jkl).
My question is, how can I tell Linux, to load /home/user/asdf/lib/libasdf.so when executing /home/user/asdf/test/perform_test and to load /opt/asdf/lib/libasdf.so when executing /home/user/jkl/use_asdf? Note that, even though I specify the directory by -L during link, Linux uses other methods (for example /ect/ld.so.conf and $LD_LIBRARY_PATH) to find the .so file.
The reason I need such a thing is that, of course the executables in the development directory need to link with the latest version of the library, while the other programs, would want to use the stable version.
Putting ../lib in the library path doesn't seem like a secure idea, not to mention not completely correct since you can't run the test from a different directory.
One solution I thought about is to have perform_test link with libasdf-dev.so and upon install, copy libasdf-dev.so as libasdf.so and have others link with that. This solution has one problem though. Imagine the following additional directory:
/home/user/asdf/tool: ... use_asdf_too
Which gets installed to:
/opt/asdf/bin: use_asdf_too
In my solution, it is unknown what use_asdf_too should be linked against. If linked against libasdf.so, it wouldn't work properly if invoked from the dev directory and if linked against libasdf-dev.so, it wouldn't work properly if invoked from the installed location.
What can I do? How is this managed by other people?
Installed shared objects usually don't just end with ".so". Usually they also include their soname, such as libadsf.so.42.1. The .so file for development is typically a symlink to a fully-versioned filename. The linker will look for the .so file and resolve it to the full filename, and the loader will then load the fully-versioned library instead.

Install CPAN Modules without messing up the system Perl installation

I have heard that it is best to not install modules from CPAN where your system's version of Perl is. I know how to install modules using the command line, I was just wondering if there is a way to keep CPAN separate from the system's core Perl.
Should I:
Download the source and make a directory specifically for these modules?
Anybody have any other ideas or implementations they have used successfully?
I am using Arch Linux with Perl 5.16.2.
Are you looking for something like local::lib
local::lib - create and use a local lib/ for perl modules with PERL5LIB
Download and extract the latest version of local::lib:
curl -LO http://search.cpan.org/CPAN/authors/id/A/AP/APEIRON/local-lib-1.008004.tar.gz
tar xzf local-lib-1.008004.tar.gz
cd local-lib-1.008004/
Deploy it:
perl Makefile.PL --bootstrap=$HOME/perl5
make
make test
make install
Save persistent configuration:
cat << PROFILE >> $HOME/.profile
eval \$(perl -I\$HOME/perl5/lib/perl5/ -Mlocal::lib)
PROFILE
Now, you can logoff/logon your session or simply source ~/.profile.
After that steps, CPAN modules will be installed locally.
You don't need to install the module manually. You just need to have somewhere to install it to, and your environment configured to install it there. Then you can use cpan/cpanp/cpanm/etc as normal. (cpan minus wins for me)
Setting up that environment manually is a bit of a pain, so most people use an application to set up the configuration for them.
The two main choices for this are:
local::lib — This sets up your environment variables so you can install modules away from the system perl, but continues to use the system perl.
Perlbrew — this installs a complete perl for you so lets you avoid your system perl entirely, and use a more up to date version of perl itself then might come with your system. It also manages multiple perl installs side by side (so you can test your modules against different versions of perl).
Personally, I prefer Perlbrew (as it makes it easy to play with shiny new features like the yada yada operator and smart match (not that smart match is all that new now) but it takes longer to set up (as you have to compile perl).
I have heard that it is best to not install modules from CPAN where your system's version of Perl is.
The idea is to avoid breaking your distro's tools by upgrading a module they use.
Installing the modules to a fresh directory and telling Perl about it using PERL5LIB (which is what aforementioned install::lib does) is not going to help at all in that case, since Perl sees exactly the same thing as if you had installed the module in the usual site directory.
(One would mainly use PERL5LIB to install modules when one doesn't have permission to install to the default directories.)
The other problem with using the system Perl is that you are prevented from upgrading it.
The solution to both is to install your own build of Perl. This is very easy to do using perlbrew.
What's about cpanminus?
CPAN minus module
Why don't you pack the modules into real packages, rpm or dep style? That way you keep control over the installed software, you can remove and update the packages as required and as you are used to. So instead of bypassing the management, which rarely is a good idea, you stay in control.
If you are using an rpm based distribution I really recommend OBS for this task. You can create your own project, configure sources, test them and have packages created for all sorts of distributions and architectures. And when you import your home projects repository into your software management then installing the packages comes down to a single click.

How to find out and control where the Perl modules stored locally?

Some Perl modules, such as DBI, need to be downloaded, compiled and installed.
I'm connecting to a remote production testing computer, for which I have only my local user password (no root, for obvious reasons). I've used wget to download some external modules that I need, such as DBI, and unpacked these resulting in directories like ~/modules/DBI-<version>.
Normally, when compiling something for Linux, you run configure to pre-configure everything before installation; and one of its switches is --prefix=<some_dir>, which controls where the compiled executable and all compiled dependencies will ultimately end up.
But for Perl modules, you don't run configure, so my first question is:
Can I control where the compiled modules (e.g. DBI.pm) go when I run make? If so, how?
Failing that, I at least need to update #INC, so I can refer to the module; so my second question is:
How can I find out where the compiled modules went when I ran make?
I can't issue make install after compiling, and moreover, I've been asked not to. (I've been asked to design the script so that it doesn't rely on external modules being in the standard system path.)
perl modules should either be installed with the distributions's system, like you did with gentoo or pkg_add on BSD, etc. or by using CPAN. Don't do what you're doing, that is going to confuse you & the system.
perl -MCPAN -e "install DBI"
You can use local::lib to install Perl modules in a custom directory. Modules so installed can be used from Perl scripts:
use local::lib '/path/to/custom/directory'; # Custom modules can be `use`d from hereon
cpanm uses local::lib internally when you use the -l or -L flag. To install a module in the current directory:
cpanm -l. DBI
The installation directory is set when the makefile for the module is built. Each module comes with a Makefile.PL which must be run to build the makefile, taking into account the current Perl configuration. Makefile.PL has the option PREFIX that says where the build is going to be installed, so after unpacking the module's distribution and cding to the unpacked directory you can say
perl Makefile.PL PREFIX=/module/directory/path
make test
make install
This process is described in the Perl documentation - read perldoc perlmodinstall. You could go into the CPAN shell and use the 'o' (lower-case) oprion that allows you to change the options passed to makefile.PL, but I think the manual build/test/install is more straightforward and gives you more control over the process.
Remember to add
use lib qw(/module/directory/path);
to the start of your program to make sure Perl searches the new directory for modules.

What's the accepted method for deploying a linux application that relies on shared libraries?

I have an application that relies on Qt, GDCM, and VTK, with the main build environment being Qt. All of these libraries are cross-platform and compile on Windows, Mac, and Linux. I need to deploy the application to Linux after deploying on Windows. The versions of vtk and gdcm I'm using are trunk versions from git (about a month old), more recent than what I can get apt-get on Ubuntu 11.04, which is my current (and only) Linux deployment target.
What is the accepted method for deploying an application that relies on these kinds of libraries?
Should I be statically linking here, to avoid LD_LIBRARY_PATH? I see conflicting reports on LD_LIBRARY_PATH; tutorials like this one suggest that it's the 'right way' to modify the library path to use shared libraries through system reboots. Others suggest that I should never set LD_LIBRARY_PATH. In the default version of GDCM, the installation already puts libraries into the /usr/local/lib directory, so those libraries get seen when I run ldd <my program>. VTK, on the other hand, puts its libraries into /usr/local/lib/vtk-5.9, which is not part of the LD_LIBRARY_PATH on most user's machines, and so is not found unless some change is made to the system. Copying the VTK files into '/usr/local/lib' does not allow 'ldd' to see the files.
So, how can I make my application see VTK to use the libraries?
On windows, deploying the dlls is very straightforward, because I can just include them in the installer, and the application finds them because they are in the local directory. That approach does not work in Linux, so I was going to have the users install Qt, GDCM, and VTK from whatever appropriate source and use the default locations, and then have the application point to those default locations. However, since VTK is putting things into a non-standard location, should I also expect users to modify LD_LIBRARY_PATH? Should I include the specific versions of the libraries that I want and then figure out how to make the executable look in the local directory for those libraries and ignore the ones it finds in the library path?
Every "serious" commercial application I have ever seen uses LD_LIBRARY_PATH. They invariably include a shell script that looks something like this:
#!/bin/sh
here="${0%/*}" # or you can use `dirname "$0"`
LD_LIBRARY_PATH="$here"/lib:"$LD_LIBRARY_PATH"
export LD_LIBRARY_PATH
exec "$0".bin "$#"
They name this script something like .wrapper and create a directory tree that looks like this:
.wrapper
lib/ (directory full of .so files)
app1 -> .wrapper (symlink)
app1.bin (executable)
app2 -> .wrapper (symlink)
app2.bin (executable)
Now you can copy this whole tree to wherever you want, and you can run "/path/to/tree/app1" or "/path/to/tree/app2 --with --some --arguments" and it will work. So will putting /path/to/tree in your PATH.
Incidentally, this is also how Firefox and Chrome do it, more or less.
Whoever told you not to use LD_LIBRARY_PATH is full of it, IMHO.
Which system libraries you want to put in lib depends on which Linux versions you want to officially support.
Do not even think about static linking. The glibc developers do not like it, they do not care about supporting it, and they somehow manage to break it a little harder with every release.
Good luck.
In general, you're best off depending on the 'normal' versions of the libraries for whatever distribution you're targetting (and saying you don't support dists that don't support recent enough versions of the lib), but if you REALLY need to depend on a bleeding edge version of some shared lib, you can link your app with -Wl,-rpath,'$ORIGIN' and then install a copy of the exact version you want in the same directory as your executable.
Note that if you use make, you'll need $$ in the makefile to get a single $ into the argument that is actually sent to the linker. The single qutoes are needed so the shell doesn't munge things...
Well, there are two options for deploying Linux application.
The correct way:
make a package for your app and for the libraries, if they are so special, that they can't be installed from standard repositories
There are two major package formats. RPM and DEB.
The easy way:
make a self-extract file that will install the "windows way" into /opt.
You can have libraries in the same directory as the executable, it's just not the preferred way.

Resources