Linux file utility magic.mgc database get content - linux

I write project where I need to identify certain file formats.
For some formats I have found signatures that I use for identifying easily (mp3, ogg), with another formats I have a big problem (like MPEG ADTS) - I just cannot find what kind of signature can be used for it.
I found out that File utility for Linux environment can do it.
I tried to search it in source code, but I've found nothing.
I found that file utility holds its database in magic.mgc file. But it's hold in binary form.
It looks like:
Does someone perhaps know how to find that database in plain text format?

That utility isn't a Linux-specific utility; it's the version of the UN*X file command originally written by Ian Darwin. The binary .mgc file is generated from a bunch of source files.
Your Linux distribution probably has a source code package for it; where you get that package, and how you install it, depends on which distribution you're using.
The source files from which the .mgc file was generated might also be available on your distribution without installing the source package for file; if so, you could use the file command to generate it, using the -C flag. I don't see them anywhere obvious on my Ubuntu 12.04 virtual machine, so that might require some other package to be installed (file itself is installed). (On OS X, they're in the directory /usr/share/file/magic.)
Alternatively, you could download the standard version of that file (which might have been modified by your distribution, so you might not want that version) and modify and build it.
Note that, on some versions of UN*X systems, the bulk of the work done by the file command is done in library routines in the "libmagic" library; see whether your distribution has that or can install it (try, for example, man libmagic) and whether it can do the job for you.

Related

How to copy an executable with all needed libraries?

I have two fairly identical (Linux-) systems but one with just a minimum set of packages installed. On one system I have a running (binary/ELF) executable which I want to copy over to the other system (with the minimum setup).
Now I need a way to copy all needed shared libraries as well. Currently I start the application on the source system and then go through the output of
lsof | grep <PID>
or
ldd <FILE>
to get a list of all libraries currently loaded by the application and copy them over manually.
Now my question is: before I start to automate this approach and run into lots of little problems and end up with yet another reinvented wheel - Is there a tool which already automates this for me? The tool I'm dreaming of right now would work like this:
$ pack-bin-for-copy <MY_EXE>
which creates a .tgz with all needed shared libraries needed to run this executable.
or
$ cp-bin <MY_EXE> user#target:/target/path/
which would just copy the binary once..
Note: I do NOT need a way to professionally deploy an application (via RPM/apt/etc.). I'm looking for a 'just for now' solution.
One tool that does something similar to what you suggest is linuxdeploy. While the tool is intended to ease the creation of an AppImage (see here for more information), it will pack your executable with any dependencies into a directory. Then you can just create a 'tgz' file of that directory instead of an AppImage.
ldd usuage is correct if you also enable -Wl,--no-dynamic-lookup at link time.

AutoIt unzipping files

I've been searching all day for a solution to unzip a file with AutoIt Script. I would like to unzip a file called full.zip to a folder.
This is my last place to turn since I can't find a solution of my own. I have found many solutions made by others; AutoIt3 files containing functions, but the code has issues of which I do not understand, and I'm unable to them copy here because I'm using a screen reader and it doesn't seem to format properly. This is why I can not copy code here.
Does anyone know of a method, tutorial or resource that I can use to unzip a file with AutoIt?
Thanks for any help,
josh.
There are a lot of solutions people have coded. A few examples are the 7zip UDF, Zip.au3, zipfldr UDF. If those are not working for you it is most likely because of small changes to AutoIt, which is usually just #incudes being restructured.
I usually just keep 7za.exe (7-zip's standalone executable, 7-zip can be downloaded from here, and then after installing you can copy the 7za.exe from its program directory).
Then it becomes as simple as a call to RunWait to create the archive:
RunWait("7za.exe a MyNewArchive.zip file1.ext file2.ext ...")
And then to unzip:
RunWait('7za.exe x MyArchive.zip -o"Path\To\MyOutputFolder"')
The 7-zip FAQ also mentions that you can use this exe in your own applications (including commercial ones) provided you mention it in the documentation and provide a link. That means you are ok to use FileInstall(...) to include 7za.exe in the compiled .exe.

how to create a debian package which updates only required files while updating the package

After few weeks of struggle i am able to create a medium native package debian package which works well in installation and removing of the package.
As http://www.quietsche-entchen.de/cgi-bin/wiki.cgi/-wiki/CreatingDebianPackages
Debian wiki
http://wiki.debian.org/HowToPackageForDebian http://www.debian.org/doc/manuals/maint-guide/ these are the quite good material for beginners,
I have basic problem, in updating the package all the files data.tar.gz are updated by default.
I want only few files to get updated in the data.tar.gz based on a key variable stored in all the files.
After the unpacking that is executing preinst script, all the files in data.tar.gz are already updated..
my idea was to take back up of the files intially before upgrading the package, and check key variable in files.. if the key variable is greater than the current variable replace it..
which means i am writing a simple backup script.. and executing in the postinst file..
i donot think this is good idea.. and more over limitations in dash script make it a very tough job..
What are you trying to accomplish here? During the reinstallation (or upgrading) of a Debian package, replacement of all of the non-conffiles with the latest version is exactly what's supposed to happen. If the file hasn't changed since the last installed version of the package then there's no harm in updating it anyway, and if is has changed, it's supposed to be updated.
If you have specific files which might be modified by the user and should be preserved across upgrades, make then conf files. The package system will prompt the user and ask them if they want to keep the package maintainer's version or the locally modified version.
(But if you're going to make every file a conf file, then you're probably doing something wrong.)
To make a file a conffile, list it in debian/conffiles. But if the file is going to be installed under /etc then you don't need to do this because dh_installdeb will do it for you.
EDIT following additional information in comment:
Suppose you have files test1.sh and test2.sh (among others) in your package. In the Debian world, they are either conffiles are intended to be modified by the end user, or they're not.
conffiles should be relatively few in number and as short as possible, to minimize the burden of having to reconcile changes made by the package maintainer with conflicting changes made by the end user.
If there are things mixed into the code that the end user is likely to want to tune, try to factor them out into a configuration file. If you put that file in /etc, you don't even have to manually designate it as a conffile.
If the end user needs to make a change to a non-conffile, they should use the dpkg-divert protocol to (1) move the original file aside, and (2) edit a copy. Diverted files are respected by package upgrades. The end user who uses dpkg-divert should be aware that things might break after upgrades as a result, because the package maintainer hasn't foreseen that these files would be modified by end users and the locally modified version might be incompatible with a newly upgraded version of a different file. dpkg-divert should be used carefully and sparingly.

let ./configure find library files in specific directory

I'm currently installing R software on a shared space across several servers. After installation I found that when I login on different servers, R is not guaranteed to run due to the missing of some library files on different machines.
Here is what I'm trying to do: since the installation of R is machine-dependent, I'd like to put all missing library files like libtermcap.so.2, libg2c.so.1, etc, to a single directory on the shared space, so that when I run ./configure, it will also search for this directory. Since this directory is shared, the installation could become machine-independent, so I won't need to add missing files on each server.
Is there an option to achieve this when I run ./configure? Thanks.
Assuming you have copied the library files to /shared/lib/ and the header files to /shared/include/, you can run
./configure LDFLAGS=-L/shared/lib CPPFLAGS=-I/shared/include ...other options...
Note, however, that you are bound to run into trouble at run time, when you have to convince your installation to use the shared libraries from the right directory, especially in case someone decides to upgrade the default version on the respective host. That whole business is platform and installation dependent. I think if your hosts are not at least mostly identical, you ought to install your software (R) locally in a way suitable to the respective system.
Peter's answer is correct (+1), and please take special note of his suggestion to install locally. Using the local package management system and auto updating on each box is (in the long run) a much easier solution than trying to get compatible binaries/libraries on a shared drive. To simplify using Peter's solution, note that you can place the appropriate arguments in /shared/share/config.site. For example:
$ cat > /shared/share/config.site << EOF
: ${LDFLAGS=-L/shared/lib}
: ${CPPFLAGS=-I/share/include}
EOF
Whenever you run configure with --prefix=/shared, the config.site file will be read and defaults will be set.

Creating a self-extracting zip archive on a linux box

Due to a number of constraints that I won't get into, I have to create a self-extracting ZIP archive on a linux box. The resulting archive should be executable on Windows only. Is this at all possible? If so, what tools would do the job?
Background: when the user downloads a setup package from my linux box, I need that setup package to be pre-populated with a certain key. I only know that key at runtime; my idea was to write a simple .xml file with that key, then zip up the .msi installer with that xml file into a self-extracting archive, and send that file to the browser. If you have alternative ideas on how to make it happen, I'd love to hear those, too
Your answer can be found at the following page: http://ubuntuforums.org/showthread.php?t=847329 Please note that I do not take credit for anything other than using Google and finding something that sounds logically like it would work. I do not guarantee that this information will work, or that you will be able to find the mentioned materials online anymore. I'm just a fan of Google. I hope this helps.
The problem A few days back I had to
create a Win32 compatible
self-extracting ZIP file for a friend.
Sounds easy, right. The problem was
that I didn't have a Windows machine
nearby and I didn't want to install
any archiving programs under Wine.
NOTE: A freeware ZIP program such as
IZArc under Wine can be used to create
a Win32 self-extracting ZIP file too.
That will not be covered by this
howto, sorry.
The "research" Googling around I found
this forum post dated August 2003.
Reading it I found out that
self-extracting ZIP files are nothing
more but a suitable unzip binary
followed by a normal ZIP file. I used
the unzipsfx.exe included in Info-ZIP
UnZip 5.52.
The link on that post worked a few
days ago so I got my hands on the
unzipsfx.exe that I was looking for.
Today, 2nd July 2008 I found the link
dead. After some googling I didn't
find a working link anywhere. I read
the licence a few times and understood
that I can redistribute the original
unzipsfx.exe with a license included.
Please note that the
unzipsfx-552_win32.tar.gz (80 kB) is
not an official Info-ZIP package and
it includes copyrighted software that
I take no credit for. More info in the
Info-ZIP license that is also included
in the tarball. The source code for
the binaries included can be found
here.
The solution
Step one, getting the unzipsfx.exe and
zip package:
* open the Terminal (in Ubuntu press alt+f2 and type gnome-terminal)
* type in the following commands Code:
wget
http://kolmoskone.homelinux.org/~kaja/kamaa/unzipsfx-552_win32.tar.gz
tar zxf unzipsfx-552_win32.tar.gz sudo
apt-get install zip
Step two, creating a ZIP file in
Ubuntu:
* open the file manager (nautilus) and select the files you want to have
zipped
* right click and select Create an archive (or similar). Select a
location for the ZIP file, using your
home directory is the easiest. Select
type .zip. See man zip for information
on how to create a ZIP file in command
line.
Step three, making the ZIP file
self-extracting
* type in the following commands Code:
cat unzipsfx-552_win32/unzipsfx.exe
MYZIPFILE.zip > mysfxfile.exe zip -A
mysfxfile.exe
mysfxfile.exe can now be opened in any
Win32 compatible system (including for
example Windows XP/2000/Vista and even
Wine in Linux) or ANY ZIP COMPATIBLE
archive program such as file-roller in
Ubuntu.
I was able to make this work with unzipsfx. There's a newer version of it available - just came out in April 2009 - version 6.0. Version 5.52 didn't support the critical functionality that I needed - launching a particular setup file after the extraction is completed.
So I downloaded the source files for 6.0. I then modified them to exclude the "prompt to launch stuff" check that is there by default. I recompiled using Visual Studio 2008, tried the steps described in the tutorial above, and it all works like a charm now.

Resources