Issue extracting macOS created TAR archive with BusyBox - linux

I have created a tar on my Mac with the following command:
$ tar -czvf test.tar.gz test/
Extracting it on my target system (BusyBox) gives me the following error message:
$ tar xzvf test.tar.gz
tar: corrupted octal value in tar header
I googled and found some other folks who had the same problem.
Apparently it's an incompatibility between the two implementations.
I tried $ brew install gnu-tar and then $ gtar -czvf test.tar.gz test/ but it is still not working.
Unfortunately I did not find another solution.
$ tar
BusyBox v1.13.2 (2012-04-08 17:28:57 CDT) multi-call binary
$ tar --version
bsdtar 2.8.3 - libarchive 2.8.3
$gtar --version
tar (GNU tar) 1.29
UPDATE:
gtar -czvf test.tar.gz test/ --format=posix
Works but it still generates skipping header warnings:
tar: warning: skipping header 'x'

Related

How can I compile and install Emacs in the home directory?

I want to install Emacs without using APT/YUM, because I am not a super user.
I downloaded the source code from
https://www.gnu.org/software/emacs/manual/html_node/efaq/Installing-Emacs.html. Then, I commanded
./configure --prefix=$HOME/.local/emacs/27_1
Standard output said
configure: error: The following required libraries were not found:
gnutls
Maybe some development libraries/packages are missing?
To build anyway, give:
--with-gnutls=ifavailable
as options to configure.
So I commanded
./configure --with-gnutls=ifavailable --prefix=$HOME/.local/emacs/27_1
Standard output said
configure: error: The required function 'tputs' was not found in any library.
The following libraries were tried (in order):
libtinfo, libncurses, libterminfo, libcurses, libtermcap
Please try installing whichever of these libraries is most appropriate
for your system, together with its header files.
For example, a libncurses-dev(el) or similar package.
So I downloaded ncurses and installed it by:
wget https://ftp.gnu.org/pub/gnu/ncurses/ncurses-6.2.tar.gz
tar xvfz ncurses-6.2.tar.gz
cd ncurses-6.2
./configure --prefix=$HOME/.local/ncurses/6_2 --with-shared --with-pkg-config-libdir=$HOME/ncurses/6_2/lib/pkgconfig --enable-pc-files
make
make install
And I made symbolic links and PATH
cd ~/.local/bin
ls ../ncurses/6_2/bin/ | xargs -I {} ln -s ../ncurses/6_2/bin/{} {}
cd ~/.local/include
ls ../ncurses/6_2/include/ | xargs -I {} ln -s ../ncurses/6_2/include/{} {}
cd ~/.local/lib
ls ../ncurses/6_2/lib/ | xargs -I {} ln -s ../ncurses/6_2/lib/{} {}
cd ~/.local/lib/pkgconfig
ls ../../ncurses/6_2/lib/pkgconfig/ | xargs -I {} ln -s ../../ncurses/6_2/lib/pkgconfig/{} {}
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HOME/.local/lib
export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:$HOME/.local/lib/pkgconfig
I did configure again, but the same error occurred. What should I do?

tar czf with --remove-files and -c

When I want to tar from a dir and remove all the original files I used the below command:
tar -cvzf /xx/yy/data-backup.tar.gz --remove-files -C /aa/daily-backup/ .
All the files under /aa/daily-backup/ is tarred and removed successfully.
But on the end of the output of the terminal, it shows:
tar: /aa/daily-backup/ .: Cannot rmdir:Invalid argument
If I remove the --remove-files, this command will run successfully.
Obviously, I don't want to remove the /aa/daily-backup/, how should I revise my command?

Duplicate files with find and tar

I'm trying to back up a bunch of directories with tar and using find to get the files. I've seen this solution elsewhere in an old post but it duplicates every file and directory in the tarball; find itself doesn't duplicate anything
find d1 d2 -print0 | tar -czvf backup.tar.gz --null -T -
Using Ubuntu 18.04 LTS, Gnu find 4.7.0 and Gnu tar 1.29
I can just give the directories to tar, but curious why this behaviour is happening.
Why use find? Just pass all directories to the tar command:
tar -czf backup.tar.gz d1 d2

How i can remove ghdl 0.29 from Debian?

I follow this steps to install GHDL compiler in my Debian, but now I need to uninstall this compiler to install x64 version, and I can't.
By downloading the binaries and unpacking them manually:
$ wget http://ghdl.free.fr/site/uploads/Main/ghdl-i686-linux-latest.tar
$ sudo tar xvf ghdl-i686-linux-latest.tar
(This generates the file ghdl-0.29-i686-pc-linux.tar.bz2)
$ cd ghdl-0.29-i686-pc-linux
$ sudo tar -C / -jxvf ghdl-0.29-i686-pc-linux.tar.bz2
(This copy the files to /usr/local/bin and /usr/local/lib)
I have used dpkg --purge ghdl, but if I use ghdl --version, the ghdl 0.29 still in the system.
How can I remove it?
faced the same situation and this is how i did it:
goto where your tarball is and use this command
sudo tar -tf ghdl-0.29-i686-pc-linux.tar.bz2 | sed 's/^..//' | parallel sudo rm -rf
explanation:
tar -tf ghdl-0.29-i686-pc-linux.tar.bz2 List all files in archive.tar
sed 's/^..//' Removes initial directory characters
parallel sudo rm -rf removes matching files
this will leave some empty directories at /usr/local and if you want to get rid of them you can use
sudo find /usr/local/* -type d -empty -delete

untar all .gz in directories and subdirectories

For starters I have checked most of the solutions available no
How to untar all .tar.gz with shell-script?
Unzipping multiple zip files in a directory?
I have a directory that has subdirectories with .gz files in them, and would like to extract all into either one folder or just keep them in their folders.
Thank you in advance
Problem
You want to decompress all compressed files inside a directory and all its subdirectories.
Solution
Use bash and the utility find to output to the console a list of all contents from the present directory. Use a looping construct to decompress each file.
Decompress all files in the current directory:
$ for file in `ls -1`; do
sudo tar -xvf "${file}" ; done
Decompress all archives in the current directory and any subdirectories (my personal favorite):
$ for file in `find *`; do
sudo tar -xvf "${file}" ; done
Decompress all archives recursively and do the same again for any remaining:
# Make the above loop a function to be called more than once
# In case of compressed files inside compressed files this will
# run twice the previous snippet.
$ function decompress_all_complete () {
function decompress_all () {
for file in `find *`; do
sudo tar -xvf "${file}" ; done
} ;
for i in `echo {1..2}`; do
decompress_all_complete; done
}
You could use variants of this for loop, if you like adventure :-)
$ for program in tar unzip untar ; do # You could simply add to this list...
for file in `find *`; do
sudo `which "${program}"` -xvf "${file}" ;
done ;
done
Discussion
The utility find lists everything from your present directory, and is fast. The snippet below will decompress the files one directory at a time each time but illustrates the simple logic of all the following code. Above are options and variations that do solve this problem; I was assuming at first that your present working directory contains all the files that you want to decompress to use the simplest version of the snippet.
This pseudo code tries to communicate the logic behind my solution briefly:
#Use all decompressing programs locally installed :: apropos compress
for --temp_container_1-- in --list_of_programs_to_decompress_files-- ; do
# run each program through every file in the current directory
for --temp_container_2-- in --list_of_files-- ; do
# use program with x options on y file
sudo "${temp_container_1}" [TYPE COMMON OPTIONS] "${temp_container_2} ;
# Successfully decompressed some files. Try another program.
done ;
# There is nothing else to do.
done
Context
I tested these snippets using or in:
* Linux debian 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt9-3 (2015-04-23) x86_64 GNU/Linux
* find (GNU findutils) 4.4.2
* GNU bash, version 4.3.33(1)-release (x86_64-pc-linux-gnu)
* tar (GNU tar) 1.27.1
If I understand your question, you could use something like
DEST=<Destination Folder>
SRC=<Src Folder>
find $SRC -name "*.tar.gz" -or -name "*.tgz" -exec tar xzvvf -C $DEST {} \;
In case you want to recursively untar into the same folder the archive was in, you can do the following:
for file in `find . -name '*.tar.gz'`; do \
DRN_NAME=$(dirname $file)
sudo tar -xzvf "${file}" -C $DRN_NAME ;
sudo rm $file
done
for file in `find . -name '*.tgz'`; do \
DRN_NAME=$(dirname $file)
sudo tar -xzvf "${file}" -C $DRN_NAME ;
sudo rm $file
done
Works on:
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 20.04.1 LTS
Release: 20.04
Codename: focal
$ tar --version
tar (GNU tar) 1.30
$ find --version
find (GNU findutils) 4.7.0
$ bash --version
GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu)
This should recursively extract all tar.gz files in the current directory and all sub-directories.
for subdir in `find . -type d`
do
for f in $subdir/*.tar.gz
do
tar -xzf $f -C <destination>
done
done

Resources