bash: /bin/tar: Argument list too long when compressing many files with tar - linux

I am trying compress files from an archive with the command
tar -czvf compress_file.tar.gz $(cat file_list.txt)
And I have an error
-bash: /bin/tar: Argument list too long
The files numbers is too long, how can I resolve this?

Use the "-T" option to pass a file to tar that contains the filenames to tar up.
tar -czv -T file_list.txt -f tarball.tar.gz

and how to make list of files to tar up:
first create the list of files to tar up
ls > temp
then
tar cvzf dicionario_ultra.tgz -X FILE -T temp
and finally
rm temp

You can use find to avoid the issue, it will list the files under current folder and the -print will trigger the tar with newline
find . -type f -print | tar -cvf somefile.tar -T -

Related

Linux tar with pattern match and removing leading paths

I'm trying to archive all the .log files located in the /var/log directory and on creation remove all leading paths on files.
I have found I can archive all the .log files easily with:
tar -cvf ~/backup.tar /var/log/*.log
unfortunately, after searching online the way to remove leading paths is to use -C to change directory for the command only now it doesn't recognize the *.log and thinks * is literal.
using:
tar -cvf ~/backup.tar -C /var/log *.log
I get an error saying cannot find file *.log.
I imagine my syntax must be off and I've tried some changes to the syntax with no avail.
Using find to pass files to tar:
find /var/log -name *.log -printf '%P\n' |\
tar -C /var/log -czf backup.tar.gz -T -
Find will look for *.log files and printf format the output to show just filenames.
tar's '-T -' tells to read filenames from stdin

Archive all the files from source directory into a xyz.gz file and move that to target directory using UNIX shell script

Requirement: Archive files using UNIX shell script into .gz format without directory structure
I am using below command
tar -C source_dir -zcvf target_dir/xyz.gz source_dir
example:
tar -C /home/log -zcvf /home/archive/xyz.gz /home/log
here xyz.gz contains /home/log
It's creating xyz.gz file maintaining the directory structure. I want only files to be archive without directory structure.
You can try the following command:
$ cd /home/log
$ tar zcvf /home/archive/xyz.gz *
You can use the --transform option to strip leading path components from the archived file names using a sed espression:
tar -C /home/log -zcvf /home/archive/xyz.gz --transform 's_.*/__' /home/log
This however will also write an entry for each encountered directory. If you don't want that, you can use find to find only regular files and pass them to tar on stdin like this:
cd /home/log
find -type f -print0 | tar -zcvf /home/archive/xyz.gz --transform 's_.*/__' --verbatim-files-from --null -T -
Note that this may create multiple entries with the same name in the tar archive, if files with the same name exist in different subdirectories. Also you should probably use the conventional .tar.gz or .tgz extension for the compressed tar archive.

Extract files from tar file using wildcard

Can any one tell me how to extract a tar file using wildcards, for example
$ tar -xvf file1_*.tar dir1/
Thanks in advance
You can execute the following in the same dir as the tars.
for filename in ./file1_*.tar; do tar -xvf $filename -C ./dir1/; done
To extract multiple tar files in a single directory, try the following (from the directory containing the files):
ls file1_*.tar | xargs -I{} tar -xvf {} dir1/
The command lists the tar files using your pattern in the current directory, piping them to xargs, which will execute the tar command on each file using the pattern tar -xvf {filename} dir1/.
To see exactly what will be performed, modify the above command to
ls file1_*.tar | xargs -I{} echo tar -xvf {} dir1/
xargs is an incredibly powerful tool to learn how to use from the commandline where a single command needs to be performed on multiple inputs, and will often save you a lot of time.
This post also has another alternative.

for each dir create a tar file

I have a bunch of directories that need to be restored, but they have to first be packaged into a .tar. Is there a script that would allow me to package all 100+ directories into their own tar so dir becomes dir.tar.
So far attempt:
for i in *; do tar czf $i.tar $i; done
The script that you wrote will not work if you have some spaces in a directory name, because the name will be split, and also it will tar files if they exist on this level.
You can use this command to list directories not recursively:
find . -maxdepth 1 -mindepth 1 -type d
and this one to perform a tar on each one:
find . -maxdepth 1 -mindepth 1 -type d -exec tar cvf {}.tar {} \;
Do you have any directory names with spaces in them at that level? If not, your script will work just fine.
What I usually do is write a script with the command I want to execute echoed out:
$ for i in *
do
echo tar czf $i.tar $i
done
Then you can look at the output and see if it's doing what you want. After you've determined that the program will work, edit the command line and remove the echo command.
If there are spaces in the directory names, then just put the variables inside double quotes:
for i in *
do
tar czf "$i.tar" "$i"
done
Get them all done simply and in parallel with GNU Parallel:
parallel tar -cf {}.tar {} ::: *
If you want to check what it is going to do without actually doing anything, add --dry-run like this:
parallel --dry-run tar -cf {}.tar {} ::: *
Sample Output
tar -cf ab.tar ab
tar -cf cd.tar cd
if number of directories are very large and their names are too long
after execution of statement number one
for i in *
do
echo tar czf $i.tar $i
done
you will get error "string too long"

How do I tar a directory without retaining the directory structure?

I'm working on a backup script and want to tar up a file directory:
tar czf ~/backup.tgz /home/username/drupal/sites/default/files
This tars it up, but when I untar the resulting file, it includes the full file structure: the files are in home/username/drupal/sites/default/files.
Is there a way to exclude the parent directories, so that the resulting tar just knows about the last directory (files)?
Use the --directory option:
tar czf ~/backup.tgz --directory=/home/username/drupal/sites/default files
Hi I've a better solution when enter in the specified directory it's impossible (Makefiles,etc)
tar -cjvf files.tar.bz2 -C directory/contents/to/be/compressed .
Do not forget the dot (.) at the end !!
cd /home/username/drupal/sites/default/files
tar czf ~/backup.tgz *
Create a tar archive
tar czf $sourcedir/$backup_dir.tar --directory=$sourcedir WEB-INF en
Un-tar files on a local machine
tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
Upload to a server
scp -r -i $privatekey $sourcedir/$backup_dir.tar $server:$deploydir/med365/
echo "File uploaded.. deployment folders"
Un-tar on server
ssh -i $privatekey $server tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
To gunzip all txt (*.txt) files from /home/myuser/workspace/zip_from/
to /home/myuser/workspace/zip_to/ without directory structure of source files use following command:
tar -P -cvzf /home/myuser/workspace/zip_to/mydoc.tar.gz --directory="/home/myuser/workspace/zip_from/" *.txt
If you want to tar files while keeping the structure but ignore it partially or completely when extracting, use the --strip-components argument when extracting.
In this case, where the full path is /home/username/drupal/sites/default/files, the following command would extract the tar.gz content without the full parent directory structure, keeping only the last directory of the path (e.g. files/file1).
tar -xzv --strip-components=5 -f backup.tgz
I've found this tip on https://www.baeldung.com/linux/tar-archive-without-directory-structure#5-using-the---strip-components-option.
To build on nbt's and MaikoID's solutions:
tar -czf destination.tar.gz -C source/directory $(ls source/directory)
This solution:
Includes all files and folders in the directory
Does not include any of the directory structure (or .) in the final product
Does not require you to change directories.
However, it requires the directory to be given twice, so it may be most useful in another script. It may also be less efficient if there are a lot of files/folders in source/directory. Adjust the subcommand as necessary.
So for instance for the following structure:
|- source
| |- one
| `- two
`- working
the following command:
working$ tar -czf destination.tar.gz -C ../source $(ls ../source)
will produce destination.tar.gz where both one and two (and sub-files/-folders) are the first items.
This worked for me:
gzip -dc "<your_file>.tgz" | tar x -C <location>
For me -C or --directory did not work, I use this
cd source/directory/or/file
tar -cvzf destination/packaged-app.tgz *.jar
# this will put your current directory to what it previously was
cd -
Kindly use the below command to generate tar file without directory structure
tar -C <directoryPath> -cvzf <Path of the tar.gz file> filename1 filename2... filename N
eg:
tar -C /home/project/files -cvzf /home/project/files/test.tar.gz text1.txt text2.txt
tar -Cczf ~/backup.tgz /home/username/drupal/sites/default/files
-C does the cd for you

Resources