Zip multiple folders and files depending on filesize in Linux/Ubuntu - linux

I have a directory "mapnik" with hundreds of sub-directories, each containing more than 10000 files. I would like to zip "mapnik" recursively, preserving the folder-structure but only adding files greater than 103 Byte to the archive.
How can I accomplish this? I tried using find and pipes, but with the wrong syntax and the huge number of files, "trial and error" is not the best way to get it done ;)
Thanks for your help guys!

How about
find -size +103c -print0 | xargs -0 zip -r outname.zip

Delan's suggestion produced some kind of zip-error whith files of the same name. But it got me on the right track. This is what worked for me:
cd mapnik
find . -size +103c -print | zip archive.zip -#

Related

Linux Copy All Files with specific filename length

I want to Copy all files in my directory with a specific file name length.
e.g.
These files exist:
1.py
12.py
123.py
321.py
1234.py
Than I want to copy only the files 123.py and 312.py (because of length of 3)
I am new to Linux and donĀ“t know how to accomplish this. Anyone can help me?
If I understood correctly, you want to copy files whose names consist of three characters followed by .py. This could be done using:
cp ???.py destination_directory/
(Note: this could fail if you have a very large number, but the limit is typically large on modern systems.)
You can do it using the command find
find directory1 -type f -size 3k -exec cp -nv {} directory2/ \;

How to pipe find results to unzip?

I have a lot of folders with a zip file in each. Most of the zip files in the folders have been opened already. I just want to unzip those which have not been opened, which I know all have the same date.
I'm trying to use the following but I'm getting hit back with Unzip rules. The first part finds all the files I need, but piping the results to unzip, as I have done, isn't enough.
find *2019-01-05* | unzip
you can try to use xargs to get prior results and then unzip them:
find *2019-01-05* | xargs unzip
That's:
find -type f -name \*2019-01-05\*.zip -exec unzip {} +
-type f for good measure, in case there are similarly named directories.

Zipping and deleting files with certain age

i'm trying to elaborate a command that will find files that haven't been modified in over 6 months and zip them with one command. Afterwards i want to delete all those files and i just archived.
My current command to find the directories with the files is
find /var/www -type d -mtime -400 ! -mtime -180 | xargs ls -l > testd.txt
This gave me all the directories including the files that are older than 6 months
Now i was wondering if there was a way of zipping all the results and deleting them afterwards. Something amongst the line of
find /var/www -type f -mtime -400 ! -mtime -180 | gzip -c archive.gz
If anyone knows the proper syntax to achieve this i'd love to know. Thakns!
Edit, after a few tests this command results in a corrupted file
find /var/www -mtime -900 ! -mtime -180 | xargs tar -cf test4.tar
Any ideas?
Break this into several distinct steps that you can implement and thoroughly test separately:
Build a list of files to be archived and then deleted, saved to a temp file
Use the list from step 1 to add the files to .tar.gz archives. Give the archive file a name following a specific pattern that won't appear in the files to be archived, and put it in a directory outside the hierarchy of files being archived.
Read back the files from the .tar.gz and compare them (or their hashes) to the original files to ENSURE that you got them all without corruption
Use the list from step 1 to delete the files. Do not use a wildcard for deletion. Put in some guard code to prevent deletion of any file matching the name pattern of the archive .tar.gz file(s) created in step 2.
When testing a script that can do irreversible damage, always code the dangerous command with a leading echo and leave it that way until you are sure everything works. Only then remove the echo.
Consider zip, it should meet your requirements.
find ... | zip -m# archive.zip
-m (move) deletes the input directories/files after making the specified zip archive.
-# takes the list of input files from standard input.
You may find more options which are useful to you in the zip manual, e. g.
-r (recurse) travels the directory structure recursively.
-sf (show-files) shows the files that would be operated on, then exits.
-t or --from-date operates on files not modified prior to the specified date.
-tt or --before-date operates on files not modified after or at the specified date.
This could possibly make findexpendable.
zip -mr --from-date 2012-09-05 --before-date 2013-04-13 archive /var/www

extracting nested different types of archives from different folders

I have got an archive of many fonts but i have troubble extracting them all into one folder. i tried to write a long script for 3 hours now, it somehow breaks on a path issue. i tried piping like find . -name *.zip|unzip -d ~/fonts but it doesnt work. i changed so much in the script i wrote, that it is not really presentable :(.
each fontfile is supposedly (i didnt check all, there are really many) inside a rar archive which together with a readme is in a zip archive which together with another readme is in each its own folder. can this be done in one line?
Try changing the one line like this
find . -name "*.zip" | xargs unzip -d ~/fonts
Try this
find . -name "*.zip" -exec unzip -d ~/fonts {} \;

How to build one file contains other files, selected by mask?

I need to put the contents of all *.as files in some specified folder into one big file.
How can I do it in Linux shell?
You mean cat *.as > onebigfile?
If you need all files in all subdirectories, th most robust way to do this is:
rm onebigfile
find -name '*.as' -print0 | xargs -0 cat >> onebigfile
This:
deletes onebigfile
for each file found, appends it onto onebigfile (this is why we delete it in the previous step -- otherwise you could end up tacking onto some existing file.)
A less robust but simpler solution:
cat `find -name '*.as'` > onebigfile
(The latter version doesn't handle very large numbers of files or files with weird filenames so well.)
Not sure what you mean by compile but are you looking for tar?

Resources