Filter and copy all the subdirectories and its content [closed] - linux

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 15 days ago.
Improve this question
I have a list of directories and files that I need to migrate. The directory tree looks something like:
-var
-data
-archive
-111111
-logs
datetime.log
meter.txt
-222222
-logs
datetime.log
meter.txt
-configurations
-rules
config.json
-recycle
When copying, it should satisfy the below conditions:
Copy only files/ directories that are less than or equal to 30 days. (This condition is not applicable to the configurations directory. Get all contents under configuration irrespective of its creation/modification date time.)
Exclude the directory recycle.
Copy only the content under archive. That means do not copy directories var, data & archive.
After copying the directory should look something like this:
-target
-111111
-logs
datetime.log
meter.txt
-222222
-logs
datetime.log
meter.txt
-configurations
-rules
config.json
I came up with this:
find var/data/archive ! -path "*/recycle*" \( -path "*/configurations*" -o -name "*" -mtime -30 \) | rsync -av --inplace --files-from=- . target
I am able to achieve the first 2 conditions, i.e. copying files that are less than 30 days and excluding the directory recycle. But not able to exclude var/data/archive. How do I fix this?

This worked for me.
cd ${HOME}/var/data/archive;\
find . ! -path "*/recycle*" \( -path "*/configurations*" -o -name "*" -mtime -10 \) | rsync -av --inplace --files-from=- . ${HOME}/target

Related

Why do I need '-o' in linux find command? [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 months ago.
Improve this question
I want to list files with certain name pattern under certain directory, and excluding certain sub-directory.
By doing
find "../../" -path "../../backup" -prune -regex "\.*\.v" -print
nothing is outputted.
But by adding -o
find "../../" -path "../../backup" -prune -o -regex "\.*\.v" -print
I get the correct results.
-o means or. But I don't think there is an or logic in my requirements, I think it should be and?
file name with certain pattern & under certain directory & not under certain sub-directory
Am I doing something wrong?
From the find man page:
-prune True; if the file is a directory, do not descend into it.
If -depth is given, false; no effect.
expr1 -o expr2
Or; expr2 is not evaluated if expr1 is true.
The construct -prune -o \( ... -print0 \) is quite common.
The idea here is that the expression before -prune matches
things which are to be pruned.
However, the -prune action itself returns true, so the following
-o ensures that the right hand side is evaluated only for
those directories which didn't get pruned (the contents of
the pruned directories are not even visited, so their contents are irrelevant).

Gzip or tars logs that are older than 7 days [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Need to create a single script that would Gzip or tars logs that are older than 7 days in multiple (3) paths /home/temp, home/logs, then confirm this one is over 50% capacity home/var/lib/mongo. This is what I got so far but I can't think of how to combine these:
find . -mtime +7 -print -exec gzip {} \; for all 3 but them
find /tmp/log/ -mtime +7 -type f -exec sh -c \
'tar -czvPf /tmp/older_log_$(basename $0)_$(date +%F).tar.gz $0' {} ;
# create a single tar file for each archive +7 days old on one mount
If I understand you correclty, you would like to archive multiple old logs in one tar files:
find /tmp/log /home/temp /home/logs -mtime +7 -type f \
| xargs tar -czf /tmp/older_log_$(date +%F).tar.gz --remove-files
For easy reading I put it on two lines.
It searches for all files you want to archive
Pass the found files as arguments to the tar archive (using xargs)
tar will make the new archive and add all the files
additionally tar will remove the original files with the last oiption (from GNU tar)

Copy files from one folder to another folder with filters [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have these files in my source folder
source_path/date=20191230/somefile.txt
source_path/date=20191231/somefile.txt
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
If I do the bellow command all files will be copied to my dest_path folder
cp --recursive source_path/ dest_path/
I just want to copy all folders where dates are in 2020 or something
I just need these to files of 2020
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
How can I add filters with cp command
This question is not suitable for Stack Overflow, but this is the answer:
cp --recursive source_path/date=20200* dest_path/
Or does dest_path not exist? Then you would write
mkdir -p dest_path && cp --recursive source_path/date=20200* dest_path/
You can use find with the -name, -type and -exec flags and so if the source directory was /home/foo and the destination directory /tmp:
find /home/foo -type d -name "*date-2020*" -exec cp '{}' /tmp \;
-type signifies that we only searching directories, -name is the name we are searching for and exec for command we are executing with the results

No such file or directory find command on linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I've created script which delete old backup file from directory but this command was worked fine before 1 week and Nothing change on script or packages but still getting below error:
root#:# find /var/backups/abc/* -type d -mtime +6
/var/backups/abc/2016-03-09_0321
root#:~# find /var/backups/abc/* -type d -mtime +6 -exec rm -rf {} \;
find: `/var/backups/abc/2016-03-08_0321': No such file or directory
Problem is that, this script run every day on cron, I getting a mail like " find: `/var/backups/abc/2016-03-08_0321': No such file or directory". files are deleted but such a mails are getting from root.
find /var/backups/abc/* -type d -mtime +6 -prune -exec rm -rf {} \;
Here, we useĀ -pruneĀ on the directories that we're about to delete, so find will then not try to read their contents.
This is because of after having returned your directory, find will try to look in it (to continue his recursive search), and will fail because of you just removed it.

What is the linux command to give execution permission of all files in current folder [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Remember I need the command to change permissions of all files in current folder?
It depends on what you mean by "current" folder; if you mean the current folder (and all subfolders) then you could use find and chmod like so -
find . -type 'f' -exec chmod +x {} \;
If you mean the current folder (and no sub-folders) then you would use it like so -
find . -maxdepth 1 -type 'f' -exec chmod +x {} \;
OR you could use find (possibly with maxdepth) and xargs likes so
find . -print0 | xargs -0 chmod +x
Note that these commands will correctly handle files with spaces in the name and most other edge cases.
Use chmod with a glob:
chmod +x *
(Technically that will give permission to list directories, too, but that shouldn't be a problem.)

Resources