Linux find command and copy and rename them same time - linux

Will you be able to help me to write a script, I just want to find log files over 2GB and copy them to archive folder in same directory.I just write a find command it is not working, appreciate if someone could help me.
ex - main log folders - /vsapp/logs/
- app1,app2,app3
there are lot of logs in the app1, app2 and app3 folders.
so i want to find the logs in the logs folder which is over 2GB, and copy them to archive folder with the different name with today's date.
ex - abcd.log -----copy to -----> abcd.log-08-22-2016
My command at the moment which is not working
find $i/* -type f -size +2G -exec cp '{}' $i/$arc/{}-$date

You can do:
find /src -type f -name '*.log' -size +2G -exec cp {} /dest/{}-$(date -I) \;
Additions/Modifications i made:
-name '*.log' searches only for log files, as we are only interested in those. You can look for files with any names too if unsure, just omit -name '*.log in that case
$(date -I) is command substitution the output will be today's date in format YYYY-mm-dd, you can also define a custom format, check man date
End the -exec action of find with \;

Related

Task Scheduler Script copy NEW files only - Synology

Similar to this question I want to make a task scheduler script to copy NEW files (last 24h) to a new folder.
I try to use this code:
find /volume1/start/ -mtime -1 -type f -exec cp -r {} /volume1/target/ \;
but it delivers a 1kb filename.pdf#SynoEAStream file instead of the file itself.
How can I fix that?
Ok actually it seems to work as supposed I actually just had the wrong file modification date in the files where it didn't work. The script additionally copies some "useless" #SynoEAStream files whicht I now avoid by only looking for pdf-files which ist what I wanted.
find /volume1/TestScan/start/ -mtime -1 -type f -iname '*.pdf' -exec rsync -r {} /volume1/TestScan/ziel/ \;
Maybe its helpful for someone

Linux move files from dir to dir by name mask

I am trying to move all files with names starts with SML from directory to another.
Tried with
find /var/.../Images/ -name SML\* mv /var/.../Images/Small but doesnt work
try find /var/.../Images/ -name SML\* -exec mv {} /var/.../Images/Small/ \;
I guess you want something like this:
dir=/path/to/your/Images
mkdir -p "$dir/Small"
find "$dir" -name "SML*" -not -wholename "$dir/Small/*" -exec mv {} "$dir/Small/" \;
Since the directory you move the files to is a subdirectory of the one you seach in, you need to exclude the files already moved there. So I added -not -wholename "$dir/Small/*"
To execute a command for each found file, you need -exec .... The alternative would be to pipe your find results to a while read loop.
When using -exec, the found name can be referenced by {}.
See man find for a lot more information.

Copy specific named directories which content changed the last 24 hours

i can recursivly find and copy all my test-directories (with content) of the current directory:
find . -name test ! -path "./my_dest/*" -exec cp -r --parents {} /path/to/my_dest \;
But now I want to copy only that test-directories (with content), which content was changed within the last 24 houres.
What do I have to add to my line above?
Edit: I want to have the same results as my find-line above, but I want only that entries in my result, in which folders a folder or a file has been changed within the last 24hours (or something else).
The line
find . -name test ! -path "./my_dest/*" ! -ctime +0 -exec cp -r --parents {} /path/to/my_dest \;
does not do that! This line would find&copy only the folderchanged test-folders but not the filechanged test-folders.
You use the rsync command built specifically for this task. Here is the documentation and the manual page.

Best way to tar and zip files meeting specific name criteria?

I'm writing a shell script on a Linux machine to be run via a crontab which is meant to move all files older than the current day to a new folder, and then tar and zip the entire folder. Seems like a simple task but for some reason, I'm running into all kinds of roadblocks. I'm new to this and self-taught so any help or redirection would be greatly appreciated.
Specific criteria for which files to archive:
All log files are in /home/tech/logs/ and all pdfs are in /home/tech/logs/pdf
All files are over a day old as indicated by the file name (file name does not include $CURRENT_DATE)
All files must be *.log or *.pdf (i.e. don't archive files that don't include $CURRENT_DATE if it isn't a log or pdf file.
Filename formatting specifics:
All the log file names are in home/tech/logs in the format NAME 00_20180510.log, and all the pdf files are in a "pdf" subdirectory (home/tech/logs/pdf) with the format NAME 00_20180510_00000000.pdf ("20180510" would be whenever the file was created and the 0's would be any number). I need to use the name rather than the file metadata for the creation date, and all files (pdf/log) whose name does not include the current date are "old". I also can't just move all files that don't contain $CURRENT_DATE in the name because it would take any non-*.pdf or *.log files with it.
Right now the script creates a new folder with a new pdf subdir for the old files (mkdir -p /home/tech/logs/$ARCHIVE_NAME/pdf). I then want to move the old logs into $ARCHIVE_NAME, and move all old pdfs from the original pdf subdirectory into $ARCHIVE_NAME/pdf.
Current code:
find /home/tech/logs -maxdepth 1 -name ( "*[^$CURRENT_DATE].log" "*.log" ) -exec mv -t "$ARCHIVE_NAME" '{}' ';'
find /home/tech/logs/pdf -maxdepth 1 -name ( "*[^$CURRENT_DATE]*.pdf" "*.pdf" ) -exec mv -t "$ARCHIVE_NAME/pdf" '{}' ';'
This hasn't been working because it treats the numbers in $CURRENT_DATE as a list of numbers to exclude rather than a literal string.
I've considered just using tar's exclude options like this:
tar -cvzPf "$ARCHIVE_NAME.tgz" --directory /home/tech/logs --exclude="$CURRENT_DATE" --no-unquote --recursion --remove-files --files-from="/home/tech/logs/"
But a) it doesn't work, and b) it would theoretically include all files that weren't *.pdf or *.log files, which would be a problem.
Am I overcomplicating this? Is there a better way to go about this?
I would go about this using bash's extended glob features, which allow you to negate a pattern:
#!/bin/bash
shopt -s extglob
mv /home/tech/logs/*!("$CURRENT_DATE")*.log "$ARCHIVE_NAME"
mv /home/tech/logs/pdf/*!("$CURRENT_DATE")*.pdf "$ARCHIVE_NAME"/pdf
With extglob enabled, !(pattern) expands to everything that doesn't match the pattern (or list of pipe-separated patterns).
Using find it should also be possible:
find /home/tech/logs -name '*.log' -not -name "*$CURRENT_DATE*" -exec mv -t "$ARCHIVE_NAME" {} +
Building on #tom-fenech answer, optimized to avoid many mv invocations:
find /home/tech/logs -maxdepth 1 -name '*.log' -not -name "*_${CURRENT_DATE?}.log" | \
xargs mv -t "${ARCHIVE_NAME?}"
An interesting feature, from processing the file thru pipes, is the ability to filter them with extra tools (aka grep :), which can (arguably) become more readable i.e. ->
find /home/tech/logs -maxdepth 1 -name '*.log' | fgrep -v "_${CURRENT_DATE?}" | \
xargs mv -t "${ARCHIVE_NAME?}"
Then similarly for the pdf ones, BTW you can "dry-run" above by just replacing mv by echo mv.
--jjo

In linux shell, How to cp/rm files by time?

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.
Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).
Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Resources