find -exec unzip multiple .zip files, each into their own directory where source and destination different - linux

I have a directory that new .zip files get placed every day. I need to find new files within the last day, and unzip the files each into their own directory in a different location. What I have found with a lot of searching almost does this for me.
find /source1/source2/source3 -maxdepth 1 -type f -mtime -1 \
-exec sh -c 'unzip -d /dest1/dest2/"${1%.*}" "$1"' _ {}
The problem with the above line, is the destination directory it is trying to create is /dest1/dest2/source1/source2/source3/(dir that is the filename of the zip)/{unzipped files} I need it to just be /dest1/dest2/{filename}
Is there a way to strip the source directories out of the ${1%.*} variable? Or if there is a better way to get this done i'm open to any suggestion.

You can strip the source directories with basename. Just replace "${1%.*}" with $(basename "${1%.*}").

Related

Copy everything except specific files

How can I copy everything (files and directories(even if they are empty)) from one directory to another, except files ".php", and files with name "config.yml".
I need to do this with single command.
I have tried this one
find ./ -type f ! ( -name "*.php" -o -name "config.yml" ) -exec cp --parents -r -t /my/directory/ "{}" +
It works but if the directory have only files ".php", command will skip the directory and do not copy the empty one, but I need the directory even if it will be empty.
Sorry, I haven't enough reputation to put a comment, because I would like to ask you if the use of "find" is a requisite.
If is not, you can do it easily with the rsync command:
rsync -av --exclude=config.yml --exclude="*php" ORIGINFOLDER/ DESTFOLDER
Just change ORIGINFOLDER and DESTFOLDER for your folders name, and take a look at the man to see the meaning of the options.

In BASH, how do you reference a directory name in a copy statement of files you recursively "find"

I want to recursively find all files that end with *DETAIL.pdf
Create a new directory (in another drive) for each file with the same name as the original directory
Copy the file into the new directory
I have this as my current attempt:
find . -name \*DETAIL.pdf -type f -not -path "./test2" -exec cp -R {} ./test2 \;
I am struggling to create new directories for all these files by referencing the original directory name of each file.
The example mentions using cp but the question/problem itself does not, so I would suggest just using find and tar. Also, though the question is a little ambiguous as noted in the comments above, the example seems to suggest that the desired output directory is a child of the same directory being searched. Given that:
find . -path "./test2" -prune -o -type f -name '*DETAIL.pdf' -print0 | \
tar c --null --files-from=- | \
tar xC test2
This uses find for file selection, generating a (null-separated) file list, which tar then uses to copy the files, and the second tar will create the relative directories as needed and write the copied files.

Best way to tar and zip files meeting specific name criteria?

I'm writing a shell script on a Linux machine to be run via a crontab which is meant to move all files older than the current day to a new folder, and then tar and zip the entire folder. Seems like a simple task but for some reason, I'm running into all kinds of roadblocks. I'm new to this and self-taught so any help or redirection would be greatly appreciated.
Specific criteria for which files to archive:
All log files are in /home/tech/logs/ and all pdfs are in /home/tech/logs/pdf
All files are over a day old as indicated by the file name (file name does not include $CURRENT_DATE)
All files must be *.log or *.pdf (i.e. don't archive files that don't include $CURRENT_DATE if it isn't a log or pdf file.
Filename formatting specifics:
All the log file names are in home/tech/logs in the format NAME 00_20180510.log, and all the pdf files are in a "pdf" subdirectory (home/tech/logs/pdf) with the format NAME 00_20180510_00000000.pdf ("20180510" would be whenever the file was created and the 0's would be any number). I need to use the name rather than the file metadata for the creation date, and all files (pdf/log) whose name does not include the current date are "old". I also can't just move all files that don't contain $CURRENT_DATE in the name because it would take any non-*.pdf or *.log files with it.
Right now the script creates a new folder with a new pdf subdir for the old files (mkdir -p /home/tech/logs/$ARCHIVE_NAME/pdf). I then want to move the old logs into $ARCHIVE_NAME, and move all old pdfs from the original pdf subdirectory into $ARCHIVE_NAME/pdf.
Current code:
find /home/tech/logs -maxdepth 1 -name ( "*[^$CURRENT_DATE].log" "*.log" ) -exec mv -t "$ARCHIVE_NAME" '{}' ';'
find /home/tech/logs/pdf -maxdepth 1 -name ( "*[^$CURRENT_DATE]*.pdf" "*.pdf" ) -exec mv -t "$ARCHIVE_NAME/pdf" '{}' ';'
This hasn't been working because it treats the numbers in $CURRENT_DATE as a list of numbers to exclude rather than a literal string.
I've considered just using tar's exclude options like this:
tar -cvzPf "$ARCHIVE_NAME.tgz" --directory /home/tech/logs --exclude="$CURRENT_DATE" --no-unquote --recursion --remove-files --files-from="/home/tech/logs/"
But a) it doesn't work, and b) it would theoretically include all files that weren't *.pdf or *.log files, which would be a problem.
Am I overcomplicating this? Is there a better way to go about this?
I would go about this using bash's extended glob features, which allow you to negate a pattern:
#!/bin/bash
shopt -s extglob
mv /home/tech/logs/*!("$CURRENT_DATE")*.log "$ARCHIVE_NAME"
mv /home/tech/logs/pdf/*!("$CURRENT_DATE")*.pdf "$ARCHIVE_NAME"/pdf
With extglob enabled, !(pattern) expands to everything that doesn't match the pattern (or list of pipe-separated patterns).
Using find it should also be possible:
find /home/tech/logs -name '*.log' -not -name "*$CURRENT_DATE*" -exec mv -t "$ARCHIVE_NAME" {} +
Building on #tom-fenech answer, optimized to avoid many mv invocations:
find /home/tech/logs -maxdepth 1 -name '*.log' -not -name "*_${CURRENT_DATE?}.log" | \
xargs mv -t "${ARCHIVE_NAME?}"
An interesting feature, from processing the file thru pipes, is the ability to filter them with extra tools (aka grep :), which can (arguably) become more readable i.e. ->
find /home/tech/logs -maxdepth 1 -name '*.log' | fgrep -v "_${CURRENT_DATE?}" | \
xargs mv -t "${ARCHIVE_NAME?}"
Then similarly for the pdf ones, BTW you can "dry-run" above by just replacing mv by echo mv.
--jjo

Copy modified files with directory structure in linux

How can I copy a list of files modified today with the directory structure into a new directory. As shown in the following command I want to copy all the files modified today from /dev1/Java/src into /dev2/java/src. The src folder has many sub directories.
find /dev1/Java/src -newermt 2014-06-10 > 1.txt
for f in $(cat 1.txt) ; do cp $f /dev2/Java/src; done
You can take advantage of find and cpio utility.
cd /dev1/Java/src; find . -mindepth 1 -mtime -1 | cpio -pdmuv /dev2/Java/src
The above command goes to the source directory and finds the list of new files relative to the source directory.
The output is read by cpio and copies the files into the target directory in the same structure as the source, hence the need for relative pathnames.
Extracts the files modified within a day and copies them to the desired path.
find . -type f -mtime -1 -exec cp {} /path \;

Howto replace a file in several sub-folders

I've a series of directories containing a set of files. There is a new copy of this file which I would like to replace all instances with. How can do this with find command?
Latest file is in /var/www/html is called update_user.php
There are 125 directories with several other files including a copy of update_user.php. I want to replace these with the one in update_user.php excluding itself.
This should do the job:
find /path/to/old/files -type f -name update_user.php -exec cp /path/to/new/update_user.php {} \;
You should check if the new file is not inside /path/to/old and if so than first copy it outside and use that copy but.. it'll not harm if you don't - one cp will fail with are the same file error.
You can use
cp -v to see what it does
cp -u to update only when source file is newer
echo cp to perform dry run
I would suggest to check first if all dest. files are the same with:
find /path/to/old/files -type f -name update_user.php -exec md5sum {} \;|awk '{print $1}'|sort|uniq

Resources