linux shell script to copy directory tree and link files [closed] - linux

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I would like to be able to create a copy of a directory tree and soft link the files on it.
For example, from
/home/user/origin/a/sub/file.txt
I would like to get
/home/user/destination/a/sub/file.txt
being this one a link to the original file.txt.
I tested with
find /home/user/origin/ -type d -printf "mkdir -vp '/home/user/destination%p'\n" -o -type f -printf "ln -vs '%p' '/home/user/destination%p'\n" | sh
but it has two problems:
I'd like to copy from origin to destination, and it copies from origin to /home/user/destination/home/user/origin. It is not a biggie, as I can move that afterwards
If the file name is something like
In Fifty Years We'll All Be Chicks.txt
It stops working because the '.

Assuming I understand what you're trying to do, it seems easier to just use -exec
find /home/user/origin/ \
-type d -exec sh -c 'mkdir -v "/home/user/destination/${0#/home/user/origin/}"' {} \; \
-o \
-type f -exec sh -c 'ln -vs "$0" "/home/user/destination/${0#/home/user/origin/}"' {} \;
Note -or having lower precedence than the implied -and's is important here.

Related

Find and replace file/folder names and contents in whole Linux file system [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 6 years ago.
Improve this question
I have a Linux (Ubuntu) server on which I am hosting a website. I am changing my domain name, let's say from xxxxx.xx to yyyyy.yy.
What I would like is a find xxxxx.xx and replace it with yyyyy.yy both in all file and folder names and in all file contents across the whole file system to reflect this change.
I don't believe this question has been asked in its entirety, but if I've missed it please point me in the right direction. Thanks.
You can use GNU find and a bit of bash string manipulation for actual file-renaming.
*xxxx.xx* is a glob-pattern to match files/folders having this anywhere in their names.
Strictly recommend NOT to run the re-name straight away, but run with echo once to see if the files are listed properly. Am providing two different commands, one for renaming files and other for folders, because renaming folders needs couple of extra options to avoid recursive file-renaming.
For re-naming folders:-
find . -depth -type d -name "*xxxx.xx*" -execdir sh -c 'x=$1; y="${x/xxxx.xx/yyyy.yy}"; mv -v "$x" "$y"' sh {} \;
For files:-
find . -type f -name "*xxxx.xx*" -exec sh -c 'x=$1; y="${x/xxxx.xx/yyyy.yy}"; mv -v "$x" "$y"' sh {} \;
Do NOT run the commands right away, just run the below commands, to see if the original file/folder and the re-named file/folder have proper names as you intended.
find . -type f -name "*xxxx.xx*" -exec sh -c 'x=$1; y="${x/xxxx.xx/yyyy.yy}"; echo "$x" "$y"' sh {} \;
(and)
find . -depth -type d -name "*xxxx.xx*" -execdir sh -c 'x=$1; y="${x/xxxx.xx/yyyy.yy}"; echo "$x" "$y"' sh {} \;
Since you want to change the contents of file also, add an extra sed in-place file re-naming as
find . -type f -name "*xxxx.xx*" -exec sh -c 'x=$1; y="${x/xxxx.xx/yyyy.yy}"; mv -v "$x" "$y"; sed -i 's/xxxx.xx/yyyy.yy/g' "$y" ' sh {} \;

find *.tar then extract and delete the files [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm trying to find a tar file, extract the files then remove all the extracted files - I'm able to perform the find and extraction or find the file and remove it but I'm not able to string all three together.
Here is my best attempt below. It runs without error but doesn't delete the extracted files so I'm stuck on how to remove the files I've extracted to the current directory.
find ~ -name '*.tar' | xargs tar -xf && rm -f
I tried extracting the tar to another directory then removing the directory but couldn't get it to work while using xargs. I've tried searching quite a few different areas but couldn't find anything so I appreciate the help.
The && ends the pipeline, it's not part of the xargs command.
You can just run the commands using the -exec option to find:
find ~ -name '*.tar' -exec tar -xf {} \; -exec rm -f {} \;
To run two or multiple commands with xargs:
find ~ -name '*.tar' | xargs -I {} sh -c 'tar -xf {} && rm -f {}'
Only after successfully unpacking the tar file is deleted.

Linux find with prune and negation [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 8 years ago.
Improve this question
I want to grep all files in a directory except for
subdirectories of lib
images (png and jpg)
I'm doing it in a shell script, passing the arguments to grep, no problem.
This command excludes the subdirectories of lib
find src \
-name lib -prune -o \
-type f -exec grep -P "$#" {} +
and this one excludes the images
find src \
! -name "*.jpg" ! -name ".png" \
-type f -exec grep -P "$#" {} +
Put together as
find src \
-name lib -prune -o \
! -name "*.jpg" ! -name ".png" \
-type f -exec grep -P "$#" {} +
it fails to exclude the images. Any idea what's going on?
It fails to exclude png images because you left out the * in -name "*.png".
A generally useful approach is to filter results via grep after a pipe, this decreases complexity in the original command, so:
find [simplified find options] | egrep -v ".jpg|.png"

chmod exclusions [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I know I should assign a group and then set an umask so that groups writable permissions persist but for whatever reason I can't do this. I need to chmod recursively a directory except one sub folder (web10), would the following work?
cd /var/www/clients/
find . -type f -not -path "*web10*" -exec chmod 777 '{}' \;
If you want to exclude files or directories, you use -prune
find /var/www/clients/ -name web10 -type d -prune -o -type f -print0 | xargs -0 chmod 0640
You should also use xargs where possible. With -exec you call the command once for every file found, whereas xargs collects as many files as possible and calls the command once for N files, resulting in a more efficient execution and better performance.

exclude directories mv unix [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
The command below moves every hidden/normal file ending with *string without . or _ before it.
mv {.,}*[!._]string /destination
How can I also exclude moving all directories in the above command?
Try
find /WHERE/TO/FIND -name '*STRING' \( ! -name '*_STRING' -o ! -name '*.STRING' \) -type f -exec mv \{\} /WHERE/TO/MOVE \;
Note, if you want to move every file from only the /WHERE/TO/FIND directory, you should add -maxdepth 1 (after e.g. the -type f part).
How about:
for file in {.,}*[!._]string; do test -f "$file" && mv "$file" /destination; done
In what shell does the [!._] glob actually work when used with {.,}? You would probably be better off avoiding the {} notation and do:
for file in .*[!._]string *[!._]string; do ... ; done

Resources