How to replace the first line of every files found using "find" - linux

I have the list of files to replace the first line from running this command
find . -type f -name ".txt"
I want to replace the first line of the files found with this text "line 1"
Doing my research I found a way to delete the first line with
ex -sc '1d|x' file.txt
then prepend a file with
echo "line 1"|cat - file.txt > out && mv out file.txt
but I don't know how to delete first line and prepend for every files found

You can use exec
find . -name "*.txt" -exec sed -i .ORI '1s/.*/line 1/' {} \;
to edit the files in place saving backups as .ORI.

You can use exec or xargs with find, but for more complicated commands I always run processing in a loop. It's clearer that way. Here's what you can do:
find . -type f -name '*.txt' | while read -r f; do
ex -sc '1d|x' "$f"
echo "line 1"|cat - "$f" > out && mv out "$f"
done

Related

Remove closing PHP tags from all files

I am trying to find and remove all the closing PHP tags ?> from the end of the PHP files on a website.
I have used:
find . -type f -exec sh -c 'tail -n 1 "$1" | grep -q "?>" && printf "%s\n" "$1"' -- {} \;
And this gives me a list of all the files that end with the PHP tag which is expected but the list is also over 500 files long so it would take a lot of time to manually go through.
I'm looking for a relatively easy way to do this if it is possible.
I've tried using sed like so:
find . -type f -exec sh -c 'tail -n 1 "$1" | grep -q "?>" && printf "%s\n" "$1"' -- {} \; | sed -i '$ d' ./*
But this has two issues; it doesn't go into directories and it removes the last line from all files rather than just the ones that have the PHP closing tag.
I am expecting it to find all files with ?> on the very last line (I can get this to work) and then delete that last line with the tag.
I feel like I'm quite close, just missing something.
To remove last line that contains ?> you may use
sed '${/?>/d;}'
The '${/?>/d;}' means:
$ - get the last line only
/?>/ - match the line only if it contains ?> text
d - delete the line.
To recursively run the sed command on a directory use
cd /your/dir/here && find . -type f -print0 | xargs -0 sed -i '${/?>/d;}'
See more specific solutions for recursive file matching at How to do a recursive find/replace of a string with awk or sed?

Save output command in a variable and write for loop

I want to write a shell script. I list my jpg files inside nested subdirectories with the following command line:
find . -type f -name "*.jpg"
How can I save the output of this command inside a variable and write a for loop for that? (I want to do some processing steps for each jpg file)
You don't want to store output containing multiple files into a variable/array and then post-process it later. You can just do those actions on the files on-the-run.
Assuming you have bash shell available, you could write a small script as
#!/usr/bin/env bash
# ^^^^ bash shell needed over any POSIX shell because
# of the need to use process-substitution <()
while IFS= read -r -d '' image; do
printf '%s\n' "$image"
# Your other actions can be done here
done < <(find . -type f -name "*.jpg" -print0)
The -print0 option writes filenames with a null byte terminator, which is then subsequently read using the read command. This will ensure the file names containing special characters are handled without choking on them.
Better than storing in a variable, use this :
find . -type f -name "*.jpg" -exec command {} \;
Even, if you want, command can be a full bloated shell script.
A demo is better than an explanation, no ? Copy paste the whole lines in a terminal :
cat<<'EOF' >/tmp/test
#!/bin/bash
echo "I play with $1 and I can replay with $1, even 3 times: $1"
EOF
chmod +x /tmp/test
find . -type f -name "*.jpg" -exec /tmp/test {} \;
Edit: new demo (from new questions from comments)
find . -type f -name "*.jpg" | head -n 10 | xargs -n1 command
(this another solution doesn't take care of filenames with newlines or spaces)
This one take care :
#!/bin/bash
shopt -s globstar
count=0
for file in **/*.jpg; do
if ((++count < 10)); then
echo "process file $file number $count"
else
break
fi
done

Recursively prepend text to file names

I want to prepend text to the name of every file of a certain type - in this case .txt files - located in the current directory or a sub-directory.
I have tried:
find -L . -type f -name "*.txt" -exec mv "{}" "PrependedTextHere{}" \;
The problem with this is dealing with the ./ part of the path that comes with the {} reference.
Any help or alternative approaches appreciated.
You can do something like this
find -L . -type f -name "*.txt" -exec bash -c 'echo "$0" "${0%/*}/PrependedTextHere${0##*/}"' {} \;
Where
bash -c '...' executes the command
$0 is the first argument passed in, in this case {} -- the full filename
${0%/*} removes everything including and after the last / in the filename
${0##*/} removes everything before and including the last / in the filename
Replace the echo with a mv once you're satisfied it's working.
Are you just trying to move the files to a new file name that has Prepend before it?
for F in *.txt; do mv "$F" Prepend"$F"; done
Or do you want it to handle subdirectories and prepend between the directory and file name:
dir1/PrependA.txt
dir2/PrependB.txt
Here's a quick shot at it. Let me know if it helps.
for file in $(find -L . -type f -name "*.txt")
do
parent=$(echo $file | sed "s=\(.*/\).*=\1=")
name=$(echo $file | sed "s=.*/\(.*\)=\1=")
mv "$file" "${parent}PrependedTextHere${name}"
done
This ought to work, as long file names does not have new line character(s). In such case make the find to use -print0 and IFS to have null.
#!/bin/sh
IFS='
'
for I in $(find -L . -name '*.txt' -print); do
echo mv "$I" "${I%/*}/prepend-${I##*/}"
done
p.s. Remove the echo to make the script effective, it's there to avoid accidental breakage for people who randomly copy paste stuff from here to their shell.

How to perform a for-each loop over all the files under a specified path?

The following command attempts to enumerate all *.txt files in the current directory and process them one by one:
for line in "find . -iname '*.txt'"; do
echo $line
ls -l $line;
done
Why do I get the following error?:
ls: invalid option -- 'e'
Try `ls --help' for more information.
Here is a better way to loop over files as it handles spaces and newlines in file names:
#!/bin/bash
find . -type f -iname "*.txt" -print0 | while IFS= read -r -d $'\0' line; do
echo "$line"
ls -l "$line"
done
The for-loop will iterate over each (space separated) entry on the provided string.
You do not actually execute the find command, but provide it is as string (which gets iterated by the for-loop).
Instead of the double quotes use either backticks or $():
for line in $(find . -iname '*.txt'); do
echo "$line"
ls -l "$line"
done
Furthermore, if your file paths/names contains spaces this method fails (since the for-loop iterates over space separated entries). Instead it is better to use the method described in dogbanes answer.
To clarify your error:
As said, for line in "find . -iname '*.txt'"; iterates over all space separated entries, which are:
find
.
-iname
'*.txt' (I think...)
The first two do not result in an error (besides the undesired behavior), but the third is problematic as it executes:
ls -l -iname
A lot of (bash) commands can combine single character options, so -iname is the same as -i -n -a -m -e. And voila: your invalid option -- 'e' error!
More compact version working with spaces and newlines in the file name:
find . -iname '*.txt' -exec sh -c 'echo "{}" ; ls -l "{}"' \;
Use command substitution instead of quotes to execute find instead of passing the command as a string:
for line in $(find . -iname '*.txt'); do
echo $line
ls -l $line;
done

How do I replace the word "hello" with "goodbye" in every file in this directory, and also recursively?

Suppose I have many files in this directory. I want to replace "hello" with "goodbye" everywhere, also recursively
find . -type f -exec sed -i 's/hello/goodbye/g' {} +
for file in $(find ./) ; do sed -e 's/hello/goodbye/g' $file > tmp && mv tmp $file ; done
You can use a perl one-liner
perl -p -i -e 's/oldstring/newstring/g' `find ./ -name *.html`
(Taken from here http://joseph.randomnetworks.com/2005/08/18/perl-oneliner-recursive-search-and-replace/)

Resources