Linux find and replace - linux

How can I replace "abc" with "abcd" on all files of a folder using shell?
Is it possible using sed command?

Try the following command for the file file.txt:
sed -i 's/abc/abcd/g' file.txt
Try the following command for all files in the current folder:
find . -maxdepth 1 -type f -exec sed -i 's/abc/abcd/g' {} \;
For the files in the current directory and all subdirectories:
find . -type f -exec sed -i 's/abc/abcd/g' {} \;
Or if you are fan of xargs:
find . -type f | xargs -I {} sed -i 's/abc/abcd/g' {}

sed -i 's/abc/&d/g' *
should work.

Yes:
find /the/folder -type f -exec sed -i 's,\<abc\>,&d,g' {} \;

Related

sed ack search / replace line break with string

I have looked into a few SO threads, non of which have helped my specific situation.
I am trying to update a PHP app that I took over from php 5.6 to php 8.0
With that said there are MANY instances that look like:
<?
echo ...
function
I need to find all cases where <? is followed directly by a newline and replace it with <?php(newline)
Per the SO posts I've read .. I think I am coming close with the following:
find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/\<\?\n/\<\?php\n/g" {} \;
I think I am close .. But I can't figure out why it won't replace <?\n with <?php\n as the sed statement works without the newline. But per THIS POST it looks like I am doing it correctly.
What am I doing wrong?
Iterations I've tried:
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/\<\?\n/\<\?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<\?\n/<\?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\n/<?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\n\r/<?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\r\n/<?php\n/g" {} \;
The sed command itself could be something as simple as:
sed -i 's/<?$/<?php/'
Glue that together with find and it might work for you.
$ is an anchor matching the end of a line, you might consider using ^ to anchor the match to the beginning as well:
s/^<?$/<?php/

Find and exec command in linux. Is it possible to pipe 2 find and exec commands

I'm trying to accomplish this task
1) Find directory A (DIR_A) and copy all files in the directory(including its sub-directory, if any) into a new directory called DIR_B
2) In directory (DIR_B),replace the word apple with orange
I executed the following code and for some reason, it copies all the files but it fails on the second task (replace apple with orange). I would appreciate help on this. Below is my code
find DIR_A -iname FILEA -type f -exec cp {} DIR_B \;|find DIR_B/ -iname \*.* -type f -exec sed -i "s|apple|orange|g" {} \;
Rather than trying to piping the output from one find into the other, why not just run them sequentially? I'm not sure that find reads from its stdin.
find DIR_A -iname FILEA -type f -exec cp {} DIR_B \; ; find DIR_B/ -iname \*.* -type f -exec sed -i "s|apple|orange|g" {} \;
I've replaced your pipe with a semi-colon.
Try this :
Sed Syntax :
sed 's/old/new/g'
find DIR_A -iname FILEA -type f -exec cp {} DIR_B \;|find DIR_B/ -iname \*.* -type f -exec sed -i "s/apple/orange/g" {} \;

How to convert some files from dos format to unix

I know how to change file format from dos to unix by use dos2unix, but how can I change ALL the files will under a directory tree. Can dos2unix change files recursively?
for example, I have some files like following:
TOPDIR
|
+-----dir1
| |
| +---file1,file2, file3
|
+-----dir2
|
+---file4,file5
How can I change them in one time, or use some shell scripts?
better to do find /path -type -f -exec dos2unix '{}' \;
find /path -name '*' -type f -exec dos2unix {} \;
dos2unix -k `find . -type f`
find . -type f -exec dos2unix -k '{}' \;
find . -type f -print | xargs dos2unix -k
Any of above command can be used from TOPDIR

find & sed: remove lines

I am trying to delete some line in PHP files. I tried to use an find, exec combination:
find . -name '*.php' -exec sed '/#category/d' {} \;
but it only prints out the files contents. Is there anythin wrong in the syntax? Or what is the problem?
Could you try this command:
find . -name '*.php' -exec sed -i '/#category/d' {} \;
I think you've missed -i option
It works, but probably not how you expect.
find . -name '*.php' -exec sed -i '/#category/d' {} \;
Will kill the lines in question.
This should be the command for sed so try to add -i :
sed -i ".bak" '/culpa/d' test.txt
find . -name '*.php' -exec sed -i '/#category/d' {} \;
Source of the answer:
Bash - find a keyword in a file and delete its line

dos2unix command

I have this script
#!/bin/sh
for i in `ls -R`
do
echo "Changing $i"
fromdos $i
done
I want to remove "^M" charcaters from many files which are in more subdirectories. I got this:
fromdos: Unable to access file
Is there somethig i'm missing?
Thanks in advance.
ls -R lists everything, including directories. So you're telling fromdos to act on actual directories is some cases.
Try something like this:
find . -type f -exec fromdos {} \;
I guess you don't need a for loop.
Here is a quick panorama of solutions for files with extension ".ext" (such commands shall be somehow restrictive)
note : ^M is obtained with CTRL-V" + "CTRL-M"
# PORTABLE SOLUTION
find /home -type f -name "*.ext" -exec sed -i -e 's/^M$//' {} \;
# GNU-sed
find /home -type f -name "*.ext" -exec sed -i -e "s/\x0D$//g" {} \;
# SED with more recent nux
find /home -type f -name "*.ext" -exec sed -i -e "s/\r$//g" {} \;
# DOS2UNIX
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do dos2unix $path $path"_new"; done
# AWK
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do awk '{ sub("\r$", ""); print }' $path > $path"_new"; done
# TR
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do cat $path | tr -d '\r' > $path"_new"; done
# PERL
find /home -type f -name "*.ext" -exec perl -pi -e 's/\r//g' {} \;

Resources