Replace strings in PHP files with shell script - linux

Good morning to everyone here, attempt to replace a series of characters in different PHP files taking into account the following:
The files are lines like this:
if($_GET['x']){
And so I want to replace:
if(isset($_GET['x'])){
But we must take into account that there are files in lines like the following, but they do not want to modify the
if($_GET["x"] == $_GET["x"]){
I try as follows but I can not because I change all lines containing $ _GET ["x"]
My example:
find . -name "*.php" -type f -exec ./code.sh {} \;
sed -i 's/\ if($_GET['x']){/ if(isset($_GET['x'])){/' "$1"

find . -name "*.php" -type f -print0 | xargs -0 sed -i -e "s|if *(\$_GET\['x'\]) *{|if(isset(\$_GET['x'])){|g" --
The pattern above for if($_GET['x']){ would never match if($_GET["x"] == $_GET["x"]){.
Update:
This would change if($_GET['x']){ or if($_GET["x"]){ to if(isset($_GET['x'])){:
find . -name "*.php" -type f -print0 | xargs -0 sed -i -e "s|if *(\$_GET\[[\"']x[\"']\]) *{|if(isset(\$_GET['x'])){|g" --
Another update:
find . -name "*.php" -type f -print0 | xargs -0 sed -i -e "s|if *(\$_GET\[[\"']\([^\"']\+\)[\"']\]) *{|if(isset(\$_GET['\1'])){|g" --
Would change anything in the form of if($_GET['<something>']){ or if($_GET["<something>"]){.

Related

How to make find . -name "*.txt" | xargs grep "text" to work with filename with spaces

find . -name "*.txt" | xargs grep "text"
fail when file name has spaces
How to make this to work with filename with spaces
try this:
find . -name "*.txt" -print0 | xargs -0 grep "text"
This will work for all file names and it will also be slightly more efficient because it avoids the need for a pipeline:
find . -name "*.txt" -exec grep "text" {} +

"find -type f -iname '*.txt' | xargs -I {} echo ""> {}" does not work

I am working on linux bash. Now I would like to clear the contents of all .txt files. However, this command "find -type f -iname '.txt' | xargs -I {} echo ""> {}" seems not to work. Any suggestions? Any ideas about better solutions?
I replaced echo with truncate in order to clear a file and used find's -exec instead of piping to xargs:
find . -type f -name "*.txt" -exec truncate -s 0 {} \;

Retrieve specific files under a directory in Linux

I want to see the list of specific files under the directory using linux.
Say for example:-
I have following sub-directories in my current directory
Feb 16 00:37 a1
Feb 16 00:38 a2
Feb 16 00:36 a3
Now if i do ls a* - I can see
bash-4.1$ ls a*
a:
a1:
123.sh 123.txt
a2:
a234.sh a234.txt
a3:
a345.sh a345.txt
I want to filter out only .sh files from the directory so that output should be:-
a1:
123.sh
a2:
a234.sh
a3:
a345.sh
Is it Possible?
Moreover is it possible to print the 1st line of sh file also?
The following find command should work for you:
find . -maxdepth 2 -mindepth 2 -path '*/a*/*.sh' -print -exec head -n1 {} \;
Just take a look at those options. I hope you would find what you you are looking for
basic 'find file' commands
find / -name foo.txt -type f -print # full command
find / -name foo.txt -type f # -print isn't necessary
find / -name foo.txt # don't have to specify "type==file"
find . -name foo.txt # search under the current dir
find . -name "foo.*" # wildcard
find . -name "*.txt" # wildcard
find /users/al -name Cookbook -type d # search '/users/al'
search multiple dirs
find /opt /usr /var -name foo.scala -type f # search multiple dirs
case-insensitive searching
find . -iname foo # find foo, Foo, FOo, FOO, etc.
find . -iname foo -type d # same thing, but only dirs
find . -iname foo -type f # same thing, but only files
find files with different extensions
find . -type f \( -name "*.c" -o -name "*.sh" \) # *.c and *.sh files
find . -type f \( -name "*cache" -o -name "*xml" -o -name "*html" \) # three patterns
find files that don't match a pattern (-not)
find . -type f -not -name "*.html" # find allfiles not ending in ".html"
find files by text in the file (find + grep)
find . -type f -name "*.java" -exec grep -l StringBuffer {} \; # find StringBuffer in all *.java files
find . -type f -name "*.java" -exec grep -il string {} \; # ignore case with -i option
find . -type f -name "*.gz" -exec zgrep 'GET /foo' {} \; # search for a string in gzip'd files
Only using ls, you can get the .sh files and their parent directory with:
ls -1 * | grep ":\|.sh" | grep -B1 .sh
Which will provide the output:
a1:
123.sh
a2:
a234.sh
a3:
a345.sh
However, note that this won't have the correct behavior in case of you have any file called for example 123.sh.txt
In order to print the first line of the first .sh file in every folder:
head -n1 $(ls -1 */*.sh)
Yes and very easy and simple just with ls itself:
ls -d */*.sh
Prove
If you would like to print it with newline:
t $ ls -d */*.sh | tr ' ' '\n'
d1/file.sh
d2/file.sh
d3/file.sh
Or
ls -d */*.sh | tr '/' '\n'
the output:
d1
file.sh
d2
file.sh
d3
file.sh
Also for the first line if you want:
t $ ls -d */*.sh | tr ' ' '\n' | head -n 1
d1/file.sh

Create tar gz on linux with specific list of files from sed output

Here is my command line:
find . -type f -exec file {} \; \
| sed 's/\(.*png\): .* \([0-9]* x [0-9]*\).*/\2 \1/' \
| sed 's/\(.*jpg\): .* \([0-9]*x[0-9]*\).*/\2 \1/' \
| awk 'int($1) < 1000' \
| sed 's/^.*[[:blank:]]//' \
| tar -czvf images.tar.gz --null -T -
And the error i got is:
tar: Unix\n./test.png\n./test2.jpg\n: Cannot stat: No such file or
directory
tar: Exiting with failure status due to previous errors
What i want is to find all images in current directory, who's width less than 1000 px and tar them into archive.
to use --null, you need to convert newlines to nulls first:
...
| tr '\n' '\0' \
| tar -czvf images.tar.gz --null -T -
(tested, working.)
also, here are a number of suggestions on speed and style in decreasing order of importance.
a. don't find and run file on more files than you need to:
find . -type f -iname "*.png" -or -iname "*.jpg"
b. for commands that can run on multiple files per command, such as file, use xargs to save a lot of time:
find . -type f -iname "*.png" -or -iname "*.jpg" -print0 | xargs -0 file
c. if you put | at the end of each line, you can continue on the next line without also using \.
find . -type f -iname "*.png" -or -iname "*.jpg" -print0 |
xargs -0 file
d. you can save yourself a lot of trouble since your max width is 999 by just greping for 1, 2, or 3 digit widths, though the awk '$1<1000' is ultimately better in case you ever want to use a different threshold:
find . -type f -iname "*.png" -or -iname "*.jpg" -print0 |
xargs -0 file |
grep ', [0-9][0-9]\?[0-9]\? x '
e. grep and awk are faster than sed, so use them where possible:
find . -type f -iname "*.png" -or -iname "*.jpg" -print0 |
xargs -0 file |
grep ', [0-9][0-9]\?[0-9]\? x ' |
grep -o -i '.*\.\(png\|jpg\)'
final command:
find . -type f -iname "*.png" -or -iname "*.jpg" -print0 |
xargs -0 file |
grep ', [0-9][0-9]\?[0-9]\? x ' |
grep -o -i '.*\.\(png\|jpg\)' |
tr '\n' '\0' |
tar -czvf images.tar.gz --null -T -
You can also use awk only with :
find . -type f \( -name "*.png" -or -name "*.jpg" \) -exec file {} \; | awk -v width_limit=1000 '
{
match($0, /,\s+([0-9]+)\s*x\s*([0-9]+)/, items)
if (items[1] < width_limit){
match($0, /(.*):/, filename)
print filename[1]
}
}' | tar -czvf allfiles.tar -T -
The width can be configured with width_limit variable
Quick way using perl:
find . -type f -exec file {} + |
perl -ne '
print $1."\0" if /^(.*):\s*(JPEG|PNG).*,\s*(\d+)\s+x\s*\d+\s*,/ &&
$3 < 1000;
' | tar -czvf images.tar.gz --null -T -
Using + operator to find as same effect than print0 | xargs -0.

Find`s placeholder {} sed from variable1 and put into variable2

Then i use this script
for line in "`cat fromDirs.txt`";
do
find "$line" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/ \;;
done
I get in folder /tmp only file names from where they are copied, but i need this filenames contains full paths from where they comes, im bored to trying fight with sed, please help
So i need just take each {} value and replace slash (/) with minus sign (-)
I trying many of variant but nothing good, this code do not work too
for line in "`cat fromDirs.txt`";
do
find "$line" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/$(sed "s/\//-/g" <<< {}) \;;
done
file fromDirs.txt contains
/home/orders/
/etc/bin/school/
there are no output, just nothing haping, maybe beacause i use sh? i havent bash at all on system
I think the problem is in sed as it read placeholder {} as file instead of string, so if {} = /home/orders/good.php then sed open this file and change all slashes to minus sign, but i need to changeslashes only in filename so /home/orders/good.php -> -home-orders-good.php and then cp to /tmp/-home-orders-good.php
I guess you get problem since you double quote the output of the file.
Try change from:
for line in "`cat fromDirs.txt`";
to:
for line in `cat fromDirs.txt`;
or better (remove old and outdated back tics):
for line in $(cat fromDirs.txt);
best (use while to read the file):
#!/bin/bash
while read line
do
find "$line" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/$(sed "s/\//-/g" <<< {}) \;;
done < fromDirs.txt

Resources