I had this question in interview.
He put a situation in front of me that there are 12 files in your Linux operating system.
Give me a command which will open a file containing data "Hello"..
I told him I just know grep command which will give you the names of files having "Hello" data.
Please tell me if there is any command to open a file in this way..
Assuming it will be only one file containing the word hello:
less $(grep -H "hello" *.txt | sed s/:.*//)
Here it is first capturing the file name using grep with -H parameter. Then using sed removing everything except the filename. And finally its using less to open the file.
Maybe this could help:
$ echo "foo" > file1.txt
$ echo "bar" > file2.txt
$ grep -l foo * | xargs cat
foo
You have 2 files, and you are looking for the one with the string "foo" in it. Change cat with your command of choice to open files. Might try vi, emacs, nano, pico... (no, another flame war!)
You may want to try a different approach if there are several files that contains the string you are looking for... Just thought of only one file containing the string.
Related
One of my Linux MySQL servers suffered from a crash. So I put back a backup, however this time the MySQL is running local (localhost) instead of remotely (IP-address).
Thanks to Stack Overflow users I found an excellent command to find the IP-address in all .php files in a given directory! The command I am using for this is:
grep -r -l --include="*.php" "100.110.120.130" .
This outputs the necessary files with its location ofcourse. If it were less than 10 results, I would simply change them by hand obviously. However I received over 200 hits/results.
So now I want to know if there is a safe command which replaces the IP-address (example: 100.110.120.130) with the text "localhost" instead for all .php files in the given directory (/var/www/vhosts/) recursively.
And maybe, if only possible and not to much work, also output the changed lines to a file? I don't know if thats even possible.
Maybe someone can provide me with a working solution? To be honest, I dont dare to fool around out of the blue with this. Thats why I created a new thread.
The most standard way of replacing a string in multiple files would be to use a tool such as sed. The list of files you've obtained via grep could be read line by line (when output to a file) using a while loop in combination with sed.
$ grep -r -l --include="*.php" "100.110.120.130" . > list.txt
# this will output all matching files to list.txt
Replacing IP in matched files:
while read -r line ; do echo "$line" >> updated.txt ; sed -i 's/100.110.120.130/localhost/g' "${line}" ; done<list.txt
This will take list.txt and read it line by line to the sed command which should replace all occurrences of the IP to "localhost". The echo command directly before sed outputs all the filenames that will be modified into a file updated.txt (it isn't necessary though as list.txt contains the same exact filenames, although it could be used as a means of verification perhaps).
To do a dry run before modifying all of the matched files remove the
-i from the sed command and it will print the output to stdout
instead of in-place modifying the files.
I want to save a command to a file (for example I want to save the string "cat /etc/passwd" to a file) but I can't use the echo command.
How can I create and save string to a file directly without using echo command?
You can redirect cat to a file, type the text, and press Control-D when you're done, like this:
cat > file.txt
some text
some more text
^D
By ^D I mean to press Control-D at the end. The line must be empty.
It will not be part of the file, it is just to terminate the input.
Are you avoiding ECHO for security purposes (e.g. you're using a shared terminal and you don't want to leave trace in the shell history of what you've written inside your files) or you're just curious for an alternative method?
Simple alternative to echo:
As someone said, redirecting cat is probably the simpler way to go.
I'd suggest you to manually type your end-of-file, like this:
cat <<EOF > outputfile
> type here
> your
> text
> and finish it with
> EOF
Here's the string you're asking for, as an example:
cat <<EOF > myscript.sh
cat /etc/passwd
EOF
You probably don't want everyone to know you've peeked into that file, but if that's your purpose please notice that wrapping it inside an executable file won't make it more private, as that lines will be logged anyway...
Security - Avoiding history logs etc..
In modern shell, just try adding a space at the beginning of every command and use freely whatever you want.
BTW, my best hint is to avoid using that terminal at all, if you can. If you got two shells (another machine or even just another secure user in the same machine), I'd recommend you using netcat. See here: http://www.thegeekstuff.com/2012/04/nc-command-examples/?utm_source=feedburner
{ { command ls $(dirname $(which cat)) |
grep ^ca't$'; ls /etc/passwd; } |
tr \\n ' '; printf '\n'; } > output-file
But it's probably a lot simpler to just do : printf 'cat /etc/passwd\n'
To be clear, this is a tongue-in-cheek solution. The initial command is an extraordinarily convoluted way to get what you want, and this is intended to be a humorous answer. Perhaps instructive to understand.
I am not sure I understood you correctly but
cat /etc/passwd > target.file
use the > operator to write it to file without echoing
If you need to use it, inside a program :
cat <<EOF >file.txt
some text
some more text
EOF
I would imagine that you are probably trying to print the content of a string to a file, hence you mentioned echo.
You are avoiding this:
echo "cat /etc/passwd" > target.file
You can use a here string combined with cat.
cat > target.file <<< "cat /etc/passwd"
Now the file target.file will contain a string cat /etc/passwd.
$ cat target.file
cat /etc/passwd
$
To create string:
var1=your command
to save a file or variable in a file without echo use:
cat $FILE/VAR1 > /new/file/path
I would like to merge two files and create a new file using Linux command.
I have the two files named as a1b.txt and a1c.txt
Content of a1b.txt
Hi,Hi,Hi
How,are,you
Content of a1c.txt
Hadoop|are|world
Data|Big|God
And I need a new file called merged.txt with the below content(expected output)
Hi,Hi,Hi
How,are,you
Hadoop|are|world
Data|Big|God
To achieve that in terminal I am running the below command,but it gives me output like below
Hi,Hi,Hi
How,are,youHadoop|are|world
Data|Big|God
cat /home/cloudera/inputfiles/a1* > merged.txt
Could somebody help on getting the expected ouput
Probably your files do not have newline characters. Here is how to put the newline character to them.
$ sed -i -e '$a\' /home/cloudera/inputfiles/a1*
$ cat /home/cloudera/inputfiles/a1* > merged.txt
If you are allowed to be destructive (not have to keep the original two files unmodified) then:
robert#debian:/tmp$ cat fileB.txt >> fileA.txt
robert#debian:/tmp$ cat fileA.txt
this is file A
This is file B.
I want to use grep command to search for files containing the string "/usr/vm/data". For searching a normal string like "how are you", i know i can do:
grep -inr "how are you" *
to search recursively. But i am getting stuck in the cases where i need to search a path like "/usr/vm/data". I tried:
grep -inr "\/usr\/vm\/data" directory1
and also
grep -inr "/usr/vm/data/" directory1
but didn't get any success.
Don't torture yourself, and it is a normal string (especially when you put it in quotes).
echo "/usr/vm/data Hello world" | grep -i "/usr/vm/data"
Your command works fine:
$ cat directory1/somefile
foo
bar
this line contains /usr/vm/data/
$ grep -inr "/usr/vm/data/" directory1
directory1/somefile:3:this line contains /usr/vm/data/
$
Perhaps your file is beyond a symlink which grep doesn't follow, or perhaps you don't have any matching files?
This is not exactly the easiest one to explain in a title.
I have a file inputfile.txt that contains parts of filenames:
file1.abc
filed.def
fileq.lmn
This file is an input file that I need to use to find the full filenames of an actual directory. The ends of the filenames are different from case to case, but part of them is always the same.
I figured that I could grep text from the input file to the ls command in said directory (or the ls command to a simple text file), and then use awk to output my full desired result, but I'm having some trouble doing that.
file1.abc is read from the input file inputfile.txt
It's checked against the directory contents.
If the file exists, specific directories based on the filename are created.
(I'm also in a Busybox environment.. I don't have a lot at my disposal)
Something like this...
cat lscommandoutput.txt \
| awk -F: '{print("mkdir" system("grep $0"); inputfile.txt}' \
| /bin/sh
Thank you.
Edit: My apologies for not being clear on this.
The output should be the full filename of each line found in lscommandoutput.txt using the inputfile.txt to grep those specific lines.
If inputfile.txt contains:
file1.abc
filed.def
fileq.lmn
and lscommandoutput.txt contains:
file0.oba.ca-1.fil
file1.abc.de-1.fil
filed.def.com-2.fil
fileh.jkl.open-1.fil
fileq.lmn.he-2.fil
The extra lines that aren't contained in the inputfile.txt are ignored. The ones that are in the inputfile.txt have a directory created for them with the name that got grepped from lscommandoutput.txt.
/dir/dir2/file1.abc.de-1.fil/ <-- directory in which files can be placed in
/dir/dir2/filed.def.com-2.fil/
/dir/dir2/fileq.lmn.he-2.fil/
Hopefully that is a little bit clearer.
First, you win a useless use of cat award
Secondly, you've explained this really badly. If you can't describe the problem clearly in plain English it's not surprising you are having trouble turning it into a script or set of commands.
grep -f is a good way to get the directory names, but I don't understand what you want to do with them afterwards.
My problem now is using the outputted file with the one file I want to put the folders
Wut? What does "the one file I want to put the folders" mean? Where does the file come from? Is it the file named in inputlist.txt? Does it go in the directory that it matched?
If you just want to create the directories you can do:
fgrep -f ./inputfile.txt ./lscommandoutput.txt | xargs mkdir
N.B. you probably want fgrep so that the input strings aren't treated as regular expressions and regex metacharacters such as . are ignored.