How to flush contents in shell pipes? [duplicate] - linux

This question already has answers here:
How to make output of any shell command unbuffered?
(5 answers)
Force flushing of output to a file while bash script is still running
(13 answers)
Closed 3 years ago.
When typing such a command:
find . -type f -iname "*part*"
I get, slowly, file names appearing, as soon as they're found -- one at a time.
Adding a sed expression behind it...
find . -type f -iname "*part*" | sed "s#^\./##"
… delays the display up to the end of the find command, then processing everything at once.
How to avoid that? How to flush to sed?
PS- Alternatively, how to remove the ./ prefix from every line, without delaying the display?

Related

find: missing argument to `-exec' when running from script file [duplicate]

This question already has answers here:
Are shell scripts sensitive to encoding and line endings?
(14 answers)
Closed 3 years ago.
When I run the following command from my script file it gives me:
find: missing argument to `-exec'
But the same command from the command line works normally:
find /home/poseidoncharters/poseidon_backup/*.sql -mtime +1 -exec rm -f {} \;
I run the script like ./myscript.sh
Well as other said on their comment command seems correct, and it was. the problem i was having was newline character problem, as i wrote this script in windows machine and then uploaded to unix server, so as soon as i ran dos2unix remove_backup.sh it started to work.

search and remove specific file using linux command [duplicate]

This question already has answers here:
Delete files with string found in file - Linux cli
(8 answers)
Closed 5 years ago.
I using this command for search all file contain this word . I want to remove all file contain this word in specific directory . grep command perfectly. suggest me how can I used
rm -rf
with below command
grep -l -r -i "Pending" . | grep -n . | wc -l
This could be done by using the l flag and piping the filenames to xargs:
-l
(The letter ell.) Write only the names of files containing selected
lines to standard output. Pathnames are written once per file searched.
If the standard input is searched, a pathname of (standard input) will
be written, in the POSIX locale. In other locales, standard input may be
replaced by something more appropriate in those locales.
grep -l -r 'Pending' . | xargs rm
The above will delete all files in the current directory containing the word Pending.

Change the output of find command [duplicate]

This question already has answers here:
How to strip leading "./" in unix "find"?
(8 answers)
Closed 8 years ago.
Hello guys I'm using the find command to find the .apk files in a directory. But the output of the find command is **./**foo.apk.
I don't want to have this ./.
cd output/dist
output_apk=`find ./ -name "*.apk" -print0`
echo "$output_apk"
The output is ./foo.apk.
I have try the sed command with no luck.
find output/dist -name "*.apk" |
sed 's%^output/dist/%%'
This also avoids the useless cd and removes the erroneous -print0. If you are not piping into a program which requires null-terminated input, this option is wrong.

Rename all files in directory [duplicate]

This question already has answers here:
Argument list too long error for rm, cp, mv commands
(31 answers)
Closed 8 years ago.
I'm trying to rename the files like:
Name1_searchstats_metrics_20141230T133000036.log
to something like: Name2_searchstats_metrics_20141230T133000036.log
I'm trying: rename -n 's/\Name1_/\Name2_/' *.log but am getting the error:
bash: /usr/bin/rename: Argument list too long
Can someone please help ?
probably the easiest solution, since you're using bash is to iterate over the list of files with a for loop:
$ for i in *; do rename -n 's/Name1_/Name2_/' $i; done
you can also filter the files if needed by using any wildcard in the command, like *.log.
There are other more convoluted ways to achieve this, especially if you need to do particular string manipulation of the file name, i.e. using awk or find -exec, but hopefully this could help you sort things out in a clear way.
Edited answer as suggested by #glglgl
a more comprehensive and detailed explanation of the above can be found on superuser:
https://superuser.com/questions/31464/looping-through-ls-results-in-bash-shell-script
If the argument list is too long for a linux command, xargs usually comes to the rescue.
Try this:
ls *.log | xargs rename -n 's/\Name1_/\Name2_/'

Best way to find the numeric value in UNIX file system [duplicate]

This question already has answers here:
How to find all files containing specific text (string) on Linux?
(54 answers)
Closed 8 years ago.
I need to grep for a particular port number from a huge set of files.
I am using a command:
find . |xargs grep "9461"
But it does not finds all the occurrences for number 9461.
Can anyone suggest a better unix/linux command to do so.
The kind of files it gets is : x.log, y.txt,z.htm, a.out etc files
But it was not able to get abc.conf files
You surely have some reason for using find in combination with grep, but just in case:
You can replace your command by:
grep -r "9461" .
and if you want even line numbers
grep -rn "9461" .
As JonathanLefflero commented, there is also the option -e that make grep match againt a regular expression, so, the ultimate command would be
grep -rne 9461
You should take a look on grep man page
A final note, you should check if what you want to grep is "9461" or 9461 without "".

Resources