Call a custom function inside a bash command - linux

I have the following bash script:
#!/bin/bash
find . -maxdepth 1 -mmin +1 -type f -name "240*.ts"
| xargs -L 1 bash -c 'mv "${1}" "$(get_crtime${1} | awk '{print $5}').ts"' \;
The idea is to find files that are older than one minute matching a certain pattern (in my case, files that start with '240') and rename them from their original name (240-1458910816045.ts) to a desired format (15:00:16.ts).
Inside the script I am using get_crtime command which is a custom function included in /etc/bash.bashrc and has the following implementation:
get_crtime() {
for target in "${#}"; do
inode=$(stat -c '%i' "${target}")
fs=$(df "${#}" | awk '{a=$1}END{print a}')
crtime=$(sudo debugfs -R 'stat <'"${inode}"'>' "${fs}" 2>/dev/null |
grep -oP 'crtime.*--\s*\K.*')
printf "%s\t%s\n" "${target}" "${crtime}"
done
}
When I call the function from the shell, like this:
get_crtime 240-1458910816045.ts | awk '{print $5}'
I get the desired output:
15:00:16
Which is a portion from the file creation date.
My problem is when I include the function call inside my initial script I get the following error:
}).ts": -c: line 0: unexpected EOF while looking for matching `)'
}).ts": -c: line 1: syntax error: unexpected end of file
I think this is caused by incorrect invoking of awk, so I thought to remove it and leave just:
find . -maxdepth 1 -mmin +1 -type f -name "240*.ts"
| xargs -L 1 bash -c 'mv "${1}" "$(get_crtime ${1}).ts"' \;
I get the following error, which is more suggestive:
;: get_crtime: command not found
How can I call the custom function inside the bashrc inside the initial command without getting the last error?
Thank you!
The OS is Ubuntu
The shell is bash

You can't use single quotes inside a single-quote delimited script. Look:
$ bash -c 'printf "%s\n" "$(date | awk '{print $0}')"'
-bash})": -c: line 0: unexpected EOF while looking for matching `)'
-bash})": -c: line 1: syntax error: unexpected end of file
$ bash -c 'printf "%s\n" "$(date | awk "{print \$0}")"'
Fri, Mar 25, 2016 8:59:31 AM
I'm not recommending you use double quotes around your awk script though - create a script to do the mv, etc. for you or figure out some other way to implement it that'll solve your function access problem too.

In this example, used modification time of file, which can be get by stat -c '%y'. The xargs -I param creates possibility to place file name two times, first for stat, second for mv. Then using Parameter Expansion bash features to extract only time from human readable stat output:
find . -maxdepth 1 -mmin +1 -type f -name "240*.ts" | \
xargs -I_ bash -c 'MTIME=$(stat -c '%y' "_") && MTIME=${MTIME#* } && mv "_" ${MTIME%.*}.ts'

You need to export the function:
export -f get_crtime
That will make it available to child bash processes (but not to other shells).
Also, as #EdMorton points out, you cannot use single quotes inside a single quoted-string, which was the problem with the invocation of awk. So you'll need to come up with a different way of quoting the interior argument to awk, or fix get_crtime to just return the string you want.
By the way, you might consider using finds -exec action instead of xargs. That would allow you to use a loop over a number of files, which would be a bit more efficient.
eg.
find . -maxdepth 1 -mmin +1 -type f -name "240*.ts" \
-exec bash -c 'for f in "$#"; do
mv "$f" "$(get_crtime "$f" | awk {print\$5}).ts"
done' _ {} +

Related

How to use grep to reverse search files in a folder

I'm trying to create a script which will find missing topics from multiple log files. These logfiles are filled top down, so the newest logs are at the bottom of the file. I would like to grep only the last line from this file which includes UNKNOWN_TOPIC_OR_PARTITION. This should be done in multiple files with completely different names. Is grep the best solution or is there another solution that suits my needs. I already tried adding tail, but that doesn't seem to work.
missingTopics=$(grep -Ri -m1 --exclude=*.{1,2,3,4,5} UNKNOWN_TOPIC_OR_PARTITION /app/tibco/log/tra/domain/)
You could try a combination of find, tac and grep:
find /app/tibco/log/tra/domain -type f ! -name '*.[1-5]' -exec sh -c \
'tac "$1" | grep -im1 UNKNOWN_TOPIC_OR_PARTITION' "sh" '{}' \;
tac prints files in reverse, the -exec sh -c SCRIPT "sh" '{}' \; action of find executes the shell SCRIPT each time a file matching the previous tests is found. The SCRIPT is executed with "sh" as parameter $0 and the path of the found file as parameter $1.
If performance is an issue you can probably improve it with:
find . -type f ! -name '*.[1-5]' -exec sh -c 'for f in "$#"; do \
tac "$f" | grep -im1 UNKNOWN_TOPIC_OR_PARTITION; done' "sh" '{}' +
which will spawn less shells. If security is also an issue you can also replace -exec by -execdir (even if with this SCRIPT I do not immediately see any exploit).

How do I find a shell script that accepts one or more arguments, and outputs a line for each argument that names a UTF-8 file?

I understand that I have to use an array of arguments, but have no experience doing so. I am using Emacs for my shell scripting.
This is what I have so far:
#!/bin/bash
find $# -type f -exec file {} + | grep UTF-8
answer because I can't comment yet:
"$#" and "${name[#]}" should always used with surrounding double quotes. Otherwise words with spaces are broken. See "man bash" for details.
I don't understand why you want to use $# (every parameter), but I would solve your problem as follows:
#!/bin/bash
ARR=($(find . -type f -exec file {} + | grep script | sed -r 's/([^:]*).*/\1/'))
for i in ${ARR[#]}; do
if [ -x $i ]; then
echo "$i is an executable script"
fi
done
Find every file (including binaries), filter shell scripts with grep and take only take the file name with sed:
find . -type f -exec file {} + | grep script | sed -r 's/([^:]*).*/\1/')
You can loop over the array items by using the "#" index. There are several others more which might be useful in the future:
for i in ${ARR[#]}; do
#code
done
Finally, check if the script is executable with the -x option from [
if [ -x $i ]; then
#do something
fi
p.s. isn't vim better than emacs? ;-)
Just loop over the arguments with a for
#!/bin/bash
for f in $#; do find -name "$f" -type f -exec file {} \; | grep UTF-8

How to perform a for-each loop over all the files under a specified path?

The following command attempts to enumerate all *.txt files in the current directory and process them one by one:
for line in "find . -iname '*.txt'"; do
echo $line
ls -l $line;
done
Why do I get the following error?:
ls: invalid option -- 'e'
Try `ls --help' for more information.
Here is a better way to loop over files as it handles spaces and newlines in file names:
#!/bin/bash
find . -type f -iname "*.txt" -print0 | while IFS= read -r -d $'\0' line; do
echo "$line"
ls -l "$line"
done
The for-loop will iterate over each (space separated) entry on the provided string.
You do not actually execute the find command, but provide it is as string (which gets iterated by the for-loop).
Instead of the double quotes use either backticks or $():
for line in $(find . -iname '*.txt'); do
echo "$line"
ls -l "$line"
done
Furthermore, if your file paths/names contains spaces this method fails (since the for-loop iterates over space separated entries). Instead it is better to use the method described in dogbanes answer.
To clarify your error:
As said, for line in "find . -iname '*.txt'"; iterates over all space separated entries, which are:
find
.
-iname
'*.txt' (I think...)
The first two do not result in an error (besides the undesired behavior), but the third is problematic as it executes:
ls -l -iname
A lot of (bash) commands can combine single character options, so -iname is the same as -i -n -a -m -e. And voila: your invalid option -- 'e' error!
More compact version working with spaces and newlines in the file name:
find . -iname '*.txt' -exec sh -c 'echo "{}" ; ls -l "{}"' \;
Use command substitution instead of quotes to execute find instead of passing the command as a string:
for line in $(find . -iname '*.txt'); do
echo $line
ls -l $line;
done

Using * to parse through files. Need to write file names

I have the following problem using UNIX Commands. I wish to go through a large number of files and convert them using a command that converts them. My idea is to work like this: command *.fileending > *.newfileending
The problem is that I wish to keep the file-names and only replace the file-ending. Thus filename.fileending should become filename.newfileending. How do I achieve this?
Use a for loop:
for file in *.krn; do
hum2mid "$file" -o "${file%.krn}.mid"
done
In a single line: for file in *.krn; do hum2mid "$file" -o "${file%.krn}.mid"; done
To apply the command to files and subdirectories recursively, use the find|xargs pattern:
find -type f -name '*.krn' -print0 \
| xargs -0 -n1 sh -c 'hum2mid "$1" -o "/destination/dir/$(basename ${1%.krn}.mid)"' -
Note that this will overwrite already converted files, if a file from another directory has the same name.
rename .fileending .newfileending *
#!/bin/bash
ls -1 *.fileending | while read i; do
command "$i" > "${i/%.fileending/.newfileending}"
done
if you need process 'weird' filenames ( like with embedded '\n', for example ), you can use following trick:
create file foo.sh:
#!/bin/bash
command "$1" > "${1/%.fileending/.newfileending}"
, then do chmod +x foo.sh and finally find . -maxdepth 1 -a -type f -a -name '*.fileending' -print0 | xargs -0 -n 1 -J '%' ./foo.sh "%"

How do I send multiple results from one command to another in bash?

I'm not sure if this is possible in one line (i.e., without writing a script), but I want to run an ls | grep command and then for each result, pipe it to another command.
To be specific, I've got a directory full of images and I only want to view certain ones. I can filter the images I'm interested in with ls | grep -i <something>, which will return a list of matching files. Then for each file, I want to view it by passing it in to eog.
I've tried simply passing the results in to eog like so:
eog $(ls | grep -i <something>)
This doesn't quite work as it will only open the first entry in the result list.
So, how can I execute eog FILENAME for each entry in the result list without having to bundle this operation into a script?
Edit: As suggested in the answers, I can use a for loop like so:
for i in 'ls | grep -i ...'; do eog $i; done
This works, but the loop waits to iterate until I close the currently opened eog instance.
Ideally I'd like for n instances of eog to open all at once, where n is the number of results returned from my ls | grep command. Is this possible?
Thanks everybody!
I would use xargs:
$ ls | grep -i <something> | xargs -n 1 eog
A bare ls piped into grep is sort of redundant given arbitrary?sh*ll-glo[bB] patterns (unless there are too many matches to fit on a command line in which case the find | xargs combinations in other answers should be used.
eog is happy to take multiple file names so
eog pr0n*really-dirty.series?????.jpg
is fine and simpler.
Use find:
find . -mindepth 1 -maxdepth 1 -regex '...' -exec eog '{}' ';'
or
find . -mindepth 1 -maxdepth 1 -regex '...' -print0 | xargs -0 -n 1 eog
If the pattern is not too complex, then globbing is possible, making the call much easier:
for file in *.png
do
eog -- "$file"
done
Bash also has builtin regex support:
pattern='.+\.png'
for file in *
do
[[ $file =~ $pattern ]] && eog -- "$file"
done
Never use ls in scripts, and never use grep to filter file names.
#!/bin/bash
shopt -s nullglob
for image in *pattern*
do
eog "$image"
done
Bash 4
#!/bin/bash
shopt -s nullglob
shopt -s globstar
for image in **/*pattern*
do
eog "$image"
done
Try looping over the results:
for i in `ls | grep -i <something>`; do
eog $i
done
Or you can one-line it:
for i in `ls | grep -i <something>`; do eog $i; done
Edit: If you want the eog instances to open in parallel, launch each in a new process with eog $i &. The updated one-liner would then read:
for i in `ls | grep -i <something>`; do (eog $i &); done
If you want more control over the number of arguments passed on to eog, you may use "xargs -L" in combination with "bash -c":
printf "%s\n" {1..10} | xargs -L 5 bash -c 'echo "$#"' arg0
ls | grep -i <something> | xargs -L 5 bash -c 'eog "$#"' arg0

Resources