Shell script to pass the output of first command to the next in pipe - linux

I would like to grep exception or error from logs, but the problem is that the exact log file name is unknown. One thing for sure is that the latest file is my log file and i want to do this single command since, i'll be using the command to do ssh from single source to multiple servers
like
ssh user#server "ls -ltr console*.log | tail -1; egrep -i 'exception|error' <<output of first command (i.e) log file name>>"
is this possible to do in single command ??

Does this work for you:
egrep -i 'exception|error' < $(\ls -tr console*.log|tail -1)
to get only the log name do not use -l in ls

Thanks everyone, i finally got it worked
ssh user#server "ls -ltr console*.log | tail -1; awk '{print $9}' | xargs egrep -i 'exception|error' <<output of first command (i.e) log file name>>"

Related

How to grep text patterns from remote crontabs using xargs through SSH?

I'm developping a script to search for patterns within scripts executed from CRON on a bunch of remote servers through SSH.
Script on client machine -- SSH --> Remote Servers CRON/Scripts
For now I can't get the correct output.
Script on client machine
#!/bin/bash
server_list=( '172.x.x.x' '172.x.x.y' '172.x.x.z' )
for s in ${server_list[#]}; do
ssh -i /home/user/.ssh/my_key.rsa user#${s} crontab -l | grep -v '^#\|^[[:space:]]*$' | cut -d ' ' -f 6- | awk '{print $1}' | grep -v '^$\|^echo\|^find\|^PATH\|^/usr/bin\|^/bin/' | xargs -0 grep -in 'server.tld\|10.x.x.x'
done
This only gives me the paths of scripts from crontab, not the matched lines and line number plus the first line is prefixed with "grep:" keyword (example below):
grep: /opt/directory/script1.sh
/opt/directory/script2.sh
/opt/directory/script3.sh
/opt/directory/script4.sh
How to get proper output, meaning the script path plus line number plus line of matching pattern?
Remote CRON examples
OO 6 * * * /opt/directory/script1.sh foo
30 6 * * * /opt/directory/script2.sh bar
Remote script content examples
1 ) This will match grep pattern
#!/bin/bash
ping -c 4 server.tld && echo "server.tld ($1)"
2 ) This won't match grep pattern
#!/bin/bash
ping -c 4 8.x.x.x && echo "8.x.x.x ($1)"
Without example input, it's really hard to see what your script is attempting to do. But the cron parsing could almost certainly be simplified tremendously by refactoring all of it into a single Awk script. Here is a quick stab, with obviously no way to test.
#!/bin/sh
# No longer using an array for no good reason, so /bin/sh will work
for s in 172.x.x.x 172.x.x.y 172.x.x.z; do
ssh -i /home/user/.ssh/my_key.rsa "user#${s}" crontab -l |
awk '! /^#|^[[:space:]]*$/ && $6 !~ /^$|^(echo|find|PATH|\/usr\/bin|\/bin\/)/ { print $6 }' |
# no -0; use grep -E and properly quote literal dot
xargs grep -Ein 'server\.tld|10.x.x.x'
done
Your command would not output null-delimited data to xargs so probably the immediate problem was that xargs -0 would receive all the file names as a single file name which obviously does not exist, and you forgot to include the ": file not found" from the end of the error message.
The use of grep -E is a minor hack to enable a more modern regex syntax which is more similar to that in Awk, where you don't have to backslash the "or" pipe etc.
This script, like your original, runs grep on the local system where you run the SSH script. If you want to run the commands on the remote server, you will need to refactor to put the entire pipeline in single quotes or a here document:
for s in 172.x.x.x 172.x.x.y 172.x.x.z; do
ssh -i /home/user/.ssh/my_key.rsa "user#${s}" <<\________HERE
crontab -l |
awk '! /^#|^[[:space:]]*$/ && $6 !~ /^$|^(echo|find|PATH|\/usr\/bin|\/bin\/)/ { print $6 }' |
xargs grep -Ein 'server\.tld|10.x.x.x'
________HERE
done
The refactored script contains enough complexities in the quoting that you probably don't want to pass it as an argument to ssh, which requires you to figure out how to quote strings both locally and remotely. It's easier then to pass it as standard input, which obviously just gets transmitted verbatim.
If you get "Pseudo-terminal will not be allocated because stdin is not a terminal.", try using ssh -t. Sometimes you need to add multiple -t options to completely get rid of this message.

cat: pid.txt: No such file or directory

I have a problem with cat. I want to write script doing the same thing as ps -e. In pid.txt i have PID of running processes.
ls /proc/ | grep -o "[0-9]" | sort -h > pid.txt
Then i want use $line like a part of path to cmdline for evry PID.
cat pid.txt | while read line; do cat /proc/$line/cmdline; done
i try for loop too
for id in 'ls /proc/ | grep -o "[0-9]\+" | sort -h'; do
cat /proc/$id/cmdline;
done
Don't know what i'm doing wrong. Thanks in advance.
I think what you're after is this - there were a few flaws with all of your approaches (or did you really just want to look at process with a single-digit PID?):
for pid in $(ls /proc/ | grep -E '^[0-9]+$'|sort -h); do cat /proc/${pid}/cmdline; tr '\x00' '\n'; done
You seem to be in a different current directory when running cat pid.txt... command compared to when you ran your ls... command. Run both your commands on the same terminal window, or use absolute path, like /path/to/pid.txt
Other than your error, you might wanna remove -o from your grep command as it gives you 1 digit for a matching pid. For example, you get 2 when pid is 423. #Roadowl also pointed that already.

Linux commands to get Latest file depending on file name

I am new to linux. I have a folder with many files in it and i need to get the latest file depending on the file name. Example: I have 3 files RAT_20190111.txt RAT_20190212.txt RAT_20190321.txt . I need a linux command to move the latest file here RAT20190321.txt to a specific directory.
If file pattern remains the same then you can try below command :
mv $(ls RAT*|sort -r|head -1) /path/to/directory/
As pointed out by #wwn, there is no need to use sort, Since the files are lexicographically sortable ls should do the job already of sorting them so the command will become :
mv $(ls RAT*|tail -1) /path/to/directory
The following command works.
ls | grep -v '/$' |sort | tail -n 1 | xargs -d '\n' -r mv -- /path/to/directory
The command first splits output of ls with newline. Then sorts it, takes the last file and then it moves this to the required directory.
Hope it helps.
Use the below command
cp ls |tail -n 1 /data...

Awk - capturing errors in cp command

We are using the below awk command in a shell script to distribute files from a given input file to four sub folders of a given destination directory.
awk -v dir=$base_dir -v p_num=$2 -v node=$3 -F '::' 'NR%p_num==node {print dir $1}' $input_file | xargs -P 6 -I {} cp {} $destination_directory/$node/
This is working fine, but not able to capture errors while running cp.
How can I capture errors, especially file not found-errors from cp?
Sample of input_file:
/data/subdir1/subdir2/subdir3/file1::2334::5667::2014-09-08
/data/subdir1/subdir2/subdir3/file2::4454::5667::2014-09-09
/data/subdir1/subdir2/subdir3/file3::9895::4445::2014-09-10
/data/subdir1/subdir2/subdir3/file4::3674::5667::2014-09-18
How can I capture if any entry in the above file is missing or non-existing and causes cp to fail?

grep for line containing jboss- until next /

I am trying to grep out of a init script status, just the jboss directory.
So in the stdout there is this line:
JBOSS_CMD_START = ulimit -c 2500000; cd /home/blah; /apps/jboss-eap-5.1.2/jboss-as/bin/run.sh -c jboss-blahtest -b 1.1.2.3 -Djboss.messaging.ServerPeerID=1
And out of that, I am trying to grep just the directory, up until /jboss-as, so the results would be:
/apps/jboss-eap-5.1.2/jboss-as/
The problem is the jboss version can be a number of things, so I need to get from /apps/jboss- to /jboss-as/
grep -oE '/apps/jboss-eap-[^/]+/jboss-as/'
One option
grep -Eo '/[^[:space:]]+jboss-as/'
grep -oE '\S+/jboss-as/ should do it.

Resources