check date_filename.txt exist - linux

I've created a set of linux command that do set of ping and keep the log file into 2 file, which is date-time_successping and date-time_failping. I want to do an action where when date-time_failping exist, the linux system will send an email to the PIC. but the filename is too random since i've a date infront of it. E.g. 20170911-160455_failping. I tried using like -c/-f, but they cannot search if no specific name. there will be multiples of file with variant of dates created. So I need some advice on this. hope anyone could help.
thanks
P/S: I'm so sorry if the information that given is not enough, please reply if needed more info so that i could try to provide and help me solve this issue.
Regards

You can do it something like this optionally,
This command is to get yesterday date:
VAR1=`date +%Y%m%d -d "yesterday"`
Or you want to work with today's date
VAR1=`date +%Y%m%d `
$file = VAR1`__failping`
if [ -f "$file" ]
then
echo "My message" | mail -s subject user#gmail.com
else
//Do something else
end

You can use inotify to monitor your log file creation event, then check if the file pattern match xxxxx_failping. Assume your log files are put in ping_logs, you can use the following script to monitor
$ inotifywait -rme create ping_logs/ |awk '{if($NF ~ /[0-9]+-[0-9]+_failping/) print $NF}'
Here just print the file name, you can change to your mailing action.

Related

Ldapsearch filtering using variables not displaying data

I am currently trying to query an LDAP server to find whether the email passed to the script exists on our system.
Below is the ldapsearch command I am trying to use:
ldapdata=`ldapsearch -h ### -b "ou=###,o=###" "email=$email" email firstname surname`
echo "ldapdata: $ldapdata"
This works perfectly when the filter includes a predetermined email, ie "mail=firstname-surname####" however when passed a variable, such as $email, the output is not able to be manipulated by further grep / awk statements and will not display any data in the echo statement.
From some Googling I have figured out It could be to do with the line wrapping which LDAP uses.
What I have already tried to solve the issue:
| perl -p00e 's/\r?\n //g
| sed '/^$/d
-o ldif-wrap=no
My question is, what is the best method to solve this issue. Many thanks in advance.
Just for anyone having the same issue, the issue was actually due to me writing and testing the program in a Windows environment.
I was pulling the $email variable from a file that is in the dos format.
To fix this all I did was :
dos2unix $FILELOCATION

Linux Bash auto command, source text file

Thank you for your concern.
I'm a noob trying to bulk-ip-lookup with [geoiplookup -f GeoLiteCity.dat] command.
I have more than 700 ips to lookup which saved on as c.txt (on the same folder)
How can I make a bash shell script? I've already made one and all I got was:
sudo: unable to execute ./ok.sh: No such file or directory
Here is my script
It would be all - ok to use another language.
To make it more clear;
[geoiplookup -f GeoLiteCity.dat IP1]
[geoiplookup -f GeoLiteCity.dat IP2]
...
[geoiplookup -f GeoLiteCity.dat IP700]
and save them as one text file. (Which would be 700 row)
I'm Korean and sorry for my poor English, but I couldn't find any in my language how to do this. I'll really appreciate it, or I have to look up 1 by 1 till Friday... (as internet speed is extremely slow in my company)
Please help me. I will pray for you at every sunday morning. Thank you.
found a very simple answer with a duckduckgo search for 'iterate through each line of file bash'
stackoverflow.com/questions/1521462/looping-through-the-content-of-a-file-in-bash
#!/usr/bin/bash
printf "\n\n"
while read ip; do
echo "LOOKING UP IP $ip"
geoiplookup $ip
printf "\n\n"
done < ipaddresses.txt
save it as iplookup.sh and run, without 'sudo':
bash iplookup.sh
tested and working. be sure to rename your file 'c.txt' to 'ipaddresses.txt' ! also the 'ipaddresses.txt' file must be in the same directory

Find and Replace in bash Shell

Please advise on replacing a variable with latest date & time.
Here is my requirement.
FN='basename$0'
TS=`date '+%m/%d/%Y %T'`
QD='08/27/2014 16:25:45'
Then I have a query to run. After it has run, I need to take $TS (current system date & time) and assign it as a value to the $QD variable. This is a loop process and gets updated every time the script runs.
I've tried using sed but was not successful.
Please help.
Programatically modifying your script to have a different timestamp constant is absolutely and emphatically the wrong way to handle this problem.
Instead, when you want to mark that the query has been done, simply touch a file:
touch lastQueryCompletion
...and when you want to know when the query was last done, check that file's timestamp:
# with GNU date
QD=$(date -r lastQueryCompletion '+%m/%d/%Y %T')
# or, with Mac OS X stat
QD=$(stat -t '%Y/%m/%d %H:%M:%S' -f '%Sm' lastQueryCompletion)
Although you haven't mentioned the overall goal that you wish to accomplish, I have a feeling something like this would be more robust than using sed to update an existing script file.
FN='basename$0'
TS=date '+%m/%d/%Y %T'
# Load the latest QD (from the last run)
[ -e ~/.QD.saved ] && QD="`cat ~/.QD.saved`"
QD='08/27/2014 16:25:45'
...Later in that file...
#Save the new QD variable
echo '$(date +$FORMAT)'" > ~/.QD.saved
Although I'm not sure if sed is the tool you're looking, I believe that your command would have to go like this:
sed -i -r 's/^QD=.*/QD="$TS"/g' "$FN"
I'm assuming you're using gnu-sed, which with -i option tells to do an in-place substitution, rather then copying the input line to the pattern space.
Well, hope it helps.

Piping with multiple commands

Assume you have a file called “heading” as follows
echo "Permissions^V<TAB>^V<TAB>Size^V<TAB>^V<TAB>File Name" > heading
echo "-------------------------------------------------------" >> heading
Write a (single) set of commands that will create a report as follows:
make a list of the names, permissions and size of all the files in your current directory,
matching (roughly) the format of the heading you just created,
put the list of files directly following the heading, and
save it all into a file called “file.list”.
All this is to be done without destroying the heading file.
I need to be able to do this all in a pipleline without altering the file. I can't seem to do this without destroying the file. Can somebody please make a pipe for me?
You can use command group:
{ cat heading; ls -l | sed 's/:/^V<tab>^V<tab>/g'; } > file.list

Find out who/what is calling a script (cron)

I was having a problem recently where somebody's cron job was calling a script that sent an alert to me when it was run. I wanted to find out whose job it was and which server it was running on.
The problem has been resolved by someone else, but I was wondering what I could have done to find out which host/username the job is being run from. One thing I could think of was to edit the script (Perl) and use Sys::Hostname. Anything else?
Thanks!
As you said, you can get the hostname with Sys::Hostname. You can also get the username with getpwuid($<):
use Sys::Hostname;
my $info = getpwuid($<) . '#' . hostname;
print "$info\n"; # prints user#host
There is no automatic way to do that unless you use mail to send out the alerts. Mails contain the host name in the header, so you can at least see where it came from (user and host). The time stamp should then help to locate the cron job.
For all other forms of alerts (SMS, pager, etc), you should make it a policy to include the user and hostname in the message.
You could also add to your script: print `env|sort`; -- that would reveal the USERNAME or LOGNAME. If you don't want to mess with the output of your program, log it to a file:
use POSIX 'strftime';
open my $log, '>>', 'logfile' or die "can't append to logfile: $!\n";
print $log strftime(%Y-%m-%d %T", localtime), " - starting $0\n";
print $log `env|sort`;
close $log;

Resources