How to Check whether the date changed in linux - linux

I wrote a script to capture ngrep log daily. But it's not creating file automatically when the date has been changed.
#!/bin/bash
month=$( date +%B )
mkdir -p /home/log/$month
NOW=$(date '+%Y%m%d')
LOGFILE1="/home/log/$month/5110_$NOW.txt"
LOGFILE2="/home/log/$month/5150_$NOW.txt"
LOGFILE3="/home/log/$month/5160_$NOW.txt"
while true
do
ngrep -t -q -d any -W byline port 5110 >> $LOGFILE1 &
ngrep -t -q -d any -W byline port 5150 >> $LOGFILE2 &
ngrep -t -q -d any -W byline port 5160 >> $LOGFILE3
exec bash
sleep 2
done
Please help.
Thanks

You would have to move the variable definitions into the while loop for that to happen.
#!/bin/bash
while true
do
month=$( date +%B )
mkdir -p /home/log/$month
NOW=$(date '+%Y%m%d')
LOGFILE1="/home/log/$month/5110_$NOW.txt"
LOGFILE2="/home/log/$month/5150_$NOW.txt"
LOGFILE3="/home/log/$month/5160_$NOW.txt"
ngrep -t -q -d any -W byline port 5110 >> $LOGFILE1 &
ngrep -t -q -d any -W byline port 5150 >> $LOGFILE2 &
ngrep -t -q -d any -W byline port 5160 >> $LOGFILE3
exec bash
sleep 2
done

Related

Netcat [nc] listen grep ip and disconnect

Is there a way to grep the IP address of the inbound connection and disconnect after a timeout?
If I do
nc -vv -l -p <portnum>
it's connected forever.
$nc -h
[v1.10]
connect to somewhere: nc [-options] hostname port[s] [ports] ...
listen for inbound: nc -l -p port [-options] [hostname] [port]
options:
-4 Use IPv4 (default)
-6 Use IPv6
-c shell commands as -e; use /bin/sh to exec [dangerous!!]
-e filename program to exec after connect [dangerous!!]
-A algorithm cast256, mars, saferp, twofish, or rijndael
-k password AES encrypt and ascii armor session
-b allow broadcasts
-g gateway source-routing hop point[s], up to 8
-G num source-routing pointer: 4, 8, 12, ...
-h this cruft
-i secs delay interval for lines sent, ports scanned
-l listen mode, for inbound connects
-n numeric-only IP addresses, no DNS
-o file hex dump of traffic
-p port local port number
-r randomize local and remote ports
-q secs quit after EOF on stdin and delay of secs
-s addr local source address
-t answer TELNET negotiation
-u UDP mode
-v verbose [use twice to be more verbose]
-w secs timeout for connects and final net reads
-z zero-I/O mode [used for scanning]
port numbers can be individual or ranges: lo-hi [inclusive];
hyphens in port names must be backslash escaped (e.g. 'ftp\-data').
I'm trying but I get no result.
My netcat is dated. The nc version number is 1.10
EDIT
#VictorLee gives me some alternatives. I made a thing.
Here there's a little server script that listen and logs every new different access.
If someone want to use or modify I put the code below
#!/bin/bash
unset PIDTMP; rm -rf tmplog.log 2>/dev/null
while true; do
if [[ "$PIDTMP" == "" ]]; then
nc -vv -l -p <YOURPORT> > tmplog.log 2>&1 & PIDTMP=$!;
fi
if [[ "$PIDTMP" != "" ]]; then
if [[ -f tmplog.log ]]; then
thisip="$(cat -v tmplog.log 2> /dev/null | tr -d '\0' | grep -aiv "failed" | grep -ioE -m2 "\\[([0-9]{1,3}\.){3}[0-9]{1,3}\\]" | tail -1 | sed 's/^.\(.*\).$/\1/')" 2> /dev/null
#uncomment if u want output to screen
#if [[ "$thisip" != "" ]]; then cat tmplog.log 2> /dev/null; fi;
fi
if [[ "$thisip" != "" ]]; then
kill $PIDTMP 2>/dev/null
wait $PIDTMP 2>/dev/null; unset PIDTMP;
if [[ "$(grep -rnw log.log -e "$thisip" 2> /dev/null)" == "" ]]; then
echo "$thisip" >> log.log
fi
unset thisip
fi
fi
sleep 2
done
Try this:
nc -vv -l -p <portnum> >>/tmp/nc.log 2>&1 & sleep <timeout>;kill -9 $!
If you want to get the only connection ip, could run this grep -oP "(?<=Connection from \[)[\w\.]*(?=])" /tmp/nc.log, the one line is:
nc -vv -l -p <portnum> >>/tmp/nc.log 2>&1 & sleep <timeout>;kill -9 $!;grep -oP "(?<=Connection from \[)[\w\.]*(?=])" /tmp/nc.log
First collect the nc log to nc.log and force kill the nc progress until the time out, then get the connection ip by grep.

Curl errors on remote but not on local

I am attempting to run the following script on a virtual instance I have created on the google cloud:
#!/bin/bash
set -eu
DS=$(date "+%Y-%m-%d" -d "7 days ago")
DE=$(date "+%Y-%m-%d" -d "1 day ago")
account=123
## above specifies last weeks delivery
rm -f cookiejar
curl /dev/null -s -S -L -f -c cookiejar 'https://url.io/auth/authenticate' -d name=usr -d passwd='pwd'
curl -o /dev/null -s -S -L -f -b cookiejar -c cookiejar 'https://adloox.io/auth/adminaccounts' -d account=$account
curl -s -S -L -f -o "report1.xlsx" -J -b cookiejar -c cookiejar "https://url.io/adquality/ajax-adblocking?categoryFw=&platform_id[]=7&id1=All&id2=&id3=All&id4=All&id5=&id11=&date=2019-12-09&date_start=$DS&date_end=$DE&website=&keywords=&zfTablePage=1&zfTableColumn=&zfTableOrder=desc&zfTableQuickSearch=&zfTableItemPerPage=100&zfTableDataTablesMaxRows=2628&zfTableItemPerPage=10000&zfTableExport=xlsx"
curl -s -S -L -f -o "report2.xlsx" -J -b cookiejar -c cookiejar "https://url.io/report/ajax-by-tag2?platform_id[]=7&id1=All&id2=&id3=All&id4=All&id5=&id11=&date=2019-12-09&date_start=$DS&date_end=$DE&website=&zfTablePage=1&zfTableColumn=&zfTableOrder=desc&zfTableQuickSearch=&zfTableItemPerPage=100&zfTableDataTablesMaxRows=10000&zfDetails=true&by_viewability=imps_sivt&device_id[]=all&tag_type_id[]=all&support_id[]=all&by_website=1&zfTableItemPerPage=10000&zfTableExport=xlsx"
Set Up
I have curl installed on my local desktop (running windows 10 with `Ubuntu 18.04.3 LTS)
I have curl installed on my remote (google cloud virtual instance set up with Ubuntu 18.04.3 LTS`)
Both have version curl 7.58.0
Issue
When run on my local desktop there is no issue and the files download. When run on my remote I am able to log in however I receive the following error for the next curl line:
+ curl -o /dev/null -s -S -L -f -b cookiejar -c cookiejar https://url.io/auth/adminaccounts -d account=123
curl: (22) The requested URL returned error: 500 Internal Server Error
Can someone confirm what else I should be looking at here? I would have thought if my linux and curl versions are the same, there wouldn't be an issue. Sorry if there are some other straight forward checks to do. This is my first time setting up a server.
If your virtual instance does not have a public IP address, this might help you out.
#!/bin/bash
set -eu
DS=$(date "+%Y-%m-%d" -d "7 days ago")
DE=$(date "+%Y-%m-%d" -d "1 day ago")
account=123
## above specifies last weeks delivery
rm -f cookiejar
curl /dev/null -s -S -L -f -c cookiejar 'https://url.io/auth/authenticate' -d name=usr -d passwd='pwd'
curl -o /dev/null -s -S -f -b cookiejar -c cookiejar 'https://adloox.io/auth/adminaccounts' -d account=$account
curl -s -S -L -f -o "report1.xlsx" -J -b cookiejar -c cookiejar "https://url.io/adquality/ajax-adblocking?categoryFw=&platform_id[]=7&id1=All&id2=&id3=All&id4=All&id5=&id11=&date=2019-12-09&date_start=$DS&date_end=$DE&website=&keywords=&zfTablePage=1&zfTableColumn=&zfTableOrder=desc&zfTableQuickSearch=&zfTableItemPerPage=100&zfTableDataTablesMaxRows=2628&zfTableItemPerPage=10000&zfTableExport=xlsx"
curl -s -S -L -f -o "report2.xlsx" -J -b cookiejar -c cookiejar "https://url.io/report/ajax-by-tag2?platform_id[]=7&id1=All&id2=&id3=All&id4=All&id5=&id11=&date=2019-12-09&date_start=$DS&date_end=$DE&website=&zfTablePage=1&zfTableColumn=&zfTableOrder=desc&zfTableQuickSearch=&zfTableItemPerPage=100&zfTableDataTablesMaxRows=10000&zfDetails=true&by_viewability=imps_sivt&device_id[]=all&tag_type_id[]=all&support_id[]=all&by_website=1&zfTableItemPerPage=10000&zfTableExport=xlsx"

Bash script. Open new terminal and run command [duplicate]

This question already has answers here:
how do i start commands in new terminals in BASH script
(2 answers)
Closed 19 days ago.
How can I run command in new terminal from bash?
If I run it just from one terminal, mosquitto_sub - blocks the script. xterm -e opens new terminal but my script blocks too...
#!/bin/bash
COUNTER=0
xterm -e mosquitto_sub -h 192.168.1.103 -t test
mosquitto_pub -h 192.168.1.103 -t test -m "Connected"
cd Desktop/ScreenTool/image/
while [ $COUNTER == 0 ]; do
tesseract c.png output
if grep -q Click "/root/Desktop/ScreenTool/image/output.txt"; then
mosquitto_pub -h 192.168.1.103 -t test -m "Rain is here"
echo -en "\007"
fi
cat "/root/Desktop/ScreenTool/image/output.txt"
sleep 3;
done
To execute a command without waiting for it to finish, put it in the background with &.
#!/bin/bash
COUNTER=0
xterm -e mosquitto_sub -h 192.168.1.103 -t test &
mosquitto_pub -h 192.168.1.103 -t test -m "Connected"
cd Desktop/ScreenTool/image/
while [ $COUNTER == 0 ]; do
tesseract c.png output
if grep -q Click "/root/Desktop/ScreenTool/image/output.txt"; then
mosquitto_pub -h 192.168.1.103 -t test -m "Rain is here"
echo -en "\007"
fi
cat "/root/Desktop/ScreenTool/image/output.txt"
sleep 3;
done

What's wrong about my script with "ssh + nohup"

I want to execute specific script at remote server by ssh in background.
I found some solution about nohup.
But, nohup is not running without "2>&1"
I want to know what's the difference between existing "2>&1" and not.
nohup needs "2>&1" expression?
(Please understand my bad English)
This is my 'iperf_server.sh' script.
iperf -s -p 1 -w 128K
And, It is my host machine command.
$ ssh [id]#[host] "nohup echo [password] | sudo -S [Home_dir]/iperf_server.sh > /dev/null &"
$ ssh [id]#[host] "nohup echo [password] | sudo -S [Home_dir]/iperf_server.sh > /dev/null 2>&1 &"
$ ssh -t [id]#[host] "nohup echo [password] | sudo -S [Home_dir]/iperf_server.sh > /dev/null &"
Connection to iperf-server closed.
$ ssh -t [id]#[host] "nohup echo [password] | sudo -S [Home_DIR]/iperf_server.sh > /dev/null 2>&1 &"
Connection to iperf-server closed.
This is ps command result in iperf server
# ps -eLf | grep iperf | grep -v grep
# ps -eLf | grep iperf | grep -v grep
00:00:00 sudo -S [HOME_DIR]/iperf_server.sh
00:00:00 sh [HOME_DIR]/iperf_server.sh
00:00:00 iperf -s -p 1 -w 128K
00:00:00 iperf -s -p 1 -w 128K
00:00:00 iperf -s -p 1 -w 128K
# killall iperf
# ps -eLf | grep iperf | grep -v grep
# ps -eLf | grep iperf | grep -v grep
Take the & off the end.
This should do it:
ssh -t [id]#[host] "nohup echo [password] | sudo -S [Home_dir]/iperf_server.sh > /dev/null 2>&1"
By the way this is a huge security risk. Don't echo the password on the command line! If you really want to use a password like this at least do something like cat pwd.txt | sudo -S instead.

Update bash script, file check, how?

#!/bin/sh
LOCAL=/var/local
TMP=/var/tmp
URL=http://um10.eset.com/eset_upd
USER=""
PASSWD=""
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -N -nH -nd -q"
UPDATEFILE="update.ver"
cd $LOCAL
CMD="$WGET $URL/$UPDATEFILE"
eval "$CMD" || exit 1;
if [ -n "`file $UPDATEFILE|grep -i rar`" ]; then
(
cd $TMP
rm -f $TMP/$UPDATEFILE
unrar x $LOCAL/$UPDATEFILE ./
)
UPDATEFILE=$TMP/$UPDATEFILE
URL=`echo $URL|sed -e s:/eset_upd::`
fi
TMPFILE=$TMP/nod32tmpfile
grep file=/ $UPDATEFILE|tr -d \\r > $TMPFILE
FILELIST=`cut -c 6- $TMPFILE`
rm -f $TMPFILE
echo "Downloading updates..."
for FILE in $FILELIST; do
CMD="$WGET \"$URL$FILE\""
eval "$CMD"
done
cp $UPDATEFILE $LOCAL/update.ver
perl -i -pe 's/\/download\/\S+\/(\S+\.nup)/\1/g' $LOCAL/update.ver
echo "Done."
So I have this code to download definitions for my antivirus. The only problem is that, it downloads all files everytime i run script. Is it possible to implement some sort file checking ?, let's say for example,
"if that file is present and have same filesize skip it"
Bash Linux
The -nc argument to wget will not re-fetch files that already exist. It is, however, not compatible with the -N switch. So you'll have to change your WGET line to:
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -nH -nd -q -nc"

Resources