remove file in Cron task - linux

I have the following command line that I run on a Debian:
ls -d mypath/filename* -tp | grep -v '/$' | tail -n +1 | xargs rm -f
Result : OKAY (the file is removed)
But I want to run this command periodically, and if I put this command in a Cron job, then it doesnt work (file not removed)
I checked the logs, and there are no errors nor warnings
What I tried :
* * * * * bash -c "ls -d mypath/filename* -tp | grep -v '/$' | tail -n +1 | xargs rm -f"
Any idea ?

Related

Using ssh inside a script to run another script that itself calls ssh

I'm trying to write a script that builds a list of nodes then ssh into the first node of that list
and runs a checknodes.sh script which it's self is just a for i loop that calls checknode.sh
The first 2 lines seems to work ok, the list builds successfully, but then I get either get just the echo line of checknodes.sh to print out or an error saying cat: gpcnodes.txt: No such file or directory
MYSCRIPT.sh:
#gets the master node for the job
MASTERNODE=`qstat -t -u \* | grep $1 | awk '{print$8}' | cut -d'#' -f 2 | cut -d'.' -f 1 | sed -e 's/$/.com/' | head -n 1`
#builds list of nodes in job
ssh -qt $MASTERNODE "qstat -t -u \* | grep $1 | awk '{print$8}' | cut -d'#' -f 2 | cut -d'.' -f 1 | sed -e 's/$/.com/' > /users/issues/slow_job_starts/gpcnodes.txt"
ssh -qt $MASTERNODE cd /users/issues/slow_job_starts/
ssh -qt $MASTERNODE /users/issues/slow_job_starts/checknodes.sh
checknodes.sh
for i in `cat gpcnodes.txt `
do
echo "### $i ###"
ssh -qt $i /users/issues/slow_job_starts/checknode.sh
done
checknode.sh
str=`hostname`
cd /tmp
time perf record qhost >/dev/null 2>&1 | sed -e 's/^/${str}/'
perf report --pretty=raw | grep % | head -20 | grep -c kernel.kallsyms | sed -e "s/^/`hostname`:/"
When ssh -qt $MASTERNODE cd /users/issues/slow_job_starts/ is finished, the changed directory is lost.
With the backquotes replaced by $(..) (not an error here, but get used to it), the script would be something like
for i in $(cat /users/issues/slow_job_starts/gpcnodes.txt)
do
echo "### $i ###"
ssh -nqt $i /users/issues/slow_job_starts/checknode.sh
done
or better
while read -r i; do
echo "### $i ###"
ssh -nqt $i /users/issues/slow_job_starts/checknode.sh
done < /users/issues/slow_job_starts/gpcnodes.txt
Perhaps you would also like to change your last script (start with cd /users/issues/slow_job_starts)
You will find more problems, like sed -e 's/^/${str}/' (the ${str} inside single quotes won't be replaced by a host), but this should get you started.
EDIT:
I added option -n to the ssh call.
Redirects stdin from /dev/null (actually, prevents reading from stdin).
Without this option only one node is checked.

echo $variable in cron not working

Im having trouble printing the result of the following when run by a cron. I have a script name under /usr/local/bin/test
#!/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ARAW=`date +%y%m%d`
NAME=`hostname`
TODAY=`date '+%D %r'`
cd /directory/bar/foo/
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
echo "Resolve2 Backup" > /home/user/result.txt
echo " " >> /home/user/result.txt
echo "$VARR" >> /home/user/result.txt
mail -s "Result $TODAY" email#email.com < /home/user/result.txt
I configured it in /etc/cron.d/test to run every 1am:
00 1 * * * root /usr/local/bin/test
When Im running it manually in command line
# /usr/local/bin/test
Im getting the complete value. But when I let cron do the work, it never display the part of echo "$VARR" >> /home/user/result.txt
Any ideas?
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
ls -ltr /path/to/dir will not include the directory in the filename part of the output. Then, you call ls again with this output, and this will look in your current directory, not in /path/to/dir.
In cron, your current directory is likely to be /, and in your manual testing, I bet your current directory is /path/to/dir
Here's another approach to finding the newest file in a directory that emits the full path name:
stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-
Requires GNU stat, check your man page for the correct invocation for your system.
I think your VARR invocation can be:
latest_dir=$(stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-)
interesting_files=$(ls -ltr "$latest_dir"/*something*)
Then, no need for a temp file:
{
echo "Resolve2 Backup"
echo
echo "$interesting_files"
} |
mail -s "Result $TODAY" email#email.com
Thanks for all your tips and response. I solved my problem. The problem is the ouput of $8 and $9 in cron. I dont know what special field being read while it is being run in cron. Im just a newbie in scripting so sorry for my bad script =)

Why part of the script cannot execute in the crontab

I have a script stopping the application and zipping some files:
/home/myname/project/stopWithZip.sh
With the properties below:
-rwxrwxr-x. 1 myname myname778 Jun 25 13:48 stopWithZip.sh
Here is the content of the script:
ps -ef | grep project | grep -v grep | awk '{print $2}' |xargs kill -15
month=`date +%m`
year=`date +%Y`
fixLogs=~/project/log/fix/$year$month/*.log.*
errorLogs=~/project/log/error/$year$month/log.*
for log in $fixLogs
do
if [ ! -f "$log.gz" ];
then
gzip $log
echo "Archived:"$log
else
echo "skipping" $log
fi
done
echo "Archived fix log files done"
for log in $errorLogs
do
if [ ! -f "$log.gz" ]; then
gzip $log
echo "Archived:"$log
else
echo "skipping" $log
fi
done
echo "Archived errorlog files done"
The problem is except this ps -ef | grep project | grep -v grep | awk '{print $2}' |xargs kill -15 command, other gzip commands are not executed. I totally don't understand why.
I cannot see any compression of the logs in the directory.
BTW, when I execute the stopWithZip.sh explicitly in command line, it works perfectly fine.
In crontab:
00 05 * * 2-6 /home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1 (NOT work)
In command line:
/home/myname/project>./stopWithZip.sh (work)
Please help
The script fails when run under cron because your script is invoked with project in its path, so the kill pipeline kills the script too.
You could prove (or disprove) this by adding some tracing. Log the output of ps and of awk to log files:
ps -ef |
tee /tmp/ps.log.$$ |
grep project |
grep -v grep |
awk '{print $2}' |
tee /tmp/awk.log.$$ |
xargs kill -15
Review the logs and see that your script is one of the processes being killed.
The crontab entry contains:
/home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1
When ps lists that, it contains 'project' and does not contain 'grep' so the kill in the script kills the script itself.
When you run it from the command line (using a conventional '$' as the prompt), you run:
$ ./stopWithZip.sh
and when ps lists that, it does not contain 'project' so it is not killed.
If you ran:
$ /home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1
from the command line, like you do with cron (crontab), you would find it fails.

How to print al list of user one by one with complete information in linux using shell

I have started writing a small piece of code to print all the list of users available in the linux box. But I want to pass one by one user into my command to display each user details together.
to list all users
root#bt# getent passwd | grep /home/ | cut -d ':' -f 1
root
san
postgres
Now I want to pass one by user in to the below command to display each user details together.
root#bt# chage -l ${user1} ; chage -l ${user2} etcc.
should I need to user for loop or while loop here?
can any one help me in suggesting how to write the same?
You can use the while loop:
getent passwd | grep /home/ | cut -d ':' -f 1 | \
while read user ; do
chage -l "$user"
done
or the for loop:
for user in $(getent passwd | grep /home/ | cut -d ':' -f 1) ; do
chage -l "$user"
done
or xargs:
getent passwd | grep /home/ | cut -d ':' -f 1 | \
xargs -n1 chage -l
I would use xargs, which runs a command on each output item of the previous pipe:
getent passwd | grep /home/ | cut -d ':' -f 1 | sudo xargs -I % sh -c '{ echo "User: %"; chage -l %; echo;}'
sudo is used to get information about all users, if you don't have access to this information then you can remove sudo
-I % is used to specify that % is a placeholder for the input item (in your case a user)
sh -c '{ command1; command2; ...;}' is the command executed by xargs on every % item; in turn, the command sh -c allows multiple shell commands to be executed
'{ echo "User: %"; chage -l %; echo;}' echoes the current user in %, then runs chage -l on this user and finished with a final empty echo to format the ouput

Greping folder names but exluding "#domain.com"

Is it possible to exclude a domain from a grep? What I have tried below doesn't seem to work.
ls -l /var/www/folder | grep -E -o --exclude-dir="#somedomain.com" --color "\b[a-zA-Z0-9.-]+#[a-zA-Z0-9.-]+\.[a-zA-Z0-9.-]+\b">>test.txt
how about this
ls -l /var/www/folder | grep -v "#somedomain.com"
test case:
$ mkdir -p /tmp/test && cd $_
$ touch {a,b,c,d}#domain.com
$ touch {e,f}#somedomain.com
$ ls
domain.com b#domain.com c#domain.com d#domain.com e#somedomain.com f#somedomain.com
$ ls -1 | grep -v "#somedomain.com"
a#domain.com
b#domain.com
c#domain.com
d#domain.com
Here is what the man page says for -v
-v, --invert-match
Invert the sense of matching, to select non-matching lines. (-v is specified by POSIX.)
-l /var/www/folder | grep --invert-match "#somedomain.com" | grep -E -o --color "\b[a-zA-Z0-9.-]+#[a-zA-Z0-9.-]+.[a-zA-Z0-9.-]+\b">>test.txt

Resources