Bash script manual execute run normally but not with crontab - linux

Hello i have a script like this one:
#!/usr/bin/bash
ARSIP=/apps/bea/scripts/arsip
CURDIR=/apps/bea/scripts
OUTDIR=/apps/bea/scripts/out
DIRLOG=/apps/bea/jboss-6.0.0/server/default/log
LISTFILE=$CURDIR/tmp/file.$$
DATE=`perl -e 'use POSIX; print strftime "%Y-%m-%d", localtime time-86400;'`
JAVACMD=/apps/bea/jdk1.6.0_26/bin/sparcv9/java
HR=00
for (( c=0; c<24; c++ ))
do
echo $DATE $HR
$JAVACMD -jar LatencyCounter.jar LatencyCounter.xml $DATE $HR
sleep 1
cd $OUTDIR
mv btw_120-180.txt btw_120-180-$DATE-$HR.txt
mv btw_180-360.txt btw_180-360-$DATE-$HR.txt
mv btw_60-120.txt btw_60-120-$DATE-$HR.txt
mv failed_to_deliver.txt failed_to_deliver-$DATE-$HR.txt
mv gt_360.txt gt_360-$DATE-$HR.txt
mv out.log out-$DATE-$HR.log
cd -
let HR=10#$HR+1
HR=$(printf %02d $HR);
done
cd $OUTDIR
tar -cf latency-$DATE.tar btw*-$DATE-*.txt gt*$DATE*.txt out-$DATE-*.log
sleep 300
gzip latency-$DATE.tar
sleep 300
/apps/bea/scripts/summaryLatency.sh
sleep 300
rm -f btw* failed* gt* out*
#mv latency-$DATE.tar.gz ../$ARSIP
cd -
It basically execute jar files in same directory as this script and then tar the result, gzip it and execute another bash file then delete all of the previous collected files. The problem is i need this script to run daily and i use crontab to do that. It still return empty tar file but if i execute it manually it works well..I also have other 4 scripts running in crontab and they work good..i still can't figure out what is the main reason of this phenomena
thank you

I'll take a stab: your script is run by /bin/sh instead of /bin/bash.
Try explicitly running it with bash at the cron entry, like this:
* * * * * /bin/bash /your/script

I'm guessing that when you execute $JAVACMD -jar LatencyCounter.jar LatencyCounter.xml $DATE $HR, you're not in the directory containing LatencyCounter.jar. You might want to cd $CURDIR before you enter the for loop.

Related

How to run script multiple times and after every execution of command to wait until the device is ready to execute again?

I have this bash script:
#!/bin/bash
rm /etc/stress.txt
cat /dev/smd10 | tee /etc/stress.txt &
for ((i=0; i< 1000; i++))
do
echo -e "\nRun number: $i\n"
#wait untill module restart and bee ready for next restart
dmesg | grep ERROR
echo -e 'AT+CFUN=1,1\r\n' > /dev/smd10
echo -e "\nADB device booted successfully\n"
done
I want to restart module 1000 times using this script.
Module is like android device witch has linux inside it. But I use Windows.
AT+CFUN=1,1 - reset
When I push script, after every restart I need a command which will wait module and start up again and execute script 1000 times. Then I do pull in .txt file and save all output content.
Which command should I use?
I try commands like wait, sleep, watch, adb wait-for-device, ps aux | grep... Nothing works.
Can someone help me with this?
I find the solution. This is how my script actually looks:
#!/bin/bash
cat /dev/smd10 &
TEST=$(cat /etc/output.txt)
RESTART_TIMES=1000
if [[ $TEST != $RESTART_TIMES ]]
then
echo $((TEST+1)) > /etc/output.txt
dmesg
echo -e 'AT+CFUN=1,1\r\n' > /dev/smd10
fi
These are the steps that you need to do:
adb push /path/to/your/script /etc/init.d
cd /etc
cat outputfile.txt - make an output file and write inside file 0 ( echo 0 > output.txt )
cd init.d
ls - you should see rc5.d
cd .. then cd rc5.d - go inside
ln -s ../init.d/yourscript.sh S99yourscript.sh
ls - you should see S99yourscript.sh
cd .. return to init.d directory
chmod +x yourscript.sh - add permision to your script
./yourscript.sh

What causes multiple Mails when using Cron with Bash Script

I've made a little bash script to backup my nextcloud files including my database from my ubuntu 18.04 server. I want the backup to be executed every day. When the job is done I want to reseive one mail if the job was done (additional if it was sucessful or not). With the current script I reseive almost 20 mails and I can't figure out why. Any ideas?
My cronjob looks like this:
* 17 * * * "/root/backup/"backup.sh >/dev/null 2>&1
My bash script
#!/usr/bin/env bash
LOG="/user/backup/backup.log"
exec > >(tee -i ${LOG})
exec 2>&1
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --on
mysqldump --single-transaction -h localhost -u db_user --password='PASSWORD' nextcloud_db > /BACKUP/DB/NextcloudDB_`date +"%Y%m%d"`.sql
cd /BACKUP/DB
ls -t | tail -n +4 | xargs -r rm --
rsync -avx --delete /var/www/nextcloud/ /BACKUP/nextcloud_install/
rsync -avx --delete --exclude 'backup' /var/nextcloud_data/ /BACKUP/nextcloud_data/
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --off
echo "###### Finished backup on $(date) ######"
mail -s "BACKUP" name#domain.com < ${LOG}
Are you sure about the CRON string? For me this means "At every minute past hour 17".
Should be more like 0 17 * * *, right?

Run all shell scripts in folder

I have many .sh scripts in a single folder and would like to run them one after another. A single script can be executed as:
bash wget-some_long_number.sh -H
Assume my directory is /dat/dat1/files
How can I run bash wget-some_long_number.sh -H one after another?
I understand something in these lines should work:
for i in *.sh;...do ....; done
Use this:
for f in *.sh; do
bash "$f"
done
If you want to stop the whole execution when a script fails:
for f in *.sh; do
bash "$f" || break # execute successfully or break
# Or more explicitly: if this execution fails, then stop the `for`:
# if ! bash "$f"; then break; fi
done
It you want to run, e.g., x1.sh, x2.sh, ..., x10.sh:
for i in `seq 1 10`; do
bash "x$i.sh"
done
To preserve exit code of failed script (responding to #VespaQQ):
#!/bin/bash
set -e
for f in *.sh; do
bash "$f"
done
There is a much simpler way, you can use the run-parts command which will execute all scripts in the folder:
run-parts /path/to/folder
I ran into this problem where I couldn't use loops and run-parts works with cron.
Answer:
foo () {
bash -H $1
#echo $1
#cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script
You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.

crontab not executing complex bash script

SOLVED! add #!/bin/bash at the top of all my scripts in order to make use of bash extensions. Otherwise it restricts itself to POSIX shell syntax. Thanks Barmar!
Also, I'll add that I had trouble with gpg decryption not working from cronjob after I got it executing, and the answer was to add the --no-tty option (no terminal output) to the gpg command.
I am fairly new to linux, so bear with me...
I am able to execute a simple script with crontab -e when logged in as ubuntu:
* * * * * /ngage/extract/bin/echoer.sh
and this bash script simply prints output to a file:
echo "Hello" >> output.txt
But when I try to execute my more complex bash script in exactly the same way, it doesn't work:
* * * * * /ngage/extract/bin/superMasterExtract.sh
This script called into other bash scripts. There are 4 scripts in total, which 3 levels of hierarchy. It goes superMasterExtract > masterExtract > (decrypt, unzip)
Here is the code for superMasterExtract.sh (top level):
shopt -s nullglob # ignore empty file
cd /str/ftp
DIRECTORY='writeable'
for d in */ ; do # for all directories in /str/ftp
if [ -d "$d$DIRECTORY" ]; then # if the directory contains a folder called 'writeable'
files=($d$DIRECTORY/*)
dirs=($d$DIRECTORY/*/)
numdirs=${#dirs[#]}
numFiles=${#files[#]}
((numFiles-=$numdirs))
if [ $numFiles -gt 0 ]; then # if the folder has at least one file in it
bash /ngage/extract/bin/masterExtract.sh /str/ftp ${d:0:${#d} - 1} # execute this masterExtract bash script with two parameters passed in
fi
fi
done
masterExtract.sh:
DATE="$(date +"%m-%d-%Y_%T")"
LOG_FILENAME="log$DATE"
LOG_FILEPATH="/ngage/extract/logs/$2/$LOG_FILENAME"
echo "Log file is $LOG_FILEPATH"
bash /ngage/extract/bin/decrypt.sh $1 $2 $DATE
java -jar /ngage/extract/bin/sftp.jar $1 $2
bash /ngage/extract/bin/unzip.sh $1 $2 $DATE
java -jar /ngage/extract/bin/sftp.jar $1 $2
echo "Log file is $LOG_FILEPATH"
decrypt.sh:
shopt -s nullglob
UPLOAD_FILEPATH="$1/$2/writeable"
DECRYPT_FOLDER="$1/decryptedFiles/$2"
HISTORY_FOLDER="$1/encryptHistory/$2"
DONE_FOLDER="$1/doneFiles/$2"
LOG_FILENAME="log$3"
LOG_FILEPATH="/ngage/extract/logs/$2/$LOG_FILENAME"
echo "DECRYPT_FOLDER=$DECRYPT_FOLDER" >> $LOG_FILEPATH
echo "HISTORY_FOLDER=$HISTORY_FOLDER" >> $LOG_FILEPATH
cd $UPLOAD_FILEPATH
for FILE in *.gpg;
do
FILENAME=${FILE%.gpg}
echo ".done FILE NAME=$UPLOAD_FILEPATH/$FILENAME.done" >> $LOG_FILEPATH
if [[ -f $FILENAME.done ]]; then
echo "DECRYPTING FILE=$UPLOAD_FILEPATH/$FILE INTO $DECRYPT_FOLDER/$FILENAME" >> $LOG_FILEPATH
cat /ngage/extract/.sftpPasswd | gpg --passphrase-fd 0 --output "$DECRYPT_FOLDER/$FILENAME" --decrypt "$FILE"
mv $FILE $HISTORY_FOLDER/$FILE
echo "MOVING FILE=$UPLOAD_FILEPATH/$FILE INTO $HISTORY_FOLDER/$FILE" >> $LOG_FILEPATH
else
echo "Done file not found!" >> $LOG_FILEPATH
fi
done
cd $DECRYPT_FOLDER
for FILE in *
do
mv $FILE $DONE_FOLDER/$FILE
echo "DECRYPTED FILE=$DONE_FOLDER/$FILE" >> $LOG_FILEPATH
done
If anyone has a clue why it refuses to execute my more complicated script, I'd love to hear it. I have also tried setting some environment variables at the beginning of crontab as well:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/local/bin:/usr/bin
MAILTO=jgardnerx85#gmail.com
HOME=/
* * * * * /ngage/extract/bin/superMasterExtract.sh
Note, I don't know that these are the appropriate variables for my installation or my script. I just pulled them off other posts and tried it to no avail. If these aren't the correct environment variables, can someone tell me how I can deduce the right ones for my particular application?
You need to begin your script with
#!/bin/bash
in order to make use of bash extensions. Otherwise it restricts itself to POSIX shell syntax.

Linux - Execute All Bash Scripts in Directory By Date Added

I am looking for a script that will execute all of the bash scripts in a given directory in the order by which they were added. For example, the earliest scripts added to the directory would be executed first.
This is what I am using now, but it doesn't seem to execute the scripts by date added.
for each in /dir/*.sh;
do
bash $each > /dev/null 2>&1 ;
rm $each > /dev/null 2>&1 ;
done ;
Let me know how could modify this to order the files in the directory by date added.
If you mean "in order of creation":
ls -ctr /dir/*.sh | while read script
do
bash $script > /dev/null 2>&1
rm $script > /dev/null 2>&1
done

Resources