I have a simple script that works when I manually put it into the ~/ directory.
#!/bin/bash
source /opt/python/current/env
source /opt/python/run/venv/bin/activate
cd /opt/python/current/app
scrapy crawl myspider
deactivate
exit 0
Since EBS will delete the script and crontab eventually, I need to create a config file in .ebextensions.
So I've created cron-linux.config
But I can't make it work. It doesn't do anything (I should see changes in a database).
files:
"/etc/cron.d/crawl":
mode: "000644"
owner: root
group: root
content: |
* * * * * /usr/local/bin/crawl.sh
"/usr/local/bin/crawl.sh":
mode: "000755"
owner: ec2-user
group: ec2-user
content: |
#!/bin/bash
source /opt/python/current/env
source /opt/python/run/venv/bin/activate
cd /opt/python/current/app
scrapy crawl myspider
deactivate
exit 0
commands:
remove_old_cron:
command: "rm -f /etc/cron.d/crawl.bak"
Do you know where is the problem?
Even when I do eb logs > log.txt and check the file, there is no such word like 'crawl', 'cron' etc...
EDIT
I can even manually run the script: [ec2-user#ip-xxx-xx-x-xx ~]$ /usr/local/bin/crawl.sh
And I see that it works. Just the cron doesn't work.
Related
There is a shell script in the location /file/location/azcopy/, and an Azcopy binary is also located there
The below command runs successfully when I run it manually
./azcopy cp "/abc/def/Goa.csv" "https://.blob.core.windows.net/abc\home\xyz?"
However, when I scheduled it in crontab, the "./azcopy" command didn't execute.
below is the script
#!/bin/bash
./azcopy cp "/abc/def/Goa.csv" "https://<Blobaccount>.blob.core.windows.net/abc\home\xyz?<SAS- Token>"
below is the crontab entry
00 17 * * * root /file/location/azcopy/script.sh
Is there something I'm doing wrong?
Could someone please help me figure out what's wrong.
When you use root to execute the /file/location/azcopy/script.sh,you work directory is /root ,so you need to add cd /file/location/azcopy/ in your script.sh script to change work directory. You can add pwd in your script to see the current work directory.
I installed Debian 9 on my VPS. I installed LAMP on the server. I'm logged in as root, I created a new site "/var/www/example.com" and I see that the permissions are "root:root". The web page is displayed in the browser.
I created a cron.php file that writes the current time to the file. In crontab I have /usr/bin/php /var/www/example.com/cron.php. If I run this command through the terminal, everything works. However, Crontab returns an error because it does not have write permissions. However, Crontab runs as root. The directory has 777 permissions.
I tried to set /var/www as www-data:www-data and the same for crontab (crontab -u www-data -e). The result is the same, cron runs but does not write to the file.
EDIT:
I found that if the script contains: file_put_contents('output.txt', 'xxx'); the file created by cron is in root. If I set the full path, everything is fine: file_put_contents('/var/www/exmaple.com/output.txt', 'xxx'); Is there any way to modify this behavior?
You can create sample script like this:
#!/bin/bash
source ~/.bashrc #or use .bash_profile
/usr/bin/php /var/www/example.com/cron.php >>/path/to/output
and add it as cron record:
0 * * * * /path/to/script/sh
I have a couple of linux commands I want to run everyday in the morning once using cron. Not sure how to do this though.
I understand that this needs to be done using shell, but I don't know how to do all this in linux. I will be able to create cron using cpanel though...
These are the commands
rm -rf <directory>
mkdir <directory>
chmod 777 <directory>
You can create a shell script with this commands in a file script.sh, for example:
#!/usr/bin/bash
rm -rf <directory>
mkdir <directory>
chmod 777 <directory>
<others commands or logical instructions>...
In linux you can add a cron job in crontab, with crontab -e command, or in /etc/cron.d directory. The difference is that with command crontab -e the cron job will be set to user that execute crontab -e and add a cron job file right in cron.d you will need to put the user ahead the command of cron job.
Examples of cron jobs to be executed at 06:00 am.
With crontab -e:
0 6 * * * /usr/bin/bash /path_to_script/script.sh
Creating a file in /etc/cron.d:
0 6 * * * root /usr/bin/bash /path_to_script/script.sh
Alternatively you can just put the commands in your cron job as:
0 6 * * * rm -rf <directory> && mkdir <directory> && chmod 777 <directory>
Attention: remember to put the absolute path to directories that want to remove or create
PS: you can make your scripts in any language and use shell calls, as php with shell_exec() function or system() function of perl.
I updated the script with the absolute paths. Also here is my current cronjob entry.
I went and fixed the ssh key issue so I know it works know, but might still need to tell rsync what key to use.
The script runs fine when called manually by user. It looks like not even the rm commands are being executed by the cron job.
UPDATE
I updated my script but basically its the same as the one below. Below I have a new cron time and added an error output.
I get nothing. It looks like the script doesn't even run.
crontab -e
35 0 * * * /bin/bash /x/y/z/s/script.sh 2>1 > /tmp/tc.log
#!/bin/bash
# Clean up
/bin/rm -rf /z/y/z/a/b/current/*
cd /z/y/z/a/to/
/bin/rm -rf ?s??/D????
cd /z/y/z/s/
# Find the latest file
FILE=`/usr/bin/ssh user#server /bin/ls -ht /x/y/z/t/a/ | /usr/bin/head -n 1`
# Copy over the latest archive and place it in the proper directory
/usr/bin/rsync -avz -e /urs/bin/ssh user#server:"/x/y/z/t/a/$FILE" /x/y/z/t/a/
# Unzip the zip file and place it in the proper directory
/usr/bin/unzip -o /x/y/z/t/a/$FILE -d /x/y/z/t/a/current/
# Run Dev's script
cd /x/y/z/t/
./old.py a/current/ t/ 5
Thanks for the help.
I figured it out, I'm use to working in cst and the server was in gmt time.
Thanks everybody for the help.
I am trying to archive my localhost's root folder with tar and want to automate it's execution on a daily basis with crontab. For this purpose, I created a 'backupfolder' in my personal folder. I am running on Ubuntu 12.04.
The execution of tar in the command line works fine without problems:
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
However, when I schedule the command for a daily backup (let's say at 17.00) in sudo crontab -e, it is not executing, i.e. the backup does not update using the following command:
0 17 * * * sudo tar -cpzf backupfolder/localhost.tar.gz /var/www
I already tried the full path home/user/backupfolder/localhost.tar.gz without success.
var/log/syslog gives me the following output for the scheduled execution:
Feb 2 17:00:01 DESKTOP-PC CRON[12052]: (root) CMD (sudo tar -cpzfbackupfolder/localhost.tar.gz /var/www)
Feb 2 17:00:01 DESKTOP-PC CRON[12051]: (CRON) info (No MTA installed, discarding output)
/etc/crontab specifies the following path:
SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
I assume that crontab is not executing as this is a sudo command.
Is there a way how I can get this running? What is the recommended, safe way if I don't want to hardcode my root password?
Well, the command that works for you is
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
Which means, you have to run the command with sudo access, and it will not work from within your crontab.
I would suggest adding the cron job to the root user's crontab.
Basically, do
sudo crontab -e
And add an entry there
0 17 * * * cd /home/user/backupfolder && tar -cpzf localhost.tar.gz /var/www
If that doesn't work, add the full path of tar (like /bin/tar).
Also, while debugging, set the cronjob to run every minute (* * * * *)
Basically the problem is the sudo command so we will allow sudo to run tar for the "user" without prompting for the password.
Add the following line in /etc/sudoers file.
user ALL=(ALL) NOPASSWD:/bin/tar
where user is the user installing the crontab.
I suspect a PATH problem, try to set some variables at the top of sudo crontab -e :
MAILTO=your_email#domain.tld # to get the output if there's errors
PATH=/usr/bin:/bin:/usr/local/bin:/usr/local/sbin:/sbin
You can write your command in a script like run.sh
#/bin/sh -l
tar -cvpzf backupfolder/localhost.tar.gz /var/www
then use the crontab to run the script.
IMPORTANT NOTE: the script's first line has the "-l" option.
Try it.