I try to import CSV file to the Bigquery with a BQ command line and it will work without issue in bash script. When I try to run this script with crontab i have error message said "bq: command not found"
Here is my script
#!/bin/bash
bq load --field_delimiter=";" --source_format=CSV.....
This script working when I run from shell.
This is the cron job
* * * * * /users/user/desktop/test.sh
Why it isn't working on crontab ?
Your PATH variable must describe the directory containing your programs, not the path to a single program.
In this case, it should be ../google-cloud-sdk/bin and not ../google-cloud-sdk/bin/bq.
Related
I am facing an issue of crontab does not run the python script in Raspbian GNU/Linux 9.4 (stretch) installed on a Raspberry Pi3. I have done lots of research on this topic e.g. by following the troubleshooting guide in https://askubuntu.com/questions/23009/why-crontab-scripts-are-not-working/24527#24527 , but none of them solved my issue.
The python script that I want to run is located in
/home/pi/Documents/Fork_BookManager.py
I have made sure that anyone can execute, read and change content to the above file.
I can run this file through terminal by writing
usr/bin/python3 /home/pi/Documents/Fork_BookManager.py
I know the file is running, because the Fork_BookManager.py is using selenium to open a web browser, and I see this web browser being opened. There is no error observed when running the Fork_BookManager.py file from terminal.
In terminal executed
crontab -e
Using nano as editor, I first verified that crontab acutally worked by
* * * * * env > /home/pi/Documents/env.output
After a minute there is a env.output file in Documents folder with below parameters.
LANGUAGE=da_DK.UTF-8
HOME=/home/pi
LOGNAME=pi
PATH=/usr/bin:/bin
LANG=da_DK.UTF-8
SHELL=/bin/sh
LC_ALL=da_DK.UTF-8
PWD=/home/pi
I noticed that the path from crontab is different from the path when env is called from terminal as I will have the below path if I call env from terminal
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games
So I know that crontab -e was working since I otherwise would not have got the env.output file created in Documents folder. I proceed by calling crontab -e from terminal and copied the path from env into the crontab like below
PATH = /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games
* * * * * /usr/bin/python3 /home/pi/Documents/Fork_BookManager.py
Saved the script and waited. Nothing happend.
To summerize I have:
Made sure that the Fork_BookManager.py is executable
Verified that crontab is working
Updated path from env into the crontab script
which are three largest cause of experiencing difficulties in making crontab to work according to this guide
https://askubuntu.com/questions/23009/why-crontab-scripts-are-not-working/24527#24527
Still it is not working. What have I missed? Is there a better practice to achieve what I want, namely to execute the python script once a minute without doing it directly from the python script itself such as a while loop with a time.sleep(60)
ADDITIONAL INFORMATION
I even tried to expand the path in crontab script by
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games:/usr/bin:/home/pi/Documents
in an attempt to specify in which folder the python3 and Fork_BookManager.py is located. Still no luck.
I'm having issues getting my crontab to run I have the following line added to my crontab -e but it won't start. The command runs fine if I run it manually.
0 */3 * * * cd /home/sam/p/ && /usr/bin/python3.5 start.py
Not getting any error messages and can't see the process when I run top or grep for it.
Usually this happens because the cron environment is different from your own. Make sure your start.py script uses full paths to any referenced files or external scripts. Make sure that your start.py script does not rely on environment variables that you have in your shell but it may not. Try piping the cron output to a mail command so you can see what it is doing, like so:
0 */3 * * * cd /home/sam/p/ && /usr/bin/python3.5 start.py | mail -s "cron output" myself#example.com
An easier way to troubleshoot this is to write a wrapper shell script and send the output to a log file.
Create file python_start_cron.sh with contents
#!/bin/bash
cd /home/sam/p/ && /usr/bin/python3.5 start.py
Set the execute bit on this script script and make sure the script works manually
Modify the cronjob as shown below
0 */3 * * * python_start_cron.sh >/tmp/python_start_cron.log 2>&1
After cron executes, check the contents of the log file to ascertain the cause of the problem.
I have my php script file in /var/www/html/dbsync/index.php. When cd /var/www/html/dbsync/ and run php index.php it works perfectly.
I want to call PHP file through sh file, the location of SH file is as below
/var/www/html/dbsync/dbsync.sh
This is the content of the dbsync.sh file is:
/usr/bin/php /var/www/html/dbsync/index.php >> /var/www/html/dbsync/myscript.log 2>&1 -q -f
When I cd /var/www/html/dbsync/ and run ./dbsync.sh it works perfectly as well.
Now if I set up crontab as below:
1 * * * * /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
However, this crontab is not working as expected.
What can be wrong?
As seen in comments, the problem is that you are not defining what program should be used to execute the script. Take into account that a cronjob is executed in a tiny environment; there, not much can be assumed. This is why we define full paths, etc.
So you need to say something like:
1 * * * * /bin/sh /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
# ^^^^^^^
/bin/sh being the binary you want to use to execute the script.
Otherwise, you can set execution permissions to the script and add a shell-script header telling it what interpreter to use:
#!/bin/sh
If you do this, adding the path of the binary is not necessary.
From Troubleshooting common issues with cron jobs:
Using relative paths. If your cron job is executing a script of some
kind, you must be sure to use only absolute paths inside that script.
For example, if your script is located at /path/to/script.phpand
you're trying to open a file called file.php in the same directory,
you cannot use a relative path such as fopen(file.php). The file must
be called from its absolute path, like this: fopen(/path/to/file.php).
This is because cron jobs do not necessarily run from the directory in
which the script is located, so all paths must be called specifically.
Also, I understand you want to run this every minute. If so, 1 * * * * won't do. Intead, it will run at every 1st minute past every hour. So if you want to run it every minute, say * * * * *.
It is important to understand "login shell" and "interactive shell" what they means.
login shell: is briefly when you sign in with ssh session and get a terminal window where you can enter shell commands. After login the system executes some files(.bashrc) and sets some environment variables such as the PATH variable for you.
interactive shell :After login on a system, you can startup manually shell terminal(s). The system executes some profile file assigned to your account (.bash_profile, .bash_login,.profile). This files also sets some environment variables and initialize PATH variable for your manually opened shell session.
By OS started shell scripts and cron jobs does not fit in above mentioned way for starting a shell. Therefore no any system scripts(.bashrc) or user profiles are executed. This means our PATH variable is not initialized. Shell commands could not found because PATH variable does not point to right places.
This explains why your script runs successfully if you start it manually but fails when you start it via crontab.
Solution-1:
Use absolute path of every shell command instead of only the command name used in your script file(s).
instead of "awk" use "/usr/bin/awk"
instead of "sed" use "/bin/sed"
Solution-2: Initialize environment variables and especially the PATH variable before executing shell scripts!
method 1, add this header in your dbsync.sh:
#!/bin/bash -l
method 2, add bash -l in your cron file:
1 * * * * bash -l /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
I have a script to backup my database at /home/<user>/bin/dbbackup. The script is executable by all users, and owned by me. The files /etc/cron.allow and /etc/cron.deny do not exist.
In my crontab I have the following lines (including a new blank line after the last line of code):
#reboot /home/<user>/.dropbox-dist/dropboxd
30 2 * * * bash /home/<user>/bin/dbbackup
However, cron is not running my dbbackup script. When I run a manual test of the script it works. When I run this test on the command line: * * * * * /bin/echo "cron works" >> ~/file I get the following error:
No command 'dbbackup' found, did you mean:
Command 'dvbackup' from package 'dvbackup' (universe)
Command 'tdbbackup' from package 'tdb-tools' (main)
dbbackup: command not found
My server is running Ubuntu Trusty. Any help please?
As the comments noted, it appears that amiga_os needed remove the reference to bash in the line.
30 2 * * * bash /home/<user>/bin/dbbackup
Should be.
30 2 * * * /home/<user>/bin/dbbackup
I usually just call scripts from their path and use "#!/bin/bash" (or wherever your bash lives) as the first line of the script. It appears the amiga_os had already done this, which is good. I don't like putting sentences into cron because it makes me nervous.
I think it was a path issue as cron executes as the user but does not read the bash profile and therefore does not work exactly like it would under your shell as it might not have access to your $PATH.
I am using Linux Centos to schedule a job.
I have created a shell script file called Im_daily_loads.sh to run the job at 12:42PM everyday.
with the following comands:
#!/bin/sh
42 12 * * * cd $pdi; ./kitchen.sh -file="/opt/kff/software/pdi/5.0.1.A/data- integration/projects/IML/code/stg/IML_Load_Frm_SRC_To_PSA.kjb" -level=Basic > -logfile="/opt/kff/software/pdi/5.0.1.A/data-integration/projects/IML/log/iml_daily_loads.err.log"
Then loaded the file into crontab by using the issuing the following command crontab Im_daily_loads.sh, but my job is not running.
What would be the problem?
Why not just use
crontab -e
as the user you plan to execute the job as, enter the job, save and exit the editor?
Also, it looks like you need to define $pdi in your script. How is crontab supposed to know where your script is located?
first , run a very simple job to be shure crontab works at all.
for example
set > /tmp/crontab_works.log 2>&1
it will write down all variables. so you will see not all variables available in crontab