I have written a cron job for dumping my mysql database but it's not working.
Here it is.
/usr/bin/mysqldump -h localhost --user=user --password=mypass --databases dbname | gzip > /home/username/site_dir_name/uploads/backup.sql.gz
I am using cpanel.
Try this
mysqldump -u{user name without space} -p{password without space} Database_name | gzip -9 > /home/backup_$(date +"\%Y.\%m.\%d.\%S.\%N").sql.gz 2> /home/db.log
add this line into cron job.
Related
I am executing below command to remotely disable and enable cron job on linux server but command not disabling cron job on remote server.Although it's working when trying on same server.
ssh mysql#$pas_ser_name "crontab -l | sed "/^[^#].*$job_name/s/^/#/" | crontab -"
Could anyone help on same.I want to enable and disable linux job remotely with job name.
crontab -l | grep -v 'wget php -q http://www.example.com/event_reminder.php' | crontab -
crontab -l lists the current crontab jobs
grep -v filter some line
crontab - adds all the printed stuff into the crontab file.
you can use this, I hop it will help you.
I have a problem with the problem with syntax of a cronjob in plesk 12.5 , I use the following syntax in plesk 12.0 and it runs fine, however in plesk 12.5 it won't work.
mysqldump -u user -ppassword database | gzip > /var/www/vhosts/domain.com/backupmysql/backup$( date +"\%Y_\%m_\%d_\%H_\%M" ).sql.gz
can anybody help me with the correct syntac for plesk 12.5
Try to use
/usr/bin/mysqldump -u user -ppassword database | gzip > /var/www/vhosts/domain.com/backupmysql/backup$( date +"\%Y_\%m_\%d_\%H_\%M" ).sql.gz
At least command like
/usr/bin/mysqldump -uadmin -pcat /etc/psa/.psa.shadow psa | gzip > /var/www/vhosts/domain.com/backup$( date +"\%Y_\%m_\%d_\%H_\%M" ).sql.gz
works like a charm.
date: command not found -: date: command not found
Make sure that you have /bin/date installed, it is part of coreutils package.
I am having an issue that I can not find any information for while doing an extensive google search.
I have a linux cron, running via crontab, that works great until I try to add a variable date to the title of the file. BUT.. When I run the same command outside the cron, just from the command line, it works fine.. Also, the cron does work if I take out the date part.
Command line code that works:
sudo mysqldump -h mysql.url.com -u user -pPassword intravet sites | gzip > /mnt/disk2/database_`date '+%m-%d-%Y'`.sql.gz
Cron that works:
15 2 * * * root mysqldump -h mysql.url.com -u user -pPassword intravet sites | gzip > /mnt/disk2/database.sql.gz
Cron that DOESN'T work:
15 2 * * * root mysqldump -h mysql.url.com -u user -pPassword intravet sites | gzip > /mnt/disk2/database_`date '+%m-%d-%Y'`.sql.gz
I am not understanding why I can not use the date function while inside a cron?
Everything I find says I can, but in practice, I can not.
Server details:
Ubuntu 12.04.5
Thank you for any insight.
You just need to use escaping % sign
* * * * * touch /tmp/foo_`date '+\%m-\%d-\%Y'`.txt
Result:
[root#linux tmp]# ls -l /tmp/foo_*
-rw-r--r-- 1 root root 0 Apr 18 02:17 /tmp/foo_04-18-2015.txt
Try replacing the backticks with $() and escaping your %s, such as:
15 2 * * * root mysqldump -h mysql.url.com -u user -pPassword intravet sites | gzip > /mnt/disk2/database_$(date '+\%m-\%d-\%Y').sql.gz
I only mention removing the backticks because you will end up having all kinds of escaping problems later in your coding endeavours. Stick with using $() for command substitution.
I am trying to execute the CQL commands from shell script.
I am able to connect to the cqlsh (CQL version i'm using is 1.1.18) but unable to send the queries to cql.
Any ideas or suggestion how to proceed on this?
Do I need to connect to Cassandra and execute few commands (select/update ) with shell script ??
cqlsh -e "select * from ks.table limit 1;" > ~/output
I'm not sure about Cassandra 1.1.18, but you should be able to accomplish this with the -f flag of cqlsh. Let's say have a file of CQL commands called "commands.cql". I can invoke those commands against my local Cassandra instance like this:
$ cqlsh -f commands.cql -u myusername -p mypassword localhost
If I wanted to invoke that from within a Bash script, the script's code would look something like this:
#!/bin/bash
cqlsh -f commands.cql -u myusername -p mypassword localhost
Save that as an executable file, and run it like any other.
Need to connect to cassandra and execute few commands (select / update ) with shell script
You can execute your commands with shell script in next way:
echo "some QUERY; exit" | cqlsh CASSANDRA_HOST -u 'USER' -p 'PASS'
The "exit" command in the last suggestion is a bit hacky.
I would propose using xargs with cqlsh -e.
echo "some QUERY;" | xargs cqlsh CASSANDRA_HOST -u 'USER' -p 'PASS' -e
I recently had to use this approach when working with docker, because clqsh -f was not an option (too complex to configure access to the file needed).
echo "some QUERY;" | xargs cqlsh CASSANDRA_HOST -u 'USER' -p 'PASS' -e
But what if you Cassandra instance is on a different server to where the shell script is being executed? (Specifically in StreamSets - wouldn't the above require Cassandra installed on the same server such that it has access to the cqlsh lib?)
I'm using the following command
mysqldump -hHOST -uUSERNAME -pPASSWORD DATABASE > file.sql
Would like a separate job created each time when cron job is executed and not overwrite existing file.
i'm using shell scrip for solve this problem on my server. This is the code if you wan to try. Create file .sh, Ex: backupdb.sh
#!/bin/bash
NOW=$(date +"%m%d%Y")
FILE="backupdb.$NOW.tgz"
SQL="backupdb.$NOW.sql"
DBNAME='live' #name of database
DUMP='/usr/bin/mysqldump'
USER='root' #user
HOST=10.0.0.8 #Ip database
$DUMP -h $HOST -u $USER -pYOURPASSWORD --routines --events $DBNAME > /home/backupdb/$SQL
gzip -9 /home/backupdb/$SQL
echo "BACKUP DATABASES FROM MYSQL MASTER SUCCESS $SQL"|mail -s "BACKUP DATABASES" your#email.com #you can chang this email with your personal email
and save.
Don't forget to change permission,
chmod +x backupdb.sh
Add this code in cron job
00 04 * * * sh /home/backupdb/backupdb.sh
Save
That's it.
Sorry if my English is not good