Shell script in cron.daily does not work - linux

I want to backup my database daily automatically, so I made a shell script, and then put it in cron.daily folder in Ubuntu 12.
The script is not complicated,
#!/bin/sh
DIR=`date +%m%d%y`
DEST=/db_backups/$DIR
mkdir $DEST
mongodump -d myapp -o $DEST
this script works well when I run manually like ./automongobackup.sh then It make a backup file in proper location. So I expected If I put it in cron.daily, the backup database will generated automatically, But I checked backup folder today the folder was empty and realize something wrong.
Should I set a another option? The chmod is 755. I attached some screenshots, The first one is my ls-l in cron.daily and second is script. Any missing I did?

Try renaming your script to 'automongobackup' rather than 'automongobackup.sh' as run-parts which handles the crons in cron.daily, and cron.hourly etc doesn't like fullstops/periods in the filename.
Reference: https://askubuntu.com/questions/611336/why-putting-a-script-in-etc-cron-hourly-is-not-working

Related

Copy Linux files to another location

We have a linux server and for some transactions it is keeping the log files only for the last 10 days. After than the file gets deleted. I want to copy these files to another location using a script. I searched google but couldn't satisfactory result. I'm new to Linux also.
Can someone please guide me if this can be achieved and how ?
You can use the previous answer by nissim abehcera in a sh script:
cp -R SOURCE_DIRECTORY DESTINATION_DIRECTORY
Just paste the bash commands in a text file, name it file.sh and make sure it is executable:
chmod +x file.sh
You can just run the script and it will do whatever you wrote in there.

Directory structure variations in bash script

I have a shell script myautoappupgrade.sh where I automate a process of application upgrade. The script has to be run on few different servers. Unfortunately, the application is located in slightly different directory on each server - the number for parent directory varies between 1-20. How I can modify the script, so that the directory can be replaced by some sort of variable? I don't want to edit the script for each server because there are many directory queries in the automation script.
example:
cd /ae1/apps/myapp/upgradefiles/
unzip file.zip
./install.sh
the directory slightly changes on another server:
cd /ae2/apps/myapp/upgradefiles/
unzip file.zip
./install.sh
and another..
cd /ae3/apps/myapp/upgradefiles/
unzip file.zip
./install.sh
Try something like this:
#!/bin/bash
num=$1
cd /ae${num}/apps/myapp/upgradefiles/file.zip
unzip file.zip
./install.sh
Then call the script with the number as first argument:
myautoappupgrade.sh 1
The simple and obvious solution is to not hard-code the directory at all. Modify the script so it accepts the parent directory as an argument, or just cd into the parent directory before running the script.
Perhaps something like this:
while read server dir; do
ssh "$server" "cd '$dir' && unzip apps/myapp/upgradefiles/file.zip/file.zip && ./install.sh"
done <<\:
ernie /ae1
bert /ae2
cookiemonster /home/cmonster/anN
:
It would probably be even better if you unzipped into a temporary directory, but hopefully this should get you moving in the right direction.
Of course, if you can be sure that /ae[0-1] is always there and there is only one match,
cd /ae[0-9]/apps/myapp/upgradefiles/file.zip
would do what you are asking.
(Do you really have a file named file.zip inside a directory also named file.zip? I'm guessing actually take away the file.zip from the end of the cd path.)
By simply using:
cd /ae*/apps/myapp/upgradefiles/
The * will expand any character.

How to create a shell script

I am trying to create a shell script to remove certain files from a directory. How would I be able to achieve this?
Can I write the standard commands in a script as follows:
#!/bin/sh
rm -f /directory/of/file/file1.txt
rm -f /directory/of/file/file2.txt
rm -f /directory/of/file/file3.txt
rm -f /directory/of/file/file4.txt
Or is there a specific way to delete files in a shell script.
This is my first question here, so please bear with me as I do not know all the rules.
Thanks in advance :)
Edit:
Thanks for all the answers in a short matter of time, I really appreciate it.
Forgot to mention this will executed by root cron (crontab -e) every Tuesday and Friday # 5PM.
Do I still need to chmod +x the file if root is executing the file?
Your question can split into a few points:
You can use those commands to delete the specific files (if you have the permissions)
Make sure you add running permissions to the shell script file (that is used to perform the rm commands) by using: chmod +x file_name.sh
In order to delete the folder contents and not the folder itself the command should be: rm -r /path/to/dir/*
Yes you can. However if you don't have the permission to delete the files then you may get error on the statement. Try to handle that error and you are good to go

How do you run bash script as a command?

I have a bash script, which I use for configuration of different parameters in text files in my wireless access media server.
The script is located in one directory, and because I do all of configurations using putty, I have to either use the full path of the file or move to the directory that contains the file. I would like to avoid this.
Is it possible to save the bash script in or edit the bash script so that I can run it as command, for example as cp or ls commands?
The script needs to be executable, with:
chmod +x scriptname
(or similar).
Also, you want the script to be located in a directory that is in your PATH.
To see your PATH use:
echo $PATH
Your choices are: to move (or link) the file into one of those directories, or to add the directory it is in to your PATH.
You can add a directory to your PATH with:
PATH=$PATH:/name/of/my/directory
and if you do this in the file $HOME/.bashrc it will happen for each of your shell's automatically.
You can place a softlink to the script under /usr/local/bin (Should be in $PATH like John said)
ln -s /path/to/script /usr/local/bin/scriptname
This should do the trick.
You can write a minimal wrapper in your home directory:
#!/bin/bash
exec /yourpath/yourfile.extension
And run your child script with this command ./NameOfYourScript
update: Unix hawks will probably say the first solution is a no-brainer because of the additional admin work it will load on you. Agreed, but on your requirements, my solution works :)
Otherwise, you can use an alias; you will have to amend your .bashrc
alias menu='bash /yourpath/menuScript.sh'
Another way is to run it with:
/bin/bash /path/to/script
Then the file doesn't need to be executable.

tar file not archiving

I am doing the following in a shell script:
tar cvzf mytar.tgz *
It works fine when I run the shell script from a terminal. When it runs the shell script from a cron job using crontab it looks like it is archived because the tgz file is there but the filesize is nothing and when I untar it there is nothing there. However, when I run the shell script via terminal the tgz has a larger filesize and I can untar them.
Anyone know why it won't work via the cronjob?
Try specifying the complete path to the files you want to archive:
tar cvzf mytar.tgz /path/to/your/files/*
Cron runs from a different directory from your $HOME.
What's the working directory of the cronjob process? If there's nothing in it, then the command will archive all of the nothing.
First, no need to be verbose in a cron.
Second, it looks like you are using relative pathing there. Consider using absolute paths, even for the tar command itself.
Last, which user is running the cron? Is there a potential for a permissions issue or a quota issue?
The other answers so far give good advice. Cron has a lot of special rules wrt what is allowed in the command. I have he most success when I make a simple shell script, and put it in $HOME/cron, chmox 755 it and put the full path to it in cron. Making sure to test the script - ensuring to cd'ing as necessary. Be aware that cron not only won't necessarily run the command from your home, but it will also likely have a different PATH and other environment settings will be missing.

Resources