Linux script backing up data and copying files to another server - linux

I want to write a Cent OS script backing up data and copying files to another server every day.
I want to set up a script that would dump the DB and then, once that is done, copy the dump file to another server.
As I understand, I need to set up a file that would list those commands and then add it into crontab.
Where I'm stuck is how to write that file, as I'm not familiar with Linux server commands. Would it be something like that below? What could I fix?
#!/bin/sh
backupscript -r ~/path/to/db ~/path/to/backup
sshpass -f "/path/to/passwordfile" scp -r /some/local/path user#example.com:/some/remote/path
But how will scp know when to run after backupscript is over?

You can use this type of script to backup and scp. i hope this will help you to create script.
DB backup script
#!/bin/bash
mysql -uroot -proot#123 dbname > /opt/db_dumps/dbname.sql
SCP script
#!/bin/bash
scp /opt/backups/dbname.sql root#10.200.172.46:/opt/db_dumps/
ssh root#10.200.172.46 /opt/db_dumps/dbname.sql

Related

How to use linux shell command to upload entire directory to ftp server?

I'm writing continuous integration script for Android,once the build task finished, I want to use a shell script to upload the entire output directory to ftp server.
I wonder is there has shell command like ftp or curl can do this, I searched a lots on the Google, but nothing found. Can anyone give me a favor? Thanks a lot.
Looks like a job for lftp like this:
lftp -u login,passwd -e "mirror --reverse /my/from/dir/ /ftp/target/dir/" <ftp.server>

Execute shell script in remote machine using ssh command with config file

I want to execute a shell script in remote machine and i achieved this using the below command,
ssh user#remote_machine "bash -s" < /usr/test.sh
The shell script executed properly in the remote machine. Now i have made some changes in script to get some values from the config file. The script contains the below lines,
#!bin/bash
source /usr/property.config
echo "testName"
property.config :
testName=xxx
testPwd=yyy
Now if i run the shell script in remote machine, i am getting no such file error since /usr/property.config will not be available in remote machine.
How to pass the config file along with the shell script to be executed in remote machine ?
Only way you can reference to your config file that you created and still run your script is you need to put the config file at the required path there are two ways to do it.
If config is almost always fixed and you need not to change it, make the config locally on the host machine where you need to run the script then put the absolute path to the config file in your script and make sure the user running the script has permission to access it.
If need to ship your config file every time you want to run that script, then may just simply scp the file before you send and call the script.
scp property.config user#remote_machine:/usr/property.config
ssh user#remote_machine "bash -s" < /usr/test.sh
Edit
as per request if you want to forcefully do it in a single line this is how it can be done:
property.config
testName=xxx
testPwd=yyy
test.sh
#!bin/bash
#do not use this line source /usr/property.config
echo "$testName"
Now you can run your command as John has suggested:
ssh user#remote_machine "bash -s" < <(cat /usr/property.config /usr/test.sh)
Try this:
ssh user#remote_machine "bash -s" < <(cat /usr/property.config /usr/test.sh)
Then your script should not source the config internally.
Second option, if all you need to pass are environment variables:
There are a few techniques described here: https://superuser.com/questions/48783/how-can-i-pass-an-environment-variable-through-an-ssh-command
My favorite one is perhaps the simplest:
ssh user#remote_machine VAR1=val1 VAR2=val2 bash -s < /usr/test.sh
This of course means you'll need to build up the environment variable assignments from your local config file, but hopefully that's straightforward.

linux or windows shell script to upload a file and copy it to another server

I am struggling with a problem. I have:
ServerA (is closer to me and much faster)
ServerB (is my website where I want the final file to stay)
so what I want to do with the shell script (either Windows batch or linux, I have cygwin installed) is, passing the filename as parameter:
1) upload with FTP a file to ServerA
2) login with ssh on serverB and wget the file from serverA
I managed to do 1 with a shell script, but I don't understand how to do step2 in the shell?
Thanks
I would recommend using scp to accomplish step 2. You can use the syntax:
scp path/to/file serverb#hostname:/path/to/destination.
You can read more about the syntax for scp here: http://www.hypexr.org/linux_scp_help.php
You could use TeraTerm which has a powerful scripting language to automate both tasks.

sh script with scp variables

I am trying to writing a simple script which will figure out the latest version of a file by the filename and then download that file to the local computer.
What I cant figure out is why my code will work in the shell, but not work when I run it as a script. I am also running my script on cygwin, not sure if that will make a difference.
Here is the script
#!/bin/sh
x=$(ssh user#hostname 'ls -r -t /vgf/day1*.gif | tail -1')
echo $x
scp user#hostname:"${x}" /images/day1.gif
x is correctly assigned, but when I get to the scp command I receive something along the lines of
: No such file or directoryif
However if I run the scp command in the shell it will work
$ sh download.sh
/vgf/day1.gif
: No such file or directoryif
$ scp user#hostname:"${x}" /images/day1.gif
day1.gif 100% 22KB 22.1KB/s 00:00
I would be open to different solutions. If I could prevent the version increasing via some linux administration, I may follow that route, although I am still wondering what the problem is here.
By version, I mean day1_001.gif and the new version becoming day1_002.gif and so on. So when the file saves a file day1.gif, it will overwrite the original without creating another version.

shell script, for loop, ssh and alias

I'm trying to do something like this, I need to take backup from 4 blades, and
all should be stored under the /home/backup/esa location, which contains 4
directories with the name of the nodes (like sc-1, sc-2, pl-1, pl-2). Each
directory should contain respective node's backup information.
But I see that "from which node I execute the command, only that data is being
copied to all 4 directories". any idea why this happens? My script is like this:
for node in $(grep "^node" /cluster/etc/cluster.conf | awk '{print $4}');
do echo "Creating backup fornode ${node}";
ssh $node source /etc/profile.d/bkUp.sh;
asBackup -b /home/backup/esa/${node};
done
Your problem is this piece of the code:
ssh $node source /etc/profile.d/bkUp.sh;
asBackup -b /home/backup/esa/${node};
It does:
Create a remote shell on $node
Execute the command source /etc/profile.d/bkUp.sh in the remote shell
Close the remote shell and forget about anything done in that shell!!
Run asBackup on the local host.
This is not what you want. Change it to:
ssh "$node" "source /etc/profile.d/bkUp.sh; asBackup -b '/home/backup/esa/${node}'"
This does:
Create a remote shell on $node
Execute the command(s) source /etc/profile.d/bkUp.sh; asBackup -b '/home/backup/esa/${node}' on the remote host
Make sure that /home/backup/esa/${node} is a NFS mount (otherwise, the files will only be backed up in a directory on the remote host).
Note that /etc/profile is a very bad place for backup scripts (or their config). Consider moving the setup/config to /home/backup/esa which is (or should be) shared between all nodes of the cluster, so changing it in one place updates it everywhere at once.
Also note the usage of quotes: The single and double quotes make sure that spaces in the variable node won't cause unexpected problems. Sure, it's very unlikely that there will be spaces in "$node" but if there are, the error message will mislead you.
So always quote properly.
The formatting of your question is a bit confusing, but it looks as if you have a quoting problem. If you do
ssh $node source /etc/profile.d/bkUp.sh; esaBackup -b /home/backup/esa/${node}
then the command source is executed on $node. After the command finishes, the remote connection is closed and with it, the shell that contains the result of sourcing /etc/profile.d/bkUp.sh. Now esaBackup command is run on the local machine. It won't see anything that you keep in `bkUp.sh
What you need to do is put quotes around all the commands you want the remote shell to run -- something like
ssh $node "source /etc/profile.d/bkUp.sh; esaBackup -b /home/backup/esa/${node}"
That will make ssh run the full list of commands on the remote node.

Resources