there are 4 files which will be generated each day in a Linux server. the files has to be sent daily via connect direct to another server which is in unix.
eg..ABC_1JUNE.txt, BCD_1JUNE.txt, CDE_1JUNE.txt, DEF_1JUNE.txt
how to do this in shell script...
To schedule daily jobs on a UNIX-like system you can usually do that with cron. Create a script for that job in the /etc/cron.daily directory on the Linux server and have the cron daemon run it automatically. The script should simply contain the commands to be run. In this case it could look something like this:
#!/usr/bin/env bash
source=<local-dir>
destination=<remote-server>:<remote-dir>
suffix=1JUNE.txt
for file in {ABC,BCD,CDE,DEF}_${suffix}; do
scp "$source/$file" "$destination"
done
This assumes there is a SSH daemon running on the remote server that you can connect to with scp. Replace the values for source and desination to match your real server name and file structures. The source here could also be a remote server.
mount the target server shared folder in the source server folder (or vice versa) and copy the files there using cp command.
For filesystem mount :
http://linux.about.com/od/commands/l/blcmdl8_mount.htm
Remote mounting
Related
I want to transfer files between two servers , files size is aproximately 170GB.
On one server , there is Direct Admin control panel, and on the other one is Cpanel .
I've ftp & ssh access on both servers. I know about scp command on ssh, but as I've tried it and I didn't succeed , I prefer to use ftp commands. Because there were some connection or other errors on ssh , so the transfer progress was stopping and I couldn't resume the progress by skipping already uploaded files. So what should I do?
You can use rsync, it will continue where it stopped.
Go to one of the servers and do:
rsync -avz other.server.com:/path/to/directory /where/to/save
You can omit z option if the data is not compressible.
This is with assumption that the user name on both servers is the same.
If not you will need to add -e 'ssh -l login_name' to the above command.
Wondering if it is possible to write a shell script like this, and if possible, any reference/sample implementation I can refer to? Thanks.
Step 1, scp a local file on a local box to a remote box ABC, may need
input password or we can hard code password into script possible?
Step 2, remote execute a script on box ABC, and leverage the file
uploaded in Step 1
Step 3, the output of Step 2 (which is on
console/stdout) is redirected to local box.
I tried this:
scp ~/Downloads/data/1.dat root#host:/root/data /root/exercise/test /root/data/1.dat
I understand that you want to copy a file to a remote machine, run a command there with that file as an argument and get the output on your local machine. Apparently, you need the test program, which is on the remote machine.
Your try takes you halfway there. You could do it as follows:
scp ~/Downloads/data/1.dat root#host:/root/data
ssh root#host '/root/exercise/test /root/data/1.dat'
The first command copies your file, the second runs the command on the remote machine. Depending on the test command, you can new get some output file back to your local machine:
scp root#host:/root/results/outputfile .
Or, if the command writes to standard out, you could redirect the output to a file on the remote machine by appending > /root/results/outputfile to the ssh command and then scp it back to your local machine.
You can execute commands using ssh, for example:
$ ssh user#host ls -la
will connect to host host as user, and after successful authorization execute ls -la command, presenting the output locally. After command finishes connection will be closed.
I have a web application that generates a shell script with commands running on a server.
Then I also have another Linux server where the script should be executed.
I was wondering if someone could point me in the right direction in terms of how I could send a shell script from one server to another linux server and execute it on the second server?
You can use scp to transfer the file over
scp <source_file> <destination>
if your destination is the host in question:
scp myfile.sh username#x.x.x.x:/path/to/new/script.sh
For executing on the server, you have various options. You can use a cron job to execute it periodically. You can use rc.local to execute at startup. You can use ssh.
Lets take SSH as an example:
ssh username#x.x.x.x 'sh /path/to/script.sh'
The above ssh command will run myfile.txt on the server.
for linux machines easiest way is
ssh root#MachineB 'bash -s' < local_script.sh
as explained in Jason R. Coombs's answer
Can anyone please let me know if there are any ways that we can monitor a directory in centOS (linux) which is present in different server and when a new file arrives in that directory I need to copy that file to my server.
One way would be to have rync command running periodically on cron job. rysnc is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending
only the differences between the source files and the existing files in
the destination.
If you want to run transfer from remote server to local server like every 10 minutes you add a line like below in your crontab. It also does few other things:
Zip the files
Transfer over ssh
Logs output
Note: You will have to do ssh key exchange to avoid the password prompt.
*/10 * * * * /usr/bin/rsync -zrvh -e ssh root#<remoteIP>:<remoteDir> >/var/log/rsync_my.log 2>&1
I'm trying to set up mysql database backups with cron in order to backup the mysql database to my local NAS storage. I would like to store the command(s) in the .sh file on the server and then use cron to execute it.
Up to now I've managed to get the command to save the database to my NAS (QNAP) from the remote server, which is:
mysqldump
--add-drop-table
--comments [database_name]
-u [database_username]
-p[database_password] |
gzip -c |
ssh [nas_user]#[nas_ip_address]
"cat > /share/mysqlBackup/backup-`date +%Y-%m-%d_%H-%M-%S`.sql.gz"
The above works fine, but the problems I have are:
I'm not sure how to create the .sh file on the remote server and put
the command in
This command asks for the password each time you
execute it - is there a way to put it in the .sh file so that i can
be executed in the background without prompting for it / or define
the password in the command?
Examples of how to solve the above would be very welcome.
I believe that the expect() dialog could be used, but again - I'm not familiar with it and its documentation is a bit confusing for me.
I guess password is asked for ssh connection, so you can make your ssh connection passwordless.
Here in the answer passwordless ssh connection is explained:
https://serverfault.com/questions/241588/how-to-automate-ssh-login-with-password
After this step is done on your remote server rest is pretty straightforward you write the command you give above in an .sh file. And add it to cron to do this backing up periodically.