Can anyone please let me know if there are any ways that we can monitor a directory in centOS (linux) which is present in different server and when a new file arrives in that directory I need to copy that file to my server.
One way would be to have rync command running periodically on cron job. rysnc is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending
only the differences between the source files and the existing files in
the destination.
If you want to run transfer from remote server to local server like every 10 minutes you add a line like below in your crontab. It also does few other things:
Zip the files
Transfer over ssh
Logs output
Note: You will have to do ssh key exchange to avoid the password prompt.
*/10 * * * * /usr/bin/rsync -zrvh -e ssh root#<remoteIP>:<remoteDir> >/var/log/rsync_my.log 2>&1
Related
I m using putty on windows server to login into remote server. I need to monitor some jobs on that remote linux box. I need some script or binary file, that will send me notification on windows server/pc as soon as job fails on remote server.
Notify-send is not working there. I m using redhat linux.
You can set a cron job in your Linux machine, which will will mail you if any failure occurs or any detail you needed. Example, I am using this service to monitor dump copy process.
This script will take backup and after completion notify me on my mail.
#!/bin/bash
date=`date -d '1 hour ago' "+%Y-%m-%d-%H"`
#/usr/bin/svnadmin dump /abc/xyz/ > /home/server1/backup/dump_$date.dump
/usr/bin/svnadmin dump /abc/xyz/ > /root/svn/dump_$date.dump
mail -s "SVN DUMP" abc#xyz.com < /opt/body.txt
Here, the body.txt file will contain the mail body.
cron will execute this file as below:
1 1 * * * sh dump.sh
This may help.
I want to transfer files between two servers , files size is aproximately 170GB.
On one server , there is Direct Admin control panel, and on the other one is Cpanel .
I've ftp & ssh access on both servers. I know about scp command on ssh, but as I've tried it and I didn't succeed , I prefer to use ftp commands. Because there were some connection or other errors on ssh , so the transfer progress was stopping and I couldn't resume the progress by skipping already uploaded files. So what should I do?
You can use rsync, it will continue where it stopped.
Go to one of the servers and do:
rsync -avz other.server.com:/path/to/directory /where/to/save
You can omit z option if the data is not compressible.
This is with assumption that the user name on both servers is the same.
If not you will need to add -e 'ssh -l login_name' to the above command.
What is the best and secure way for a terminal to ping a server for a list of commands to execute every 60 secs? For example, it could download a file (that houses the command) or query a database and then execute what is on there.
Are there more efficient/secure ways to accomplish the above?
Thanks
If you want to make it into a script:
commands.ssh
echo "This will run on the remote machine."
# Do a backup or something...
Then you can execute pass this file to the remote machine using:
ssh user#remote -i id_rsa < commands.ssh
I recommend using an sshkey so that you don't have to keep your login information in the commands file.
Note: make sure the permissions for the commands.ssh file are secure!
chmod 600 commands.ssh
You can use SSH connections which are SSL enabled. If commands are predefined you can depend on a cron job, then you don't need to login to terminal again and again to run it.
there are 4 files which will be generated each day in a Linux server. the files has to be sent daily via connect direct to another server which is in unix.
eg..ABC_1JUNE.txt, BCD_1JUNE.txt, CDE_1JUNE.txt, DEF_1JUNE.txt
how to do this in shell script...
To schedule daily jobs on a UNIX-like system you can usually do that with cron. Create a script for that job in the /etc/cron.daily directory on the Linux server and have the cron daemon run it automatically. The script should simply contain the commands to be run. In this case it could look something like this:
#!/usr/bin/env bash
source=<local-dir>
destination=<remote-server>:<remote-dir>
suffix=1JUNE.txt
for file in {ABC,BCD,CDE,DEF}_${suffix}; do
scp "$source/$file" "$destination"
done
This assumes there is a SSH daemon running on the remote server that you can connect to with scp. Replace the values for source and desination to match your real server name and file structures. The source here could also be a remote server.
mount the target server shared folder in the source server folder (or vice versa) and copy the files there using cp command.
For filesystem mount :
http://linux.about.com/od/commands/l/blcmdl8_mount.htm
Remote mounting
I'm trying to set up mysql database backups with cron in order to backup the mysql database to my local NAS storage. I would like to store the command(s) in the .sh file on the server and then use cron to execute it.
Up to now I've managed to get the command to save the database to my NAS (QNAP) from the remote server, which is:
mysqldump
--add-drop-table
--comments [database_name]
-u [database_username]
-p[database_password] |
gzip -c |
ssh [nas_user]#[nas_ip_address]
"cat > /share/mysqlBackup/backup-`date +%Y-%m-%d_%H-%M-%S`.sql.gz"
The above works fine, but the problems I have are:
I'm not sure how to create the .sh file on the remote server and put
the command in
This command asks for the password each time you
execute it - is there a way to put it in the .sh file so that i can
be executed in the background without prompting for it / or define
the password in the command?
Examples of how to solve the above would be very welcome.
I believe that the expect() dialog could be used, but again - I'm not familiar with it and its documentation is a bit confusing for me.
I guess password is asked for ssh connection, so you can make your ssh connection passwordless.
Here in the answer passwordless ssh connection is explained:
https://serverfault.com/questions/241588/how-to-automate-ssh-login-with-password
After this step is done on your remote server rest is pretty straightforward you write the command you give above in an .sh file. And add it to cron to do this backing up periodically.