How to sync two folders between two VMs Linux? - linux

I am newbie on Linux. I have a folder1 on VM1 and folder 2 on VM2. How can I configure as soon as I edit folder1, folder2 changes too.
Thanks.

You could consider rsync (https://linux.die.net/man/1/rsync).
However I am using SCP which is a "secured copy" using my private key like so:
scp -r -i /home/private_key_file what_to_copy.txt /var/projects/some_folder root#123.123.123.123:/var/projects/where_to_copy_to >> log_file_with_results.log 2>&1
So my VM2 is protected with a private key (/home/private_key_file) and I'm using user "root" to login. Hope this helps but that would be the most secure way in my opinion.
I would then run that command in a crontab every minute. For instant sync I don't know how to do so (yet), I hope that 1 minute increments are enough for you?

You can mount the same folder on the host machine as drives/folders on the VM guests. This means that a write in VM1 writes into the folder on the host and will also show up on VM2.
How to do this depends on the tool that you use for virtualisation.

You can try rsync + crontab
rsync /path/to/folder1 username#host:/path/to/folder2
you can set this task on crontab using a short delay of time.
Don't forget to put the respectively keys ssh pk into yours VMs ( /.ssh )
it work easy for me.

Beyond rsync as the other answers have mentioned, you can consider mounting the directory on other VMs using Network File System (NFS). The following article documents the steps to installing and configuring NFS quite well.
If you are new to Linux, I would advise for you to snapshot the VMs before making any changes so that you will be able to revert to an earlier snapshot in the event that something breaks.

Related

How to transfer files from remote to remote via ftp?

I want to transfer files between two servers , files size is aproximately 170GB.
On one server , there is Direct Admin control panel, and on the other one is Cpanel .
I've ftp & ssh access on both servers. I know about scp command on ssh, but as I've tried it and I didn't succeed , I prefer to use ftp commands. Because there were some connection or other errors on ssh , so the transfer progress was stopping and I couldn't resume the progress by skipping already uploaded files. So what should I do?
You can use rsync, it will continue where it stopped.
Go to one of the servers and do:
rsync -avz other.server.com:/path/to/directory /where/to/save
You can omit z option if the data is not compressible.
This is with assumption that the user name on both servers is the same.
If not you will need to add -e 'ssh -l login_name' to the above command.

How to connect to multiple servers to run the same query?

I have 4 servers where we have log files in same pattern. For every serch/query I need to login to all servers one by one and execute the command.
Is it possible to provide some command, so that it will login to all those servers one by one automatically and will fetch the output from each server?
What configuration, settings etc I have to do to make it working.
I am new to Linux Domain.
As suggested in your question comments, there are a number of tools to help you in performing a task on multiple machines. I will add to this list and suggest Ansible. It is designed to perform all of the interactions over ssh, in quite a simple manner, and with very little configuration.
https://github.com/ansible/ansible
If you were to have server-1 and server-2 defined in your ~/.ssh/config file, then the ansible configuration would be as simple as
[myservers]
server-1
server-2
Then to run a command on the group
$ ansible myservers -a uptime
If your servers are called eenie, meanie, minie, and moe, you simply do
for server in eenie meanie minie moe; do
ssh "$server" grep 'intrusion attempt' /var/log/firewall.log
done
The grep command won't reveal from which server it is reporting a result; maybe replace it with ssh "$server" sed -n "/intrusion attempt/s/^/$server: /p" /var/log/firewall.log
Use https://sealion.com. You just have to execute one script and it will install the agent in your servers and start collecting output. It has a convenient web interface to see the output across all your servers.

Shell: Get The Data From Remote Host And Excute Some Other Commands

I need to create a shell script to do this:
ssh to another remote host
use sqlplus on that host and spool command to get the data from oracle db into a file
transfer the file from that host to my host
excute another shell script to process the data file
I have finished the 4th step shell script. Now I have to do this 4 steps one by one. I want to create a script and do them all. Is that possible? How to transfer the data from one host to my host?
I think maybe the db file is not necessary.
Note: I have to ssh to another host to use sqlplus. It is the only one host which have the permission to access database.
# steps 1 and 2
ssh remote_user#remote_host 'sqlplus db_user/db_pass#db #sql_script_that_spools'
# step 3
scp remote_user#remote_host:/path/to/spool_file local_file
# step 4
process local_file
Or
# steps 1, 2 and 3
ssh remote_user#remote_host 'sqlplus db_user/db_pass#db #sql_script_no_spool' > local_file
# step 4
process local_file
Or, all in one:
ssh remote_user#remote_host 'sqlplus db_user/db_pass#db #sql_script_no_spool' |
process_stdin
Well Glenn pretty much summed it all up.
In order to make your life easier, you may want to also consider setting up passwordless ssh. There is a slightly higher security risk associated with this, but in many cases the risk is negligible.
Here is a link to a good tutorial. It is a debian based tutorial, but the commands given should work the same on most major linux distros.

Alternative to scp, transferring files between linux machines by opening parallel connections

Is there an alternative to scp, to transfer a large file from one machine to another machine by opening parallel connections and also able to pause and resume the download.
Please don't transfer this to severfault.com. I am not a system administrator. I am a developer trying to transfer past database dumps between backup hosts and servers.
Thank you
You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.
# on local host
split -b 1M large.file large.file. # split into 1MiB chunks
for f in large.file.*; do scp $f remote_host: & done
# on remote host
cat large.file.* > large.file
Take a look at rsync to see if it will meet your needs.
The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.
Similar to Mike K's answer, check out https://code.google.com/p/scp-tsunami/ - it handles splitting the file, starting several scp processes to copy the parts and then joins them again...it can also copy to multiple hosts...
./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host
That splits the file into 10MB chunks and copies them using 9 scp processes...
The program you are after is lftp. It supports sftp and parallel transfers using its pget command. It is available under Ubuntu (sudo apt-get install lftp) and you can read a review of it here:
http://www.cyberciti.biz/tips/linux-unix-download-accelerator.html

Automated website folder backup system needed? Any recommendations?

Hi guys is there any backup software that can take periodic backups of online website folders and store them offline on a local system. Need something robust and would be nice if theres something free that can do the job :)
Thanks for the links - I have ftp access and its my website and its a bit of a documents sharing website with user uploads and I would like to maintain a backup of teh files uploaded from time to time on the website on a periodic basis. Just want to automate this process. My local system is windows based though.
If you are referring to a website that will be accessed by you from your browser (rather than as the administrator of the site) you should check out WGet. And, if you need to use WGet from a Windows system, checkout Cygwin
If you have access to the webserver, a cronjob which emails or ftps out the archive would do the job.
If you don't have shell access at the site, you can use wget:
#!/bin/bash
export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
wget -m -np -P $BCKDIR http://www.example.com/path/to/dir
wget options:
-m - Mirror everything, follow links
-np - Don't access parent directories (avoids downloading the whole site)
-P - Store files below $BCKDIR
If you have shell access, you can use rsync. One way to do it, is to have this loop running in a screen(1) session with automatic login using ssh-agent:
#!/bin/bash
while :; do
export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
rsync -az user#hostname:/path/to/dir $BCKDIR
sleep 86400 # Sleep 24 hours
done
Not sure what OS you're using, but this should run fine under *NIX. And for MS Windows, there's Cygwin.

Resources