Automated website folder backup system needed? Any recommendations? - web

Hi guys is there any backup software that can take periodic backups of online website folders and store them offline on a local system. Need something robust and would be nice if theres something free that can do the job :)
Thanks for the links - I have ftp access and its my website and its a bit of a documents sharing website with user uploads and I would like to maintain a backup of teh files uploaded from time to time on the website on a periodic basis. Just want to automate this process. My local system is windows based though.

If you are referring to a website that will be accessed by you from your browser (rather than as the administrator of the site) you should check out WGet. And, if you need to use WGet from a Windows system, checkout Cygwin

If you have access to the webserver, a cronjob which emails or ftps out the archive would do the job.

If you don't have shell access at the site, you can use wget:
#!/bin/bash
export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
wget -m -np -P $BCKDIR http://www.example.com/path/to/dir
wget options:
-m - Mirror everything, follow links
-np - Don't access parent directories (avoids downloading the whole site)
-P - Store files below $BCKDIR
If you have shell access, you can use rsync. One way to do it, is to have this loop running in a screen(1) session with automatic login using ssh-agent:
#!/bin/bash
while :; do
export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
rsync -az user#hostname:/path/to/dir $BCKDIR
sleep 86400 # Sleep 24 hours
done
Not sure what OS you're using, but this should run fine under *NIX. And for MS Windows, there's Cygwin.

Related

How to sync two folders between two VMs Linux?

I am newbie on Linux. I have a folder1 on VM1 and folder 2 on VM2. How can I configure as soon as I edit folder1, folder2 changes too.
Thanks.
You could consider rsync (https://linux.die.net/man/1/rsync).
However I am using SCP which is a "secured copy" using my private key like so:
scp -r -i /home/private_key_file what_to_copy.txt /var/projects/some_folder root#123.123.123.123:/var/projects/where_to_copy_to >> log_file_with_results.log 2>&1
So my VM2 is protected with a private key (/home/private_key_file) and I'm using user "root" to login. Hope this helps but that would be the most secure way in my opinion.
I would then run that command in a crontab every minute. For instant sync I don't know how to do so (yet), I hope that 1 minute increments are enough for you?
You can mount the same folder on the host machine as drives/folders on the VM guests. This means that a write in VM1 writes into the folder on the host and will also show up on VM2.
How to do this depends on the tool that you use for virtualisation.
You can try rsync + crontab
rsync /path/to/folder1 username#host:/path/to/folder2
you can set this task on crontab using a short delay of time.
Don't forget to put the respectively keys ssh pk into yours VMs ( /.ssh )
it work easy for me.
Beyond rsync as the other answers have mentioned, you can consider mounting the directory on other VMs using Network File System (NFS). The following article documents the steps to installing and configuring NFS quite well.
If you are new to Linux, I would advise for you to snapshot the VMs before making any changes so that you will be able to revert to an earlier snapshot in the event that something breaks.

How can I automatically/periodically copy a file from an FTP server to a different Ubuntu server, using that Ubuntu server?

I'm trying to get a Ubuntu server to periodically (preferably whenever it gets updated, if possible) to copy a file remotely from an FTP server to a directory on the Ubuntu server. I should note I'm not very advanced with this kind of stuff.
I of course am not doing this without a tutorial, however it doesn't cover grabbing the file from an ftp.
What would be simplest for me is to be able to run:
tail -F ftp://ftp.addr.ess/files/file-i-want.txt | grep --line-buffered": <" | while read x ; do echo -ne $x | curl -X POST -d #- http://url/hook ; done
What I'm following has that FTP address as a local address. This is a problem, because that command returns this:
tail: cannot open 'ftp://ftp.addr.ess/files/file-i-want.txt' for reading: No such file or directory
I've tried to run:
rsync username#ftp.addr.ess:XX/files/file-i-want.txt /home/ubuntu/destination
however this returns:
ssh: connect to host ftp.addr.ess port XX: Connection refused.
So really if I can get rsync to run FTP instead of SSH, I figure I'd be golden. I researched it though and I can't figure out how to do this (keep in mind I'm no programmer). I originally thought the error was because I wasn't giving it a password, because I didn't know how. It might be that also, though.
This however brings me to my next issue. If it's possible to make rsync do FTP instead of SSH, how would I make it periodically do that?
What is being updated? The remote file (my guess) or something on your server? If it's the remote file, you're out of luck unless there is mecanism/process on the remote server that can send you a notification (an email for example).
I've not used ftp for ages, but have a look at this as a starting point.
A periodic task can be quite easily configured with a cron task.

How to transfer files from remote to remote via ftp?

I want to transfer files between two servers , files size is aproximately 170GB.
On one server , there is Direct Admin control panel, and on the other one is Cpanel .
I've ftp & ssh access on both servers. I know about scp command on ssh, but as I've tried it and I didn't succeed , I prefer to use ftp commands. Because there were some connection or other errors on ssh , so the transfer progress was stopping and I couldn't resume the progress by skipping already uploaded files. So what should I do?
You can use rsync, it will continue where it stopped.
Go to one of the servers and do:
rsync -avz other.server.com:/path/to/directory /where/to/save
You can omit z option if the data is not compressible.
This is with assumption that the user name on both servers is the same.
If not you will need to add -e 'ssh -l login_name' to the above command.

efficient way to execute command when instructed

What is the best and secure way for a terminal to ping a server for a list of commands to execute every 60 secs? For example, it could download a file (that houses the command) or query a database and then execute what is on there.
Are there more efficient/secure ways to accomplish the above?
Thanks
If you want to make it into a script:
commands.ssh
echo "This will run on the remote machine."
# Do a backup or something...
Then you can execute pass this file to the remote machine using:
ssh user#remote -i id_rsa < commands.ssh
I recommend using an sshkey so that you don't have to keep your login information in the commands file.
Note: make sure the permissions for the commands.ssh file are secure!
chmod 600 commands.ssh
You can use SSH connections which are SSL enabled. If commands are predefined you can depend on a cron job, then you don't need to login to terminal again and again to run it.

Alternative to scp, transferring files between linux machines by opening parallel connections

Is there an alternative to scp, to transfer a large file from one machine to another machine by opening parallel connections and also able to pause and resume the download.
Please don't transfer this to severfault.com. I am not a system administrator. I am a developer trying to transfer past database dumps between backup hosts and servers.
Thank you
You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.
# on local host
split -b 1M large.file large.file. # split into 1MiB chunks
for f in large.file.*; do scp $f remote_host: & done
# on remote host
cat large.file.* > large.file
Take a look at rsync to see if it will meet your needs.
The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.
Similar to Mike K's answer, check out https://code.google.com/p/scp-tsunami/ - it handles splitting the file, starting several scp processes to copy the parts and then joins them again...it can also copy to multiple hosts...
./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host
That splits the file into 10MB chunks and copies them using 9 scp processes...
The program you are after is lftp. It supports sftp and parallel transfers using its pget command. It is available under Ubuntu (sudo apt-get install lftp) and you can read a review of it here:
http://www.cyberciti.biz/tips/linux-unix-download-accelerator.html

Resources