I've already done a backup of my database, using mysqldump like this:
mysqldump -h localhost -u dbUsername -p dbDatabase > backup.sql
After that the file is in a location outside public access, in my server, ready for download.
How may I do something like that for files? I've tried to google it, but I get all kind of results, but that.
I need to tell the server running ubuntu to backup all files inside folder X, and put them into a zip file.
Thanks for your help.
You can use tar for creating backups for a full system backup
tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz /
for a single folder
tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz /your/folder
to create a gzipped tar file of your whole system. You might need additional excludes like
--exclude=/proc --exclude=/sys --exclude=/dev/pts.
If you are outside of the single folder you want to backup the --exclude=/backup.tar.gz isn't needed.
More details for example here (you can do it over network, split the archive etc.).
Related
I'm trying to download multiple files trough SFTP on a linux server using
sftp -o IdentityFile=key <user>#<server><<END
get -r folder
exit
END
which will download all contents on a folder. It appears that find and grep are invalid commands, so are for loops.
I need to download files having a name containing a string e.g.
test_0.txt
test_1.txt
but no file.txt
Do you really need the -r switch? Are there really any subdirectories in the folder? You do not mention that.
If there are no subdirectories, you can use a simple get with a file mask:
cd folder
get *test*
Are you required to use sftp? A tool like rsync that operates over ssh has flexible include/exclude options. For example:
rsync -a <user>#<server>:folder/ folder/ \
--include='test_*.txt' --exclude='*.txt'
This requires rsync to be installed on the remote system, but that's very common these days. If rsync isn't available, you could do something similar using tar:
ssh <user>#<server> tar -cf- folder/ | tar -xvf- --wildcards '*/test_*.txt'
This tars up all the files remotely, but then only extracts files matching your target pattern on the receiving side.
I have a question about performing routing backups on our webserver. Currently we are running Apache and we want to backup our doc root. I have a shell script that runs nightly and the command I use is: sudo tar cvzf filename targetFilename.
My question is is it safe to run a tar cvzf command on a doc root if files are being read, written, created? Is there a better way to do this? Is it a good idea to shut down Apache while creating the tar file?
I've done some research and I couldn't find a straight forward answer.
Thank you for your help!
it's safe
System Deny A and B are writing file same the time because file locked.
tar backup is reading the files and create new file . it's ok.but shutdown web server is the best way to backup files .
I am facing problem in taking shell script backup of my system (Linux). Please help me to know how to take shell script backup of my system.
Thanks in advance
I prefer rsync for backing up my Centos boxes. It is very good at both file transfer and file synchronization and offers tons of options for compression and such. This is a modified one line example from my BASH backup script that backs up all of my media onto a temporarily mounted external hard drive:
rsync -avP --stats /media/* /mnt/ntfs/Media 1>/log/stats.info 2>>log/bkup.err
You can substitute /media/* with the directories you want to backup; the most simple would simply be /*, which will backup everything.
You can also use the --exclude directive to exclude directories.
The other simple method is a simple tar archive to get all the important system files, something like:
tar cvfj /root/sysBkup.bz2 --exclude=/root/sysBkup.bz2 /etc /var /root /sys
Then moving that backup to a remote share with the next line of bash script. But I would recommend getting familiar with rsync; very handy backup utility.
I've tried multiple ways of creating a zip or a tar.gz on the mac using GUI or command lines, and I have tried decompressing on the Linux side and gotten various errors, from things like "File.XML" and "File.xml" both appearing in a directory, to all sorts of others about something being truncated, etc.
Without listing all my experiments on the command line on the Mac and Linux (using tcsh), what should 2 bullet proof commands be to:
1) make a zip file of a directory (with no __MACOSX folders)
2) unzip / untar (whatever) the Mac zip on Linux with no errors (and no __MACOSX folders)
IT staff on the Linux side said they "usually use .gz and use gzip and gunzip commands".
Thanks!
After much research and experimentation, I found this works every time:
1) Create a zipped tar file with this command on the Mac in Terminal:
tar -cvzf your_archive_name.tar.gz your_folder_name/
2) When you FTP the file from one server to another, make sure you do so with binary mode turned on
3) Unzip and untar in two steps in your shell on the Linux box (in this case, tcsh):
gunzip your_archive_name.tar.gz
tar -xvf your_archive_name.tar
On my Mac and in ssh bash I use the following simple commands:
Create Zip File (-czf)
tar -czf NAME.tgz FOLDER
Extract Zip File (-xzf)
tar -xzf NAME.tgz
Best, Mike
First off, the File.XML and File.xml cannot both appear in an HFS+ file system. It is possible, but very unusual, for someone to format a case-sensitive HFSX file system that would permit that. Can you really create two such files and see them listed separately?
You can use the -X option with zip to prevent resource forks and extended attributes from being saved. You can also throw in a -x .DS_Store to get rid of those files as well.
For tar, precede it with COPYFILE_DISABLE=true or setenv COPYFILE_DISABLE true, depending on your shell. You can also throw in an --exclude=.DS_Store.
Your "IT Staff" gave you a pretty useless answer, since gzip can only compress one file. gzip has to be used in combination with tar to archive a directory.
I have written a shellscript which tries to pull a tar file from an ftp server and untar it locally. I need to extract specific files from the tar archive. The filename of the tarfile contains a date; I need to be able to select a tar file based on this date.
abc_myfile_$date.tar is the format of the file I am pulling from the ftp server.
My current code looks like this:
for host in ftpserver
do
ftp -inv host <<END_SCRIPT
user username password
prompt
cd remotepath
lcd localpath
mget *myfile_$date*.tar
quit
END_SCRIPT
done
for next in `ls localpath/*.tar`
do
tar xvf $next *required_file_in_tar_file*.dat
done
when i run the script am not able to untar the files
I am able to get a single tar file from the ftp server only if I mention the exact name of that file. I would like to get a file which has myfile_$date in its name. After this I would like to extract it to a local path to get the specified files in that tar file whose names consist of my required_files.
You get the .tar file, but decompress it with z option. Compressed files (those that require z) normally have .tar.gz prefix. Try
tar xvf $next *required_file_in_tar_file*.dat
Firstly, if you want to use wildcards for the file name that you're getting from the server you need to use mget instead of get. Wildcard file expansion (the *) does not work for the get command.
Once you have pulled the file the tar operation will work as expected, most modern versions of linux/bsd have a 'smart' tar, which doesn't need the 'z' command to specify that the tar file is compressed - they'll figure out that the tarball is compressed on their own and uncompress it automatically, providing the appropriate compression/decompression tool is on the system (bzip2 for .jz files, gzip for .gz files).
I'm not quite sure, but does the FTP protocol not have a command mget if you want to download multiple files? (instead of get)