how to take linux shell backup script system? - linux

I am facing problem in taking shell script backup of my system (Linux). Please help me to know how to take shell script backup of my system.
Thanks in advance

I prefer rsync for backing up my Centos boxes. It is very good at both file transfer and file synchronization and offers tons of options for compression and such. This is a modified one line example from my BASH backup script that backs up all of my media onto a temporarily mounted external hard drive:
rsync -avP --stats /media/* /mnt/ntfs/Media 1>/log/stats.info 2>>log/bkup.err
You can substitute /media/* with the directories you want to backup; the most simple would simply be /*, which will backup everything.
You can also use the --exclude directive to exclude directories.
The other simple method is a simple tar archive to get all the important system files, something like:
tar cvfj /root/sysBkup.bz2 --exclude=/root/sysBkup.bz2 /etc /var /root /sys
Then moving that backup to a remote share with the next line of bash script. But I would recommend getting familiar with rsync; very handy backup utility.

Related

How to backup docroot using tar

I have a question about performing routing backups on our webserver. Currently we are running Apache and we want to backup our doc root. I have a shell script that runs nightly and the command I use is: sudo tar cvzf filename targetFilename.
My question is is it safe to run a tar cvzf command on a doc root if files are being read, written, created? Is there a better way to do this? Is it a good idea to shut down Apache while creating the tar file?
I've done some research and I couldn't find a straight forward answer.
Thank you for your help!
it's safe
System Deny A and B are writing file same the time because file locked.
tar backup is reading the files and create new file . it's ok.but shutdown web server is the best way to backup files .

Shell script to download file from UNIX system directory

Can any one help me writing a shell script to Download files from Linux/UNIX system?
Regards
On UNIX systems, such as Linux and OSX, you have access to a utility called rsync. It is installed by default and is the tool to use to download files from another UNIX system.
It is a drop-in replacement for the cp (copy) command, but it is much more powerful.
To copy a directory from a remote system to yours, using SSH, you would do this:
rsync username#hostname:path/to/dir .
(notice the dot at the end, this means 'place everything here please', you can also give the name of the local dir where the files should be placed.)
To download only some specific files, use this:
rsync 'username#hostname:path/to/dir/*.txt' .
(notice the quotes: if you omit them, your shell will try to expand the *.txt part locally, will fail and give you an error.)
Useful flags:
--progress: show a progress bar
--append: if a file has only partially downloaded, resume it where it left off
I find the rsync utility so useful, I've created an alias for it in my shell and use it as a 'super-copy':
alias cpa 'rsync -vae ssh --progress --append'
With that alias, copying files between machines is just as easy as copying files locally:
cpa user#host:file .
Making it even better
Since rsync is using SSH, it helps to setup a private/public key pair, so you don't have to type in your password every time:
How do I setup Public-Key Authentication?
Futhermore, you can write down your username in your .ssh/config file and give the remote host a short name: read about it here.
For example, I have something like this:
Host panda
Hostname panda.server.long.hostname.com
User rodin
With this setup, my command to download files from the panda server is just:
cpa panda:path/to/my/files .
And there was much rejoicing.

copy directory from another computer on Linux

On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/

How to create a Linux compatible zip archive of a directory on a Mac

I've tried multiple ways of creating a zip or a tar.gz on the mac using GUI or command lines, and I have tried decompressing on the Linux side and gotten various errors, from things like "File.XML" and "File.xml" both appearing in a directory, to all sorts of others about something being truncated, etc.
Without listing all my experiments on the command line on the Mac and Linux (using tcsh), what should 2 bullet proof commands be to:
1) make a zip file of a directory (with no __MACOSX folders)
2) unzip / untar (whatever) the Mac zip on Linux with no errors (and no __MACOSX folders)
IT staff on the Linux side said they "usually use .gz and use gzip and gunzip commands".
Thanks!
After much research and experimentation, I found this works every time:
1) Create a zipped tar file with this command on the Mac in Terminal:
tar -cvzf your_archive_name.tar.gz your_folder_name/
2) When you FTP the file from one server to another, make sure you do so with binary mode turned on
3) Unzip and untar in two steps in your shell on the Linux box (in this case, tcsh):
gunzip your_archive_name.tar.gz
tar -xvf your_archive_name.tar
On my Mac and in ssh bash I use the following simple commands:
Create Zip File (-czf)
tar -czf NAME.tgz FOLDER
Extract Zip File (-xzf)
tar -xzf NAME.tgz
Best, Mike
First off, the File.XML and File.xml cannot both appear in an HFS+ file system. It is possible, but very unusual, for someone to format a case-sensitive HFSX file system that would permit that. Can you really create two such files and see them listed separately?
You can use the -X option with zip to prevent resource forks and extended attributes from being saved. You can also throw in a -x .DS_Store to get rid of those files as well.
For tar, precede it with COPYFILE_DISABLE=true or setenv COPYFILE_DISABLE true, depending on your shell. You can also throw in an --exclude=.DS_Store.
Your "IT Staff" gave you a pretty useless answer, since gzip can only compress one file. gzip has to be used in combination with tar to archive a directory.

How to do a backup of files using the terminal?

I've already done a backup of my database, using mysqldump like this:
mysqldump -h localhost -u dbUsername -p dbDatabase > backup.sql
After that the file is in a location outside public access, in my server, ready for download.
How may I do something like that for files? I've tried to google it, but I get all kind of results, but that.
I need to tell the server running ubuntu to backup all files inside folder X, and put them into a zip file.
Thanks for your help.
You can use tar for creating backups for a full system backup
tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz /
for a single folder
tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz /your/folder
to create a gzipped tar file of your whole system. You might need additional excludes like
--exclude=/proc --exclude=/sys --exclude=/dev/pts.
If you are outside of the single folder you want to backup the --exclude=/backup.tar.gz isn't needed.
More details for example here (you can do it over network, split the archive etc.).

Resources