i create backup folder in ftp server , and send all my .tar.gz file into /backup folder
using (put file.tar.gz backup)
while i retrieve backup,, i get backup folder as backup files. ,, how to convert the file to folder ..
ftp server
ls
227 Entering Passive Mode (10,21,131,105,76,56)
150 Accepted data connection
drwxr-xr-x 6 100 ftpgroup 7 Oct 20 19:57 .
drwxr-xr-x 6 100 ftpgroup 7 Oct 20 19:57 ..
-r-------- 1 100 ftpgroup 84 Oct 21 11:15 .banner
drwxrwxrwx 3 100 ftpgroup 4 Oct 20 18:28 backup
drwxrwxrwx 2 100 ftpgroup 3 Oct 20 19:45 dailybackup
drwxrwxr-x 2 100 ftpgroup 3 Oct 20 19:57 hi5songs
drwxrwxr-x 2 100 ftpgroup 3 Oct 20 19:49 whole
226-Options: -a -l
226 7 matches total
i tried :
ftp> mget backup``
mget .? y
227 Entering Passive Mode (10,21,131,105,62,8)
550 I can only retrieve regular files
mget ..? y
Warning: embedded .. in .. (changing to !!)
227 Entering Passive Mode (10,21,131,105,46,39)
550 Can't open !!: No such file or directory
mget backup? y
227 Entering Passive Mode (10,21,131,105,72,24)
550 I can only retrieve regular files
mget cpanelbackup? y
227 Entering Passive Mode (10,21,131,105,73,69)
550 Can't open cpanelbackup: No such file or directory
while
i use (get backup home)
it successfully retrieve but as files shown below
server:
'root#azar [/home]# ls
./ backup.2* .cpan/ dailybackup hi5songs.4 oldeserver
../ backup.3* cPanelInstall/ hi5songs/ hi5songs.5 oldserver/
0_README_BEFORE_DELETING_VIRTFS backup.4* .cpanm/ hi5songs.1 home quota.user
backup/ backup.5* .cpcpan/ hi5songs.2 latest virtfs/
backup.1* .banner cpeasyapache/ hi5songs.3 lost+found/ whole'
i got that backup with green color executable file like backup.1* ( note: i cant open those file and extract those files) what to do
how to get my .tar.gz file back
please guide me,,
advance thanks,,
Updated Answer
If you want to get all files from /some/place on your server, to /home/here on your local machine, you would either do this:
cd /home/here # change directory before starting FTP
ftp server ... # connect
cd /some/place # go to desired folder on server
bi # ensure no funny business with line-endings
mget * # get all files
or you can change directory locally, within FTP like this:
ftp server ... # connect
cd /some/place # go to desired folder on server
lcd /home/here # LOCALLY change directory to where you want the files to 'land'
bi # ensure no funny business
mget * # get all files
Original Answer
I cannot understand your question at all, but you are doing some things wrong.
You cannot use GET or MGET to get a folder (directory) like you are trying to do with mget backup. You can only GET a file. Now your file may be a tar-file with more than one file in it, but it is still a file.
If you are getting tar-files and binary files, you should use BINARY mode to ensure line-end characters that may occur in binary files are not translated between Windows and Unix line-endings. So, as a matter of course, you should issue BI command before you get files.
If you have several files in your backup directory, you should probably do cd backup then bi
then mget *
Related
(ubuntu 18.04)
I'm attempting to extract an odbc driver from a tarball and following these instructions with command:
tar --directory=/opt -zxvf /SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux.tar.gz
This results in the following output:
root#08ba33ec2cfb:/# tar --directory=/opt -zxvf SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux.tar.gzSimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/GoogleBigQueryODBC.did
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/docs/
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/docs/release-notes.txt
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/docs/Simba Google BigQuery ODBC Connector Install and Configuration Guide.pdf
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/docs/OEM ODBC Driver Installation Instructions.pdf
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/setup/
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/setup/simba.googlebigqueryodbc.ini
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/setup/odbc.ini
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/setup/odbcinst.ini
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/SimbaODBCDriverforGoogleBigQuery32_2.4.6.1015.tar.gz
SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015.tar.gz
The guide linked to above says:
The Simba Google BigQuery ODBC Connector files are installed in the
/opt/simba/googlebigqueryodbc directory
Not for me, but I do see:
ls -l /opt/
total 8
drwxr-xr-x 1 1000 1001 4096 Apr 26 00:39 SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux
And:
ls -l /opt/SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux/
total 52324
-rwxr-xr-x 1 1000 1001 400 Apr 26 00:39 GoogleBigQueryODBC.did
-rw-rw-rw- 1 1000 1001 26688770 Apr 26 00:39 SimbaODBCDriverforGoogleBigQuery32_2.4.6.1015.tar.gz
-rw-rw-rw- 1 1000 1001 26876705 Apr 26 00:39 SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015.tar.gz
drwxr-xr-x 1 1000 1001 4096 Apr 26 00:39 docs
drwxr-xr-x 1 1000 1001 4096 Apr 26 00:39 setup
I was specifically looking for the .so driver file. All the above is on a docker container. I tried extracting the tarball locally on Ubuntu 18.04 (Same as my Docker container) and when I use Ubuntu desktop gui to extract by double clicking the tar.gz file and then clicking 'extract', I do indeed see the expected files.
It seems my tar command (tar --directory=/opt -zxvf /SimbaODBCDriverforGoogleBigQuery_2.4.6.1015-Linux.tar.gz) is not extracting the tarball as expected.
How can I extract the contents of the tarball properly? The tarball in question is the linux one on this link.
[edit]
Adding screens of contents of the tarball per comments. I had to click down two levels of nesting to arrive at 'stuff':
The instructions you linked to do not match the contents of the file I found from here. The first .tar.gz contains two other .tar.gz files. I looked into the 64 bit one and it has:
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/SimbaBigQueryODBCMessages.xml
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/ODBCMessages.xml
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/SQLEngineMessages.xml
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/DSMessages.xml
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/ErrorMessages/en-US/DSCURLHTTPClientMessages.xml
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/third-party-licenses.txt
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/lib/
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/lib/libgooglebigqueryodbc_sb64.so
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/lib/cacerts.pem
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/lib/EULA.txt
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/Tools/
SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015/Tools/get_refresh_token.sh
Your .so is in the lib directory. Based on the instructions it looks like you need to extract this file (or the 32 bit if appropriate) and rename, in this case SimbaODBCDriverforGoogleBigQuery64_2.4.6.1015 to simba/googlebigqueryodbc. The tar command is doing what it is told but the instructions are way off.
I want to zip a set of directories and files on my centos 8 VM.
There are 3 directories and 1 file which I want to zip in such a way that only env.conf file will move to /etc/env.txt after unzipping it and remaining directories will be unzipped at current location.
Is there any way to achieve this.
drwxr-xr-x. 9 root root 114 Feb 25 12:40 config
-rw-r--r--. 1 root root 340 Feb 25 09:01 env.conf
drwxr-xr-x. 9 root root 4096 Feb 28 05:11 platform
drwxr-xr-x. 2 root root 135 Feb 28 07:49 install
I don't think this is possible. in fact this is considered a vulnerability if you could do that.
Imagine you download a zip file from some website. and after you unzip it in a temp folder. It registers itself as a service by writing a file in /etc somewhere, and gets control over your pc.
Example: zip-slip
You could however create a one-liner that extracts and moves the file wherever you want like this:
unzip <filename> && mv env.conf /etc/env.txt
I have a scenario where I have to take an incremental backup every 5 minutes of a large file which is around 100GB on the local machine if the content of the file changes.
Filename: example.txt
Backups: example.txt.00:05, example.txt.00:10, example.txt.00:15 and
so on.
What will be the most optimized way to do this?
If I opt for diff than it will take a lot of time to check the content of the file.
I would prefer doing it with rsync but I am unsure about how it will manage the multiple files.
I figured this out with the man page of rsync.
-b, --backup make backups (see --suffix & --backup-dir)
--backup-dir=DIR make backups into hierarchy based in DIR
--suffix=SUFFIX backup suffix (default ~ w/o --backup-dir)
Script:
#!/bin/bash
while True
do
timestamp=$(date +"%H:%M:%S")
echo $timestamp
rsync -avschz --backup --backup-dir=archive --suffix="-$timestamp" example.txt backup
sleep 300
done
The above script will create an archive directory inside the backup directory and rename the files accordingly.
Output:
imohit:rsync-script ethicalmohit$ ls -l backup/archive/
total 88064
-rw-r--r-- 1 ethicalmohit staff 18874368 Mar 25 03:06 example.txt-03:15:41
-rw-r--r-- 1 ethicalmohit staff 12582912 Mar 25 03:17 example.txt-03:25:42
-rw-r--r-- 1 ethicalmohit staff 13631488 Mar 25 03:25 example.txt-03:30:42
There are two directories that contains these files:
First one /usr/local/nagios/etc/hosts
[root#localhost hosts]$ ll
total 12
-rw-rw-r-- 1 apache nagios 1236 Feb 7 10:10 10.80.12.53.cfg
-rw-rw-r-- 1 apache nagios 1064 Feb 27 22:47 10.80.12.62.cfg
-rw-rw-r-- 1 apache nagios 1063 Feb 22 12:02 localhost.cfg
And the second one /usr/local/nagios/etc/services
[root#localhost services]$ ll
total 20
-rw-rw-r-- 1 apache nagios 2183 Feb 27 22:48 10.80.12.62.cfg
-rw-rw-r-- 1 apache nagios 1339 Feb 13 10:47 Check usage _etc.cfg
-rw-rw-r-- 1 apache nagios 7874 Feb 22 11:59 localhost.cfg
And I have a script that goes through file in Hosts directory and paste some lines from that file in the file in the Services directory.
The script is ran like this:
./nagios-contacts.sh /usr/local/nagios/etc/hosts/10.80.12.62.cfg /usr/local/nagios/etc/services/10.80.12.62.cfg
How can I achieve that another script calls my script and goes through every file in the Hosts directory and does its job for the files with the same name in the Service directory?
In my script I´m pulling out contacts from the 10.80.12.62.cfg in the Hosts directory and appending them to the file with the same name in the Service directory.
Don't use ls output as an input to for loop instead use the built-in wild-cards. See why it's not a good idea.
for f in /usr/local/nagios/etc/hosts/*.cfg
do
basef=$(basename "$f")
./nagios-contacts.sh "$f" "/usr/local/nagios/etc/services/${basef}"
done
It sounds like you just need to do some iteration.
echo $(pwd)
for file in $(ls); do ./nagious-contacts.sh $file; done;
So it will loop over all files in the current directory.
You can also modify it as well by doing something more absolute.
abspath=$1
for file in $(ls $abspath); do ./nagious-contacts.sh $abspath/$file; done
which would loop over all files in a set directory, and then pass the abspath/filename into your script.
I am trying to copy directories (& files) recursively from one directory to another.
I tried the following -
rsync -avz <source> <target>
cp -ruT <source> <taret>
Both were successful. but, when i try to compare the sizes using (du -c), the empty directories seem to have mismatch in size.
In target directory
drwxrwxr-x 2 abc devl 4096 Jun 9 01:25 .
drwxrwxr-x 4 abc devl 4096 Jul 20 07:46 ..
In source directory
drwxrwxr-x 2 prod ops 2 Jun 9 01:25 .
drwxrwxr-x 4 prod ops 36 Jul 20 07:46 ..
Is there a special way to handle this? diff -qr doesn't show any differences though.
Thanks for your help.
Are both folders on the same volume? If not chances are that the sector size for those volumes are different and in turn the inode sizes differ. In case of diff it's just looking at whenever or not the directory exists and if it contains the corresponding files. It's similar in how diff doesn't include permission differences because those might be pretty system specific.
A pretty comprehensive answer can be found here: Why size reporting for directories is different than other files?