Execute mirror and mget lftp commands in bash script - linux

Current code
#!/bin/bash
SFTP_SERVER="sftp.url.com:/csv/test/10"
SFTP_USER="user"
SFTP_PWD="pwd"
## not sure if this line is needed given I specify the local directory
# in the next block of code.
cd /mnt/c/Users/user/Documents/new_directory
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER
lftp -e mget *.csv mirror sftp.user.com:/csv/test/10 /mnt/c/Users/user/Documents/new_directory
Objective
Download all csv files and mirror my local directory folder with the remote server, so when the code is run again it won't download a second file.
Error received
open: *.csv: Name or service not known
Comments
From what I understood of the lftp man page I should be able to get all wildcard files by using mget instead of the standard get, provided I use -e to use external commands. I've run mget manually and can download the files without issue but it doesn't seem to support the *.csv in the script.
Appreciate any feedback you can provide as to why my code won't download the files and what I might have misunderstood from the man pages.

It should be like:
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER -e "mget *.csv; bye"

Related

Is there a way to download files matching a pattern trough SFTP on shell script?

I'm trying to download multiple files trough SFTP on a linux server using
sftp -o IdentityFile=key <user>#<server><<END
get -r folder
exit
END
which will download all contents on a folder. It appears that find and grep are invalid commands, so are for loops.
I need to download files having a name containing a string e.g.
test_0.txt
test_1.txt
but no file.txt
Do you really need the -r switch? Are there really any subdirectories in the folder? You do not mention that.
If there are no subdirectories, you can use a simple get with a file mask:
cd folder
get *test*
Are you required to use sftp? A tool like rsync that operates over ssh has flexible include/exclude options. For example:
rsync -a <user>#<server>:folder/ folder/ \
--include='test_*.txt' --exclude='*.txt'
This requires rsync to be installed on the remote system, but that's very common these days. If rsync isn't available, you could do something similar using tar:
ssh <user>#<server> tar -cf- folder/ | tar -xvf- --wildcards '*/test_*.txt'
This tars up all the files remotely, but then only extracts files matching your target pattern on the receiving side.

How can I download all the files from a remote directory to my local directory?

I want to download all the files in a specific directory of my site.
Let's say I have 3 files in my remote SFTP directory
www.site.com/files/phone/2017-09-19-20-39-15
a.txt
b.txt
c.txt
My goal is to create a local folder on my desktop with ONLY those downloaded files. No parents files or parents directory needed. I am trying to get the clean report.
I've tried
wget -m --no-parent -l1 -nH -P ~/Desktop/phone/ www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
I got
I want to get
How do I tweak my wget command to get something like that?
Should I use anything else other than wget ?
Ihue,
Taking a shell programatic perspective I would recommend you try the following command line script, note I also added the citation so you can see the original threads.
wget -r -P ~/Desktop/phone/ -A txt www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
-r enables recursive retrieval. See Recursive Download for more information.
-P sets the directory prefix where all files and directories are saved to.
-A sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list. See Types of Files for more information.
Ref: #don-joey
https://askubuntu.com/questions/373047/i-used-wget-to-download-html-files-where-are-the-images-in-the-file-stored

Bash ftp : put only newer files just like filezilla

I can't believe I stuck there.
I would like to put only newer files in a bash ftp script.
Just like filezilla does:
I know it is possible with winscp, but I cannot believe this doesn't exist within the linux ftp command line tool.
Important Note: I can't SSH the server, so please don't suggest rsync.
As #fvu suggested, I finally sorted this out with lftp:
lftp -u <username>,<password> <host> << EOS
set ssl:verify-certificate no
set ftp:ssl-allow no
set ftp:ssl-protect-list no
mirror -R --only-newer --parallel=10 <localfolder> <remotefolder>
quit
EOS

Create and update archive over ssh on local machine

I am trying to find a way to create and update a tar archive of files on a remote system where we don't have write permissions (the remote file system is read only) over ssh. I've figured out that the way to create a archive is,
ssh user#remoteServer "tar cvpjf - /" > backup.tgz
However, I would like to know if there is some way of performing only incremental backups from this point on (of only files that have actually changed?). Any help with this is much appreciated.
You can try using the --listed-incremental option of tar:
http://www.gnu.org/software/tar/manual/html_node/Incremental-Dumps.html
The main problem is that you have no option to pipe the snar file through the stdout because you are already piping backup.tgz so the best option to store it would be to create the file in the /tmp directory where you should have write permissions and then download it at the end of the backup session.
For example:
ssh user#remoteServer "tar --listed-incremental=/tmp/backup-1.snar -cvpjf - /" > backup-1.tgz
scp user#remoteServer:/tmp/backup-1.snar
And in the following session you will use that .snar file to avoid copying the same files:
scp backup-1.snar user#remoteServer:/tmp/backup-1.snar
ssh user#remoteServer "tar --listed-incremental=/tmp/backup-1.snar -cvpjf - /" > backup-2.tgz

lftp + bash script + variables

I'm using lftp to mirror files from external server but now what I need is to after sucessful download rename source directory (on remote server). Basicaly what I need is to open connection on remote server list directories, download all dirs that name starts from "todo" i.e. todo.20121019 after sucess I must rename downloaded directory to "done.20121019". There might be more than one dir on the server.
Remote FTP server works only with active connection.
#!/bin/bash
directories=`lftp -f lftp_script_file.lf |grep done|awk '{print $NF}'`
for i in $directories
do
echo $i //here I get list of directories that should be downloaded and renamed
done
lftp_script_file.lf just list directires:
set ftp:passive-mode false;
open ftp://user:pass$#10.10.10.123
ls my_sub_dir/
Is there a way to:
open connection to ftp server
find directories that I want to download
add those dirs to queue and download
rename directories on remote server
in batch file?
What I was trying to achive was to list dirs find interesing ones, download and rename but I cant find a way to post list of dirs to lftp via bash script and "set ftp:passive-mode false".
To be able to substitute variables into lftp commands use something like this:
lftp -e "cmd1;cmd2"

Resources