sftp script failing when one of the sftp statement fails in Linux - linux

I am having an OpenSSH sftp script which transfer the files from a SFTP server (Solaris) to application server (Linux). Here the scenario is the transfer happens from different location and same files are transferred backup to SFTP server different location. But if any of the transfers fail due to file is not available, it is not continuing the remaining sftp commands. Instead it just comes out of the code. Below is the script.
export SSHPASS=*******
/usr/local/bin/sshpass -e sftp -oPort=22 -oBatchMode=no -b - rkwlahtt#10.204.140.14 << !
cd /home/rkwlahtt/Inbound
mget *.*
rm *.*
cd /home/rkwlahtt/Inbound/Adhoc
mget *.*
rm *.*
cd /home/rkwlahtt/Archive/Inbound
mput *.TXT
mput *.txt
cd /home/rkwlahtt/Archive/Adhoc
mput *.xlsx
bye
!
Here in the above script when I am trying to mget from /home/rkwlahtt/Inbound folder and if file doesn't exist, it just comes out of the sftp code instead of going to next command that is cd /home/rkwlahtt/Inbound/Adhoc and mget. This is the same situation while mput too.
This is the first time we are transferring from different location in the same code. And this is creating issue in our transferring.
Please let me know what can be done to resolve this issue.

You can suppress an abort on error on a per-command basis using a - prefix:
-mget *.*
Another options is to remove the -b - switch.
The -b does two things. First it enables a batch mode (= abort of any error), second it sets a script file. Except when you use - instead of a script file name, in which case the commands are read from the standard input, what is the default. You do not need the second effect (as you use - anyway) and you do not want the first.
Even without the switch, you can still feed the commands using input redirection, as you are doing.
Though you need to make sure no command will ask for any input. As then some of your command will be used as the input instead of being executed.
See https://man.openbsd.org/sftp#b

Related

Execute mirror and mget lftp commands in bash script

Current code
#!/bin/bash
SFTP_SERVER="sftp.url.com:/csv/test/10"
SFTP_USER="user"
SFTP_PWD="pwd"
## not sure if this line is needed given I specify the local directory
# in the next block of code.
cd /mnt/c/Users/user/Documents/new_directory
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER
lftp -e mget *.csv mirror sftp.user.com:/csv/test/10 /mnt/c/Users/user/Documents/new_directory
Objective
Download all csv files and mirror my local directory folder with the remote server, so when the code is run again it won't download a second file.
Error received
open: *.csv: Name or service not known
Comments
From what I understood of the lftp man page I should be able to get all wildcard files by using mget instead of the standard get, provided I use -e to use external commands. I've run mget manually and can download the files without issue but it doesn't seem to support the *.csv in the script.
Appreciate any feedback you can provide as to why my code won't download the files and what I might have misunderstood from the man pages.
It should be like:
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER -e "mget *.csv; bye"

Is it possible to create an empty file using scp?

In *nix, I can create an empty file using cp:
cp /dev/null ~/emptyfile
I'd like to know if it's possible to do something similar using scp (instead of ssh + touch). If I try to run:
scp /dev/null remoteserver:~/emptyfile
it returns an error /dev/null: not a regular file
EDIT:
Just to clarify, I don't wanna run any command at the remoteserver (i.e. No ssh command should be invoked).
So it's ok to run some command at localhost (echo, cat, dd or any other trivial command) and copy the resulting file to remoteserver.
It's preferable not leaving the resulting file at localhost. It's also good if the resulting command is an one-liner solution.
EDIT2:
It's impossible to use /dev/null approach as in cp command, because scp only works with regular files and directories:
https://github.com/openssh/openssh-portable/blob/8a85f5458d1c802471ca899c97f89946f6666e61/scp.c#L838-L850
So it's mandatory to use another command (touch, cat, dd etc) to create a regular empty file (either in a previous command, pipe or a subshell).
As #willh99 commented, creating an empty file locally, and then performing scp is the most feasible solution.
So far I came up with this:
filename=$(mktemp) && scp $filename remoteserver:~/emptyfile; rm $filename
It creates an empty file in a subshell, and copies it to remoteserver as emptyfile.
Any refactor/improvements are welcome.
EDIT: remove $filename whether scp succeeding or not, as stated by #Friek.
If you're just trying to create an empty file, you can use ssh and run the touch command like this:
ssh username#remoteserver touch anemptyfile

copy/move files on remote server linux

I log into server_a and run .sh file, which has the following script:
scp user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
to copy files from my_folder to my_new_folder at server_b. It doesn't throw an error, but no files are copied.
Notes:
server_b is accessed by the pre-set rsa_keys.
server_a: unix
server_b: ubuntu
can SCP files from/to these servers without any issues
The end goal is to move or copy/remove files.
There are two possibilities:
Connect from server_a to server_b and do local copy:
ssh user#server_b "cp /my_folder/my_file.xml /my_new_folder/"
Do copy over the server_a. Your method would require the server_b to be able to authenticate to itself, which is probably not the case:
scp -3 user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
Also note that your code copies only one file and not files as you write in the title.
If you are logged on to the server, why are you authenticating again:
scp user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
You should be in the directory of file or simply use scp and use -v parameter to see the debug information.
Run as follows:
scp -v /my_folder/my_file.xml user#server_b:/my_new_folder/
It is not a directory nor it is recursive, so you do not need to -r parameter.

How to SCP files which are being FTPed by another process &delete them on completion?

Files are being transferred to a directory on my machine by FTP protocol. I need to SCP these files to another machine & delete them on completion.
How can I detect if file trasfer by FTP has been done & the file is safe to do SCP?
There's no reliable way to detect completion of the transfer. Some clients send ALLO command and pass the size of the file before actually uploading the file, but this is not a definite rule, so you can't rely on it. All in all, it's possible that the client streams the data and there's no definite "end" of file on its side.
If the client is under your control, you can make it upload files with extension A and after upload rename the files to extension B. And then you transfer only files with extension B.
You can do a script like this:
#!/bin/bash
EXPECTED_ARGS=1
E_BADARGS=65
#Arguments control
if [ $# -lt $EXPECTED_ARGS ]
then
echo "Usage: `basename $0` <folder_update_1> <folder_update_2> <folder_update_3> ..."
exit $E_BADARGS
fi
folders=( "$#" )
for folder in ${folders[#]}
do
#Send folder or file to new machine
time rsync --update -avrt -e ssh /local/path/of/$folder/ user#192.168.0.10:/remote/path/of/$folder/
#Delete local file or folder
rm -r /local/path/of/$folder/
done
It is configured to send folders. If you want files need make little changes on script as:
time rsync --update -avrt -e ssh /local/path/of/$file user#192.168.0.10:/remote/path/of/$file
rm /local/path/of/$file/
Rsync is similar to scp. I prefer use rsync but you can change it.

mget No Such file or directory

I'm connected to one of our file servers and am trying to pull down (via ftp and mget) a folder to a local directory. 50% of the mget command works successfully until it gets to a JAR file that is absolutely there on the server. It gives me the following error:
local: dist/MyProgram.jar remote: dist/MyProgram.jar
local: dist/MyProgram.jar: No such file or directory
The command I am using is a simpl mget
ftp> prompt
ftp> mget *
I am absolutely in the right directory and absolutely have a solid connection. Setting the prompt flag to prevent me from being prompted on each get. Any ideas?
wget -r ftp://name:passwd#ftp.com/somedir/
That's because mget doesn't behave recursively. I thought it would recurse down my directory tree and copy everything over as-is. You need to run it at every level of your project. It was treating dist/MyProgram.jar as a filename.

Resources