FTP script not working - linux

I have an FTP script running on Linux and it is failing. Here is the script:
/usr/bin/ftp -v -i -n my.ftp.server << cmd
user ftpuser password
binary
ls
<some other commands here>
quit cmd
It returns an error:
421 Service not available, remote server has closed connection
Not connected.
The weird thing here is that if I just typed this in the command line:
/usr/bin/ftp my.ftp.server
It asks for a username and password and after I supplied them, I was able to connect!
In ftp> I type ls and I can see the files from the FTP server.
What is wrong with my script?
And also, I don't have putty access to the FTP server so I can't see the logs from there. Any ideas?
Thanks!

Here is an example of a correct ftp linux script.
#!/bin/sh
HOST='ftp.server.com'
USER='user'
PASSWD='pw'
FILE='file.txt' #sample file to upload
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILE #sample command
#some other commands here
quit
END_SCRIPT
exit 0

Related

Host key verification failed. when Using scp inside ssh EOL

I am trying to run multiple ssh command using the bellow shell script
#!/bin/bash
ssh -T root#10.123.234.456 <<'EOL'
'/var/map/pg_dump.sh'
CHOOSE_DB=$(ls -t /var/mapbackup/mapdb* | head -1)
scp -r root#10.123.234.456:$CHOOSE_DB /app/map/
echo $CHOOSE_DB
EOL
First 2 command work well but scp fails with Host key verification failed.
Login got success for ssh but i feel scp not able to fetch password from EOL Here is the screen short of the error
Can Someone help correcting my script

How to save FTP session logs in file in Linux

I am using a bash script in Linux to transfer files to a server. My script is running from cron and I have directed output to a file but I cannot know from logs if the file has been transferred to B server or not.
This is the cron:
1>>/home/owais/script_test/logs/res_sim_script.logs 2>>/home/owais/script_test/logs/res_sim.logs
And the FTP is as below:
cd ${dir}
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $dir
cd $destDir
bin
prompt
put FILENAME
bye
The only thing that I get in the logs is:
Local directory now Directory_Name
Interactive mode off.
Instead of using FTP, there is rsync. Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to or from another host over any remote shell, or to, or from a remote rsync daemon.
More information at the following webpage, https://linux.die.net/man/1/rsync
I have used ftp -inv Host << EOF >> LogFilePath and it worked. Thank you all for the support

FTP output redirect to a file on my linux box using shell script

I am looking for a Bash script to redirect a simple ls command output to a file on my Linux box from my FTP Server.
Here follows the step by step commands to illustrate what I am looking to script. The FTP site can be accessed without a user/password, so i am entering user as anonymous and password as blank when prompted.
ftp <FTP_SERVER>
Connected to <FTP_SERVER> (IP_ADDRESS).
220 Microsoft FTP Service
Name (<FTP_SERVER>:root): anonymous
331 Anonymous access allowed, send identity (e-mail name) as password.
Password:
230 Anonymous user logged in.
Remote system type is Windows_NT.
ftp> ls /Outgoing/Artemis/incremental/Hashes-1492870261.zip
227 Entering Passive Mode (10,37,108,77,5,87).
125 Data connection already open; Transfer starting.
04-22-17 07:11AM 227634 Hashes-1492870261.zip
226 Transfer complete.
I need the output of the command 'ls' command I am executing to be saved in a file on my linux box.
Here is the script i have:
ftp FTP_SERVER <<EOF >outputfile
quote USER anonymous
quote PASS
prompt noprompt
ls -la /Outgoing/Artemis/incremental/Hashes-1492870261.zip
quit
EOF
when i execute this i get a login failed error.
sh -x test.sh
+ ftp FTP_SERVER`
Password:
Login failed.
local: /Outgoing/Artemis/incremental/Hashes-1492870261.zip: No such file or directory
I used some random password as a test instead of blank (null) which I used previously, but still get the same error. How can I fix this?
Turn interactive mode off:
ftp -i -n FTP_SERVER <<EOF >outputfile
user <user> <password>
binary
ls -la .
quit
EOF
You might get on better with lftp:
lftp -e 'ls -l someFile.zip; quit' -u USER,PASSWORD FTPSERVER > ls.txt
where all the words in capitals need replacing by your own values.
Or you could try with curl:
curl ftp://FTPSERVER --user USER:PASSWORD

Getting files using expect command in a shell script

I am trying to automate SFTP command using a UNIX shell expect script with the help of expect command. My code is as below. When I tried executing the script it throws me an exception. Can somebody help me if you have faced the similar issue. Any help would be appreciated. Thank you.
--Shell Script
#This ftp script will copy all the files from Buxton directory into the local directory.
#!/usr/local/bin/expect
spawn sftp -b cmdFile XYZ#sftp.abc.com
expect "password:"
send "6n8W6u0J"
interact
Command File
SRC_DIR='From_Bux'
TRGT_DIR='/work/informatica/release81x/DEV2/DEP_EXT_ARCH/SrcFiles/LANDING'
FILE='buxt_summary_20140702.csv'
cd $SRC_DIR
lcd $TRGT_DIR
get $FILE
Execution command ./get_bxtn_src_files.ksh
Error Message
$ ./get_bxtn_src_files.ksh
./get_bxtn_src_files.ksh[3]: spawn: not found
couldn't read file "password:": no such file or directory
send: unable to stat draft file /home/vvutukuri/Mail/6n8W6u0J: No such file or directory
./get_bxtn_src_files.ksh[6]: interact: not found
This looks like an interpreter issue.
Check again if you script has #! as the to absolute first characters in your script.
head -1 get_bxtn_src_files.ksh | cut -c 1-2
# The result should be: #!
If not, the script will be interpreted with the shell you are logged in with.
The errors you are receiving indicates that the expect commands are interpreted by a regular shell (ksh/sh/bash/...) instead of expect.
See the following examples:
$ ksh
$ cd test
$ cat expect1-test.exp # the hash-bang is in my first line
#!/usr/bin/expect
spawn sftp -b cmdFile XYZ#sftp.abc.com
expect "password:"
send "6n8W6u0J"
interact
$ ./expect1-test.exp # Notice the error messages, they come from Expect
spawn sftp -b cmdFile XYZ#sftp.abc.com
No such file or directory (cmdFile).
send: spawn id exp4 not open
while executing
"send "6n8W6u0J""
(file "./expect1-test.exp" line 4)
$ cat expect2-test.exp # Notice the blank line
#!/usr/bin/expect
spawn sftp -b cmdFile XYZ#sftp.abc.com
expect "password:"
send "6n8W6u0J"
interact
$ ./expect2-test.exp # Notice that i.e. spawn is not found.
./expect2-test.exp[3]: spawn: not found [No such file or directory]
couldn't read file "password:": no such file or directory
./expect2-test.exp[5]: send: not found [No such file or directory]
./expect2-test.exp[6]: interact: not found [No such file or directory]
You should also look into autoexpect which you can use to create an expect script you can use and edit.
Finally: You should consider switching to using ssh private/public keys instead of expect, unless - of course - you are prevented from doing this / need to use a password.
SSH private/public keys
On the remote machine
mkdir -p $HOME/.ssh/
chmod 700 $HOME/.ssh/
chmod go-w $HOME # NB! If sshd is set to strict your $HOME - and parent directories must _not_ be world writable
On the local machine
mkdir -p $HOME/.ssh # Create the .ssh directory if necessary
chmod 700 $HOME/.ssh # Low perms to protect the directory
ssh-keygen -t dsa -f $HOME/.ssh/id_dsa -N "" # Create the private/pulic key pair
# with no pw since you'll use it in a script
scp $HOME/.ssh/id_dsa.pub remote_machine:.ssh/authorized_keys # Provided it's not allready there, in which case you should add your public key to the bottom of the file
Verify that you are able to log in withouth a password:
ssh remote_machine # or user#remote_machine
Your script can now be:
#!/bin/ksh
# Or /usr/bin/ksh..?
sftp -b cmdFileXYZ#sftp.abc.com
On the command line you can also use
scp XYZ#sftp.abc.com:$SRC_DIR/$FILE $TRGT_DIR/

Shell script to Sudo in Remote Machine and execute commands

#!/bin/csh
ssh -o StrictHostKeyChecking=no xyz123#remotemachine.com
sudo -su rootuser
ksh
. /mydir/setup_env.ksh
ls -ltr
Above is the list of task i need to do.
Login into remote machine without password prompt
Run Sudo to get access to Root
Change shell to ksh
execute a script (setup_env.ksh)
List files using ls -ltr
When i execute this script from , lets say localunixmachine.com...
It ask me for password
once i enter password , it will transfer to remote machine but wont execute remaining commands
If i exit from remote session, it will execute remaining command.
Can you please Guide me whats the best way to accomplish what i am trying here.
first you can copy your ssh public key which you can generate ssh-keygen to authorized_keys to the remote server root/.ssh/authorized_keys
and then the script will be
ssh root#remotemachine.com "/bin/ksh mydir/setup_env.ksh"
I think this should work for executing multiple commands remotely:
#!/bin/bash
ssh -o StrictHostKeyChecking=no xyz123#remotemachine.com <<EOF
sudo -su rootuser
ksh
. /mydir/setup_env.ksh
ls -ltr
EOF
As for login to the server without password, you need to setup ssh authentication with keys.

Resources