FTP status check using a variable - Linux - linux

I am doing an ftp and I want to check the status. I don't want to use '$?' as mostly it returns 0 (Success) for ftp even though internally ftp didn't go through.
I know I can check the log file and do a grep from there for "Transfer complete" (221 status). That works fine but I don't want to do it as I have many different reports doing ftp. So creating multiple log files for all of them is what I want to avoid.
Can I get the logged information in a local script variable and process it inside the script itself?
Something similar to these (I've tried both but neither worked):
Grab FTP output in BASH SCRIPT
FTP status check whether successful or not
Below is something similar to what I am trying to do:
ftp -inv ${HOST} > log_file.log <<!
user ${USER} ${PASS}
bin
cd "${TARGET}"
put ${FEEDFILE}
bye
!
Any suggestions on how can I get the entire ftp output in a script variable and then check it within the script?

To capture stdout to a variable you can use bash's command substitution, so either OUTPUT=`cmd` or OUTPUT=$(cmd).
Here's an example how to capture the output from ftp in your case:
CMDS="user ${USER} ${PASS}
bin
cd \"${TARGET}\"
put \"${FEEDFILE}\"
bye"
OUTPUT=$(echo "${CMDS}" | ftp -inv ${HOST})

Related

Linux shell script to download file(s) from server to PC while connected with putty

I am connected to a server through putty, and I want to download (to my PC) certain files on a regular basis using a shell script. Specifically, these are the files...
ls -t ~/backup | head -n2
What is the best strategy for this? I was trying with command line FTP but I am prompted to login to something. I'm already logged into the server that has the files I need to download, so I am missing something.
The SSH protocol can be a good way, with scp command. You can take a look at this thread
To automate the process and script a solution, you will need use password-less ssh and ssh keys.
The first step will be to get the list of files to copy and so:
fils=$(ssh username#host ls -t ~/backup | head -n2)
Then once we have the files read into a variable fils, we can loop on the entries and run a secure copy command:
while read fyle
do
scp username#host:~/"$fyle" "$fyle"
done <<< "$fils"

How to create one logfile when ssh-ing to multiple servers

I'd like to create a bash script to automatically connect myself to a bunch of servers, execute some commands there and save the output of these commands in one logfile on the server I use to connect myself to all the other servers.
So far I was able to create a logfile on each of the servers I'm connecting myself to or to display the output of each of the commands on the console of the server I use to get to all the other servers.
My script currently looks like this (I know about for loops, but I don't want to use them in this case because I need to execute different commands on each server):
#!/bin/bash
ssh server1 <<EOF
hostname
printf '\n'
mount
EOF
printf '\n'
printf '\n'
printf '\n'
ssh server2 <<EOF
hostname
printf '\n'
mount
EOF
...
My idea was to use the &>> operator, because I need to know if all commands where executed successfully or not. In the end I'd like to have only one logfile which should look somewhat like this:
server1
output of mount
server 2
output of mount
...
So, how can I manage to create only one large logfile that contains the results of all executed commands? Also, will this script still work correctly if I make use of the ssh -T option to get rid of the message "Pseudo-terminal will not be allocated because stdin is not a terminal."? And do I have to escape special characters like / _ - when using mount in my script to mount something?
Thanks in advance!
I suggest using Open source utilities like logstash or fluentd.
I would use fabric, which is a tool to interact with several servers using ssh. It provides operations for executing remote shell commands.
For your example, the fabfile:
from fabric.api import run, sudo
def my-task():
run('hostname')
run('mount')
An you can execute it:
fab -H server1,server2 my-task
Output will be via standard output of the server you are executing so you can easily redirect it to a file:
fab -H server1,server2 my-task | my-task.log

Adding new user to multiple unix servers using terminal

I am working within a company and require myself to be added onto different branch servers. The current way of doing this is:
sudo /usr/local/bin/sd-adduser test "Test User"
This needs to be done individually logging into each server manually - which is about 20 servers. I vaguely know of expect which allows you to do add a user to multiple servers? Could anyone point me in the right direction? Or provide me the script to do this.
Any help is appreciated.
Sounds like multi-ssh could help you or pssh or pdsh.
In the long run you probably want a central user management like LDAP.
Routine administration tasks such as this can be done using a script that reads a list of server names and runs a command. Something like this "each-host" script:
#!/bin/sh
for server in $(cat mylist)
do
ssh -t $server "$#"
done
where mylist is a file containing the list of servers.
Thus
each-host sudo /usr/local/bin/sd-adduser test "Test User"
would run the OP's command on each host. Once you get that working, you could tidy up a little, making it less verbose (not printing /etc/motd);
#!/bin/sh
for server in $(cat mylist)
do
echo "** $server"
ssh -q -t $server "$#"
done

Write bash output to file while using here document

I'm writing a bash script to grab an archive (specifically, the gcc-4.9.1 source) from ftp.gnu.org, which it does using a here document. I'd also like to direct ftp's output to a log file, in order to make the script's output cleaner while keeping any information that might be needed. I'd also like to use bash's double-bar to run an error-catching function that prints some output and exits if ftp returns unsuccessfully.
#get the archive
echo "getting the archive from ftp://${HOST}/${FTPPATH}${ARCHIVE}"
/usr/bin/ftp -inv $HOST > ${VERNAME}_logs/ftp.log || errorcatcher "failed to connect to ftp server, check ${VERNAME}_logs/ftp.log" <<FTPSCRIPT
user anonymous
cd $FTPPATH
binary
get $ARCHIVE
FTPSCRIPT
The problem is, this hangs, and ftp.log looks like this:
Trying 208.118.235.20...
Connected to ftp.gnu.org (208.118.235.20).
220 GNU FTP server ready.
ftp>
ftp>
ftp>
The commands are clearly not getting passed to the ftp client, and I imagine either the output redirection or the double pipes are causing this, as without them both the script successfully gets the archive.
Is there any syntax that allows me to pass interactive commands to ftp while still allowing output redirection and conditional execution following the return?
You are feeding the wrong command with the here document.
/usr/bin/ftp [ ...args... ] || errorcatcher [...] <<FTPSCRIPT
should be
/usr/bin/ftp [ ...args... ] <<FTPSCRIPT || errorcatcher [...]
The contents of the here document do not begin until the line following the <<. You could even write
/usr/bin/ftp -inv $HOST > ${VERNAME}_logs/ftp.log <<FTPSCRIPT ||
user anonymous
cd $FTPPATH
binary
get $ARCHIVE
FTPSCRIPT
errorcatcher "failed to connect to ftp server, check ${VERNAME}_logs/ftp.log"
if you find that more readable (I'm not sure that I do, but it's an option), or also
/usr/bin/ftp -inv $HOST > ${VERNAME}_logs/ftp.log <<FTPSCRIPT \
|| errorcatcher "failed to connect to ftp server, check ${VERNAME}_logs/ftp.log"
user anonymous
cd $FTPPATH
binary
get $ARCHIVE
FTPSCRIPT

Linux FTP put success output

I have a bash script that creates backups incrementally(daily) and full(on Mondays). Every 7 days the script combines the week of backups(full and Incremental) and sends them off to an FTP server, the problem i am having is i want to delete the files from my backup directory after the FTP upload is finished, but i cant do that until i know the file was successfully uploaded. I need to figure out how to capture the '226 transfer complete' so i can use that in an 'IF' statement to delete the backup files. Any help is greatly appreciated. also he is my FTP portion of the script
if [ -a "$WKBKDIR/weekending-$YEAR-$MONTH-$DAY.tar.gz" ]; then
HOST=192.168.40.30 #This is the FTP servers host or IP address.
USER=username #This is the FTP user that has access to the server.
PASS=password #This is the password for the FTP user.
ftp -inv $HOST << EOF
user $USER $PASS
cd /baks
lcd $WKBKDIR
put weekending-$YEAR-$MONTH-$DAY.tar.gz
bye
EOF
fi
I could use whatever mean i needed i suppose, FTP was something already setup for another backup function for something else, thanks
2nd EDIT Ahmed the rsync works great in test from command line, its a lot faster than FTP, the server is on the local network so SSH not that big of a deal but nice to have for added security, i will finish implementing in my script tomorrow, thanks again
FTP OPTION
The simple solution would be to do something like this:
ftp -inv $HOST >ftp-results-$YEAR-$MONTH-$DAY.out 2>&1 <<-EOF
user $USER $PASS
cd /baks
bin
lcd $WKBKDIR
put weekending-$YEAR-$MONTH-$DAY.tar.gz
bye
EOF
Also there is an issue with your here-document syntax; there is no space between << and the delimiter word (in your case EOF) and I added a - because you are putting white-spaces before the ACTUAL delimeter (it's tabbed in for the if / fi block) so the [-] is required
Now when you do this; you can parse the output file to look for the successful put of the file. For example:
if grep -q '226 transfer complete' ftp-results-$YEAR-$MONTH-$DAY.out; then
echo "It seems that FTP transfer completed fine, we can schedule a delete"
echo "rm -f $PWD/weekending-$YEAR-$MONTH-$DAY.tar.gz" >> scheduled_cleanup.sh
fi
and just run scheduled_cleanup.sh using cron at a given time; this way you will have some margin before the files are cleaned up
If your remote FTP server has good SITE or PROXY options you may be able to get the remote FTP to run a checksum on the uploaded file after successful upload and return the result.
SCP / RSYNC OPTION
Using FTP is clunky and dangerous, you should really try and see if you can have scp or ssh access to the remote system.
If you can then generate an ssh key if you don't have one using ssh-keygen:
ssh-keygen -N "" -t rsa -f ftp-rsa
put the ftp-rsa.pub file into the $HOST/home/$USER/.ssh/authorized_keys and you have a much nicer method for uploading files:
if scp -B -C weekending-$YEAR-$MONTH-$DAY.tar.gz $USER#$HOST:/baks/ ; then
echo Upload successful 1>&2
else
echo Upload failed 1>&2
fi
Or better yet using rsync:
if rsync --progress -a weekending-$YEAR-$MONTH-$DAY.tar.gz $HOST:/baks/ ; then
echo Upload successful 1>&2
else
echo Upload failed 1>&2
fi
et voilĂ  you are done since rsync works over ssh you are happy and secure
Try the next
#!/bin/bash
runifok() { echo "will run this when the transfer is OK"; }
if [ -a "$WKBKDIR/weekending-$YEAR-$MONTH-$DAY.tar.gz" ]; then
HOST=192.168.40.30 #This is the FTP servers host or IP address.
USER=username #This is the FTP user that has access to the server.
PASS=password #This is the password for the FTP user.
ftp -inv <<EOF | grep -q '226 transfer complete' && runifok
user $USER $PASS
cd /baks
lcd $WKBKDIR
put weekending-$YEAR-$MONTH-$DAY.tar.gz
bye
EOF
fi
test it and when will run ok - replace the echo in the runifok function for your commands what want execute after the upload is succesful.

Resources