Sftp files from Remote server to local server - linux

Sorry if it's too much simple question. But I am Java developer, no idea of shell scripting.
I googled, but couldn't find exactly what I am looking for.
My requirement
Connect to remote server using Sftp [authentication based on pub/pri
keys]. A variable to point to private key file
Transfer files with
specific extension [.log] to local server folder. Variable to set
remote server path and local folder
Rename the transferred file in
remote server
Log all the transferred files in a .txt file
Can any one give me shell script for this?
This is so far I framed from suggestions.
Still some questions left on my side ;)
export PRIVKEY=${private_key_path}
export RMTHOST=user#remotehost
export RMTDIR=/logs/*.log
export LOCDIR=/downloaded/logs/
export LOG=sucess.txt
scp -i $PRIVKEY $RMTHOST:$RMTDIR $LOCDIR
for i in 'ls -1 $LOCDIR/*.log'
do
echo $i >> $LOG
done
ssh $RMTHOST -c "for i in `ls -1 $RMTDIR; do mv /logs/$i /logs/$i.transferred; done"

What about this approach?
Connect to remote server using Sftp [authentication based on pub/pri keys]. A variable to point to private key file
Transfer files with specific extension [.log] to local server folder. Variable to set remote server path and local folder
scp your_user#server:/dir/of/file/*.log /your/local/dir
Log all the transferred files in a .txt file
for file in /your/local/dir/*.log
do
echo "$file" >> $your_txt
done
Rename the transferred file in remote server
ssh your_user#server -c "for file in /dir/of/file/*.log; do mv /dir/of/file/"$file" /dir/of/file/new_name_based_on"$file"; done"

Use scp(secure copy) command to transfer file. You may also want to add the -C switch, which compresses the file. That can speed things up a bit. i.e. copy file1 from server1 to server2,
On server1:
#!/bin/sh
scp -C /home/user/file1 root#server2.com:/home/user
Edit:
#!/bin/sh
scp -i {path/to/pub/pri/key/file} /home/user/file1 root#server2.com:/home/user

Related

FTP transfer issue

I am building FTP bash script to generate a .csv file and transfer from a Linux machine to another server, but i have problems because it tigers an error and the file is not transferred on the 2nd server. What could be the problem?
This is the error:
TEST: A file or directory in the path name does not exist.
Filename invalid
And it doesn't matter if i put the / before the TEST, it will trigger the same issue.
This is my script
HOST='ipadress'
USER='user'
PASSWD=''
TARGET='TEST'
#Paramenters
set -x
DATE=`date +%Y%m%d%H%M`
SQL=/home/sql_statement.sql
QUERYCMD=/home/report.sh
CSV=/home/csv/test_$DATE.csv
#Interogate the sql and put in the folder
$QUERYCMD ${SQL} ${CSV}
#Send the .csv file in the target folder
cd /home/csv
ftp -n $HOST <<EOF
quote USER $USER
quote PASS $PASSWD
lcd $TARGET
put $CSV $TARGET
quit
EOF
exit 0
does the symbol TARGET refer to a dir on the remote host?
ftp command lcd would change-dir on the local (client) side, while cd would change-dir on the remote (server) side; also for the remote side there's usually a designated ftp root dir, adjust any path in relation to that starting point; to confirm dir contents, you might add ftp commands ls and !ls on separate lines right after the PASS line

How to SCP files which are being FTPed by another process &delete them on completion?

Files are being transferred to a directory on my machine by FTP protocol. I need to SCP these files to another machine & delete them on completion.
How can I detect if file trasfer by FTP has been done & the file is safe to do SCP?
There's no reliable way to detect completion of the transfer. Some clients send ALLO command and pass the size of the file before actually uploading the file, but this is not a definite rule, so you can't rely on it. All in all, it's possible that the client streams the data and there's no definite "end" of file on its side.
If the client is under your control, you can make it upload files with extension A and after upload rename the files to extension B. And then you transfer only files with extension B.
You can do a script like this:
#!/bin/bash
EXPECTED_ARGS=1
E_BADARGS=65
#Arguments control
if [ $# -lt $EXPECTED_ARGS ]
then
echo "Usage: `basename $0` <folder_update_1> <folder_update_2> <folder_update_3> ..."
exit $E_BADARGS
fi
folders=( "$#" )
for folder in ${folders[#]}
do
#Send folder or file to new machine
time rsync --update -avrt -e ssh /local/path/of/$folder/ user#192.168.0.10:/remote/path/of/$folder/
#Delete local file or folder
rm -r /local/path/of/$folder/
done
It is configured to send folders. If you want files need make little changes on script as:
time rsync --update -avrt -e ssh /local/path/of/$file user#192.168.0.10:/remote/path/of/$file
rm /local/path/of/$file/
Rsync is similar to scp. I prefer use rsync but you can change it.

how to save the log file in different name

scp user#server:/home/loghost??/logfiles.log .
i'm using above scp command in my unix script to download all the logs from loghost folder.
there are mutliple loghost are avaible in my server(i.e. loghost01,loghost02,loghost03)
The log name is same in all the loghost folder. So while scping, the logs are getting override. Is there a way to change the logname while copying?
for server in loghost01 loghost02 loghost03; do
mkdir -p $server;
scp user#$server:/home/$server/logfiles.log $server/;
done
I think something like that might help.
It takes a list of your servers, scps files over to a folder named loghost##/logfiles.log.
If you have a list of servers in a text file, replace the top line with:
for server in `cat file_containing_servers`; do
Put logs from different servers into different directories:
for server in loghost{01,02,03}
do
mkdir -p $server
scp user#$server:/home/$server/logfiles.log ./$server/
done
Put logs from different servers into the same directory with different names:
for server in loghost{01,02,03}
do
scp user#$server:/home/$server/logfiles.log ./$server.log
done

Create and update archive over ssh on local machine

I am trying to find a way to create and update a tar archive of files on a remote system where we don't have write permissions (the remote file system is read only) over ssh. I've figured out that the way to create a archive is,
ssh user#remoteServer "tar cvpjf - /" > backup.tgz
However, I would like to know if there is some way of performing only incremental backups from this point on (of only files that have actually changed?). Any help with this is much appreciated.
You can try using the --listed-incremental option of tar:
http://www.gnu.org/software/tar/manual/html_node/Incremental-Dumps.html
The main problem is that you have no option to pipe the snar file through the stdout because you are already piping backup.tgz so the best option to store it would be to create the file in the /tmp directory where you should have write permissions and then download it at the end of the backup session.
For example:
ssh user#remoteServer "tar --listed-incremental=/tmp/backup-1.snar -cvpjf - /" > backup-1.tgz
scp user#remoteServer:/tmp/backup-1.snar
And in the following session you will use that .snar file to avoid copying the same files:
scp backup-1.snar user#remoteServer:/tmp/backup-1.snar
ssh user#remoteServer "tar --listed-incremental=/tmp/backup-1.snar -cvpjf - /" > backup-2.tgz

lftp + bash script + variables

I'm using lftp to mirror files from external server but now what I need is to after sucessful download rename source directory (on remote server). Basicaly what I need is to open connection on remote server list directories, download all dirs that name starts from "todo" i.e. todo.20121019 after sucess I must rename downloaded directory to "done.20121019". There might be more than one dir on the server.
Remote FTP server works only with active connection.
#!/bin/bash
directories=`lftp -f lftp_script_file.lf |grep done|awk '{print $NF}'`
for i in $directories
do
echo $i //here I get list of directories that should be downloaded and renamed
done
lftp_script_file.lf just list directires:
set ftp:passive-mode false;
open ftp://user:pass$#10.10.10.123
ls my_sub_dir/
Is there a way to:
open connection to ftp server
find directories that I want to download
add those dirs to queue and download
rename directories on remote server
in batch file?
What I was trying to achive was to list dirs find interesing ones, download and rename but I cant find a way to post list of dirs to lftp via bash script and "set ftp:passive-mode false".
To be able to substitute variables into lftp commands use something like this:
lftp -e "cmd1;cmd2"

Resources