I've a simple shell script to transfer daily log file to another Windows FTP.
The problem is if the file is already there, it will still uploading a new one even though the file name is exactly the same.
How to perform a quick check on this script? If the file is there, then it won't proceed with FTP transfer
ftp -n -v $HOST << EOT
user $USER $PASSWD
prompt
bin
mput $FILE
bye
EOT
It is easy in Unix with ftp. First login to the system through ftp and run a ls -ltr command through ftp and list the files in a history.txt file(see below my example) and while transferring the file first check whether that file is already available in history file or not. And if available do not transfer that file. I do it like below:-
HISTORY_FILE="history.txt"
ftp -n -v $HOST << EOT
user $USER $PASSWD
prompt
bin
ls -rtE $HISTORY_FILE
bye
EOT
Now you can use below command to check:-
ISFILENAMEEXIST=$(cat $HISTORY_FILE | grep $FILE)
Now if the file exist in history.txt, do not send that file and if not available send it through ftp.
Related
I am using a bash script in Linux to transfer files to a server. My script is running from cron and I have directed output to a file but I cannot know from logs if the file has been transferred to B server or not.
This is the cron:
1>>/home/owais/script_test/logs/res_sim_script.logs 2>>/home/owais/script_test/logs/res_sim.logs
And the FTP is as below:
cd ${dir}
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $dir
cd $destDir
bin
prompt
put FILENAME
bye
The only thing that I get in the logs is:
Local directory now Directory_Name
Interactive mode off.
Instead of using FTP, there is rsync. Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to or from another host over any remote shell, or to, or from a remote rsync daemon.
More information at the following webpage, https://linux.die.net/man/1/rsync
I have used ftp -inv Host << EOF >> LogFilePath and it worked. Thank you all for the support
I need to upload files of a directory /home/test to my ftp server, in a specific folder. I will then schedule the script hourly via cron. Any examples?
I try most solutions but i'm not able to put file.
This is my code
HOST=*****
USER=****
PASSWORD=****
PORT=990
FILE=test.txt
ftp -inv $HOST << EOF
user $USER $PASSWORD
cd /Test
put test.test
bye
EOF
I Found solution: Just do
lftp -u $USER,$PASSWORD ftps://$HOST:$PORT <<EOF
set ssl:verify-certificate no
set ftp:ssl-protect-data true
put test.txt
exit
EOF
Thanks a lot for your ideas.
We need to move all the files from particular folder to ftp server. We wrote a script but getting directory not found exception. Below is the script.
#!/bin/sh
HOST='192.168.1.100'
USER='ramesh'
PASSWD='rameshftp'
ftp -inv $HOST << EOF
user $USER $PASSWD
cd /home/Ramesh
put *.zip
bye
EOF
Our requirement is to copy all the files which resides in some directory in Suse Linux Server and copy to FTP server. for eg: Copy all the contents from "/home/Ramesh" directory and put into ftp server.
You can do this in one line with ncftp:
ncftpput -u username -p password ftp-host-name /path/to/remote/dir /path/to/local/dir/*
See http://linux.die.net/man/1/ncftp for more info
I'm having issues with my Unix FTP script...
It's only transferring the first three files in the directory that I'm local cd'ing into during the FTP session.
Here's the bash script that I'm using:
#!/bin/sh
YMD=$(date +%Y%m%d)
HOST='***'
USER='***'
PASSWD=***
FILE=*.png
RUNHR=19
ftp -inv ${HOST} <<EOF
quote USER ${USER}
quote PASS ${PASSWD}
cd /models/rtma/t2m/${YMD}/${RUNHR}/
mkdir /models/rtma/t2m/${YMD}/
mkdir /models/rtma/t2m/${YMD}/${RUNHR}/
lcd /home/aaron/grads/syndicated/rtma/t2m/${YMD}/${RUNHR}Z/
binary
prompt
mput ${FILE}
quit
EOF
exit 0
Any ideas?
I had encountered same issue, I have to transfer 400K files but mput * or mput *.pdf was not moving all files in one go
tried timeout :fails
tried -r recursive :fails
tried increasing Data/control timeout in IIS :fails
tried -i
Prompt
scripting fails
Finally went to use portable filezilla connect to from source and transferred the all files
I would like to synchronize two folders with each other. It should go two ways, always keeping the folders up to date (I use a regular cronjob). However, first I do not get the two way file transfer to work (it just downloads from the ftp and not the opposite).
Secondly, it downloads the whole content from the ftp, even though the login information has been set up on the ftp so that access is only restricted to a specific folder. Why??
Here is the code (thanks in advance!):
#!/bin/bash
#get username and password
USER=username
PASS=password
HOST="myftpserver.com/users/user1/" #here I have tried with only specifying server name as well as including whole path
LCD="~/Desktop/localfolder/"
RCD="users/user1/"
lftp -c "set ftp:list-options -a;
open ftp://$USER:$PASS#$HOST;
lcd $LCD;
mirror -c --reverse --verbose $LCD $RCD" #I have tried a few different options w/o result
You probably don't need this anymore (4 years late) but I'll just update this, and if someone get's here with the same issue here's a help.
Local directory to FTP server directory
If you want to sync the FTP server folder with the content in your folder you should use something like this
#!/bin/bash
#get username and password
USER=username #Your username
PASS=password #Your password
HOST="myftpserver.com" #Keep just the address
LCD="~/Desktop/localfolder" #Your local directory
RCD="/users/user" #FTP server directory
lftp -f "
open $HOST
user $USER $PASS
lcd $LCD
mirror --continue --reverse --delete --verbose $LCD $RCD
bye
"
FTP server directory to your local directory
Simply remove the --reverse and swap the folders in the mirror command.
#!/bin/bash
#get username and password
USER=username #Your username
PASS=password #Your password
HOST="myftpserver.com" #Keep just the address
LCD="~/Desktop/localfolder" #Your local directory
RCD="/users/user" #FTP server directory
lftp -f "
open $HOST
user $USER $PASS
lcd $LCD
mirror --continue --delete --verbose $RCD $LCD
bye
"
To do something like you commented in the question, sync both ways and keep the most updated value from each, i don't believe it's possible using lftp alone you'll need something to detect the change and decide which script use.
if sync the remote FTP server folder to the local folder and using lftp-4.9 above, please try this script:
#!/bin/bash
LFTP_HOME=/home/lftp-4.9.2
#get username and password
REMOTE_FTP_USER="user"
REMOTE_FTP_PASS="passwd"
REMOTE_HOST="ftp-server"
REMOTE_PORT="ftp-pport"
LOCAL_FOLDER="/home/ftpRoot/backup_mirror/"
REMOTE_FOLDER="/"
cd $LOCAL_FOLDER
$LFTP_HOME/bin/lftp -f "
open -p $REMOTE_PORT $REMOTE_HOST
user $REMOTE_FTP_USER $REMOTE_FTP_PASS
mirror -c -e --verbose --target-directory=$LOCAL_FOLDer $REMOTE_FILDER
bye
"