I am building FTP bash script to generate a .csv file and transfer from a Linux machine to another server, but i have problems because it tigers an error and the file is not transferred on the 2nd server. What could be the problem?
This is the error:
TEST: A file or directory in the path name does not exist.
Filename invalid
And it doesn't matter if i put the / before the TEST, it will trigger the same issue.
This is my script
HOST='ipadress'
USER='user'
PASSWD=''
TARGET='TEST'
#Paramenters
set -x
DATE=`date +%Y%m%d%H%M`
SQL=/home/sql_statement.sql
QUERYCMD=/home/report.sh
CSV=/home/csv/test_$DATE.csv
#Interogate the sql and put in the folder
$QUERYCMD ${SQL} ${CSV}
#Send the .csv file in the target folder
cd /home/csv
ftp -n $HOST <<EOF
quote USER $USER
quote PASS $PASSWD
lcd $TARGET
put $CSV $TARGET
quit
EOF
exit 0
does the symbol TARGET refer to a dir on the remote host?
ftp command lcd would change-dir on the local (client) side, while cd would change-dir on the remote (server) side; also for the remote side there's usually a designated ftp root dir, adjust any path in relation to that starting point; to confirm dir contents, you might add ftp commands ls and !ls on separate lines right after the PASS line
Related
Current code
#!/bin/bash
SFTP_SERVER="sftp.url.com:/csv/test/10"
SFTP_USER="user"
SFTP_PWD="pwd"
## not sure if this line is needed given I specify the local directory
# in the next block of code.
cd /mnt/c/Users/user/Documents/new_directory
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER
lftp -e mget *.csv mirror sftp.user.com:/csv/test/10 /mnt/c/Users/user/Documents/new_directory
Objective
Download all csv files and mirror my local directory folder with the remote server, so when the code is run again it won't download a second file.
Error received
open: *.csv: Name or service not known
Comments
From what I understood of the lftp man page I should be able to get all wildcard files by using mget instead of the standard get, provided I use -e to use external commands. I've run mget manually and can download the files without issue but it doesn't seem to support the *.csv in the script.
Appreciate any feedback you can provide as to why my code won't download the files and what I might have misunderstood from the man pages.
It should be like:
lftp sftp://$SFTP_USER:$SFTP_PWD#$SFTP_SERVER -e "mget *.csv; bye"
i have shell script to FTP a file from one server to another server called abc.sh below is the code inside it
#!/bin/bash
HOST='10.18.11.168'
USER='india'
PASS='India#2017'
FILE='inload.dat'
DIRECTORY='/inloading'
ftp -n $HOST <<END_SCRIPT
user $USER $PASS
cd $DIRECTORY
put $FILE
quit
END_SCRIPT
exit 0
i am able to run it using ./abc.sh and file also gets copied to remote server.
But when i use in crontab it is not ftp the file
below is the crontab entry
15 01 * * * /user/loader/abc.sh > /user/loader/error.log 2>&1
in the error.log it shows as local: inload.dat: No such file or directory
You're referencing the file inload.dat, which is relative to the directory the script is run from. When you run the script as ./abc.sh it looks for an inload.dat in the same directory.
Cron chooses which directory to run your script from when it executes (IIRC it generally defaults to /root or your HOME directory, but the specific location doesn't matter), and it's not necesarily the same directory that you're in when you run ./abc.sh.
The right solution is to make FILE and absolute path to the full location of inload.dat, so that your script no longer depends on being run from a certain directory in order to succeed.
There are other options, such as dynamically determining the directory the script lives in, but simply using absolute paths is generally the better choice.
Your local path is probably not what you want it to be. Before executing the ftp command, add a cd to the directory of where the file is located. Or have the full path name to the file $FILE.
I need to upload a file from source unix server to destination unix server (supports sftp). i'm using simple script below:-
cd /usr/bin
sftp userid#destination_server <<EOF
put myfile /
EOF
I get host key verification failed, Couldn't read packet: Connection reset by peer
I know this must have got something to do with correct public ssh key of my source being not set under destination server. But otherwise , is my script correct. Or do you suggest any other script based on my simple requirement stated above. Please note this doesn't need any password, just user name is sufficient and remote directory is just the root directory, hence using /.
Simply use a SFTP batch file:
sftp -b batchfile.sftp userid#destination_server
with batchfile.sftp containing exactly one line (or whatever more commands you should need)
put myfile /
Sorry if it's too much simple question. But I am Java developer, no idea of shell scripting.
I googled, but couldn't find exactly what I am looking for.
My requirement
Connect to remote server using Sftp [authentication based on pub/pri
keys]. A variable to point to private key file
Transfer files with
specific extension [.log] to local server folder. Variable to set
remote server path and local folder
Rename the transferred file in
remote server
Log all the transferred files in a .txt file
Can any one give me shell script for this?
This is so far I framed from suggestions.
Still some questions left on my side ;)
export PRIVKEY=${private_key_path}
export RMTHOST=user#remotehost
export RMTDIR=/logs/*.log
export LOCDIR=/downloaded/logs/
export LOG=sucess.txt
scp -i $PRIVKEY $RMTHOST:$RMTDIR $LOCDIR
for i in 'ls -1 $LOCDIR/*.log'
do
echo $i >> $LOG
done
ssh $RMTHOST -c "for i in `ls -1 $RMTDIR; do mv /logs/$i /logs/$i.transferred; done"
What about this approach?
Connect to remote server using Sftp [authentication based on pub/pri keys]. A variable to point to private key file
Transfer files with specific extension [.log] to local server folder. Variable to set remote server path and local folder
scp your_user#server:/dir/of/file/*.log /your/local/dir
Log all the transferred files in a .txt file
for file in /your/local/dir/*.log
do
echo "$file" >> $your_txt
done
Rename the transferred file in remote server
ssh your_user#server -c "for file in /dir/of/file/*.log; do mv /dir/of/file/"$file" /dir/of/file/new_name_based_on"$file"; done"
Use scp(secure copy) command to transfer file. You may also want to add the -C switch, which compresses the file. That can speed things up a bit. i.e. copy file1 from server1 to server2,
On server1:
#!/bin/sh
scp -C /home/user/file1 root#server2.com:/home/user
Edit:
#!/bin/sh
scp -i {path/to/pub/pri/key/file} /home/user/file1 root#server2.com:/home/user
I'm using lftp to mirror files from external server but now what I need is to after sucessful download rename source directory (on remote server). Basicaly what I need is to open connection on remote server list directories, download all dirs that name starts from "todo" i.e. todo.20121019 after sucess I must rename downloaded directory to "done.20121019". There might be more than one dir on the server.
Remote FTP server works only with active connection.
#!/bin/bash
directories=`lftp -f lftp_script_file.lf |grep done|awk '{print $NF}'`
for i in $directories
do
echo $i //here I get list of directories that should be downloaded and renamed
done
lftp_script_file.lf just list directires:
set ftp:passive-mode false;
open ftp://user:pass$#10.10.10.123
ls my_sub_dir/
Is there a way to:
open connection to ftp server
find directories that I want to download
add those dirs to queue and download
rename directories on remote server
in batch file?
What I was trying to achive was to list dirs find interesing ones, download and rename but I cant find a way to post list of dirs to lftp via bash script and "set ftp:passive-mode false".
To be able to substitute variables into lftp commands use something like this:
lftp -e "cmd1;cmd2"