TCL Blowfish behavior - zip

I made a small script that zips a file and then encrypts that file:
#-- cut from proc --
set outfile [open $out wb]
set ind [string last \/ $in]
set in [string range $in [expr $ind + 1] end]
zipfile::mkzip::mkzip $in.zip $in
set infile [open $in.zip rb]
if {[catch {blowfish::blowfish -mode $mode -key $key -iv $iv -out $outfile -in $infile} msg]} {
tk_messageBox -message "Error message: $msg"
continue
}
close $infile; close $outfile
#-- end of cut --
To decipher and unzip:
#-- cut from proc --
set outfile [open $out.zip wb]
set infile [open $in rb]
if {[catch {blowfish::blowfish -dir decrypt -mode $mode -key $key -iv $iv -out $outfile -in $infile} msg]} {
tk_messageBox -message "Error message: $msg"
close $outfile
continue
}
close $infile; close $outfile
if {[zipfile::decode::iszip $out.zip] < 1} {
tk_messageBox -message "bad zip file"
file delete -force $out.zip
return
}
zipfile::decode::unzipfile $out.zip $final
file delete -force $out.zip
#-- end of cut --
Now, all works fine, except if the deciphered zip file is bad, meaning that we've used a bad password or mode. I would've thought that catch {blowfish line would get me an error, but apparently blowfish doesn't care, and will just blow garbage into the output file with the .zip extension. In any case, in the case of the bad zip file, the script is not releasing the zip file, and will give me a permission error when trying to delete it. If the zip file is a good file, it will happily unzip and be deleted. I would presume that blowfish has the file locked but won't give an error or let it go. Any help nailing down what I'm doing wrong would be appreciated.
Update: Run the same script on a Linux os at home, and it works. At work on Win10 was the bad behavior, I should've noted that initially.

I've found the problem bug. It's in zipfile::decode::LocateEnd (called by iszip and others), which doesn't close the open file handle to the zip file if it throws an error. I'm not quite what all of the conditions are under which it throws an error, but one is definitely when the ZIP index can't be found. Which would be OK… except that the handle's open and on Windows that means that the file can't be deleted (because having a file open locks its directory entry; Unixes don't typically work that way).
It's definitely a bug.
Fortunately it's using a normal Tcl channel, not some kind of complicated C thing, so we can work around it.
# Assume you have Tcl 8.5
proc safeIsZip {filename} {
set channels [file channels]; # Hope this is short!
catch {
zipfile::decode::iszip $filename
} result options
foreach ch [file channels] {
if {$ch ni $channels} {
close $ch
}
}
return -options $options $result
}
In Tcl 8.6, you can do it a bit nicer:
proc safeIsZip {filename} {
set channels [file channels]; # Hope this is short!
try {
return [zipfile::decode::iszip $filename]
} finally {
foreach ch [file channels] {
if {$ch ni $channels} {
chan close $ch
}
}
}
}

Related

search multi line string from multiple files in a directory

the string to to be searched is:
the file_is being created_automaically {
period=20ns }
the perl script i am using is following ( this script is working fine for single line string but not working for multi line )
#!/usr/bin/perl
my $dir = "/home/vikas";
my #files = glob( $dir . '/*' );
#print "#files";
system ("rm -rf $dir/log.txt");
my $list;
foreach $list(#files){
if( !open(LOGFILE, "$list")){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: unable to open file";
close (File);
else {
while (<LOGFILE>){
if($_ =~ /".*the.*automaically.*\{\n.*period\=20ns.*\}"/){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: File contain the required string\n";
close (File);
break;
}
}
close (LOGFILE);
}
}
This code does not compile, it contains errors that causes it to fail to execute. You should never post code that you have not first tried to run.
The root of your problem is that for a multiline match, you cannot read the file in line-by-line mode, you have to slurp the whole file into a variable. However, your program contains many flaws. I will demonstrate. Here follows excerpts of your code (with fixed indentation and missing curly braces).
First off, always use:
use strict;
use warnings;
This will save you many headaches and long searches for hidden problems.
system ("rm -rf $dir/log.txt");
This is better done in Perl, where you can control for errors:
unlink "$dir/log.txt" or die "Cannot delete '$dir/log.txt': $!";
foreach my $list (#files) {
# ^^
Declare the loop variable in the loop itself, not before it.
if( !open(LOGFILE, "$list")){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: unable to open file";
close (File);
You never have to explicitly select a file handle before you print to it. You just print to the file handle: print File "....". What you are doing is just changing the STDOUT file handle, which is not a good thing to do.
Also, this is error logging, which should go to STDERR instead. This can be done simply by opening STDERR to a file at the beginning of your program. Why do this? If you are not debugging a program at a terminal, for example via the web or some other process where STDERR does not show up on your screen. Otherwise it is just extra work while debugging.
open STDERR, ">", "$dir/log.txt" or die "Cannot open 'log.txt' for overwrite: $!";
This has the added benefit of you not having to delete the log first. And now you do this instead:
if (! open LOGFILE, $list ) {
warn "Unable to open file '$list': $!";
} else ....
warn goes to STDERR, so it is basically the same as print STDERR.
Speaking of open, you should use three argument open with explicit file handle. So it becomes:
if (! open my $fh, "<", $list )
} else {
while (<LOGFILE>) {
Since you are looking for a multiline match, you need to slurp the file(s) instead. This is done by setting the input record separator to undef. Typically like this:
my $file = do { local $/; <$fh> }; # $fh is our file handle, formerly LOGFILE
Next how to apply the regex:
if($_ =~ /".*the.*automaically.*\{\n.*period\=20ns.*\}"/) {
$_ =~ is optional. A regex automatically matches against $_ if no other variable is used.
You should probably not use " in the regex. Unless you have " in the target string. I don't know why you put it there, maybe you think strings need to be quoted inside a regex. If you do, that is wrong. To match the string you have above, you do:
if( /the.*automaically.*{.*period=20ns.*}/s ) {
You don't have to escape \ curly braces {} or equal sign =. You don't have to use quotes. The /s modifier makes . (wildcard character period) also match newline, so we can remove \n. We can remove .* from start or end of string, because that is implied, regex matches are always partial unless anchors are used.
break;
The break keyword is only used with the switch feature, which is experimental, plus you don't use it, or have it enabled. So it is just a bareword, which is wrong. If you want to exit a loop prematurely, you use last. Note that we don't have to use last because we slurp the file, so we have no loop.
Also, you generally should pick suitable variable names. If you have a list of files, the variable that contains the file name should not be called $list, I think. It is logical that it is called $file. And the input file handle should not be called LOGFILE, it should be called $input, or $infh (input file handle).
This is what I get if I apply the above to your program:
use strict;
use warnings;
my $dir = "/home/vikas";
my #files = glob( $dir . '/*' );
my $logfile = "$dir/log.txt";
open STDERR, ">", $logfile or die "Cannot open '$logfile' for overwrite: $!";
foreach my $file (#files) {
if(! open my $input, "<", $file) {
warn "Unable to open '$file': $!";
} else {
my $txt = do { local $/; <$fh> };
if($txt =~ /the.*automaically.*{.*period=20ns.*}/) {
print " $file : File contain the required string\n";
}
}
}
Note that the print goes to STDOUT, not to the error log. It is not common practice to have STDOUT and STDERR to the same file. If you want, you can simply redirect output in the shell, like this:
$ perl foo.pl > output.txt
The following sample code demonstrates usage of regex for multiline case with logger($fname,$msg) subroutine.
Code snippet assumes that input files are relatively small and can be read into a variable $data (an assumption is that computer has enough memory to read into).
NOTE: input data files should be distinguishable from rest files in home directory $ENV{HOME}, in this code sample these files assumed to match pattern test_*.dat, perhaps you do not intend to scan absolutely all files in your home directory (there could be many thousands of files but you interested in a few only)
#!/usr/bin/env perl
use strict;
use warnings;
use feature 'say';
my($dir,$re,$logfile);
$dir = '/home/vikas/';
$re = qr/the file_is being created_automaically \{\s+period=20ns\s+\}/;
$logfile = $dir . 'logfile.txt';
unlink $logfile if -e $logfile;
for ( glob($dir . "test_*.dat") ) {
if( open my $fh, '<', $_ ) {
my $data = do { local $/; <$fh> };
close $fh;
logger($logfile, "INFO: $_ contains the required string")
if $data =~ /$re/gsm;
} else {
logger($logfile, "WARN: unable to open $_");
}
}
exit 0;
sub logger {
my $fname = shift;
my $text = shift;
open my $fh, '>>', $fname
or die "Couldn't to open $fname";
say $fh $text;
close $fh;
}
Reference: regex modifies, unlink, perlvar

How to read line by line gz file TCL/LINUX

i made a script in TCL which receives huge input file, reads line by line and then modifies the data in some way.
the problem starts when i need to do the same with *.gz format files, which contains the data file.
the only thing i found by google search is how to do it by using gzcat and that also didn't work + it's not good because it reads the whole file ( i think ?) and i don't want it to process the whole file.
on short : i need to read a gz file line by line, how do i do it?
example of what i did on normal :
set fh [open <some path> r]
while {[gets $fh line]>=0} {
do something with $line
}
what i tried and couldn't understand\make it work for me :
set pipeline [open "| zcat foo.gz"]
set data [read $pipeline]
close $pipeline
thanks!
If you have Tcl 8.6, just do:
set fh [open <SomePath.gz> r]
zlib push gunzip $fh
while {[gets $fh line]>=0} {
do something with $line
}
close $fh
With 8.5 or before, going via an external gzcat process is the simplest way.
set ZCAT_PROGRAM gzcat; # Might be called something else on your system
set fh [open |[list $ZCAT_PROGRAM <SomePath.gz>] r]
while {[gets $fh line]>=0} {
do something with $line
}
close $fh
You can also do it if you have gzip if you pass the right flags, which has the advantage of it being pretty consistently called gzip when it is present at all:
set fh [open |[list gzip -d -c <SomePath.gz>] r]
while {[gets $fh line]>=0} {
do something with $line
}
close $fh
(The -d option does decompression, the -c option sends it to stdout so we can read it from the pipeline.)

How to replace a string of different length through file handling in tcl

Want to replace SVT-ATL in all the lines of file with SVT without disturbing other text.
Using below code:
set fileDest3 "$dirName/$filename"
set fpr [open $fileDest3 r+]
set line [gets $fpr]
regsub -all "SVT-ATL" $line "SVT" line
puts $fpr "$line"
Because you're changing the length of lines, you must rewrite the whole file. (Well, you could theoretically leave the lines before the first thing being changed a lot, but that's a whole bunch more work.) The simplest way is to read it all in, string map to perform the change (in the simplest case; regsub if things are trickier) and then write it all back out (chan seek to the beginning first, of course). As you're shortening things, you'll need to finish with a chan truncate.
set fileDest3 "$dirName/$filename"
set fpr [open $fileDest3 r+]
set newContents [string map {"SVT-ATL" "SVT"} [read $fptr]]
chan seek $fptr 0
puts -nonewline $fptr $newContents
chan truncate $fptr
close $fptr
The puts has a -nonewline so you don't get an extra terminating newline; the one that was there originally will still be in (as we're reading it all in and not just line-by-line).
package require fileutil
proc cmd data {
string map {SVT-ATL SVT} $data
}
if {[catch {fileutil::updateInPlace [file join $dir $filename] cmd}]} {
error "failed to change file"
}
The Tcllib fileutil::updateInPlace command takes care of the low-level details of opening, reading, applying a given command to the content, truncating, writing, and closing files that you want updated. You simply provide a command like cmd here and enjoy the odds ever being in your favor.
Documentation: catch, error, if, package, proc, string
The fileutil package is documented here: fileutil
set timestamp [clock format [clock seconds] -format {%Y%m%d%H%M%S}]
set filename "yourfilenamehere.txt"
set temp $filename.tmp.$timestamp
set backup $filename.bak.$timestamp
set in [open $filename r]
set out [open $temp w]
# line-by-line, read the original file
while {[gets $in line] != -1} {
# Modifying $line by replacing the 'SVT-AL' with 'SVT'
regsub -all "SVT-ATL" $line "SVT" line
# then write the modified line to 'tmp' file
puts $out $line
}
close $in
close $out
# This is to rename the current file to backup file
file rename -force $filename $backup
# This is to rename the tmp file to the original file
file rename -force $temp $filename
Reference : Glenn Jackman & Donal Fellows
Update :
If you don't want to create a new file, then at least, as Jerry pointed out, we can read all the file content at once, apply our string replacement and then write back to file.
# Reading the file content
set fd [ open "yourfilename" r ]
set data [ read $fd ]
close $fd
# Replacing the string now...
regsub -all "SVT-ATL" $data "SVT" data
# Opening file with 'w' mode which will truncate the file
set fd [ open "yourfilename" w ]
puts $fd $data
close $fd
I would consider
exec sed -i {s/SVT-ATL/SVT/g} "$dirName/$filename"

How can I split a CA certificate bundle into separate files?

I'm working with OpenSSL and need a sane default list of CAs. I'm using Mozilla's list of trusted CAs, as bundled by cURL. However, I need to split this bundle of CA certs, because the OpenSSL documentation says:
If CApath is not NULL, it points to a directory containing CA certificates in PEM format. The files each contain one CA certificate. The files are looked up by the CA subject name hash value, which must hence be available.
For example, using the ca-bundle.crt file directly works fine:
openssl-1.0.1g> ./apps/openssl s_client -connect www.google.com:443 -CAfile /home/user/certs/ca-bundle.crt
...
Verify return code: 0 (ok)
---
DONE
But specifying the directory containing the ca-bundle.crt file does not work:
openssl-1.0.1g> ./apps/openssl s_client -connect www.google.com:443 -CApath /opt/aspera/certs
Verify return code: 20 (unable to get local issuer certificate)
---
DONE
I presume this is because my folder doesn't adhere to what the documentation asks for (namely, a directory containing CA certs in PEM format, with each file containing one cert, named by hash value). My directory just has the single bundle of certs.
How can I split my bundle of certs to adhere to OpenSSL's request that each cert be in an individual file? Bonus points if the hashing can be done too (though if needed I could write a script to do that myself if all the certs are in individual files).
You can split the bundle with awk, like this, in an appropriate directory:
awk 'BEGIN {c=0;} /BEGIN CERT/{c++} { print > "cert." c ".pem"}' < ca-bundle.pem
Then, create the links OpenSSL wants by running the c_rehash utility that comes with OpenSSL:
c_rehash .
Note: use 'gawk' on non linux-platforms - as above relies on a GNU specific feature.
Just to give an alternative; facing the same issue I ended up with csplit:
csplit -k -f bar foo.pem '/END CERTIFICATE/+1' {10}
If you want to get a single certificate out of a multi-certificate PEM, try:
$ awk '/subject.*CN=host.domain.com/,/END CERTIFICATE/' INPUT.PEM
source
The following Ruby-script will split the bundle (with one or more certificates in it) into files named after the hashes -- side-stepping the c_rehash step in most cases.
To use, cd into the right directory (such as /etc/ssl/certs/) and run the script with the path to your certificate bundle as the sole argument. For example: ruby /tmp/split-certificates.rb ca-root-nss.crt.
#!/usr/bin/env ruby
require 'openssl'
blob = IO.binread(ARGV[0]) # Read the entire file at once
DELIMITER = "\n-----END CERTIFICATE-----\n"
blobs = blob.split(DELIMITER)
blobs.each do |blob|
blob.strip!
blob += DELIMITER # Does not break DER
begin
cert = OpenSSL::X509::Certificate.new blob
rescue
puts "Skipping what seems like junk"
next
end
begin
# XXX Need to handle clashes, suffix other than 0
filename=sprintf("%x.0", cert.subject.hash)
File.open(filename,
File::WRONLY|File::CREAT|File::EXCL) do |f|
f.write(blob)
end
rescue Errno::EEXIST
puts "#{filename} already exists, skipping"
end
end
Here is mine in Perl (so much code, but I like gonzo programming):
#!/usr/bin/perl -w
# -------
# Split "certificate bundles" like those found in /etc/pki/tls/certs into
# individual files and append the X509 cleartext description to each file.
#
# The file to split is given on the command line or piped via STDIN.
#
# Files are simply created in the current directory!
#
# Created files are named "certificate.XX" or "trusted-certificate.XX",
# with XX an index value.
#
# If a file with the same name as the output file already exists, it is not
# overwritten. Instead a new name with a higher index is tried.
#
# This works for bundles of both trusted and non-trusted certificates.
#
# See http://tygerclan.net/?q=node/49 for another program of this kind,
# which sets the name of the split-off files in function of the subject
# -------
my #lines = <> or die "Could not slurp: $!";
my $state = "outside"; # reader state machine state
my $count = 0; # index of the certificate file we create
my $fh; # file handle of the certificate file we create
my $fn; # file name of the certificate file we create
my $trusted; # either undef or "TRUSTED" depend on type of certificate
for my $line (#lines) {
chomp $line;
if ($state eq "outside") {
if ($line =~ /^(-----BEGIN (TRUSTED )?CERTIFICATE-----)\s*$/) {
my $marker = $1;
$trusted = $2;
$state = "inside";
my $created = 0;
my $prefix = "";
if ($trusted) {
$prefix = "trusted-"
}
while (!$created) {
$fn = "${prefix}certificate.$count";
$count++;
if (-f $fn) {
# print STDERR "File '$fn' exists; increasing version number to $count\n";
}
else {
print STDERR "Certificate data goes to file '$fn'\n";
open($fh,">$fn") || die "Could not create file '$fn': $!\n";
$created = 1;
print $fh "$marker\n"
}
}
}
else {
print STDERR "Skipping line '$line'\n"
}
}
else {
if ($line =~ /^(-----END (TRUSTED )?CERTIFICATE-----)\s*$/) {
my $marker = $1;
my $trustedCheck = $2;
if (!((($trusted && $trustedCheck) || (!$trusted && !$trustedCheck)))) {
die "Trusted flag difference detected\n"
}
$state = "outside";
print $fh "$marker\n";
print STDERR "Closing file '$fn'\n";
close $fh;
# Append x509 cleartext output by calling openssl tool
`openssl x509 -noout -text -in '$fn' >> '$fn'`;
if ($? != 0) {
die "Could not run 'openssl x509' command: $!\n";
}
}
else {
print $fh "$line\n"
}
}
}
if ($state eq "inside") {
die "Last certificate was not properly terminated\n";
}

TCL, expect: Multiple files w/ SCP

I was able to transfer files with scp and expect, now I tried to upload several files at once:
#!/usr/bin/expect -f
# Escapes spaces in a text
proc esc text {
return [regsub -all {\ } $text {\\&}]
}
# Uploads several files to a specified server
proc my_scp_multi {ACCOUNT SERVER PW files newfolder} {
set timeout 30
send_user -- "\n"
spawn scp $files $ACCOUNT#$SERVER:[esc $newfolder]
match_max 100000
# Look for password prompt
expect {
-re ".*Connection closed.*" {
sendError "\n\n\nUpload failed!\nPlease check the errors above and start over again.\nThis is most likely induced by too many wrong password-attempts and will last quite a time!"
}
-re ".*Permission denied.*" {
sendError "\n\n\nUpload failed!\nPlease check the errors above and start over again.\nYou entered most likely a wrong password!"
}
-re ".*Are.*.*yes.*no.*" {
send "yes\n"
exp_continue
#look for the password prompt
}
-re ".*sword.*" {
# Send password aka $PW
send -- "$PW\r"
# send blank line (\r) to make sure we get back to gui
send -- "\r\n"
exp_continue
}
send_user -- "Upload successful!\n"
}
set timeout -1
}
When I want to upload several files, the sh command is:
scp $a $b $c user#server:$folder, so I called my_scp_multi "ACCOUNT" "SERVER" "PW" "~/testfileA ~/testfileB ~/testfileC" "~/test/". Which also produces this output:
spawn scp ~/testfileA ~/testfileB ~/testfileC user#server:~/test/
user#server's password:
~/testfileA ~/testfileB ~/testfileC: No such file or directory
It seems to see "~/testfileA ~/testfileB ~/testfileC" as one file. But when I copy-paste scp ~/testfileA ~/testfileB ~/testfileC user#server:~/test/ to the console it works fine!
What am I doing wrong? I've tried "\"~/testfileA\" \"~/testfileB\" \"~/testfileC\"" and such things, but nothing did work at all.
Any ideas or suggestions?
EDITS
P.S.: I'm transferring rather small files. Building up a connection is the biggest part of the transfer. This is the reason I want it to be done in ONE scp.
P.P.S.:
I played around a little and came up with:
my_scp_multi3 "user" "server" "pw" "~/a\ b/testfileA, ~/a\\ b/testfileB, ~/a\\\ b/testfileC" "~/test"
with your first solution but {*}[split $files ","] and
my_scp_multi2 "user" "server" "pw" "~/a b/testfileA" "~/a\ b/testfileB" "~/a\\ b/testfileC" "~/test"
with your second solution. This prints:
~/a b/testfileA: No such file or directory
~/a\ b/testfileB: No such file or directory
~/a\ b/testfileC: No such file or directory
and
~/a b/testfileA: No such file or directory
~/a b/testfileB: No such file or directory
~/a\ b/testfileC: No such file or directory
(BTW: I of course moved the files :) )
Thanks to all the answers, here my Solution:
using \n \0 (nullbyte) as separator, because it is the only symbol except / and \ which may not be used in filenames.
#!/usr/bin/expect -f
# Escapes spaces in a text
proc esc text {
return [regsub -all {\ } $text {\\&}]
}
# Returns the absolute Filepath
proc makeAbsolute {pathname} {
file join [pwd] $pathname
}
proc addUploadFile {files f} {
if {$files != ""} {
set files "$files\0"
}
return "$files[makeAbsolute $f]"
}
#Counts all files from an upload-list
proc countUploadFiles {s} {
set rc [llength [split $s "\0"]]
incr rc -1
return $rc
}
# Uploads several files from a list (created by addUploadFile) to a specified server
proc my_scp_multi {ACCOUNT SERVER PW files newfolder} {
foreground blue
set nFiles [countUploadFiles $files]
set timeout [expr $nFiles * 60]
send_user -- "\n"
spawn scp -r {*}[split $files "\0"] $ACCOUNT#$SERVER:[esc $newfolder]
match_max 100000
# Look for password prompt
expect {
-re ".*Connection closed.*" {
sendError "\n\n\nUpload failed!\nPlease check the errors above and start over again.\nThis is most likely induced by too many wrong password-attempts and will last quite a time!"
}
-re ".*Permission denied.*" {
sendError "\n\n\nUpload failed!\nPlease check the errors above and start over again.\nYou entered most likely a wrong password!"
}
-re ".*Are.*.*yes.*no.*" {
send "yes\n"
exp_continue
#look for the password prompt
}
-re ".*sword.*" {
# Send password aka $PW
send -- "$PW\r"
# send blank line (\r) to make sure we get back to gui
send -- "\r\n"
exp_continue
}
send_user -- "Upload successful!\n"
}
set timeout -1
}
set fls [addUploadFile "" "a b/testfileA"]
set fls [addUploadFile $fls "a b/testfileB"]
set fls [addUploadFile $fls "a b/testfileC"]
my_scp_multi "user" "server" "pw" $fls "~/test"
You don't want to send the filenames as a single string. Either do this:
spawn scp {*}[split $files] $ACCOUNT#$SERVER:[esc $newfolder]
And continue to quote the filenames:
my_scp_multi "ACCOUNT" "SERVER" "PW" "~/testfileA ~/testfileB ~/testfileC" "~/test/"
or do this:
proc my_scp_multi {ACCOUNT SERVER PW args} {
set timeout 30
send_user -- "\n"
set files [lrange $args 0 end-1]
set newfolder [lindex $args end]
spawn scp {*}$files $ACCOUNT#$SERVER:[esc $newfolder]
And then do not quote the filenames
my_scp_multi "ACCOUNT" "SERVER" "PW" ~/testfileA ~/testfileB ~/testfileC "~/test/"
The splat ({*}) splits the list up into it's individual elements so the spawn command sees several words, not a single word. See http://tcl.tk/man/tcl8.5/TclCmd/Tcl.htm
You could spawn a shell and then run the scp command instead:
spawn bash
send "scp $files $ACCOUNT#$SERVER:[esc $newfolder]\r"
This allows for glob expansion but adds extra housekeeping as you will need to trap when the scp process is completed, as you still have a shell running.
You could add below to your expect block:
-re "100%" {
if { $index < $count } {
set index [expr $index + 1]
exp_continue
}
}
Where index is the # of file being transferred and count the nr of files.
You should be using SSH public key authentication instead of typing in the password with expect. When it's set up properly, scp will work without any human input of passwords while keeping the system very secure. You will be free from all the troubles with expect.
How do I setup Public-Key Authentication?
http://www.ece.uci.edu/~chou/ssh-key.html
If there's some reason why you cannot use pubkey, you may find sftp useful because it accepts a batch command file as -b batchfile. See man 1 sftp Not a very good solution when expect can actually split the arguments

Resources