sftp upload bash ignore files - linux

I try to achieve some semi-automatic sftp upload/deployment. The main key is NOT to upload all files. I do not know which files to upload but I know which files not to upload.
My bash script looks like:
#!/bin/bash
IP="123.123.123.123"
HOSTNAME="ftp.my-host.com"
PATH="subdirectory"
sftp username#$HOSTNAME:$PATH < "sftp-pattern"
in the sftp-pattern file i want to store my sftp commands. But I could not find any hints how to ignore several patterns. like *.sql.
Ideally i'd ignore everything that is gitignored.
I do NOT have an ssh connection.

Since you are using a shell script, you could use a loop. Something like this should work.
#!/bin/bash
IP="123.123.123.123"
HOST="ftp.my-host.com"
DIR="/tmp/"
for f in `/bin/ls $DIR`
do
if echo $f | /usr/bin/grep '.sql' > /dev/null
then
echo SKIPPING $f
else
sftp username#$HOST:$DIR/$f
fi
done

The answer should be git-ftp as #Clijsters mentioned. It is more feature rich and needs no fiddeling around with loops and pipes.
dwrights solution is working though, if it is ONLY sql that you want to exclude.
Thanks!!

Related

CENTOS - Bash script to initiate a file transfer from a directory

I am trying to create a bash script to initiate a file transfer to another machine via tftp app. currently i would do this manually by running the command ./tftp "filename" tftp://ipaddress/filename.
What i would like to do is have a bash script that looks at a folder e.g (filetransfer) for any files an initiates that same command. can someone please help? as i am a noob at bash scripting
so far i have tried the below
when running this is says that the filename is bad
#!/bin/bash
for filename in ./*
do
./tftp "$filename" tftp://ipaddress/"$filename"
done
also tried this
when running this one below it transfers everything on the directory below it.
#!/bin/bash
cd /path/to/the/directory/*
for i in *
do
./tftp "$i" tftp://ipaddress/"$i"
done
In the code you posted, filename, respecitvely i, can also take the name of a subdirectory, since you are looping over all entries in the directory. If you want to restrict the transfer to plain files, do a
[[ -f $filename ]] && ./tftp "$filename" tftp://ipaddress/"$filename"

run a script with $(cat filename.txt)

So im running a script called backup.sh. It creates a backup of a site. Now I have a file called sites.txt that has a list if sites that I need to backup. i dont want to run the script for every site that I need to backup. So what im trying to do is run is like this:
backup.sh $(cat sites.txt)
But it only backups the 1st site thats on the list then stop. any suggestions how i could keep make it go throughout the whole list?
To iterate over the lines of a file, use a while loop with the read command.
while IFS= read -r file_name; do
backup.sh "$file_name"
done < sites.txt
The proper fix is to refactor backup.sh so that it meets your expectation to accept a list of sites on its command line. If you are not allowed to change it, you can write a simple small wrapper script.
#!/bin/sh
for site in "$#"; do
backup.sh "$site"
done
Save this as maybe backup_sites, do a chmod +x, and run it with the list of sites. (I would perhaps recommend xargs -a sites.txt over $(cat sites.txt) but both should work if the contents are one token per line.)
I think this should do, provided that sites.txt has one site per line (not tested):
xargs -L 1 backup.sh < sites.txt
If you are permitted to modify backup.sh, I would enhance it so that it accepts a list of sites, not a single one. Of course, if sites.txt, is very, very large, the xargs way would still be the better one (but then without the -L switch).

How to download all files from a Linux server using SCP which contan a given String

I need to download all 492 files from a Linux directory which contain a given String within the file. I can't quite manage to find a command which can accomplish this from my searching so far. Could anybody help me out?
Cheers.
Use grep to filter the files with a given string and loop over them to scp like this
for file in $(grep <some-pattern> <directory>); do scp $file <remote>; done;
Just in case if you need to filter out also the files in subdirectories of directory add the -R option to grep

How to SCP files which are being FTPed by another process &delete them on completion?

Files are being transferred to a directory on my machine by FTP protocol. I need to SCP these files to another machine & delete them on completion.
How can I detect if file trasfer by FTP has been done & the file is safe to do SCP?
There's no reliable way to detect completion of the transfer. Some clients send ALLO command and pass the size of the file before actually uploading the file, but this is not a definite rule, so you can't rely on it. All in all, it's possible that the client streams the data and there's no definite "end" of file on its side.
If the client is under your control, you can make it upload files with extension A and after upload rename the files to extension B. And then you transfer only files with extension B.
You can do a script like this:
#!/bin/bash
EXPECTED_ARGS=1
E_BADARGS=65
#Arguments control
if [ $# -lt $EXPECTED_ARGS ]
then
echo "Usage: `basename $0` <folder_update_1> <folder_update_2> <folder_update_3> ..."
exit $E_BADARGS
fi
folders=( "$#" )
for folder in ${folders[#]}
do
#Send folder or file to new machine
time rsync --update -avrt -e ssh /local/path/of/$folder/ user#192.168.0.10:/remote/path/of/$folder/
#Delete local file or folder
rm -r /local/path/of/$folder/
done
It is configured to send folders. If you want files need make little changes on script as:
time rsync --update -avrt -e ssh /local/path/of/$file user#192.168.0.10:/remote/path/of/$file
rm /local/path/of/$file/
Rsync is similar to scp. I prefer use rsync but you can change it.

Shell Script - SFTP -> If copied, remove?

Iam trying to copy textfiles with a shellscript over sftp.
I already wrote a script that does the job.
#!/bin/bash
HOST='Servername'
USER='Username'
sftp -b - ${USER}#${HOST} << EOFFTP
get /files/*.txt /tmp/ftpfiles/
rm /files/*.txt
quit
EOFFTP
Before I remove all the textfiles on the FTP, I want to make sure, I copied all the files without errors. How can I do this? I use SSH-keys for login.
Task is:
Copy all textfiles over and over but make sure, its not the same ones... (thats why I use remove...)
Maybe I could move them on the FTP? like copy and then move to /files/copied ?
Actually, rsync is ideal for this:
rsync --remove-source-files ${USER}#${HOST}:/files/*.txt /tmp/ftpfiles/

Resources