How can I cat a remote file to read the parameters in Bash? - linux

How can I cat a remote file? Currently, it works for local files only.
#!/bin/bash
regex='url=(.*)'
# for i in $(cat /var/tmp/localfileworks.txt);
for i in $(cat http://localhost/1/downloads.txt);
do
echo $i;
# if [[ $i =~ $regex ]]; then
#echo ${BASH_REMATCH[1]}
#fi
done
cat: http://localhost/1/downloads.txt: No such file or directory

You can use curl:
curl http://localhost/1/downloads.txt

Instead of cat, which reads a file from the file-system, use wget -O- -q, which reads a document over HTTP and writes it to standard output:
for i in $(wget -O- -q http://localhost/1/downloads.txt)
(The -O... option means "write to the specified file", where - is standard output; the -q option means "quiet", and disables lots of logging that would otherwise go to standard error.)

Why are you using a URL to copy from the local machine? Can't you just cat directly from the file?
If you are doing this from a remote machine and not localhost, then as far as I know you can't pass a URL to cat.
I would try something like this:
scp username#hostname:/filepath/downloads.txt /dev/stdout
As someone else mentioned you could also use wget instead of scp.

Related

How to use if function in shell scripts?

I need to use if function to filter out required files only when using SFTP to copy files to my server from a remote server. Here is my try to get the all data inside /filesnew.
#!/bin/bash
files=`sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12<<EOF
cd /filesnew
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
(
echo cd /filesnew
for file in $files; do
echo get $file /data/processedfiles/$file
done
) |sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12
I need to filter out the files which are starting with "USER".
ex:
If($files==*USER*) then
echo get $file /data/processedfiles/$file
Can someone show me how to do this?
Use spaces around operators. Those are all arguments for commands and spaces separate them.
"If" is spelled if (lowercase) in Bash.
Testing a condition is done with [...] in Bash, not with (...).
Filtering is not comparison. Those are completely different operations. Use grep:
... | grep -E -v '^USER'
See: man grep

How do I get the files from SFTP server and move them to another folder in bash script?

How do I get the one by one files from SFTP server and move them do another folder in Ubuntu bash script?
#!bin/sh
FOLDER=/home/SFTP/Folder1/
sftp SFTP#ip_address
cd /home/FSTP/Folder1/
for file in "$FOLDER"*
<<EOF
cd /home/local/Folder1
get $file
EOF
mv $file /home/SFTP/Done
done
I know it's not right, but i've tried my best and if anyone can help me, i will appreciate it. Thanks in advance.
OpenSSH sftp is not very powerful client for such tasks. You would have to run it twice. First to collect list of files, use the list to generate list of commands, and execute those in a second run.
Something like this:
# Collect list of files
files=`sftp -b - user#example.com <<EOF
cd /source/folder
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
# Use the list to generate list of commands for the second run
(
echo cd /source/folder
for file in $files; do
echo get $file
echo rename $file /backup/folder/$file
done
) | sftp -b - user#example.com
Before you run the script on production files, I suggest, you first output the generated command list to a file to check, if the results are as expected.
Just replace the last line with:
) > commands.txt
Maybe use SFTP internal command.
sftp get -r $remote_path $local_path
OR with the -f option to flush files to disk
sftp get -rf $remote_path $local_path

How to download files from my list with wget and ftp

I need to download only defined files with wget and ftp.
For example:
1.I retrieve all files using:
echo ls -R | ftp ftp://user:password#host > ./list.txt
2.Then I will parse the result and get a list with absolute paths for each file:
/path/to-the/file-1
/path/to-the/file-2
etc.
3.And now I need to download all files from the result list using wget and ftp.
And I don't want to create a separate FTP session for each file download process.
Please give your advice. Thank you.
Update:
For recursive download I'm using it: wget -r ftp://user:password#host:/ -nH -P /download/path. It works great, but I need to pass a file with a list of remote files for downloading via FTP with one FTP session.
Sorry, I missed the "single session" part when I commented. I think you need to have your script generate a second script to run a single FTP session.
So, your script will not do any FTP itself, it will just write another script that does the transfers. So, it will write a script that does this
ftp -n <SOMEADDRESS> <<EOS
quote USER <USERNAME>
quote PASS <PASSWORD>
bin
get file1 localname1
get file2 localname2
...
get fileN localnameN
quit
EOS
Then it will execute that script, by doing:
bash < thatScript
So your script will look like this:
#!/bin/bash
ScriptName=funkyFTPer
cat - <<END > $ScriptName
ftp -n 192.168.0.1 <<EOS
quote USER freddy
quote PASS frog
END
# Your selection code goes here ***PHNQZ***
echo get file1 localname1 >> $ScriptName
echo get file2 localname2 >> $ScriptName
echo get fileN localnameN >> $ScriptName
echo quit >> $ScriptName
echo EOS >> $ScriptName
echo "Now run bash < $ScriptName"
Then delete the script as it contains your password. Or you can put the password in your .netrc file.
As regards creating directories locally, you can do that in the first script using mkdir -p. The -p has the advantage that it creates all directories in between in one go and doesn't get upset if they already exist.
So, just looking at the area of code where it says ***PHNQZ*** above, let's say your code decides you need file freddy/frog/c.txt, you could do:
remotename="freddy/frog/c.txt"
localdir=${remotename%/*} # Get just directory part using "bash Parameter Substitution"
mkdir -p "$localdir" # make directory and all parts in between

scp: how to copy a file from remote server with a filter

I am trying to use scp to copy large log files from a remote server. However I want only the lines in remote log file that has a string "Fail".
This is how I am doing it currently
scp user#ip:remote_folder/logfile* /localfolder
This copies all the files starting with logfile in remote server to my local folder. The files are pretty large and I need to copy only the lines in those log file, containing the string "Fail" from remote server. Can any body tell me how to do this? Can I use cat or grep command?
Use grep on the remote machine and filter the output into file name and content:
#!/usr/bin/env bash
BASEDIR=~/temp/log
IFS=$'\n'
for match in `ssh user#ip grep -r Fail "remote_folder/logfile*"`
do
IFS=: read file line <<< $match
mkdir -p `dirname $BASEDIR/$file`
echo $line >> $BASEDIR/$file
done
You might want to look at an explanation to IFS in combination with read.
ssh user#ip grep Fail remote_folder/logfile*

bash - wget -N if else value check

I'm working on a bash script that pulls a file from an FTP site only if the timestamp on remote is different than local. After it puts the file, it copies the file over to 3 other computers via samba (smbclient).
Everything works, but the file copies even if the wget -N ftp://insertsitehere.com returns a value that the file on the remote was not newer. What would be the best way to check the output of the script so that the copy only happens if a new version was pulled from FTP?
Ideally, I'd like the copy to the computers to preserve the timestamp just like the wget -N command does, too.
Here is an example of what I have:
#!/bin/bash
OUTDIR=/cats/dogs
cd $OUTDIR
wget -N ftp://user:password#sitegoeshere.com/filename
if [ $? -eq 0 ]; then
HOSTS="server1 server2 server3"
for i in $HOSTS; do
echo "Uploading to $i..."
smbclient -A /root/.smbclient.authfile //$i/path -c "lcd /cats/dogs; put fiilename.txt"
if [ $? -eq 0 ]; then
echo "Upload to $i successful..."
else
echo "There was an issue uploading to host $i..."
fi
done
else
echo "There was an issue with the FTP Download...."
exit 1
fi
The return value of wget is different than 0 only if there is an error. If -N is in use and the remote file is older than the local file, it will still have a return value of 0, so you cannot use that to check if the file has been modified.
You could check the mtime of the file to see if it changed, or the content. For example, you could use something like:
md5_old=$( md5sum filename.txt 2>/dev/null )
wget -N ftp://user:password#sitegoeshere.com/filename.txt
md5_new=$( md5sum filename.txt )
if [ "$md5_old" != "$md5_new" ]; then
# Copy filename.txt to SMB servers
fi
Regarding smbclient, unfortunately there is no way to preserve timestamps in either get or put commands. If you need it, you must use some different tool (scp -p, rsync -t...)
touch -r foo.txt foo.old
wget -N example.com/foo.txt
if [ foo.txt -nt foo.old ]
then
echo 'Uploading to server1...'
fi
"Save" the current timestamp into a new empty file
Use wget --timestamping to only download the file if it is newer
If file is newer than the "save" file, do stuff

Resources