How to expand scp command in a variable? - linux

I have a text file on the server created scp_command.txt which I need to use as the content changes dynamically.
So retrieving the content this way.
copy_command="$(cat $CI_PROJECT_DIR/scp_command.txt)"
And the content looks this way.
echo $copy_command
scp -q -i ssh_key.pem %s ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
To use this command, I tried like this.
$(echo ${copy_command/"%s"/"~/.project-n/1.txt"})
But I am getting error as below.
~/.project-n/1.txt: No such file or directory
new_copy_command=${copy_command/"%s"/"~/.project-n/1.txt"}
echo $new_copy_command
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
$new_copy_command
~/.project-n/1.txt: No such file or directory
$($new_copy_command)
~/.project-n/1.txt: No such file or directory
But if I ran the content directly, it works,
echo $new_copy_command
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~

Do not quote the filename, remove the quotes which associated with your file name ~/.project-n/1.txt
Ex:
$(echo ${copy_command/"%s"/~/.project-n/1.txt})

Related

How to use if function in shell scripts?

I need to use if function to filter out required files only when using SFTP to copy files to my server from a remote server. Here is my try to get the all data inside /filesnew.
#!/bin/bash
files=`sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12<<EOF
cd /filesnew
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
(
echo cd /filesnew
for file in $files; do
echo get $file /data/processedfiles/$file
done
) |sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12
I need to filter out the files which are starting with "USER".
ex:
If($files==*USER*) then
echo get $file /data/processedfiles/$file
Can someone show me how to do this?
Use spaces around operators. Those are all arguments for commands and spaces separate them.
"If" is spelled if (lowercase) in Bash.
Testing a condition is done with [...] in Bash, not with (...).
Filtering is not comparison. Those are completely different operations. Use grep:
... | grep -E -v '^USER'
See: man grep

How to download files from my list with wget and ftp

I need to download only defined files with wget and ftp.
For example:
1.I retrieve all files using:
echo ls -R | ftp ftp://user:password#host > ./list.txt
2.Then I will parse the result and get a list with absolute paths for each file:
/path/to-the/file-1
/path/to-the/file-2
etc.
3.And now I need to download all files from the result list using wget and ftp.
And I don't want to create a separate FTP session for each file download process.
Please give your advice. Thank you.
Update:
For recursive download I'm using it: wget -r ftp://user:password#host:/ -nH -P /download/path. It works great, but I need to pass a file with a list of remote files for downloading via FTP with one FTP session.
Sorry, I missed the "single session" part when I commented. I think you need to have your script generate a second script to run a single FTP session.
So, your script will not do any FTP itself, it will just write another script that does the transfers. So, it will write a script that does this
ftp -n <SOMEADDRESS> <<EOS
quote USER <USERNAME>
quote PASS <PASSWORD>
bin
get file1 localname1
get file2 localname2
...
get fileN localnameN
quit
EOS
Then it will execute that script, by doing:
bash < thatScript
So your script will look like this:
#!/bin/bash
ScriptName=funkyFTPer
cat - <<END > $ScriptName
ftp -n 192.168.0.1 <<EOS
quote USER freddy
quote PASS frog
END
# Your selection code goes here ***PHNQZ***
echo get file1 localname1 >> $ScriptName
echo get file2 localname2 >> $ScriptName
echo get fileN localnameN >> $ScriptName
echo quit >> $ScriptName
echo EOS >> $ScriptName
echo "Now run bash < $ScriptName"
Then delete the script as it contains your password. Or you can put the password in your .netrc file.
As regards creating directories locally, you can do that in the first script using mkdir -p. The -p has the advantage that it creates all directories in between in one go and doesn't get upset if they already exist.
So, just looking at the area of code where it says ***PHNQZ*** above, let's say your code decides you need file freddy/frog/c.txt, you could do:
remotename="freddy/frog/c.txt"
localdir=${remotename%/*} # Get just directory part using "bash Parameter Substitution"
mkdir -p "$localdir" # make directory and all parts in between

scp: how to copy a file from remote server with a filter

I am trying to use scp to copy large log files from a remote server. However I want only the lines in remote log file that has a string "Fail".
This is how I am doing it currently
scp user#ip:remote_folder/logfile* /localfolder
This copies all the files starting with logfile in remote server to my local folder. The files are pretty large and I need to copy only the lines in those log file, containing the string "Fail" from remote server. Can any body tell me how to do this? Can I use cat or grep command?
Use grep on the remote machine and filter the output into file name and content:
#!/usr/bin/env bash
BASEDIR=~/temp/log
IFS=$'\n'
for match in `ssh user#ip grep -r Fail "remote_folder/logfile*"`
do
IFS=: read file line <<< $match
mkdir -p `dirname $BASEDIR/$file`
echo $line >> $BASEDIR/$file
done
You might want to look at an explanation to IFS in combination with read.
ssh user#ip grep Fail remote_folder/logfile*

How to move file in another server from one list file using while read line?

The goal is I want to monitor one directory from different server. For example the remote server is user#host.
I have list.txt that contents list of file that will be moved. And list.txt located in a remote server.
Currently I have this code.
ssh user#host cat /full-path/list.txt |
{
while read line;
do mv user#host:/full-path/$line user#host:/full-path/done/;
done;
}
When I run the code above, error exists. There's no such file or directory.
But when I log in to user#host and cat one file randomly from list.txt, the file exists.
The while loop runs on the local server. You need to put the script in quotes so it's an argument to the ssh command.
... Or a here document, like this:
ssh user#host <<':'
while read line; do
mv /full-path/"$line" /full-path/done/
done </full-path/list.txt
:
... or more succinctly
ssh user#host 'cd /full-path && xargs -a list.txt mv -t done'
Notice also the absence of a useless cat and the local file name resolution (mv would have no idea about the SSH remote path syntax you were trying to use).

How can I cat a remote file to read the parameters in Bash?

How can I cat a remote file? Currently, it works for local files only.
#!/bin/bash
regex='url=(.*)'
# for i in $(cat /var/tmp/localfileworks.txt);
for i in $(cat http://localhost/1/downloads.txt);
do
echo $i;
# if [[ $i =~ $regex ]]; then
#echo ${BASH_REMATCH[1]}
#fi
done
cat: http://localhost/1/downloads.txt: No such file or directory
You can use curl:
curl http://localhost/1/downloads.txt
Instead of cat, which reads a file from the file-system, use wget -O- -q, which reads a document over HTTP and writes it to standard output:
for i in $(wget -O- -q http://localhost/1/downloads.txt)
(The -O... option means "write to the specified file", where - is standard output; the -q option means "quiet", and disables lots of logging that would otherwise go to standard error.)
Why are you using a URL to copy from the local machine? Can't you just cat directly from the file?
If you are doing this from a remote machine and not localhost, then as far as I know you can't pass a URL to cat.
I would try something like this:
scp username#hostname:/filepath/downloads.txt /dev/stdout
As someone else mentioned you could also use wget instead of scp.

Resources