scp: how to copy a file from remote server with a filter - linux

I am trying to use scp to copy large log files from a remote server. However I want only the lines in remote log file that has a string "Fail".
This is how I am doing it currently
scp user#ip:remote_folder/logfile* /localfolder
This copies all the files starting with logfile in remote server to my local folder. The files are pretty large and I need to copy only the lines in those log file, containing the string "Fail" from remote server. Can any body tell me how to do this? Can I use cat or grep command?

Use grep on the remote machine and filter the output into file name and content:
#!/usr/bin/env bash
BASEDIR=~/temp/log
IFS=$'\n'
for match in `ssh user#ip grep -r Fail "remote_folder/logfile*"`
do
IFS=: read file line <<< $match
mkdir -p `dirname $BASEDIR/$file`
echo $line >> $BASEDIR/$file
done
You might want to look at an explanation to IFS in combination with read.

ssh user#ip grep Fail remote_folder/logfile*

Related

How to expand scp command in a variable?

I have a text file on the server created scp_command.txt which I need to use as the content changes dynamically.
So retrieving the content this way.
copy_command="$(cat $CI_PROJECT_DIR/scp_command.txt)"
And the content looks this way.
echo $copy_command
scp -q -i ssh_key.pem %s ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
To use this command, I tried like this.
$(echo ${copy_command/"%s"/"~/.project-n/1.txt"})
But I am getting error as below.
~/.project-n/1.txt: No such file or directory
new_copy_command=${copy_command/"%s"/"~/.project-n/1.txt"}
echo $new_copy_command
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
$new_copy_command
~/.project-n/1.txt: No such file or directory
$($new_copy_command)
~/.project-n/1.txt: No such file or directory
But if I ran the content directly, it works,
echo $new_copy_command
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
scp -q -i ssh_key.pem ~/.project-n/1.txt ec2-user#ec2-35-86-96-6.us-west-2.compute.amazonaws.com:~
Do not quote the filename, remove the quotes which associated with your file name ~/.project-n/1.txt
Ex:
$(echo ${copy_command/"%s"/~/.project-n/1.txt})

How to use if function in shell scripts?

I need to use if function to filter out required files only when using SFTP to copy files to my server from a remote server. Here is my try to get the all data inside /filesnew.
#!/bin/bash
files=`sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12<<EOF
cd /filesnew
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
(
echo cd /filesnew
for file in $files; do
echo get $file /data/processedfiles/$file
done
) |sshpass -p 'XXX' sftp -P 2222 User1#10.18.90.12
I need to filter out the files which are starting with "USER".
ex:
If($files==*USER*) then
echo get $file /data/processedfiles/$file
Can someone show me how to do this?
Use spaces around operators. Those are all arguments for commands and spaces separate them.
"If" is spelled if (lowercase) in Bash.
Testing a condition is done with [...] in Bash, not with (...).
Filtering is not comparison. Those are completely different operations. Use grep:
... | grep -E -v '^USER'
See: man grep

How do I get the files from SFTP server and move them to another folder in bash script?

How do I get the one by one files from SFTP server and move them do another folder in Ubuntu bash script?
#!bin/sh
FOLDER=/home/SFTP/Folder1/
sftp SFTP#ip_address
cd /home/FSTP/Folder1/
for file in "$FOLDER"*
<<EOF
cd /home/local/Folder1
get $file
EOF
mv $file /home/SFTP/Done
done
I know it's not right, but i've tried my best and if anyone can help me, i will appreciate it. Thanks in advance.
OpenSSH sftp is not very powerful client for such tasks. You would have to run it twice. First to collect list of files, use the list to generate list of commands, and execute those in a second run.
Something like this:
# Collect list of files
files=`sftp -b - user#example.com <<EOF
cd /source/folder
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
# Use the list to generate list of commands for the second run
(
echo cd /source/folder
for file in $files; do
echo get $file
echo rename $file /backup/folder/$file
done
) | sftp -b - user#example.com
Before you run the script on production files, I suggest, you first output the generated command list to a file to check, if the results are as expected.
Just replace the last line with:
) > commands.txt
Maybe use SFTP internal command.
sftp get -r $remote_path $local_path
OR with the -f option to flush files to disk
sftp get -rf $remote_path $local_path

How to move file in another server from one list file using while read line?

The goal is I want to monitor one directory from different server. For example the remote server is user#host.
I have list.txt that contents list of file that will be moved. And list.txt located in a remote server.
Currently I have this code.
ssh user#host cat /full-path/list.txt |
{
while read line;
do mv user#host:/full-path/$line user#host:/full-path/done/;
done;
}
When I run the code above, error exists. There's no such file or directory.
But when I log in to user#host and cat one file randomly from list.txt, the file exists.
The while loop runs on the local server. You need to put the script in quotes so it's an argument to the ssh command.
... Or a here document, like this:
ssh user#host <<':'
while read line; do
mv /full-path/"$line" /full-path/done/
done </full-path/list.txt
:
... or more succinctly
ssh user#host 'cd /full-path && xargs -a list.txt mv -t done'
Notice also the absence of a useless cat and the local file name resolution (mv would have no idea about the SSH remote path syntax you were trying to use).

SSH find and replace string in a list of filenames

We have a repository of about 3000 MP3 files. Many of these files have our old domain name within their name.
For example: somesong_oldDomainName.mp3
I need to SSH into the site, find all instances of those files and rename them with the new domain name.
Example: somesong_NEWDomainName.mp3
I know the basic SSH commands but not something advanced like this.
Pretty sure it'll be a combination of multiple commands.
Assuming you get an interactive shell when you ssh into your linux server, this might be a possible way:
ssh user#machine-name-or-ip
then you will get some sort of terminal like
user#machine-name:~$
where you enter the commands to execute on that remote machine.
As mentioned in the comments, the answer here might just fit very well:
Bash: Rename small part of multiple files in middle of name
user#machine-name:~$ for i in *.mp3; do mv "$i" "$(echo "$i" | sed 's/_oldDomainName/_NEWDomainName/g')"; done
This assumes, your current directory is the one with all the MP3 files in it.
If you dont want interactivly operate on your files, e.g. because they change very often and you want a script to perform this action, SSH can also execute a command and/or shell script remotely.
To pass the command directly with the SSH call:
SSH error when executing a remote command: "stdin: is not a tty"
To pipe a local shell script into the SSH connection: How to use SSH to run a shell script on a remote machine?
Run a remote shell script via SSH: how to run a script file remotely using ssh
Edit:
Assume you are connected via SSH to your remote machine and have somewhat similar versions of bash and sed, it should work like this:
$ ls
bar_chosefil.mp3 boo_chosefil.mp3 foo_chosefil.mp3
$ for i in *.mp3; do mv $i $(echo $i | sed 's/chosefil/tamasha/g'); done
$ ls
bar_tamasha.mp3 boo_tamasha.mp3 foo_tamasha.mp3
Versions involved:
bash: 4.2.25
sed: 4.2.1
mv: 8.13
Edit 2:
Updated the command to work with blanks in filenames
$ ls
asd chosefil.mp3 bar_chosefil.mp3 boo_chosefil.mp3 foo_chosefil.mp3
$ for i in *.mp3; do mv "$i" "$(echo "$i" | sed 's/chosefil/tamasha/g')"; done
$ ls
asd tamasha.mp3 bar_tamasha.mp3 boo_tamasha.mp3 foo_tamasha.mp3

Resources