rsync files from different path - linux

I need to copy files from remote server (different path) to local path
Iget file list in this working way:
ssh user#remote " ls -R /path/ \
" |grep "o1_" | awk -F '_' '{if ($4 > 55146) print $0}' >file_list.txt
or
ssh user#remote " find /path/ " \
|grep "o1_" | awk -F '_' '{if ($8 > 55152) print $0}' >files_full_path.txt
example files_full_path.txt
/path/path1/file1
/path/path1/file2
/path/path2/file3
/path/path2/file4
I've tried with full or non full path without success, examples below:
rsync -aver --include-from=files_full_path.txt user#remote:/path/ /destination_path
rsync -ave --include-from=files_full_path.txt --include /path/ --exclude='/*/' /path/
Can you help me?
Thanks

Perhaps this will help you:
$ ssh user#remote touch 1 2
$ mkdir 12
$ echo -e "1\n2" | rsync --include-from=- user#remote:\* 12/
$ ls 12 | cat
1
2

I'have found solution
cat to_apply.txt |xargs -i scp user#host:{} destination_path/

Related

echo text to multiple files in bash script

I am working on a bash script that uses pssh to run external commands, then join the output of the commands with the IP of each server. pssh has an option -o that writes a file for each server into a specified directory, but if the commands do not run, you just have an empty file. What I am having issues with is updating these empty files with something like "Server Unreachable" so that I know there was a connection issue reaching the server and to not cause problems with the rest of the script.
Here is what I have so far:
#!/bin/bash
file="/home/user/tools/test-host"
now=$(date +"%F")
folder="./cnxhwinfo-$now/"
empty="$(find ./cnxhwinfo-$now/ -maxdepth 1 -type f -name '*' -size 0 -printf '%f%2d')"
command="echo \$(uptime | awk -F'( |,|:)+' '{d=h=m=0; if (\$7==\"min\") m=\$6; else {if (\$7~/^day/) {d=\$6;h=\$8;m=\$9} else {h=\$6;m=\$7}}} {print d+0,\"days\",h+0,\"hours\",m+0,\"minutes\"}'), \$(hostname | awk '{print \$1}'), \$(sudo awk -F '=' 'FNR == 2 {print \$2}' /etc/connex-release/version.txt), \$(lscpu | awk -F: 'BEGIN{ORS=\", \";} NR==4 || NR==6 || NR==15 {print \$2}' | sed 's/ *//g') \$(free -k | awk '/Mem:/{print \$2}'), \$(df -Ph | awk '/var_lib/||/root/ {print \$2,\",\"\$5,\",\"}')"
pssh -h $file -l user -t 10 -i -o /home/user/tools/cnxhwinfo-$now -x -tt $command
echo "Server Unreachable" | tee "./cnxhwinfo-$now/$empty"
ls ./cnxhwinfo-$now >> ./cnx-data-$now
cat ./cnxhwinfo-$now/* >> ./cnx-list-$now
paste -d, ./cnx-data-$now ./cnx-list-$now >>./cnx-data-"$(date +"%F").csv"
I was trying to use find to locate the empty files and write "Server" unavailable using tee with this:
echo "Server Unreachable" | tee "./cnxhwinfo-$now/$empty"
if the folder specified doesn't already exist i get this error:
tee: ./cnxhwinfo-2019-09-03/: Is a directory
And if it does exist (ie, i run the script again), it instead creates a file named after the IP addresses returned by the find command, like this:
192.168.1.2 192.168.1.3 192.168.1.4 1
I've also tried:
echo "Server Unreachable" | tee <(./cnxhwinfo-$now/$empty)
The find command outputs the IP addresses on a single line with a space in between each one, so I thought that would be fine for tee to use, but I feel like I am either running into syntax issues, or am going about this the wrong way. I have another version of this same script that uses regular ssh and works great, just much slower than using pssh.
empty should be an array, assuming none of the file names will contain any whitespace in their names.
readarray -t empty < <(find ...)
echo "Server unreachable" | (cd ./cnxhwinfo-$now/; tee "${empty[#]}" > /dev/null)
Otherwise, you are building a single file name by concatenating the empty file names.

Commands work on terminal but not in shell script

The following commands work on my terminal but not in my shell script. I later found out that my terminal was /bin/tcsh. Can somebody tell me what changes I need to do for /bin/sh. Here are the commands I need to change:
cp source_dir/*/dir1/*.xml destination_dir/
Error in sh-> cp: cannot stat `source_dir/*/dir1/*.xml': No such file or directory
sed -i "s+${initial_name}+${final_name}+" $file_name
This one does not complain but does not work as well.
I am adding an example for testing. The code tends to rename the names of xml files and also the contents of xml files. For example-
The file name crr.ya.na.aa.xml should be changed to aa.xml
The same name inside crr.ya.na.aa.xml should also be changed from crr.ya.na.aa to aa
Here is the code:
#!/bin/sh
# Create dir structure for testing
rm -rf audience
mkdir audience
mkdir audience/dir1 audience/dir2 audience/dir3
mkdir audience/dir1/ipxact audience/dir2/ipxact audience/dir3/ipxact
touch audience/dir1/ipxact/crr.ya.na.aa.xml
echo "<spirit:name>crr.ya.na.aa</spirit:name>" > audience/dir1/ipxact/crr.ya.na.aa.xml
touch audience/dir2/ipxact/crr.ya.na.bb.xml
echo "<spirit:name>crr.ya.na.bb</spirit:name>" > audience/dir2/ipxact/crr.ya.na.bb.xml
touch audience/dir3/ipxact/crr.ya.na.cc.xml
echo "<spirit:name>crr.ya.na.cc</spirit:name>" > audience/dir3/ipxact/crr.ya.na.cc.xml
# Create a dir for ipxact_drop files if it does not exist
mkdir -p ipxact_drop
rm -rf ipxact_drop/*
cp audience/*/ipxact/*.xml ipxact_drop/
ls ipxact_drop/ > ipxact_drop_files.log
cat ipxact_drop_files.log | \
awk '{ split($0,a,"."); print a[length(a)-1] "." a[length(a)] }' ipxact_drop_files.log > file_names.log
cat ipxact_drop_files.log | \
awk '{ split($0,a,"."); print "mv ipxact_drop/" $0 " ipxact_drop/" a[length(a)-1] "." a[length(a)] }' ipxact_drop_files.log > command.log
chmod +x command.log
./command.log
while read line
do
echo ipxact_drop/$line
initial_name=`grep -m 1 crr ipxact_drop/$line | sed -e 's/<spirit:name>//' | sed -e 's/<\/spirit:name>//' `
final_name="${line%.*}"
echo $initial_name
echo $final_name
sed -i "s+${initial_name}+${final_name}+" ipxact_drop/$line
done < file_names.log
echo " ***** SCRIPT RUN FINISHED *****"
Only the sed command at the end is not working
I was reading some other posts and understood that xml files can have problems with scripts. Here is what that worked for me upto now.
To remove cp error: replace #!/bin/sh -f with #!/bin/sh
To remove sed error for the test input: replace sed -i ...... with sed -i.back ....

unable to run script contain * using ssh

I am trying to run a script which have a wild card to search a file but failing as getting error like this:
bash: *: syntax error: operand expected (error token is "*")
This script is running fine on a machine but when try to use within ssh command it falling. Here is a command:
ssh -o StrictHostKeyChecking=no user#local-dev-server 'for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'
Can someone give me hint how i can fix this.
The problem is that, if there are no .version files in the current directory, the code is trying to add 1 to *.version and that is an arithmetic error.
In a directory with no files, observe:
$ ls
$ for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done
bash: *: syntax error: operand expected (error token is "*")
If there was a number.version file, then the code would run:
$ touch 1.version
$ ls
1.version
$ for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done
$ ls
2.version
Also, the cut pipeline is unnecessary. The code can be simplified to:
for i in *.version; do mv "$i" "$((${i%.version}+1)).version"; done
Further, to avoid the missing file error, use nullglob:
shopt -s nullglob; for i in *.version; do mv "$i" "$((${i%.version}+1)).version"; done
Try wrapping the option in quotes:
ssh -o "StrictHostKeyChecking=no" user#local-dev-server 'for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'
Try this
ssh -o StrictHostKeyChecking=no user#local-dev-server `for i in `*.version`; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'

Perl Script to Grep Directory For String and Print

I would like to create a perl or bash script that will read keyboard input and assign a variable, perform a fixed string grep recursively within the current directory filled with Snort logs, and then automatically tcpdump the matched files, grep its output, and print the specified lines to the terminal. Does anyone have a good idea of how this should work?
Here is an example of the methodology I want from the script:
step 1: Read keyboard input and assign it to variable named string.
step 2 command: grep -Fr "$string"
step 2 output: snort.log.1470609906 matches
step 3 command: tcpdump -r snort.log.1470609906 | grep -F "$string" C-10
step 3 output:
Snort log
Here's some bash code that does that:
s="google.com"
grep -Frl "$s" | \
while IFS= read -r x; do
tcpdump -r "$x" | grep -F "$s" -C10
done
idk about perl but you can do it easily enough just in shell:
str="google.com"
find . -type f -name 'snort.log.*' -exec grep -FlZ "$str" {} + |
xargs -0 -I {} sh -c 'tcpdump -r "{}" | grep -F '"$str"' -C10'

echo $variable in cron not working

Im having trouble printing the result of the following when run by a cron. I have a script name under /usr/local/bin/test
#!/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ARAW=`date +%y%m%d`
NAME=`hostname`
TODAY=`date '+%D %r'`
cd /directory/bar/foo/
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
echo "Resolve2 Backup" > /home/user/result.txt
echo " " >> /home/user/result.txt
echo "$VARR" >> /home/user/result.txt
mail -s "Result $TODAY" email#email.com < /home/user/result.txt
I configured it in /etc/cron.d/test to run every 1am:
00 1 * * * root /usr/local/bin/test
When Im running it manually in command line
# /usr/local/bin/test
Im getting the complete value. But when I let cron do the work, it never display the part of echo "$VARR" >> /home/user/result.txt
Any ideas?
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
ls -ltr /path/to/dir will not include the directory in the filename part of the output. Then, you call ls again with this output, and this will look in your current directory, not in /path/to/dir.
In cron, your current directory is likely to be /, and in your manual testing, I bet your current directory is /path/to/dir
Here's another approach to finding the newest file in a directory that emits the full path name:
stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-
Requires GNU stat, check your man page for the correct invocation for your system.
I think your VARR invocation can be:
latest_dir=$(stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-)
interesting_files=$(ls -ltr "$latest_dir"/*something*)
Then, no need for a temp file:
{
echo "Resolve2 Backup"
echo
echo "$interesting_files"
} |
mail -s "Result $TODAY" email#email.com
Thanks for all your tips and response. I solved my problem. The problem is the ouput of $8 and $9 in cron. I dont know what special field being read while it is being run in cron. Im just a newbie in scripting so sorry for my bad script =)

Resources