Variable still doesn't work on remote server - linux

I have this part of script, which I cannot get to work. Been searching everywhere, but I must be missing something here.
export RULE=`cat iptables_sorted.txt`
ssh hostname << EOF
for line in $RULE; do
echo \$line >> test.txt (I have tried with and without slashes, etc...)
done
exit
EOF
After running this part of script, I get
stdin: is not a tty
-bash: line 2: syntax error near unexpected token `103.28.148.0/24'
-bash: line 2: `103.28.148.0/24'
...which is totally weird, because the iptables_sorted.txt is just full of ip ranges (when I run it locally, it works).

Newlines in $RULE cause the problem. Replace them by spaces:
RULE=$(< iptables_sorted.txt)
RULE=${RULE//$'\n'/ }
ssh hostname << EOF
for line in $RULE ; do
echo \$line >> test.txt
done
EOF
Note that this wouldn't work with lines containing whitespace.

Don't use for to iterate over a file; use while. This also demonstrates piping the output of the loop, not just every individual echo, to the remote host. cat is used to read the incoming data and redirect it to the final output file.
while IFS= read -r line; do
echo "$line"
done | ssh hostname 'cat > test.txt'

Related

error reading input file: Key has expired

I am currently making a bash script. The purpose of this script is not important. However, I have a piece of code that is generating an error. The error is as follows:
./script.bs: line 175: read: read error: 0: Key has expired
./script.bs: error reading input file: Key has expired
I have the code below for lines 175-189.
This specific piece of code does the following:
-Reads a txt file, that has a list of targeted files.
-For each targeted file, each line is read. And if that line is contained in $NumbersFile, it will do nothing. If that line is NOT contained in $NumbersFile, it will add that line to NumbersFile.
This general piece of code is working, and added 65810 lines of content to $NumbersFile. It then however got the error I stated above.
I'd like to add that the while loop on line 175 (where the error is happening) is supposed to read about 70'000 lines from the given file.
How do I fix this error so that my script may finish running without a key expired error?
NumbersFile="numbers.txt";
while read line; do
while read gramline; do
has="0";
if grep -Fq -- "$gramline" "$NumbersFile"; then
has="1";
fi
if [ "$has" -eq "0" ]; then
echo "$gramline" >> $NumbersFile;
fi
done < "$line";
done < "targetsfile.txt";
If my comment is accurate, perhaps this might be faster:
{ cat targetsfile.txt; xargs cat < targetsfile.txt; } | sort -u > numbers.txt
Or as clarified:
xargs cat < targetsfile.txt | sort -u > numbers.txt
Notes:
the braces are simply to group the cat and xargs commands so that the combined output can be piped into sort. Documented in the manual at 3.2.4.3 Grouping Commands
The first cat outputs the contents of the "targetsfile.txt" file
the xargs cat < targetsfile.txt construct will execute the cat command for every file listed in the targets file. It's a very concise and efficient way to execute
while IFS= read -r line; do cat "$line"; done < targetsfile.txt

Receiving "curl: (3) Illegal characters found in URL" in a Linux Script

I'm currently working on a script that reads in several hundred IP addresses listed line-by-line in a file. The script is supposed to take the IP addresses and then output the IP address and its longitude and latitude to a new file. However, whenever I try to run the script I received multiple "curl: (3) Illegal characters found in URL". I've been troubleshooting it for a couple of days, and so far I've come up with nothing. Can anyone put me in the right direction to figuring out the problem?
Thanks in advance for any help.
This is the script I'm using.
#!/bin/bash
cat ipCheck.txt | while read line
do
curl "https://api.ipstack.com/"$line"access_key=9c04ea7631a32590cac23eb27ec6c104&foraat=1&fields=ip,latitude,longitude"
done >> locations.txt
I'm currently using a test text file with 10 IP addresses. It is as follows
101.249.211.209
102.165.32.39
102.165.35.37
102.165.49.193
103.27.125.18
103.3.61.193
103.78.132.4
104.143.83.13
104.143.83.8
104.149.216.71
Nothing jumps out as being wrong. Have you tried commenting out your curl line and running the loop with something like:
cat ipCheck.txt | while read line
do
#curl "https://api.ipstack.com/"$line"access_key=9c04ea7631a32590cac23eb27ec6c104&foraat=1&fields=ip,latitude,longitude"
echo "\"$line\""
done >> locations.txt
This should help to locate an extra space or something strange with the formatting.
If it isn't the formatting in your file, you could try:
cat ipCheck.txt | while read line
do
curl "https://api.ipstack.com/${line}access_key=9c04ea7631a32590cac23eb27ec6c104&foraat=1&fields=ip,latitude,longitude"
done >> locations.txt
This should give the same result, but you could have something strange going on with your environment.
According to a previous comment a solution is to remove any additional character like the carriage return \r and checking using printf:
while IFS= read -r line; do
# fix `\r` at end
line=$(echo ${line} | tr -d '\r')
printf 'Downloading %q\n' "$line"
./download.sh $FOLDER $line
done < "$DS"

Read line output in a shell script

I want to run a program (when executed it produces logdata) out of a shell script and write the output into a text file. I failed to do so :/
$prog is the executed prog -> socat /dev/ttyUSB0,b9600 STDOUT
$log/$FILE is just path to a .txt file
I had a Perl script to do this:
open (S,$prog) ||die "Cannot open $prog ($!)\n";
open (R,">>","$log") ||die "Cannot open logfile $log!\n";
while (<S>) {
my $date = localtime->strftime('%d.%m.%Y;%H:%M:%S;');
print "$date$_";
}
I tried to do this in a shell script like this
#!/bin/sh
FILE=/var/log/mylogfile.log
SOCAT=/usr/bin/socat
DEV=/dev/ttyUSB0
BAUD=,b9600
PROG=$SOCAT $DEV$BAUD STDOUT
exec 3<&0
exec 0<$PROG
while read -r line
do
DATE=`date +%d.%m.%Y;%H:%M:%S;`
echo $DATE$line >> $FILE
done
exec 0<&3
Doesn't work at all...
How do I read the output of that prog and pipe it into my text file using a shell script? What did I do wrong (if I didn't do everything wrong)?
Final code:
#!/bin/sh
FILE=/var/log/mylogfile.log
SOCAT=/usr/bin/socat
DEV=/dev/ttyUSB0
BAUD=,b9600
CMD="$SOCAT $DEV$BAUD STDOUT"
$CMD |
while read -r line
do
echo "$(date +'%d.%m.%Y;%H:%M:%S;')$line" >> $FILE
done
To read from a process, use process substitution
exec 0< <( $PROG )
/bin/sh doesn't support it, so use /bin/bash instead.
To assign several words to a variable, quote or backslash whitespace:
PROG="$SOCAT $DEV$BAUD STDOUT"
Semicolon is special in shell, quote it or backslash it:
DATE=$(date '+%d.%m.%Y;%H:%M:%S;')
Moreover, no exec's are needed:
while ...
...
done < <( $PROG )
You might even add > $FILE after done instead of adding each line separately to the file.
Original answer
You haven't shown the error messages — which would have been helpful.
Your problem, though, is probably this line:
DATE=`date +%d.%m.%Y;%H:%M:%S;`
where the semicolons mark the end of a command, and there likely isn't a command %H that does anything useful, etc.
You need quotes around the format argument to date, and I'd use single quotes for this job:
DATE=$(date +'%d.%m.%Y;%H:%M:%S;')
or even replace the two lines in the body of the loop with:
echo "$(date +'%d.%m.%Y;%H:%M:%S;')$line" >> $FILE
The double quotes prevent a variety of problems.
That assumes you fix a bunch of other problems, such as the setting of the variables FILE and prog. Also, I'd probably use:
exec > $FILE
to initially zap the output file and then all subsequent standard output would go to that file, so the echo line becomes:
echo "$(date +'%d.%m.%Y;%H:%M:%S;')$line"
Amended answer
The question was originally missing lots of key information. It eventually got updated to include the complete code.
The problem I identified originally remains an issue, but you weren't running into it because the input redirection was not working. If you want the input to come from a process, use a pipe, or possibly process substitution. However, note that you have #!/bin/sh as your shebang line, and /bin/sh won't recognized process substitution; either change the shebang or use the pipe notation. Note that process substitution has advantages if the loop is setting variables that need to be accessed after the loop is complete.
$SOCAT $DEV$BAUD STDOUT |
while read -r line
do
…
done
or
while read -r line
do
…
done < <($SOCAT $DEV$BAUD STDOUT)
Note that your code contains the line:
PROG=$SOCAT $DEV$BAUD STDOUT
This runs the command identified by $DEV$BAUD with the argument STDOUT and the environment variable PROG set to the value of $SOCAT. That is not what you wanted.
You could use an array:
PROG=($SOCAT $DEV$BAUD STDOUT)
and then run:
"${PROG[#]}"
either in the pipe line:
"${PROG[#]}" |
while read -r line
do
…
done
or with process substitution:
while read -r line
do
…
done < <("${PROG[#]}")
Note that unless there is code after the final exec 0<&3, there was no particular virtue in the redirections involving file descriptor 3. You should also close 3 when you're done with it:
exec 0<&3 3>&-
The 'final' code includes the lines:
CMD="$SOCAT $DEV$BAUD STDOUT"
$CMD |
while read -r line
This works OK because there are no spaces in the arguments to the command. That's a common case, but beware of spaces in arguments and file paths.

Linux command and eof in one line

I want to ask if is possible to combine linux command and <
sendmail -S "lalalal" -f "dailaakak" -au "kakakak" <<EOF
>lalal:lalal
>opp:ttt
>ggg:zzz
EOF
I want to have something like that sendmail -S "lalalal" -f "dailaakak" -au "kakakak" <<EOF; lalal:lalal; opp:ttt; ggg:zzz; EOF
I need to use that not in bash script
If it has to be in one line without newlines use that:
echo -e "lalal:lalal\nopp:ttt\nggg:zzz" | sendmail -S "lalalal" -f "dailaakak" -au "kakakak"
echo -n interpretes escapes characters such as \n as a newline.
If you are asking whether you can use the << EOF in an interactive shell then the answer is yes, you can.
Note this functionality is called here document and that there can be any word instead of EOF. For example:
$ cat - << someword
> Here you
> can
> write text with as many
> newlines as you want.
> someword
Here you
can
write text with as many
newlines as you want.
(cat - prints whatever it receives on stdin)
For more information on here documents you can read for example this: http://tldp.org/LDP/abs/html/here-docs.html
I have tried and succeeded but it's messy. EOF simply does not like to accept substituted new lines for some reason so it needs to be put in another format. Now I'm sure this could be achieved with an expect script one one line but the below is what I have made and works.
echo "ssh localhost `printf "<< EOF\necho "Working!" >> /tmp/myfile \nEOF\n"`" > file.sh; chmod770 file.sh; ./file.sh
printf "<< EOF\necho Test! >> /tmp/myfile \nEOF\n" | xargs ssh localhost
Please ensure chmod file permissions are suitable for your own work case! Putting it into an environment variable instead of a file is also likely to work.

Read filenames from a text file and then make those files?

My code is given below. Echo works fine. But, the moment I redirect output of echo to touch, I get an error "no such file or directory". Why ? How do i fix it ?
If I copy paste the output of only echo, then the file is created, but not with touch.
while read line
do
#touch < echo -e "$correctFilePathAndName"
echo -e "$correctFilePathAndName"
done < $file.txt
If you have file names in each line of your input file file.txt then you don't need to do any loop. You can just do:
touch $(<file.txt)
to create all the files in one single touch command.
You need to provide the file name as argument and not via standard input. You can use command substitution via $(…) or `…`:
while read line
do
touch "$(echo -e "$correctFilePathAndName")"
done < $file.txt
Ehm, lose the echo part... and use the correct variable name.
while read line; do
touch "$line"
done < $file.txt
try :
echo -e "$correctFilePathAndName" | touch
EDIT : Sorry correct piping is :
echo -e "$correctFilePathAndName" | xargs touch
The '<' redirects via stdin whereas touch needs the filename as an argument. xargs transforms stdin in an argument for touch.

Resources