validating multiple expressions within a until loop in a bash script - linux

I am getting an error when trying to evaluate multiple expressions within a until loop
I have tried multiple combinations as shown in the code section but none of them work.
Attempt 1:
until [ [ $http_response_code=$(curl --write-out "%{http_code}" --silent --output /dev/null "$http_url") ] = $http_success_code ]
do
<something>
done
Attempt 2:
until [ $http_response_code=$(curl --write-out "%{http_code}" --silent --output /dev/null "$http_url") ] = [ $http_success_code ]
do
<something>
done
Attempt 3:
until [ ($http_response_code=$(curl --write-out "%{http_code}" --silent --output /dev/null "$http_url")) = $http_success_code ]
do
<something>
done
Expected - No Syntax Error
Actual - Syntax Error

You can't do variable assignment in a test expression. You need to have the assignment separate from the test. Something like this could work:
until {
http_response_code=$(curl --write-out "%{http_code}" --silent --output /dev/null "$http_url")
[[ "$http_response_code" = "$http_success_code" ]]
}; do
<something>
done

Related

GNU Parallel echo job number

So I have this code: parallel --jobs 5 curl --write-out '\\n %{http_code}\\n' --silent --head --output /dev/null https://example.com/id={} ::: {427240690783..427240690793}
Which returns 10 lots of 404's.
If the HTTP Response code was 200, I want to echo out the current job iteration ID, displayed within the {}. How would I go about echoing these?
Apologies, I'm brand new to GNU.
When a command gets this complex I would normally make function:
doit() {
curl --write-out '\n %{http_code}\n' --silent --head --output /dev/null https://example.com/id="$1"
}
export -f
parallel doit ::: {427240690783..427240690793}
This is because it is easy to test if the function does the right thing for one input. When it does that, you call GNU Parallel to call the function.
Here you probably want something like:
grep200() {
status=$(curl --write-out '%{http_code}' --silent --head --output /dev/null "$1");
if [ $status == 200 ] ; then
echo "$1";
fi;
}
export -f grep200
parallel grep200 https://example.com/id={} ::: {427240690783..427240690793}

Bash script code runs but does not return data

I've created a program to scan for directories and files in websites, but it does not return any info on the screen when I run it. I`m using the following code:
#!/bin/bash
for palavra in $(cat lista.txt)
do
resposta=$(curl -s -o /dev/null -w "%{http_code}" $1/$palavra/)
resposta2=$(curl -s -o /dev/null -w "%{http_code}" $1/$palavra)
if [ $resposta == "200" ]
then
echo "Diretorio encontrado: $palavra"
fi
if [ $resposta2 == "200" ]
then
echo "Arquivo encontrado: $palavra"
fi
done
It runs, but simply returns nothing. The file "lista.txt" is in the same folder as the program and it has information, double checked it.
Someone, please, help me find what is wrong! Thanks you!
After the calculation of resposta and resposta2, add the following lines of code:
echo "palavra is : [$palavra]"
echo "resposta is : [$resposta]"
echo "resposta2 is : [$resposta2]"
You'll see from there what to do.

how to do if statements with "and" in sh

I try to solve this problem with the sh and not the bash.
All i want is a if statement that check some regex AND something else. Normally in bash this would be an easy job but with the sh i only find solutions online that wont work for me
First thing i want to check:
if echo "$1"| grep -E -q '^(\-t|\-\-test)$';
Than i want to check:
if echo "$#"| grep -E -q '^(1|2)$';
Combined:
if [ \(echo "$1"| grep -E -q '^(\-h|\-\-help)$'\) -a \(echo "$#"| grep -E -q '^(1|\2)$'\) ];
ERROR:
grep: grep: ./analysehtml.sh: 41: [: missing ]
Invalid back reference
(echo: No such file or directory
grep: 1: No such file or directory
I also try many diffrent combinations with this brackets but non of them worked for me. Maybe someone can help me here :)
logical and between commands is &&
if
echo "$1"| grep -E -q '^(\-h|\-\-help)$' &&
echo "$#"| grep -E -q '^(1|\2)$';
By default the exit status of a pipe is the exit status of last command.
set -o pipefail the exit status is fail if if any command of pipe has a fail exit status.
when only the exit status of the last command of a sequence must be checked
if { command11; command12;} && { command21; command22;};
However to check parameters there is no need to launch another process grep with a pipe there's an overhead.
Consider using following constructs work with any POSIX sh.
if { [ "$1" = -h ] || [ "$1" = --help ];} &&
{ [ $# -eq 1 ] || [ $# -eq 2 ];};
EDIT: Following are not POSIX but may work with many shell
if [[ $1 = -h || $1 = --help ]] && [[ $# = 1 || $# = 2 ]];
Works also with bash with set -o posix
Perhaps for your particular case, pattern matching might be better:
if [[ $1 =~ ^(\-h|\-\-help)$ && $# =~ ^(1|\2)$ ]]; then
The problem with your command is that the part within test or [ command is expression, not commands list.
So when you run [ echo 'hello' ] or [ \( echo 'hello' \) ] complains error in spite of sh or Bash. Refer to the classic test usage: The classic test command
And the syntax of if is:
if list; then list; fi
So you can just combine command with && operator in if statements:
if echo "$1"| grep -E -q '^(\-h|\-\-help)$' && echo "$#"| grep -E -q '^(1|\2)$';

How to stop xargs on first error?

I have an pages.txt file with 100 URLs inside. I want to check them one by one and fail on the first problem. This is what I'm doing:
cat pages.txt | xargs -n 1 curl --silent \
--output /dev/null --write-out '%{url_effective}: %{http_code}\n'; echo $?
Exit code is 1, but I see it only when the entire file is done. How to stop earlier, on the first problem?
General method
xargs -n 1 sh -c '<your_command> $0 || exit 255' < input
Specific case
xargs -n 1 sh -c 'curl --silent --output /dev/null \
--write-out "%{url_effective}: %{http_code}\n" $0 || exit 255' < pages.txt
Explanation
For every URL in pages.txt, executes sh -c 'curl ... $0 || exit 255' one by one (-n 1) forcing to exit with 255 if the command fails.
From man xargs:
If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens.
I haven't found a way to do what you ask for with xargs, but a loop with read might be what you are looking for.
while read URL; do
curl --silent \
--output /dev/null --write-out '%{url_effective}: %{http_code}\n' $URL;
RET=$?;
echo $RET;
if [ $RET -ne 0 ]; then break; fi
done < pages.txt

Linux script with curl to check webservice is up

I have a webservice provided at http://localhost/test/testweb
I want to write a script to check if webservice is up with curl
If there a curl parameter given, returns 200 OK ok true false so that I can use it is if-else block in linux script
curl -sL -w "%{http_code}\\n" "http://www.google.com/" -o /dev/null
-s = Silent cURL's output
-L = Follow redirects
-w = Custom output format
-o = Redirects the HTML output to /dev/null
Example:
[~]$ curl -sL -w "%{http_code}\\n" "http://www.google.com/" -o /dev/null
200
I would probably remove the \\n if I were to capture the output.
I use:
curl -f -s -I "http://example.com" &>/dev/null && echo OK || echo FAIL
-f --fail Fail silently (no output at all) on HTTP errors
-s --silent Silent mode
-I --head Show document info only
Note:
depending on needs you can also remove the "-I" because in some cases you need to do a GET and not a HEAD
Same as #burhan-khalid, but added --connect-timeout 3 and --max-time 5.
test_command='curl -sL \
-w "%{http_code}\\n" \
"http://www.google.com:8080/" \
-o /dev/null \
--connect-timeout 3 \
--max-time 5'
if [ $(test_command) == "200" ] ;
then
echo "OK" ;
else
echo "KO" ;
fi
That will check the headers via wget 2>&1pipes the stderr to stdout
grep filters
-O /dev/null just throws the content of the page
if [ "\`wget http://example.org/ -O /dev/null -S --quiet 2>&1 | grep '200 OK'\`" != "" ];
then
echo Hello;
fi;
I know not curl, but still a solution
I needed a better answer to this, so I wrote the script below.
The fakePhrase is used to detect ISP "Search Assist" adware HTTP resposnes.
#!/bin/bash
fakePhrase="verizon"
siteList=(
'http://google.com'
'https://google.com'
'http://wikipedia.org'
'https://wikipedia.org'
'http://cantgettherefromhere'
'http://searchassist.verizon.com'
)
exitStatus=0
function isUp {
http=`curl -sL -w "%{http_code}" "$1" -o temp_isUp`
fakeResponse=`cat temp_isUp | grep $fakePhrase`
if [ -n "$fakeResponse" ]; then
http=$fakePhrase
fi
case $http in
[2]*)
;;
[3]*)
echo 'Redirect'
;;
[4]*)
exitStatus=4
echo "$1 is DENIED with ${http}"
;;
[5]*)
exitStatus=5
echo "$1 is ERROR with ${http}"
;;
*)
exitStatus=6
echo "$1 is NO RESPONSE with ${http}"
;;
esac
}
for var in "${siteList[#]}"
do
isUp $var
done
if [ "$exitStatus" -eq "0" ]; then
echo 'All up'
fi
rm temp_isUp
exit $exitStatus
Use this:
curl -o $CURL_OUTPUT -s -w %{http_code}\\n%{time_total}\\n $URL > $TMP_FILE 2>&1
cat $TMP_FILE

Resources