How to speed up this curl script and response faster - linux

I do have this script to get some variables hitting a server. The point is that is doing it slow and I have to run multiple times same script to get a decent rate of requests. How can I multiply number of threads with this curl script without needing to run it 4 or 5 times?
Also I would like to make it faster and performed, this is original one
while ! grep "TokenException" output.txt > /dev/null
do
echo -e '\n'$(date +%x_%H:%M:%S:%3N) > output.txt
curl -s -H 'Host: host.com' -H "Cookie: session-token="$SESSION\" -H "x-amz-access-token: $token" -H "x-flex-instance-id: $flex" -H 'Accept: */*' -H 'User-Agent: Dalvik/2.1.0 (Linux; U; Android 7.1.1; Nexus 5X Build/N4F26T) RabbitAndroid/3.0.6778.0' -H 'Accept-Language: en-us' --compressed 'https://hostname.com/GetOffersForProvider?serviceAreaIds=16' >> output.txt
if grep -q "OFFERED" output.txt; then
cat output.txt >> foundb.txt
./getlast.bat
if [ ! -f pageflag.txt ]; then
/usr/bin/php alert.php
echo "paged" > pageflag.txt
fi
sleep 0.05
fi
done
I modified to
while ! grep "TokenException" output.txt > /dev/null
do
echo -e '\n'$(date +%x_%H:%M:%S:%3N) > output.txt
curl -s -H 'Host: host.com' -H "Cookie: session-token="$SESSION\" -H "x-amz-access-token: $token" -H "x-flex-instance-id: $flex" -H 'Accept: */*' -H 'User-Agent: Dalvik/2.1.0 (Linux; U; Android 7.1.1; Nexus 5X Build/N4F26T) RabbitAndroid/3.0.6778.0' -H 'Accept-Language: en-us' --compressed 'https://hostname.com/GetOffersForProvider?serviceAreaIds=16' >> output.txt
if grep -q "OFFERED" output.txt; then
cat output.txt >> foundb.txt
./getlast.bat
/usr/bin/php alert.php
sleep 0.05
fi
done
Some suggestion to run on multi threading and faster even with sleep or another way to pause for milliseconds instead? Main point is that it should execute ./getlast.bat as fast as possible with variables caught from curl, but it has a time difference between data and getlast.bat execution of 2 seconds, too much

curl itself doesn't support multiple connections/threads and/or resume cancelled/stalled operations.
Use aria2 or something similar.

Related

m3u8 chunks combined lagging

I wrote a script to download chunks (.ts) from a web page that stream some soocer videos and combine all the chunks in a one file. The issue is that the final result is a video "lagging", it seems that some frames are missing.
I encoded the references of the website to avoid get the question removed.
m3u8 url
BASE64(aHR0cHM6Ly9uZXdlZGdlLmV1LWNlbnRyYWwtMS5lZGdlLm15Y2RuLmxpdmUvbGl2ZS92aW50ZXF1YXRyb2hyczIvdmludGVxdWF0cm9ocnMyXzIwMDAvaW5kZXgubTN1OA==)
Referer and Origin headers
BASE64(aHR0cHM6Ly9mdXRlbWF4LmFwcA==)
Script
#!/bin/bash
#./hls-download.sh "$url" "$title" "$(date +\"%Y-%m-%d %H:%M:%S\")"
url_m3u8="$1"
title="$2"
duration=$(date -d "$3 minutes" +%s)
cmd_curl="curl -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:101.0) Gecko/20100101 Firefox/101.0' -H 'Accept: */*' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate, br' -H 'Origin: BASE64(aHR0cHM6Ly9mdXRlbWF4LmFwcA==)' -H 'Connection: keep-alive' -H 'Referer: BASE64(aHR0cHM6Ly9mdXRlbWF4LmFwcA==)' -H 'Sec-Fetch-Dest: empty' -H 'Sec-Fetch-Mode: cors' -H 'Sec-Fetch-Site: cross-site' --retry 5 --fail --compressed "
mkdir -- "$title"
cd "$title"
while [[ $(date -u +%s) -le $duration ]]
do
response=$(eval "$cmd_curl $url_m3u8")
urls=$(echo $response | grep -oP '(https[^ ]*)')
for url in $urls
do
filename=$(basename $url)
if ! [ -f $filename ]; then
touch $filename
eval "$cmd_curl -O $url"
fi
done
done
cat *.ts > ../"$title".ts
cd ../
rm -rf "$title"
Run this script passing m3u8 url, program title and duration in minutes
./hls-download.sh BASE64(aHR0cHM6Ly9uZXdlZGdlLmV1LWNlbnRyYWwtMS5lZGdlLm15Y2RuLmxpdmUvbGl2ZS92aW50ZXF1YXRyb2hyczIvdmludGVxdWF0cm9ocnMyXzIwMDAvaW5kZXgubTN1OA==) program 1
I still looking an answer to my question but I found a better way to solve the issue.
ffmpeg -headers $'User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:101.0) Gecko/20100101 Firefox/101.0\r\nAccept: */*\r\nAccept-Language: en-US,en;q=0.5\r\nAccept-Encoding: gzip, deflate, br\r\nOrigin: BASE64(aHR0cHM6Ly9mdXRlbWF4LmFwcA==)\r\nConnection: keep-alive\r\nReferer: BASE64(aHR0cHM6Ly9mdXRlbWF4LmFwcA==)\r\nSec-Fetch-Dest: empty\r\nSec-Fetch-Mode: cors\r\nSec-Fetch-Site: cross-site' -i "BASE64(aHR0cHM6Ly9uZXdlZGdlLmV1LWNlbnRyYWwtMS5lZGdlLm15Y2RuLmxpdmUvbGl2ZS92aW50ZXF1YXRyb2hyczIvdmludGVxdWF0cm9ocnMyXzIwMDAvaW5kZXgubTN1OA==)" -c copy -bsf:a aac_adtstoasc "output.mp4"

How to SSH a curl command

env = GNU bash, version 4.2.46(2)-release (x86_64-redhat-linux-gnu)
Situation:
SystemA=No internet. SystemB=Yes internet.
SystemA has a log file. SystemA wants SystemB to send a curl command for him.
SystemA$
ssh SystemB curl -X POST -H "Content-type: application/json" -d "$data" $hook
= fail
SystemB$
curl -X POST -H "Content-type: application/json" -d "$data" $hook = success
How do I achieve this without SystemA 'scp'ing the log file to SystemB?
It's heavily schedule related so I want SystemA let SystemB work.
EDIT:
I narrowed down the problem :
On SystemB:
curl -X POST -H "Content-type: application/json" -d '{$data}' $hookurl = success
curl -X POST -H "Content-type: application/json" -d {$data} $hookurl = fail
So when I type in SystemA
ssh SystemB curl -X POST -H "Content-type: application/json" -d "{$data}" $hookurl
It actually runs with -d {$data} on SystemB. How can I fix this?
Update:
ssh SystemB curl -X POST -H "Content-type: application/json" -d "'{$data}'" $hookurl
did work and actually sent data to url,
but curl: (6) Could not resolve host: application; Unknown error occurred again.
You can use this command :
ssh SystemB /bin/bash <<< "$(declare -p data hook);"'curl -X POST -H "Content-type: application/json" -d "$data" "$hook"'
"$(declare -p data hook);" takes variable definitions from SystemA and passes them to SystemB
Use sshfs perhaps
on systemB
sshfs -p'ssh port' -o password_stdin user#systemA:/home/user/dir /home/user/dir <<< 'passwd'

Grep unrecognized option '-->' while parsion content of html element [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I'm trying to parse content of div element with class "error-icon" and grep shows unrecognized option with multiple attempts.Here is my code.Please help
#!/bin/sh
verifyCard=$1
if [ -z "${verifyCard}" ]; then echo "No argument supplied"; exit 1; fi
response=$(curl 'https://www.isic.org/verify/' -H 'User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:80.0) Gecko/20100101 Firefox/80.0' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' -H 'Accept-Language: en-US,en;q=0.5' --compressed -H 'Content-Type: application/x-www-form-urlencoded' -H 'Origin: https://www.isic.org' -H 'Connection: keep-alive' -H 'Referer: https://www.isic.org/verify/' -H 'Cookie: PHPSESSID=46plnh6b31e2pusv1do6thfbm7; AWSELB=AF73034F18B38D7DCED6DEDC728D31BA3F3A73F96747FEE7FA7C4F7A74BC9954E5928CBDDD5C053FFB2A37CE37136C4BA008B15163192B34CFA04D35BEC4ED0D0D2913A2FB; AWSELBCORS=AF73034F18B38D7DCED6DEDC728D31BA3F3A73F96747FEE7FA7C4F7A74BC9954E5928CBDDD5C053FFB2A37CE37136C4BA008B15163192B34CFA04D35BEC4ED0D0D2913A2FB; _ga=GA1.2.650910486.1600495658; _gid=GA1.2.731428038.1600495658; _gat=1' -H 'Upgrade-Insecure-Requests: 1' --data-raw 'verify_card_number=${$verifyCard}')
output=$(grep -o '<div class="error-icon">[^<]*' "$response" | grep -o '[^>]*$')
echo "$output"
The line
response=$(curl ...)
puts the output of the curl command in a variable named response.
In your grep command you try to pass the expansion of the variable as an argument.
output=$(grep -o '<div class="error-icon">[^<]*' "$response" | ...)
grep tries to interpret the value as command line arguments which may result in various errors depending on the actual output. In my test I got a message grep: <some html code>: File name too long because it tries to interpret this as a file name argument.
You should save the data in a file and pass this to grep. Adapt the name and location of the temporary file as necessary. Example:
#!/bin/sh
verifyCard=$1
if [ -z "${verifyCard}" ]; then echo "No argument supplied"; exit 1; fi
curl -o response-file 'https://www.isic.org/verify/' -H 'User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:80.0) Gecko/20100101 Firefox/80.0' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' -H 'Accept-Language: en-US,en;q=0.5' --compressed -H 'Content-Type: application/x-www-form-urlencoded' -H 'Origin: https://www.isic.org' -H 'Connection: keep-alive' -H 'Referer: https://www.isic.org/verify/' -H 'Cookie: PHPSESSID=46plnh6b31e2pusv1do6thfbm7; AWSELB=AF73034F18B38D7DCED6DEDC728D31BA3F3A73F96747FEE7FA7C4F7A74BC9954E5928CBDDD5C053FFB2A37CE37136C4BA008B15163192B34CFA04D35BEC4ED0D0D2913A2FB; AWSELBCORS=AF73034F18B38D7DCED6DEDC728D31BA3F3A73F96747FEE7FA7C4F7A74BC9954E5928CBDDD5C053FFB2A37CE37136C4BA008B15163192B34CFA04D35BEC4ED0D0D2913A2FB; _ga=GA1.2.650910486.1600495658; _gid=GA1.2.731428038.1600495658; _gat=1' -H 'Upgrade-Insecure-Requests: 1' --data-raw 'verify_card_number=${$verifyCard}'
output=$(grep -o '<div class="error-icon">[^<]*' response-file | grep -o '[^>]*$')
rm response-file
echo "$output"
If you use bash or zsh instead of sh, there are ways to substitute some variable value as an input file, see e.g. the answers in using a Bash variable in place of a file as input for an executable.

How to escape double quotes and exclamation mark in password?

I have the following code:
curl -s --insecure -H "Content-Type: application/json" -X POST -d "{\"username\":\"$1\",\"password\":\"$2\"}" http://apiurl
In the above curl command I want to escape the " and ! in the password. Right now the password is just given as $2
I have modified the curl command as below but it doesn't seem to work
curl -s --insecure -H "Content-Type: application/json" -X POST -d "{\"username\":\"$1\",\"password\":\"$2\"}" http://apiurl
Do not try to generate JSON manually; use a program like jq, which knows how to escape things properly, to generate it for you.
json=$(jq -n --arg u "$1" --arg p "$2" '{username: $u, password: $p}')
curl -s --insecure -H "Content-Type: application/json" -X POST -d "$json" http://apiurl

bash script to windows cmd, with nested command

I am doing this curl command successfully in bash console to retrieve data from a rest resource :
curl -H "Accept: application/json" -H "Content-Type: application/json" -H "X-content: test_content" -H "X-Public: public_key" -H "X-Hash: $(printf "testcompany1" | openssl sha256 -hmac "secret_key" | sed "s/^.* //" | tr -d "\n")" -X GET http://192.168.100.20/rest/v01/customer/csv > /home/user/customer.csv
What I want to do is to use this in a windows cmd shell. Cygwin and curl is installed. So I have tracked it down to this puzzling me, the hmac hashing, done as a nested command in my script using $(command) :
$(printf "testcompany1" | openssl sha256 -hmac "secret_key" | sed "s/^.* //" | tr -d "\n")
How do I get a windows shell cmd recognize this ? Or is there another smarter approach in windows to get rest data with hmac auth ?
I split the bash oneliner up i chunks in a batch file like this :
echo off
set content=content
set secret=secret_key
printf %content% | openssl sha256 -hmac %secret% | sed "s/^.* //" | tr -d "\n" > hash.txt
set /p hash=< hash.txt
curl -H "Accept: application/json" -H "Content-Type: application/json" -H "X-content: %content%" -H "X-Public: public_key -H "X-Hash: %hash%" -X GET http://192.168.100.20/rest/v01/customer/csv > out.csv
The printf command was very picky regarding generating the correct hash value, so I send it to a file first, and back again.

Resources