Run cURL command every 5 seconds - linux

This is the command that I want to run -
curl --request POST --data-binary #payload.txt --header "carriots.apiKey:XXXXXXXXXXXXXXXXXXXX" --verbose http://api.carriots.com/streams
This basically sends a data stream to a server.
I want to run this command every 5 seconds. How do I achieve this?

You can run in while loop.
while sleep 5; do cmd; done
Edit:
If you don't want to use while..loop. you can use watch command.
watch -n 5 cmd

Another simple way to accomplish the same task is:
watch -n {{seconds}} {{your-command}}
For example, every 900 milliseconds (note the double quotes around "your-command"):
watch -n 0.9 "curl 'https://public-cloud.abstratium.dev/user/site'"

Related

Bash script results in different output when running from a cron job

I'm puzzled by this problem I'm having on Ubuntu 20.04 where cron is able to run a bash script but the overall outcome is different then when using the shell command.
I've look through all questions I could in here and on Google but couldn't find anyone that had the same problem.
Background:
I'm using Pushgateway to store metrics I'm generating through a bash script, and afterwards it's being imported automatically to Prometheus.
The end goal is to export a list of running processes, their CPU%, Mem% etc, similar to top command.
This is the bash script:
#!/bin/bash
z=$(top -n 1 -bi)
while read -r z
do
var=$var$(awk 'FNR>7{print "cpu_usage{process=\""$12"\", pid=\""$1"\"}", $9z} FNR>7{print "memory_usage{process=\""$12"\", pid=\""$1"\"}", $10z}')
done <<< "$z"
curl -X POST -H "Content-Type: text/plain" --data "$var
" http://localhost:9091/metrics/job/top/instance/machine
I used to have a version that used ps aux but then I found out that it only shows the average CPU% per process.
As you can see, the command I'm running is top -n 1 -bi which gives me a snapshot of active processes and their metrcis.
I'm using awk to format the data, and FNR>7 because I need to ignore the first 7 lines which is the summery presented by top.
The bash scrip is registered on /bin, /usr/bin and /usr/local/bin.
When checking http://localhost:9091/metrics, which is supposed to show me the information gathered, I'm getting this some of information when running the scrip using shell:
cpu_usage{instance="machine",job="top",pid="114468",process="php-fpm74"} 17.6
cpu_usage{instance="machine",job="top",pid="114483",process="php-fpm74"} 11.8
cpu_usage{instance="machine",job="top",pid="126305",process="ffmpeg"} 64.7
And this is the same information when cron is running the same script:
cpu_usage{instance="machine",job="top",pid="114483",process="php-fpm+"} 5
cpu_usage{instance="machine",job="top",pid="126305",process="ffmpeg"} 60
cpu_usage{instance="machine",job="top",pid="128777",process="php"} 15
So, for some reason, when I run it from cron it cuts the process name after 7 places.
I initially though it was related to the FNR>7 but even after changing it to 8 or 9 (and using exec bash to re-register the command) it gives the same results, also when I run it manually it works just fine.
Any help would be appreciated!!

Cronjob stuck without exiting

I have 50+ cronjobs like the one given below running in my Centos 7 server.
curl -s https://url.com/file.php
This runs every 10 minutes. When running manually from the shell it only takes 1-2 minutes. It also is working fine using cronjob. The problem is that it does not exit after the execution. When i check my processes using ps command, it shows many cronjobs of previous dates(even 10 days before) which accumulates the total proccesses in my server.
Line in crontab :-
*/10 * * * * user curl -s https://url.com/file.php > /dev/null 2>&1
Is there any reasion for this? If i rememmber correctly this happened after latest patch update.
Please help.
Modify your command to store the logs in log files instead of dumping it to /dev/null.
Options
--max-time
--connect-timeout
--retry
--retry-max-time
can be used to control the curl command behaviour.

Script to launch a linux terminal, wait 1s and then launch command in it

would like to automate showing of a nice weather report
curl wttr.in/bydgoszcz
weather report
with such script launched from the system menu:
#!/bin/sh
exec io.elementary.terminal -e "curl wttr.in/bydgoszcz"
But this way the output gets a little "incomplete", just like if the command got executed too fast (notice where the user#machine line went here):
weather report in the script
So, is there a way to delay the command after -e flag?
Or maybe totally different approach will automate this to show the output properly?
im not sure but try this
exec io.elementary.terminal -e "sleep1s & curl wttr.in/bydgoszcz"
Would
#!/bin/sh
WTTR=$(curl -s wttr.in/bydgoszcz) && \
exec io.elementary.terminal -e "echo $WTTR"
solve the problem?

How to run multithreading wget to make load test on REST API

I'd like to write a Linux bash script in order to make a load test on URL responding REST, specifically on time spent. I'd like to run multi wget thread then to eval time spent when ALL threads are terminated. But following sh code doesnt calculate time properly, giving back hand whithout waiting for threads ends. Could you help me ? Thanks.
date > temps
for i in (seq 1 10)
do
wget -q -header "Content-Type : application/json" -post-file json.txt server &
done
date >> temps
I think you are looking for a benchmark utility. You may try these commands:
ab (Apache Benchmark)
siege
Both have the option for "concurrent connections" and detailed stats.

Curl Command to Repeat URL Request

Whats the syntax for a linux command that hits a URL repeatedly, x number of times. I don't need to do anything with the data, I just need to replicate hitting refresh 20 times in a browser.
You could use URL sequence substitution with a dummy query string (if you want to use CURL and save a few keystrokes):
curl http://www.myurl.com/?[1-20]
If you have other query strings in your URL, assign the sequence to a throwaway variable:
curl http://www.myurl.com/?myVar=111&fakeVar=[1-20]
Check out the URL section on the man page: https://curl.haxx.se/docs/manpage.html
for i in `seq 1 20`; do curl http://url; done
Or if you want to get timing information back, use ab:
ab -n 20 http://url/
You might be interested in Apache Bench tool which is basically used to do simple load testing.
example :
ab -n 500 -c 20 http://www.example.com/
n = total number of request, c = number of concurrent request
You can use any bash looping constructs like FOR, with is compatible with Linux and Mac.
https://tiswww.case.edu/php/chet/bash/bashref.html#Looping-Constructs
In your specific case you can define N iterations, with N is a number defining how many curl executions you want.
for n in {1..N}; do curl <arguments>; done
ex:
for n in {1..20}; do curl -d #notification.json -H 'Content-Type: application/json' localhost:3000/dispatcher/notify; done
If you want to add an interval before executing the cron the next time you can add a sleep
for i in {1..100}; do echo $i && curl "http://URL" >> /tmp/output.log && sleep 120; done
If you want to add a bit of delay before each request you could use the watchcommand in Linux:
watch curl https://yourdomain.com/page
This will call your url every other second. Alter the interval by adding the ´-n´ parameter with a delay containing the number of seconds. For instance:
watch -n0.5 curl https://yourdomain.com/page
This will now call the url every half second.
CTRL+C to exit watch

Resources