How can I test a Http Post query from linux command line - linux

Im testing a get request to my server using wget
i.e I created script that took argument from comandline and uses it as part of wget request
#!/bin/bash
wget http://servername?querystring$1
but I need to change my server so it only accepts a http post rather than a http get. Thats fine but I dont know how to easily test it, how can I send a post with a paramter rather than a get
Im on a version of linux

You have to use --post-data :
wget --post-data 'user=foo&password=bar' http://server.com/auth.php
For more options, you can have a look to the man here or with man wget

Related

How to use curl call a php to get file url first and then download?

On my webserver, the PHP code can get the latest file list.
On another Linux server, I want to use CRON and cURL to call this PHP code to get the file list first, and then download the file.
curl http://www.website.com/index.php
This code works and its results are string URL. (eg: http://www.website.com/files/new.zip )
However, I don't know how to post this string to curl for downloading this files. What I am tring to do is like this:
curl -O (curl http://www.website.com/index.php )
How can I make this work? Thanks.
You was almost there:
curl $(curl "http://www.website.com/index.php")
If you want to hide progress:
curl $(curl -s "http://www.website.com/index.php")
Looks like OP had line breaks/caret returns in outputed URL so http://.... became .ttp://....

CPanel Email piping to shell script in linux, syntax

I am trying to write a shell script, where emails get piped to (by an email forward in cpanel).
The shell script will then post the entire email to a url using curl.
The script looks like this:
curl -d "param=$1" http://localhost/stuff/
And the forward looks like this:
|/home/usr/script/curlthis.sh
This is only sort of working.
The email gets bounced back even though the curl posts to the url successfully. (it looks like only part of the email is getting posted, but I am not 100% sure)
I have been told the email bounces because I am not reading the stdin, but I am not sure why I need to do that and why I cannot use $1?
How can I read the entire contents of the pipe (then post it using curl), and will that stop the mail server from bouncing it back?
EDIT
Using the answer below here is what I came up with:
#!/bin/bash
m=$(cat -)
escapedm="$(perl -MURI::Escape -e 'print uri_escape($ARGV[0]);' "$m")"
curl -silent -G -d "param=$escapedm" http://localhost/stuff/ 2>&1 >/dev/null
This part:
2>&1 >/dev/null
is shockingly important. If you don't redirect the stdout/err to null then the email gets bounced back for whatever reason.
Your mail is being passed to the script as a stream on stdin, and not as a parameter ($1). Note that your forward script begins with a pipe, and that's the mechanism passing the mail into your script.
So you should be able to read this in your shell (bash?) using the read statement. See this SO answer for more details.

Giving credentials for a url in cron job

I need to write a cron job which hits a url once every day. Problem is that this url needs authentication. How can I authenticate and hit the url through a cron job?
Thanks
Then write a script, example from wget manual
# Log in to the server. This can be done only once.
wget --save-cookies cookies.txt \
--post-data 'user=foo&password=bar' \
http://server.com/auth.php
# Now grab the page or pages we care about.
wget --load-cookies cookies.txt \
-p http://server.com/interesting/article.php
then call this script from user cron or system cron.
If you want a helpful answer, you need to answer Adam's question: "Is this 'HTTP authenication' or a regular login?" (though I don't know what he means by "regular login").
Jasonw's answer is extremely unlikely to work.
Adam's answer of wget --http-user=foo --http-passwd=bar http://... is your best bet. In fact, given the "403" return code, I am willing to bet that it is the answer you need.

shell script wget not working when used as a cron job

i have a function in a php web app that needs to get periodically called by a cron job. originally i just did a simple wget to the url to call the function and everything worked fine, but ever since we added user auth i am having trouble getting it to work.
if i manually execute these commands i can login, get the cookie and then access the correct url:
site=http://some.site/login/in/here
cookie=`wget --post-data 'username=testuser&password=testpassword' $site -q -S -O /dev/null 2>&1 | awk '/Set-Cookie/{print $2}' | awk 'NR==2{print}'`
wget -O /dev/null --header="Cookie: $cookie" http://some.site/call/this/function
but when executed as a script, either manually or through cron, it doesn't work.
i am new to shell scripting, any help would be appreciated
this is being run on ubuntu server 10.04
OK simple things first -
I assume the file begins with #!/bin/bash or something
You have chmodded the file +x
You're using unix 0x0d line endings
And you're not expecting to return any of the variables to the calling shell, I presume?
Failing this try teeing the output of each command to a log file.
In theory, the only difference from manually executing these and using a script would be the timing.
Try inserting a sleep 5 or so before the last command. Maybe the http server does some internal communication and that takes a while. Hard to say, because you didn't post the error you get.

Setting up Dreamhost Cron job to simply execute URL

Just when I thought I was understanding cron jobs, I realize I'm still not understanding. I'm trying to set up a cron job through Dreamhost to ping a URL once an hour. This URL when visited performs a small(ish) query and updates the database.
A few examples I've tried which haven't seemed to of worked:
wget -O /dev/null http://www.domain.com/index.php?ACT=25&profile_id=1
and
wget -q http://www.domain.com/index.php?ACT=25&profile_id=1
The correct domain was inserted into the URL of course.
So, what am I missing? How could I execute a URL via a Cronjob?
One thing, are you escaping your url?
Try with:
wget -O /dev/null "http://www.domain.com/index.php?ACT=25&profile_id=1"
Having an ampersand in the URL usually leads to strange behaviour (process going background and ignoring the rest of the URL, etc).
I just had the same exact problem, and I found that actually two solutions work. One is as Victor Pimentel suggested: enclosing the url with " and the second option is to escape the & character in the cronjob like this: \&, so in your case the statement would look like this:
wget -q http://www.domain.com/index.php?ACT=25\&profile_id=1
or
wget -q "http://www.domain.com/index.php?ACT=25&profile_id=1"
Putting the following on the Dreamhost control panel\goodies\cron seems to work for me
wget -qO /dev/null http://domain/cron.php
links -dump http://Txx-S.com/php/test1.php
Worked much better than wget. It echoes the outputs of the php script into the email without all the junk that wget provides. Took awhile to get here, but it IS in the Dreamhost documentation. Don't need all the home/user stuff and the headache of placing all the php's under different users...
IMHO.
Pete

Resources