I need to write a cron job which hits a url once every day. Problem is that this url needs authentication. How can I authenticate and hit the url through a cron job?
Thanks
Then write a script, example from wget manual
# Log in to the server. This can be done only once.
wget --save-cookies cookies.txt \
--post-data 'user=foo&password=bar' \
http://server.com/auth.php
# Now grab the page or pages we care about.
wget --load-cookies cookies.txt \
-p http://server.com/interesting/article.php
then call this script from user cron or system cron.
If you want a helpful answer, you need to answer Adam's question: "Is this 'HTTP authenication' or a regular login?" (though I don't know what he means by "regular login").
Jasonw's answer is extremely unlikely to work.
Adam's answer of wget --http-user=foo --http-passwd=bar http://... is your best bet. In fact, given the "403" return code, I am willing to bet that it is the answer you need.
Related
Im testing a get request to my server using wget
i.e I created script that took argument from comandline and uses it as part of wget request
#!/bin/bash
wget http://servername?querystring$1
but I need to change my server so it only accepts a http post rather than a http get. Thats fine but I dont know how to easily test it, how can I send a post with a paramter rather than a get
Im on a version of linux
You have to use --post-data :
wget --post-data 'user=foo&password=bar' http://server.com/auth.php
For more options, you can have a look to the man here or with man wget
First and foremost linux is by far not my thing but I can understand the basics.
What I am needing to do is to login to a site, it will redirect me to another page at which point I will need to submit that page. The site does use Session Cookies so I will need to track those in order to perform all of the actions I need.
after reviewing some sites this is what I was able to come up with, but it does not appear to be capturing the session cookies if nothing else
the entire point to this is to set up a cron job that will run a report for me on a weekly basis instead of me having to login to the site and manually perform the tasks
curl -d "username=MYUSERNAME&password=MYPASSWORD&submit=Llocation --dump-header headers_and_cookies http://example.com?event=loginProcess http://example.com/?event=loginProcess http://example.com/?event=thepromiseland
this is the wget that I already had which seems to not work.
wget --save-cookies cookies.txt --keep-session-cookies --post-data="username=MyUserName&password=MyPassword" --no-cache --spider "http://example.com/?event=thepromiseland"
I've tried every variation I can think of but can't seem to get my script to execute. I can hit it manually and it runs fine, but I'd prefer to have an outside server just hit it once per day. Is the trouble I'm having due to the presence of a query string?
My script URL looks like:
http://domain.com/index.php?g=main&reset=true
I've tried curl, wget, lynx, GET... all with no result whatsoever. What am I missing here?
Aha... knew it had to be something simple. Was missing quotes around the URL! Final working line in CPanel is:
wget -O - "http://domain.com/index.php?g=main&reset=true" >/dev/null 2>&1
You can just simply do like this in Cron Jobs
/usr/bin/curl --user-agent cPanel-Cron "http://example.com/index.php?g=main&reset=true"
I've never written a cron job cmd before in my life, and I want to make sure its right before I run it on my site so nothing messes up.
There is a json feed that my script autopost.php grabs and adds to my database. Usually I just point my browser to the file so it runs and updates the database but I hear cronjobs can do that automatically for me. Would this be correct?
wget -O http://www.domain.com/autopost.php
EDIT:
Okay with your help I got it working. However it only work when I added /dev/null after wget -O Why is that?
You're missing the run schedule part of the cron command. you need something like this:
* * * * * wget -O http://www.domain.com/autopost.php
That says "do a wget every minute. For the syntax, see:
http://www.adminschoice.com/crontab-quick-reference
-sure it can !!
follow this link and you will be able to program you job every time you want :
www.aodba.com
Just when I thought I was understanding cron jobs, I realize I'm still not understanding. I'm trying to set up a cron job through Dreamhost to ping a URL once an hour. This URL when visited performs a small(ish) query and updates the database.
A few examples I've tried which haven't seemed to of worked:
wget -O /dev/null http://www.domain.com/index.php?ACT=25&profile_id=1
and
wget -q http://www.domain.com/index.php?ACT=25&profile_id=1
The correct domain was inserted into the URL of course.
So, what am I missing? How could I execute a URL via a Cronjob?
One thing, are you escaping your url?
Try with:
wget -O /dev/null "http://www.domain.com/index.php?ACT=25&profile_id=1"
Having an ampersand in the URL usually leads to strange behaviour (process going background and ignoring the rest of the URL, etc).
I just had the same exact problem, and I found that actually two solutions work. One is as Victor Pimentel suggested: enclosing the url with " and the second option is to escape the & character in the cronjob like this: \&, so in your case the statement would look like this:
wget -q http://www.domain.com/index.php?ACT=25\&profile_id=1
or
wget -q "http://www.domain.com/index.php?ACT=25&profile_id=1"
Putting the following on the Dreamhost control panel\goodies\cron seems to work for me
wget -qO /dev/null http://domain/cron.php
links -dump http://Txx-S.com/php/test1.php
Worked much better than wget. It echoes the outputs of the php script into the email without all the junk that wget provides. Took awhile to get here, but it IS in the Dreamhost documentation. Don't need all the home/user stuff and the headache of placing all the php's under different users...
IMHO.
Pete