Cronjob with password protected site (.htaccess) - .htaccess

I want to create a cronjob that every X time goes to open a webpage.
This webpage is password protected by .htaccess (user=admin, passwor=pass). The instruction I give is the following:
wget --user=admin --password='pass' http://www.mywebsite.com/test.php
But cron gives me the following error:
--2012-05-02 10:14:01-- http://www.mywebsite.com/test.php
Resolving www.mywebsite.com... IP
Connecting to www.mywebsite.com|IP|:80... connected.
HTTP request sent, awaiting response... 401 Authorization Required
Reusing existing connection to www.mywebsite.com:80.
HTTP request sent, awaiting response... 403 Forbidden
2012-05-02 10:14:01 ERROR 403: Forbidden.
I have also tried doing:
wget admin:pass#http://www.mywebsite.com/test.php
but with similar errors. How can I solve? Thank you in advance for your help.

You are making a small mistake.
keep the http:// before the url.
You have
admin:pass#http://www.mywebsite.com/test.php
Change it to
http://admin:pass#www.mywebsite.com/test.php
Hope that works.

wget --user admin --password pass http://www.mywebsite.com/test.php
Opens every minutes a website with a htaccess password
*/1 * * * * wget -O /dev/null --user admin --password pass "http://www.mywebsite.com/test.php" > /dev/null 2>&1

Add auth parameter to url. This works for me when call url directly.
http://yoururl.ext?auth=id:psw
I don't know how much secure it is...

Related

Wget error: HTTP request sent, awaiting response... 401 Unauthorized Authorization failed

So i have to download a whole webpage for worst case scenario if our network collapses. But with Wget I only get Error 401 unauthorized. I suspect kerberos.
I've tried curl, at first it only showed index code then I added --output and it only downloads index page. But only index beacause after that the site is password protected.
wget --header="Authorization: Basic XXXXXXXXXXXXXXXXXXX" --recursive --wait=5 --level=2 --execute="robots = off" --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains my.domain --no-parent https://webpage.internal
and curl
curl --anyauth --negotiate -u admin https://webpage.internal --output index.html
Is there any way to use curl for whole website, or is there a simple fix for my wget.
Thanks.
Okay solved it myself. Just needed to change:
-header="Authorization: Basic XXXXXXXXXXXXXXXXXXX"
to
-header="Authorization: OAuth XXXXXXXXXXXXXXXXXXX"
and it started to clone.
Edit: did not solve.

How to download Bing Static Map in Linux

I'm trying to download the static map using Bing Maps API. It works when I load the URL from Chrome, but when I tried to curl to wget from Linux, I got Auth Failed error.
The URL are identical but for some reason Bing is blocking calls from Linux?
Here's the commands I tried:
wget -O map.png http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...
curl -O map.png http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...
Error:
Resolving dev.virtualearth.net (dev.virtualearth.net)... 131.253.14.8
Connecting to dev.virtualearth.net (dev.virtualearth.net)|131.253.14.8|:80... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Username/Password Authentication Failed.
--2016-10-24 15:42:30-- http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/.../12?mapSize=340,500
Reusing existing connection to dev.virtualearth.net:80.
HTTP request sent, awaiting response... 401 Unauthorized
Username/Password Authentication Failed.
I'm not sure if it has anything to do with Key Type, I've tried several from Public Website to Dev/Test but still didn't work.
The url needs to be wrapped (because of & symbol in query string that needs to be escaped) with quotes:
wget 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...'
Examples
Via wget:
wget -O map.jpg 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/Bellevue%20Washington?mapLayer=TrafficFlow&key=<key>'
Via curl:
curl -o map.jpg 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/Bellevue%20Washington?mapLayer=TrafficFlow&key=<key>'
Have been verified under Ubuntu 16.04

How to use wget on a page with authentication

I've been searching the internet about wget, and found many posts on how to use wget to log into a site that has a login page.
The site uses https, and the form fields the login page looks for is "userid" and "password". I've verified this by checking the Network tool in Chrome when you hit F12.
I 've been using the following posts as guidelines:
http://www.unix.com/shell-programming-and-scripting/131020-using-wget-curl-http-post-authentication.html
And
wget with authentication
What I've tried:
testlab:/lua_curl_tests# wget --save-cookies cookies.txt --post-data 'userid=myid&password=123123' https://10.123.11.22/cgi-bin/acd/myapp/controller/method1
wget: unrecognized option `--save-cookies'
BusyBox v1.21.1 (2013-07-05 16:54:31 UTC) multi-call binary.
And also
testlab/lua_curl_tests# wget
http://userid=myid:123123#10.123.11.22/cgi-bin/acd/myapp/controller/method1
Connecting to 10.123.11.22 (10.123.11.22:80) wget: server returned
error: HTTP/1.1 403 Forbidden
Can you tell me what I'm doing wrong? Ultimately, what I'd like to do is login, and post data, then grab the resulting page.
I'm also currently looking at curl - to see if I really should be doing this with curl (lua-curl)

Jenkins remote api error 404

I am trying to update remote jenkins config.xml file but i am having issue, I am sure username and password are correct and path of file also correct because same thing is working in other server but with newer version of jenkin 1.514. I have having issue with 1.501 version, Even i have disabled cross-site request forgery (CSRF) any idea how to make it work if there is any work around?
wget --auth-no-challenge --http-user=spatel --http-password=secret --post-file=config.xml --no-check-certificate http://jenkin.example.com/jenkin/jobs/Sched_M_Builds2Test/config.xml
--2013-05-23 15:54:22-- http://jenkin.example.com/jenkin/jobs/Sched_M_Builds2Test/config.xml
Resolving hudson.outcome.com... 10.101.100.60
Connecting to hudson.outcome.com|10.101.100.60|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2013-05-23 15:54:22 ERROR 404: Not Found.
You are probably talking about software called Jenkins - http://jenkins-ci.org/ but your link points at jenkin.
404 means not found so please make sure that you have right URL.

wget and htaccess: username only

I googled how to download images using terminal in ubuntu with wget. I found what I needed, but on the server, protected with .htaccess, there's no password. with
wget admin#http://server.com/filename.jpg
it returns: No route to hosts. When I set a password and type
wget admin:password#http://server.com/filename.jpg
everything's fine. However I am not allowed to use a password on the server. How to fix it, finding route?
Much easier:
wget --user="username" --password="password" http://xxxx.yy/filename.xxx
Try
wget --user=admin http://server.com/filename.jpg
instead.
Alternatively,
wget http://admin:#server.com/filename.jpg
may work instead.
Your syntax is actually wrong, as its:
wget http://user:pass#host/file
your username is outside the url, and was being treated as a hostname.

Resources