Sounds simple enough
def create_cookie():
bag = string.ascii_uppercase + string.ascii_lowercase + string.digits
cookie = Cookie.SimpleCookie()
cookie['sessionid'] = ''.join(random.sample(bag,24))
cookie['sessionid']['expires'] = 600
return 'Set-Cookie: ', cookie.output().replace('Set-Cookie: ', '', 1)
cookie.output() is Set-Cookie: sessionid=YmsrvCMFapXk6wAt4EVKz2uU; expires=Sun, 14-Aug-2011 21:48:19 GMT
headers.append(('Content-type', 'text/html'))
headers.append(('Content-Length', str(output_len)))
headers.append(create_cookie)
This is my response
('200 OK', [('Content-type', 'text/html'), ('Content-Length', '1204'), ('Set-Cookie', 'sessionid=YmsrvCMFapXk6wAt4EVKz2uU; expires=Sun, 14-Aug-2011 21:48:19 GMT')], 'html stuff')
This is what I get from envirion:
HTTP_COOKIE: sessionid=YmsrvCMFapXk6wAt4EVKz2uU
And when I click another link on my page, no more HTTP_COOKIE
Using the chrome dev console I can see the request cookie and the page header contains:
Cookie:: sessionid=YmsrvCMFapXk6wAt4EVKz2uU
Now, this bothers me a bit. First of all why does it have double :: ? I tried using 'Set-Cookie' instead of 'Set-Cookie: ' in the create_cookie function. Doing that I didn't get any HTTP_COOKIE at all from environ.
So after lots of searching in the web and everyone just talking middleware (don't suggest I use one please - I'm doing this to learn the wsgi) ... I've come up empty.
Invisible default behavior ftw...
After some intensive debugging I noticed that the following request didn't include the HTTP_COOKIE making it conclusively a problem on the browsers side of actually sending the cookie that I could find in the browser otherwise.
Some digging around revealed that the default path and domain behavior was spoiling my efforts , the difference between /action/login (where the cookie was set) and /display/data (where the cookie wasn't sent was fixed by setting the path in this case to '/'.
"yay"
you could try:
return [tuple(line.split(': ',1)) for line in cookie.output().split('\r\n')]
This also works for multiple entries in cookie.
Of course, you need to use extend instead of append:
headers.extend(create_cookie())
Related
I'm trying to use python's requests library to log in to a website. It's a pretty simple code, and you can really get the gist of requests just by going on its website. I, however, want to check if I'm successfully logged in via the url. The problem I've encountered is when I initiate the post requests and give it (the variable p) a url, whether the html has changed or not I'm still passed the same url when I type print(p.url). Is there any way for me to refresh the browser or update the url to whatever it's currently set at?
(I can add a line for checking the url against itself later, but for now I just want to get the correct url)
#!usr/bin/env python3
import requests
payload = {'login': 'USERNAME,
'password': 'PASSWORD'}
with requests.Session() as s:
p = s.post('WEBSITE', data=payload)
#print p.text
print(p.url)
The usuage of python-requests may not as complex as you think. It will automatically handle the redirect of your post ( or session.get()).
Here, session.post() method return a response object:
r = s.post('website', data=payload)
which means r.url is current url you are looking for.
If you still want to refresh current page, just use:
s.get(r.url)
To verify whether you has login successfully, one solution is to do the login in your browser.
Based on the title or content of the webpage returned (i.e, use the content in r.text), you can judge whether you have made it.
BTW, python-requests is a great library, enjoy it.
I've been scratching my head the whole day yesterday about this and to my surprise, can't seem to find an easy way to check this.
I am using Python's Requests library to pass my proxy such as:
def make_request(url):
with requests.Session() as s:
s.mount("http://", HTTPAdapter(max_retries=3))
s.mount("https://", HTTPAdapter(max_retries=3))
page = None
d.rotate(-1) #d contains a dict of my proxies. this allows to rotate through the proxies everytime make_request is called.
s.proxies = d[0]
page = s.get(url, timeout=3)
print('proxy used: ' + str(d[0]))
return page.content
Problem is, I can't seem to make the request fail when the proxy is not expected to work. It seems there is always a fallback on my internet ip if the proxy is not working.
For example: I tried passing a random proxy ip like 101.101.101.101:8800 or removing the ip authentication that is needed on my proxies, the request is still passed, even though it should'nt.
I thought adding the timeout parameters when passing the request would do the trick, but obviously it didn't.
So
Why does this happen?
How can I check from which ip a request is being made?
From what I have seen so far, you should use the form
s.get(url, proxies = d)
This should use the proxies in the dict d to make a connection.
This form allowed me to check with working proxies and non-working proxies the status_code
print(s.status_code)
I will update once I find out whether it just circulates over the proxies in the dict to match a working one, or one is able to actually select which one to be used.
[UPDATE]
Tried to work around the dict in proxies, to use different proxy if I wanted to. However, proxies must be a dict to work. So I used a dict in the form of:
d = {"https" : 'https://' + str(proxy_ips[n].strip('\n'))}
This seems to work and allow me to use an ip I want to. Although it seems quite dull, I hope someone might come and help!
The proxies used can be seen through:
requests.utils.getproxies()
or
requests.utils.get_environ_proxies(url)
I hope that helps, obviously quite old question, but still!
I was wondering if there was any way to re-order HTTP headers that are being sent by our browser, before getting sent back to the web server?
Since the order of the headers leaves some kind of "fingerprinting", see this post and this post, I was thinking about using MITMProxy (with Inline Scripting, I guess) to modify headers on-the-fly. Is this possible?
How would one achieve that?
Note: I'm looking for a method that could be scripted, not a method using a graphical tool like the Burp Suite (although Burp is known to be able to re-order headers)
I'm open to suggestions. Perhaps NGINX might come to the rescue as well?
EDIT: I should be more specific, by giving an example...
Let's say I'm using Firefox. With the use of a funky add-on, I'm spoofing my user-agent to "look" like a Chrome browser. But then if I test my browser with ip-check.info, the "signature" of my browser remains the one of Firefox, even though my spoofed user-agent shows "Chrome".
So the solution, in this specific case, should be to re-order the HTTP headers in the same manner as Chrome does.
How can this be done?
For the record, the order of the HTTP headers should not matter at all according to RFC 7230. But now that you have asked... this can be done in mitmproxy as follows:
import random
def request(context, flow):
# flow.request.headers.fields is a tuple of (name, value) header tuples.
h = list(flow.request.headers.fields)
random.shuffle(h)
flow.request.headers.fields = tuple(h)
See the mitmproxy documentation on netlib.http.Headers for more details.
There are tons of way to reorder them as you wish:
def reorder(headers, header_order=["Host","User-Agent","Accept"]):
lines = []
for name in header_order: # add existing headers in the specified order
if name in headers:
lines.extend(headers.get_all(name))
del headers[name]
lines.extend(headers.fields) # all other headers
return lines
request.headers.fields = reorder(request.headers)
I am currently trying to convert a smallish app from nodejs to golang (hence the two tags) but I'm running into a bit of trouble in doing so.
Essentially it is a very simple http POST login which I can't seem to realise. The background is that my university provides a calendar export function and I would like to provide this calendar as a feed that could be added to Google Cal.
Now the thing is that I have the whole thing working in node, but I would really like to be able realise it in go aswell.
The important bit of node code would be
var query = url.parse(req.url, true).query;
var data = {
u: query.user, // Username
p: query.password, // Password
};
needle.post(LOGIN_URL, data, {}, function (error, response) {
//extract cookies etc.
});
which is working like a charm but if I try to do the same in go
import "github.com/parnurzeal/gorequest"
//...
resp, body, err := gorequest.New().Post(LOGIN_URL).Send("u=user&p=pass").End()
//extract cookies etc.
I end up an invalid (timed out) session. I already tried using just net/http in go, which doesn't seem to change anything.
The result the POST request yields is a 302 redirect to an overview page (Btw: it is ASP based). Could it be that this is what's causing the problem, since gorequest then fetches that overview page without the cookies returned in resp, effectively creating a new session that isn't authorized, or am I overlooking something terribly simple?
So it seems that I found the answer myself by following your advice and using "net/http" and digging a little deeper into what the http.Client actually does. To anyone who might encounter similar problems, here is my solution:
http.Client automatically redirects if it receives a 30x response by the server see documentation. Although one can override the redirect policy, I was unable to prevent redirection entirely.
Additionally it seems as if the client has a bug (what I would call it at least), as it drops all header upon redirect see the issue or in the source where new headers are created.
While searching around in net/http I found http.DefaultTransport which is used by http.Client and does not redirect. It seems somewhat lower level and exactly what I was after. The following piece of code demonstrates how I replaced the line realised with gorequest from above:
data := url.Values{"u": {USER}, "p": {PASS}}
req, err := http.NewRequest("POST", LOGIN_URL, bytes.NewBufferString(data.Encode()))
//I needed quite some time to figure out that I needed to set the content type accordingly
req.Header.Add("Content-Type", "application/x-www-form-urlencoded")
//...
resp, err := http.DefaultTransport.RoundTrip(req)
//...
//resp.Header["Set-Cookie"] now contains the login/session cookies
Although I need to extract cookies myself and set a few header values, the solution works perfectly and I am quite happy with it. If anybody has some improvements to my solution or any other advice I am glad to hear it. And thanks to JimB and Volker :).
I am using Facebook Owin Authentication and more or less follow Microsoft sample. I am more or less following the First time user logs in, everything is ok. But if they sign out and try again, it seems like the previous .AspNet.Correlation.Facebook is not removed, but set to empty string. So my next call to api/getexternallogin looks like this in Fiddler:
This is when we are generating a correlationId and having multiple cookies at this point is not a show stopper. In the response, I set it to the new CorrelationId:
Later when facebook calls back to "/signin-facebook", we try to validate the correlationId in ValidateCorrelationId method. The request seems like this:
So the new CorrelationId has been set but the extra cookie with no value means when I go Request.Cookies["ValidateCorrelationId"], it returns empty string.
I have checked the code and it seems like the only methods modifying this cookie are GenerateCorrelationId and ValidateCorrelationId. Implementation of these methods can be found in here:
http://katanaproject.codeplex.com/SourceControl/latest#src/Microsoft.Owin.Security/Infrastructure/AuthenticationHandler.cs
Curiously enough, my browser does not seem to see the issue:
Any ideas will be much appreciated.
OK this has taken me a fair bit of frustration but when Response.Cookies.Delete(".AspNet.Correlation.Facebook") is called in ValidateCorrelationId method, it sends the following in response:
So the value of "expires" has been concatenated and treated as two separate "set-cookie"s. Hence, the cookie is not expired but its value set to empty string. It seems like the comma after "Thu" is causing it.
The fix I have come up with was to comment out Response.Cookies.Delete(".AspNet.Correlation.Facebook") and do the following instead:
Response.Headers.Add("Set-Cookie", new[] { CorrelationKey + "=; path=/; expires=Fri 02-Jan-1970 00:00:00 GMT" })
No commas there and it is working now.
This does seem like a genuine bug in OWIN.