Python Requests Authentication - python-3.x

I'm trying to interact with an API via Python3's Requests module but the authentication is not working . I am able to take the exact same url with the same username/password and use the API via my Firefox browser.
There are a few complications with the setup, so let me explain.
The device I'm trying to talk with uses https but I don't have access to the certs so I'm choosing not to verify. I'm able to interact with Requests with older models that only use http no problem.
The device is on a local network but my company's network utilizes proxies that interfere with communication to devices on a local network so I turn these off.
Here's an example of my code:
session = requests.Session()
auth = ('username', 'password')
session.auth = auth
s.trust_env = False # remove proxies (see 2 above)
response = session.get(
url='https://mylocalurl/api/',
params={'apikey' : 'myapikey',
'req' : 'generalInfo'},
verify=False # don't verify SSL cert (see 1 above)
)
My first thought was that I had to specify the encoding of the credentials sent since I'm able to use those in my browser. I've tried doing auth = (b'username', b'password') with no luck. I looked at the response.content and found b'<?xml version="1.0" encoding="iso-8859-1"?>\n in the first line so I tried specifying that by auth = ('username'.encode('iso-8859-1'), 'password'.encode('iso-8859-1')) but still no luck.
The questions I have are:
The API's server is LightTPD, is there something particular with authenticating with this server via Python that I'm missing?
Do I have to verify my SSL cert to authenticate? Wouldn't make sense but I'm asking anyway.
Is there any more information I could gleam from the 401 response that could help me determine how to send the credentials in a way the server will accept?

Related

When accessing a Rest API via requests.get it times out. Same URL gets the response in browser

I am trying to access a Rest API that supports GET, POST, PUT, MOVE functions. When I try the URL using the browser to get list of files I get a response from the REST API. However, when I try to call the same URL via requests.get(url) I get:
(Caused by ConnectTimeoutError(<urllib3.connection.VerifiedHTTPSConnection object at 0x000002D8CCFBC6A0>, 'Connection to timed out. (connect timeout=None)'))
What am I missing? Does it make any difference if I pass a JWT token within the URL?
URL in the browser returns a list in JSON.
import requests
url = "https://www.restapi.com/content?JWT-token"
x = requests.get(url)
print(x.status_code)
I am not expert in Python, but it looks like you forgot quotes for link, it should look like this:
url = 'https://wwww.restapi.com/content?JWT-token'
EDIT: Oh, you see, in your url you typed wwww instead of www. Is this ok?
When you run the url within the browser it automatically finds the corresponding proxy for the url. I had to download the proxy file from browser's setting find the proxy mapping for that URL and add proxy parameters within the requests
There can be multiple reasons for this.
If you are in a corporate setup which uses proxy server for outbound connections, then you need to provide proxy details. The internet browser will already have the proxy configured.
import os
# Provide proxy details
os.environ["http_proxy"] = "http://user:password#proxy-server-ip:port"
os.environ["https_proxy"] = "http://user:password#proxy-server-ip:port"
import requests
url = "https://gorest.co.in/public/v2/users"
x = requests.get(url)
print(x.status_code)
print(x.content)

Apple's MusicKit JS library example runs fine when rendered by Node.js, fails with Django

For three hours, I've been scratching my head over this.
Apple's MusicKit uses JWT tokens to authenticate.
When I npm start this Node.js example project, I can generate the JWT token, and then authorize Apple Music and collect the user token in response. This works when executing from localhost:8080. Here is a successful pop-up window.
When I launch my Django server locally which also generates valid JWT tokens, running the same exact HTML code with a fresh JWT token, I receive this from Apple: "Problem Connecting: There may be a network issue."
The only error on Apple.com's authorization page, is this: vendor-2c0b12a9762d9af673e21ccd8faf615e.js:2325 Error while processing route: woa Failed to construct 'URL': Invalid URL TypeError: Failed to construct 'URL': Invalid URL
I have confirmed that both applications are generating valid JWT tokens. I can use the tokens generated in my Django application with Postman directly with Apple API, as well as my token from the Node.js app.
I have tried:
Using JWT token from Django in the Node.js app -- works
Using JWT token from Node.js app in Django -- fails still
Allowing all hosts to Django
Allowing all CORS traffic to Django
Hosting page on an HTTPS valid cert. domain vs. locally
Different browser / computer
Accessing the authorization URL directly from Node.js in a new tab -- FAILS
Accessing the authorization URL directly from Django in a new tab -- FAILS
Breaking the JWT token and using that with Django -- FAILS and does not even open the authorization window (console says invalid token, so this further proves the token I use is valid with Django)
From steps 7 / 8, it would appear something in the referral process between when the JS script runs and the pop-up window for authorization appears is failing, as the token is clearly valid.
So what could Django (dev server) be doing that's disrupting this authorization flow? Or am I missing something else?
Okay, so I was right, it had to do with what Node.js server exposed to Apple.com when the request was made, versus what Django did.
It boiled down to the 'Referrer-Policy' policy of the Django server.
By default it is "same site" on Django, which means the referral information is not distributed to Apple.com, so Apple cannot verify the source of the request.
I added the following header to my Django view render:
# Create the render variable
response = render(request, 'apple.html'.format(settings.BASE_DIR), {'apple_developer_token': apple.get_token()})
# Attach the header
response['Referrer-Policy'] = 'origin-when-cross-origin'
# Return the render
return response
It then worked.

Cookies not sent on redirect

I am building a react web application with a separate back-end express api that manages all the calls, including passporting and setting cookies. Let's call the back-end service 'api.com' and the front-end service 'react.com'. I'm using passporting with an existing provider (spotify) and after the authorization succeeds, a cookie is set on api.com. The idea is that the user interacts with react.com and requests are made to api.com via a proxy.
If I'm just testing in my browser and I make a call to api.com/resource, the cookie is automatically set. I know this because I've added a bit of logging and also because the requests that require authorization are succeeding via the cookie.
However, when I make calls to api.com from react.com via the proxy, the cookie is not set. Is this expected behavior when proxying? It seems odd that the cookie is set when I call api.com directly, but it is not set when it is redirected. Is there a way around this? My thought would be to communicate the cookie from api.com to react.com, save it there, and send it on all subsequent requests, but that seems overkill. I'm also wondering if maybe I should be setting the cookie on react.com instead of api.com.
I've tried in both Firefox and Chrome, and if it makes a difference, I'm using axios for the requests on react.com.
const request = axios({
method:'get',
url:'/api/resource'
});
This gets proxied as follows (still on react.com), using express-http-proxy:
app.use('/', proxy('api.com', {
filter: (req) => {
return (req.path.indexOf('/api') === 0);
}
}));
But once this hits api.com, any authentication fails, because the cookie is not present.
Any help is appreciated
As far as I have understood your question, I think you're not considering that cookies are set to host name.
So in the first case the hostname is same and its okay, but in the second case the browser's cookies are not set for react.com
So trying to set the cookie on react.com should work.
I would have asked for a clarification using a comment but I don't have enough reputation for that yet.

Twilio is not sending creds in headers when specifying username:password format in the URL

I'm currently developing my app and I'm at the stage where I can start testing messages from Twilio. I configured my server on digital ocean with a public facing IP address and my Nodejs app is listening to calls from Twilio.
I also configured my phone number's message "request url" to "http://username:password#198.xxx.xxx.xxx/messages" with "HTTP POST".
When I debug the headers, I don't see the "authorization" headers. I'm I missing something here?
Any help is much appreciated!
Below is the code.
var headerValues = bag.req.headers.authorization.split(' ');
console.log(bag.req.headers);
var scheme = headerValues[0];
if (scheme === 'Basic') {
var credentials = headerValues[1];
var decoded = new Buffer(credentials, 'base64').toString().split(':');
bag.req.creds = {
userName: decoded[0],
password: decoded[1],
authType: 'basic'
}
}
I use the same setup as you do in several call centers I have built.
If you are using a proxy setup which requires username:password# before the IP address then your issue is likely with that proxy if you can access the code by going directly to the actual server ip address as I note below. However, you did not mention using a proxy just using a digital ocean droplet so I am responding assuming you do not have a proxy setup.
So if you do have a proxy setup make sure you can access the IP address of the server directly first.
Also if those are just extra variables you need to pass over you may be better off appending them after the IP address
for instance xxx.xxx.xxx.xxx/username/password
Then get them with req.params
for instance (and yes this will work with post data since its merely part of the URL and not an actual get command post)
router.post('/sms/:username/:password'), function(req, res, next){
username = req.params.username;
}
First you would not want to direct your request URL at "http://username:password#198.xxx.xxx.xxx/messages" with "HTTP POST".
If you do not have a domain directed at your IP address yet you want your request URL to be
https://198.xxx.xxx.xxx/inbound/sms
{Replacing /inbound/sms with whatever route you are using}
Then at the top of your route (I am using express so my setup may look different than your)
I have the node.js twilio library
, twilio = require('twilio')
, capability = new twilio.Capability(sid, auth)
, client = require('twilio')(sid, auth)
Then here is an example of my /sms route
router.post('/sms', function(req, res, next){
var sid = req.body.SmsSid;
var from = req.body.From;
var to = req.body.To;
var date = Date();
var body = req.body.Body;
if(req.body.NumMedia > 0){
code to handle MMS
}
Code to handle SMS data
res.send("Completed");
});
I ran into this this week and discovered that behavior surrounding Basic Auth in the URL is very cloudy. For one thing, it appears to be deprecated from the URI spec as it pertains to HTTP:
...
3.2.1. User Information
...
Use of the format "user:password" in the userinfo field is deprecated.
...7.5. Sensitive Information
URI producers should not provide a URI that contains a username or password that is intended to be secret. URIs are frequently displayed by browsers, stored in clear text bookmarks, and logged by user agent history and intermediary applications (proxies). A password appearing within the userinfo component is deprecated and should be considered an error (or simply ignored) except in those rare cases where the 'password' parameter is intended to be public.
...
Because of this, both Firefox and Chrome appear to just strip it out and ignore it. Curl, however, seems to convert it to a valid Authorization header.
Whatever the case, I believe this functionality is actually the responsibility of the HTTP user agent, and it appears that Twilio's user agent is not doing its job. Thus, there is no way to make basic auth work.
However, it appears Twilio's preferred method of auth is to simply sign the request using your account's secret auth key, which you can then verify when handling the request. See here.
On researching the raw NodeJS Request and IncomingMessage classes, there appears to be no way to get at the full, raw URL to compensate for Twilio's non-conformity. I believe this is because the actual data of an HTTP request doesn't contain the full URL.
My understanding is that it's actually the HTTP user agent that's responsible for extracting and formatting the auth info from the URL. That is, a conformant HTTP user agent should parse the URL itself, using the hostname and port portion to find the right door on the right machine, the protocol portion to establish the connection with the listener, the verb combined with the URL's path portion to indicate what functionality to activate, and presumably it is then responsible for converting the auth section of the URL to an official HTTP Authorization header.
Absent that work by the user agent, there is no way to get the auth data into your system.
(This is my current understanding, although it may not be totally accurate. Others, feel free to comment or correct.)

Cross domain Sails.js + Socket.io authorization

I'm working in an application which delivers push content to a group of web applications hosted in different domains. I'm using Sails.js and Socket.io, and structured it like this:
The client script, running on each web application's client's browser, is something like:
socket.on('customEvent', function(message){
//do something with message on event trigger
}
And then, in the server, the event 'customEvent' is emitted when needed, and it works (e.g. on the onConnect event: sails.io.emit('customEvent',{message ...}).
But I'm facing a problem when it comes to handle authorization. The first approach I've tried is a cookie-based auth, as explained here (by changing the api/config/sockets.js function authorizeAttemptedSocketConnection), but it isn't a proper solution for production and it isn't supported in browsers with a more restrictive cookie policy (due to their default prohibition to third-party cookies).
My question is: how to implement a proper cross-browser and cross-domain authorization mechanism using sails.js, that can be supported in socket.io's authorization process?
======
More details:
I also tried adding a login with a well-known oauth provider (e.g. facebook), using this example as a base. I've got the Passport session, but I'm still unable to authenticate from the client script (it only works in pages hosted by my node app directly).
A JSONP request to obtain the session might be a solution, but it didn't work well in Safari. Plus I prefer the user to be authenticated in one of the web apps, rather than in my node application directly.
I believe your routes must handle CORS mate. Example:
'/auth/logout': {
controller: 'AuthController',
action: 'logout',
cors: '*'
},
Of course you can specify the list of ip your are accepting (and then replace the '*').
Worth mentionning that you must specify where socket.io has to connect to (front-end JS):
socket = io.connect(API.url);
For any common http GET/PUT/POST/DELETE, please ensure that your ajax envelope goes with the credentials (cookie). For example with angular:
$httpProvider.defaults.withCredentials = true
Let me know how it goes.

Resources