I'm working in an application which delivers push content to a group of web applications hosted in different domains. I'm using Sails.js and Socket.io, and structured it like this:
The client script, running on each web application's client's browser, is something like:
socket.on('customEvent', function(message){
//do something with message on event trigger
}
And then, in the server, the event 'customEvent' is emitted when needed, and it works (e.g. on the onConnect event: sails.io.emit('customEvent',{message ...}).
But I'm facing a problem when it comes to handle authorization. The first approach I've tried is a cookie-based auth, as explained here (by changing the api/config/sockets.js function authorizeAttemptedSocketConnection), but it isn't a proper solution for production and it isn't supported in browsers with a more restrictive cookie policy (due to their default prohibition to third-party cookies).
My question is: how to implement a proper cross-browser and cross-domain authorization mechanism using sails.js, that can be supported in socket.io's authorization process?
======
More details:
I also tried adding a login with a well-known oauth provider (e.g. facebook), using this example as a base. I've got the Passport session, but I'm still unable to authenticate from the client script (it only works in pages hosted by my node app directly).
A JSONP request to obtain the session might be a solution, but it didn't work well in Safari. Plus I prefer the user to be authenticated in one of the web apps, rather than in my node application directly.
I believe your routes must handle CORS mate. Example:
'/auth/logout': {
controller: 'AuthController',
action: 'logout',
cors: '*'
},
Of course you can specify the list of ip your are accepting (and then replace the '*').
Worth mentionning that you must specify where socket.io has to connect to (front-end JS):
socket = io.connect(API.url);
For any common http GET/PUT/POST/DELETE, please ensure that your ajax envelope goes with the credentials (cookie). For example with angular:
$httpProvider.defaults.withCredentials = true
Let me know how it goes.
Related
I made a website with react and i used "react-paypal-button-v2" package to integrate paypal to my website,
so everything is working well, but Now what i would like to do is to hide the " ClientId " which is one of the properties of react-paypal-button-v2 as below:
<PayPalButton
amount={amount}
currency={currency}
onSuccess={(details, data) => onSuccess(details, data)}
options={{
clientId: "YOUR_CLINET_ID"
}}
/>
because it is not secure to put sensitive data in the front-end as they said in the documentation from react apps documentation, so for that i decided to handle this on the backend and save the ClientId as a variable and use it each time request payment is fired.
so my question is what is the best way to make the payment on the server side?
or if I'm wrong and there is way better than this please tell me.
thank in advance guys.
The Client ID is not sensitive information. It is intended to be used on the client side.
A server integration uses a client ID + secret for API calls. Server integrations are more robust and secure, so if you have the resources and ability to integrate with a backend it's recommended that you do so.
Vanilla JS+backend approach
Create two routes on your server, one for 'Create Order' and one for 'Capture Order', documented here. These routes should return JSON data. The latter one should (on success) store the payment details in your database before it does the return.
Pair those two routes with the following approval flow: https://developer.paypal.com/demo/checkout/#/pattern/server
React specifics
react-paypal-button-v2 is not an official module, try the newer react-paypal-js instead. See the "Docs" tab of the Storybook.
const ClientOAuth2 = require('client-oauth2');
const oauth2 = new ClientOAuth2({
clientId: 'clientId',
clientSecret: 'clientSecret',
accessTokenUri: 'https://fakeurl.com/v1/auth/token',
});
oauth2 .credentials.getToken().then(function (user) {
console.log(user);
}).catch(function (error) {
console.log(error);
});
Is there a way i can include proxy settings when requesting for a token given i am running this code inside a corporate network
I had a quick look and couldn't find an easy way to do it. Other than proxy features I noticed also that it is missing Open Id Connect features such as these:
Looking up metadata
Authorization Code Flow (PKCE)
Calls to the User Info Endpoint
REQUIREMENTS
Choosing a library for your apps is an important decision, and here are a few common things people usually look for:
Standards Based (works for any Authorization Server)
Certified as following the latest OAuth 2.1 and Open Id Connect recommendations
Supports HTTP proxying (highly useful to view OAuth messages when developing)
NODEJS SOLUTION
If you are using Node it might be worth considering the node openid-client library, which is the one I use. Here is some relevant code from an API of mine:
Looking up metadata - note that an agent can be supplied to support proxying
Setting the HTTP proxy - I use TunnelAgent.httpsOverHttp to proxy calls to HTTPS OAuth URLs
OAuth Operations - note that there are some custom classes that make these tasks easier
I am very new to electron so I may be going about this all wrong.
We have a few web apps internally that are all working and I wanted to practice by building one of them into electron.
What I need to do is load our SSO login page within the app and then listen for a cookie/session to be created after authentication has been successful.
I am using a webview like so:
<div style="width:100%; height:100%">
<span class="loading loader" id="loading" name="loading"></span>
<webview class="ssologin" src="https://example.com/resources/ldap.php" autosize="on" style="min-width:755px; min-height:640px"></webview>
</div>
This loads the login page for ldap/sso. After I login, it would normally take you to the web application you were going to before you were re-routed to SSO do to not having a valid session.
I am trying to figure out how I can listen for a cookie/session so that I know that they have authenticated and we get a response back.
Essentially, I need this valid session in order to make future API calls in the app to endpoints so I want to try and use this existing authentication implementation without having to include other modules and mess with all that.
Any suggestions?
Just in case you didn't know: Electron does not currently recommend to use <webview>:
We currently recommend to not use the webview tag and to consider alternatives, like iframe, Electron's BrowserView, or an architecture that avoids embedded content altogether.
Cf https://electronjs.org/docs/api/webview-tag#warning
You probably need to set a partition on your <webview>:
<webview src="https://github.com" partition="persist:github"></webview>
<webview src="https://electronjs.org" partition="electron"></webview>
Sets the session used by the page. If partition starts with persist:, the page will use a persistent session available to all pages in the app with the same partition. if there is no persist: prefix, the page will use an in-memory session. By assigning the same partition, multiple pages can share the same session. If the partition is unset then default session of the app will be used.
Cf https://electronjs.org/docs/api/webview-tag#partition
With that you can (from the main process) access the cookie of the session:
const {session} = require('electron');
const sess = session.fromPartition('persist:foobar');
const cookies = sess.cookies;
Then you can listen for changed events on that cookie object:
Emitted when a cookie is changed because it was added, edited, removed, or expired.
cookies.on('change', () => {
// do something when your SSO cookie is set
});
Cf https://electronjs.org/docs/api/cookies#event-changed
I am building a react web application with a separate back-end express api that manages all the calls, including passporting and setting cookies. Let's call the back-end service 'api.com' and the front-end service 'react.com'. I'm using passporting with an existing provider (spotify) and after the authorization succeeds, a cookie is set on api.com. The idea is that the user interacts with react.com and requests are made to api.com via a proxy.
If I'm just testing in my browser and I make a call to api.com/resource, the cookie is automatically set. I know this because I've added a bit of logging and also because the requests that require authorization are succeeding via the cookie.
However, when I make calls to api.com from react.com via the proxy, the cookie is not set. Is this expected behavior when proxying? It seems odd that the cookie is set when I call api.com directly, but it is not set when it is redirected. Is there a way around this? My thought would be to communicate the cookie from api.com to react.com, save it there, and send it on all subsequent requests, but that seems overkill. I'm also wondering if maybe I should be setting the cookie on react.com instead of api.com.
I've tried in both Firefox and Chrome, and if it makes a difference, I'm using axios for the requests on react.com.
const request = axios({
method:'get',
url:'/api/resource'
});
This gets proxied as follows (still on react.com), using express-http-proxy:
app.use('/', proxy('api.com', {
filter: (req) => {
return (req.path.indexOf('/api') === 0);
}
}));
But once this hits api.com, any authentication fails, because the cookie is not present.
Any help is appreciated
As far as I have understood your question, I think you're not considering that cookies are set to host name.
So in the first case the hostname is same and its okay, but in the second case the browser's cookies are not set for react.com
So trying to set the cookie on react.com should work.
I would have asked for a clarification using a comment but I don't have enough reputation for that yet.
Okay, so atm i have a frontend application built with Nuxt JS using Axios to do requests to my REST API(separate).
If a user does a search on the website the API URL is visible in XMLHttprequests so everyone could use the API if they want to.
What is the best way of making it so that only users that search through my website gets access to the API and people that just directly to the URL gets denied. I suppose using some sort of token system, but what is the best way to do it? JWT? (Users never log in so there is no "authentication")
Thanks!
IMO, you CANNOT block other illegal clients accessing your
backend as you describe that the official client and other illegal have the same knowledge about your backend.
But you can make it harder for illegal clients to accessing your backend through some approach such as POST all requests, special keys in header, 30-minutes-changed token in header and server-side API throttling by client IP.
If the security of the search API is really important, authenticate it by login; if not, just let it go since it is not in your critical path. Let's focus on other important things.
I'm in the same "boat" and my current setup is actually in VueJs but before even come to StackOverflow I developed a way to actually, the frontend calls the server and then the server calls the API, so in the browser, you will only see calls to the server layer that, the only constraint is that the call must come from the same hostname.
backend is handled with expressJs and frontend with VueJs
// protect /api calls to only be originated from 'process.env.API_ALLOW_HOST'
app.use(api.allowOnlySameDomainRequests());
...
const allowHostname = process.env.API_ALLOW_HOST ||'localhost';
exports.api = {
...
allowOnlySameDomainRequests: (req, res, next) => {
if(req.url.startsWith('/api') && req.hostname === allowHostname) {
// an /api call, only if request is the same
return next();
} else if (!req.url.startsWith('/api')) {
// not an /api call
return next();
}
return res.redirect('/error?code=401');
},
...
};
In our case, we use Oauth2 (Google sign through passportJs) to log in the user, I always have a user id that was given by the OAuth2 successful redirect and that user id is passed to the API in a header, together with the apikey... in the server I check for that userid permissions and I allow or not the action to be executed.
But even I was trying to find something better. I've seen several javascript frontend apps using calls to their backend but they use Bearer tokens.
As a curious user, you would see the paths to all the API and how they are composed, but in my case, you only see calls to the expressJs backend, and only there I forward to the real API... I don't know if that's just "more work", but seemed a bit more "secure" to approach the problem this way.