Single Page Web Apps, CORS and security concerns - security

The situation
I am writing a Single-Page-Web App (using Angular). Lets call it SPA
Another team-mate is writing some APIs (using Node.js). Lets call is Server
My SPA is to Login to the Server using login/passwd, and do some stuff
My team-mate has decided to use cookies to track the session. Hence, upon a successful login, a http-only cookie is to be set in the web-browser the SPA is loaded in.
The problem
If we put the SPA in the Server's public_html dir, all works well. This, however, makes the SPA as a part of the API code. This breaks our build process, since every version upgrade to the SPA now requires upgrading the API too.
If we host the SPA in a seperate webserver that only serves the static SPA files, I run into CORS issues. Since the SPA comes from a different origin than the APIs it is trying to access, the browser blocks the ajax calls. To overcome this, we will have to set Access-Control-Allow-Origin on the server side appropriately. I also understand that Access-Control-Allow-Credentials:true needs to be set, to instruct the browser to set/send the cookies.
Possible solutions
We create a build process which does a git-pull to the Server's public_html dir every time the SPA gets upgraded. I am trying to avoid this to keep the client and server upgrades separate.
We create a proxy kind of situation, where the Server doesnt store the SPA files, but collects them on-demand from another server that hosts the SPA files. In this case, the web-browser will see the SPA files and subsequent ajax calls from the same origin.
We code the server to set Access-Control-Allow-Origin:* in its responses. Firstly, this is too open and looks insecure. Is it really insecure, or is it just my perception? Also, since we are setting Access-Control-Allow-Credentials:true, Chrome complains Cannot use wildcard in Access-Control-Allow-Origin when credentials flag is true.. To overcome this, we will have to put exact origins (perhaps using a regex) in the Access-Control-Allow-Origin. This may seriously restrict us from distributing our SPA to users in unknown domains.
For a Server API designer, is Cookie based authentication the recommended way to handle Authentication for SPAs? OAuth2.0 and JWT based Authentication seems to suggest that Cookies based Authentication is not right for SPAs. Any pros/cons?
Kindly comment on the above options, or suggest any others that you may have used. Thanks in advance.

I think the issue is that your terminology is confusing. API is not an server, its an application that lives on a machine that can also be a server. If you make a NodeJS API, I suggest you use a Nginx server as a reverse proxy before it. Assuming you want the Nginx server, API and SPA files all on same machine, you can deploy your API to a directory and your SPA to another directory and have Nginx route the requests accordingly.
So I believe solution 2 is way to go. From there you can easily scale by increasing number of instances(if you use AWS) and load balance them or separate your API into its own application server.
As far as authentication. I have always preferred using Header Authorization with access tokens over cookies for SPA or API request. The idea that each request is self contained and does not require a persistent string kept on the browser is more appealing to me, though you can save access token via local storage.

I would go with either solution 2 or 3.
2: you could set both (webpage and API) on the same server (or use reverse proxies) so that from an outside perspective they share the same origins.
3: in the case of an API, the same origin policy becomes less important. The API is to be consumed by clients that are not part of your web application anyways, no?
I would not see any issue in setting a more lax allow origin header. And by more lax I don't mean wildcard, just add the origin of your webpage. Why do you want to wildcard it?

Related

How can i hide API Key in an Electron JS Project?

i'm working on an electron js app & i need to connect it to an API
knowing that the source code of an electron-js app is visible it's a huge security risk to leave the API key there !
how can i solve this problem ?
Instead of having the electron app (whether from the page or from the main process) make a request to the API directly, you can have it make a request to your own server instead - then, your server can make the request to the API, so that the key is only visible to your server, and isn't exposed publicly anywhere.
This will also let you gate requests from clients - if, for example, the credentials a client sends don't match what you need, or if they make too many requests in too short a time, you can cut them off.
You can't. If API key is shared, it's probably designed to be used in the backend.
The solution for that is to create a backend API for proxying API calls. Such proxying API should utilize authentication, so each user must send individual credentials.
How about obfuscate it using a utility like js-beautify.

Same Origin Policy easily circumvented?

I've read an article which used Cors-Anywhere to make an example url request, and it made me think about how easily the Same Origin Policy can be bypassed.
While the browser prevents you from accessing the error directly, and cancels the request altogether when it doesn't pass a preflight request, a simple node server does not need to abide to such rules, and can be used as a proxy.
All there needs to be done is to append 'https://cors-anywhere.herokuapp.com/' to the start of the requested url in the malicious script and Voila, you don't need to pass CORS.
And as sideshowbarker pointed out, it takes a couple of minutes to deploy your own Cors-Anywhere server.
Doesn't it make SOP as a security measure pretty much pointless?
The purpose of the SOP is to segregate data stored in browsers by their origin. If you got a cookie from domain1.tld (or it stored data for you in a browser store), Javascript on domain2.tld will not be able to gain access. This cannot be circumvented by any server-side component, because that component will still not have access in any way. If there was no SOP, a malicious site could just read any data stored by other websites in your browsers.
Now this is also related to CORS, as you somewhat correctly pointed out. Normally, your browser will not receive the response from a javascript request made to a different origin than the page origin it's running on. The purpose of this is that if it worked, you could gain information from sites where the user is logged in. If you send it through Cors-Anywhere though, you will not be able to send the user's session cookie for the other site, because you still don't have access, the request goes to your own server as the proxy.
Where Cors-Anywhere matters is unauthenticated APIs. Some APIs might check the origin header and only respond to their own client domain. In that case, sure, Cors-Anywhere can add or change CORS headers so that you can query it from your own hosted client. But the purpose of SOP is not to prevent this, and even in this case, it would be a lot easier for the API owner to blacklist or throttle your requests, because they are all proxied by your server.
So in short, SOP and CORS are not access control mechanisms in the sense I think you meant. Their purpose is to prevent and/or securely allow cross-origin requests to certain resources, but they are not meant to for example prevent server-side components from making any request, or for example to try and authenticate your client javascript itself (which is not technically possible).

How to make Node API only accessible by web app?

I'm developing a web app with React and an GraphQL API with Node.js / Express. I would like to make the API more secure so that its harder for API requests that don't come from the web app on the browser to get data. I know how to do it with registered users. But how to make the non-registered user still be able to access some basic data needed for the app?
Is it possible to put some kind of key in the web app - so the API call can't be replicated for others through sniffing the network dev tool in browser and replicating in Postman? Does SSL/TLS also secure requests in that browser tool? Or use like a "standard" user for non-registered visitors?
Its a serverside web app with next.js
I know theres no 100% secure api but maybe its possible to make it harder for unauthorized access.
Edit:
I'm not sure if this is a problem about CSRF because Its not about accessing user data or changing data through malicious websites etc. But its about other people trying to use the website data (all GET requests to API) and can easily build there own web app on top of my api. So no one can easily query my api through simple Postman requests.
The quick answer is no you can't.
If you trying to prevent what can be describe as legit users form accessing your api you can't really do it. they can always fake the same logic and hit your webpage first before abusing the api. if this is what your trying to prevent your best bet is to add rate limiting to the api to prevent a single user from making too many request to your api (I'm the author of ralphi and
express-rate-limit is very popular).
But if you are actually trying to prevent another site form leaching of you and serving content to their users it is actually easier to solve.
Most browsers send Referrer header with the request you can check this header and see that requests are actually coming from users on your own site (this technique is called Leech Protection).
Leaching site can try and proxy request to your api but since they all going to come from the same IP they will hit your rate limiting and he can only serve a few users before being blocked.
One thing the Leecher site can do is try to cache your api so he wont have to make so many requests. if this is a possible case you are back to square one and you might need to manually block his IP once you notice such abuse. I would also check if it's legal cause he might be breaking the law.
Another option similar to Referrer is to use samesite cookies. they will only sent if the request is coming directly from your site. they are probably more reliable than the Referrer but not all browsers actually respect them.

Accessing Header values in React

I am new to React, and I believe this is a unique situation.
So, I am creating a React Website create-react-app. When I run npm start, it will run on localhost:3000.
My next requirement is to have Authentication & Authorization (AA). It is already created by another group.
What they give me is a small proxy website. Let's say it runs on localhost:4000.
When I want to go to localhost:3000/Home, I will type localhost:4000/Home.
localhost:4000 will take care of AA, and if everything works out, it will forward to localhost:3000/Home.
The login information is added to Http-Only & Secure cookie by localhost:4000.
My question is how React can access it?
One of the solution is to use react-cookie. But it cannot access Http-Only & Secure cookie.
The workaround is to create another proxy. React with Express
So, I create another proxy at localhost:5000.
The workflow becomes localhost:4000 will handle AA. The result is forwarded to localhost:5000. localhost:5000 will decide if it should forward to localhost:3000 or redirect to error page. It works if my original website doesn't need to use those result cookies.
But React website needs to do some POST calls to a separate website at localhost:9000. React needs to pass the same AA result cookies to that website. If not, it will ask the user to login again.
I cannot change the cookies properties to non Http-Only, or copy them to another place.
So, the question is how to pass those cookies to localhost:9000?
Am I on completely wrong track because I couldn't find any similar posts online.
In short, can React access the header where some information like username is set, and cookies where actual tokens are stored? If not, what's the standard workaround?
Thanks so much.
Trying to work around HTTP-only authorization cookies is a huge security risk, because in this case they could be stolen by third party scripts. Using proxies to bypass this limitation is certainly an inappropriate solution.
You are working in an environment with several services (localhost 3000, 4000, 9000) that share one authentication system. In this case your options are:
1) Make your server (the 3000 one) communicate with other servers on behalf of the client. If your client needs to POST to 9000, let it POST to a special endpoint of 3000. Your server will then query 9000 and return to the client with answer.
In this case you need to establish server-server authentication which is much simpler that client-server. But you have to handle authorization separately - if your clients may have different permissions.
2) If the 4000 server is only source of AAA, make it a single sign-on server then. There are a lot of solutions of this type. OAuth2 and JSON web tokens (JWT) are hot topics nowadays, with advent of microservices.
You could also talk to the team that developed localhost:4000 and ask what their idea was.

SPA security using Backbone.js, Require.js and Laravel

I'm currently searching the best way for developing my next webapplication. I'm thinking about using Backbone.js and build a single page application. But I really can't imagine how to secure my app since nearly everything is done on client side. Of course I just could prevent the users from accessing my RESTful Api so they would not have access to my data. But all the view/model/collection/template js files are still accessible.
Or is there a known way to serve the js files with php (laravel), which would allow me to only serve the files I need for the respective user.
I just couldn't find a solution by searching the Web. But I just don't think that I am the lonely person who needs a clean and secure authentication method including different user rights.
Thank you in advance!
Your backend application will fetch data from a backend (= API), and probably send back some changes.
This code can't have "security holes / leaks" as long as your backend is secured.
If you are afraid of people stealing your code, you can always minify the JS (check grunt.js and almond.js for this)
To secure your backend you can make use of Laravel's auth class, and the auth filter as mentioned before.
Besides normal auth, you could implement roles, that you can assign to specific users, giving them more or less access to certain resources in your backend.
Here's the method I would try :
Separate the application in two parts.
One part - login via regular Laravel Auth on a separate page, and then when the user is logged in serve the single page app in a different view.
Wouldn't this work?
Web Services are no different than any other web application you build. At the end of the day you are exposing functionality to the client (which is also the attacker). It doesn't matter what the client is implemented in, if you expose dangerous functionality you will be hacked.
Have a session state, keep track of the user id and make sure that the user is only accessing resources they have been allowed to access.
I do not think that what JS/template files are exposed really matters. Essentially, you should only be allowing data interaction to authenticated users. Think of this as two separate applications.
The front-end application logs in, and a cookie is stored (or some other persistence is used).
The back-end application then uses the persistent authentication to validate every single user request for data, and every user action.
This way you don't have to worry about the security, the client can only fetch the data that the server allows it to, and, likewise, it can only interact with the data insofar as the server allows it. You shouldn't be relying on the client side for security anyway, even logged in, otherwise some malicious user could, conceivably, save all your frontend code and use it against you without authentication.

Resources