I have created a MERN application using Create-React-App.
When run on localhost, and deployed using Heroku, it says the site (or localhost) uses 2 cookies.
It looks like this:
Because of this, I have to add a accept cookie pop up in my website and create a cookie policy..
I did not write a single code about cookies in my app, but there are two.
From my search CRA does not automatically include code for setting cookies.
How can I remove these two cookies?
Based on your screenshot you've included in your question, you've mistaken local/session storage for cookies.
Both storage types have nothing to do with cookies. Values stored in these storages are not transmitted back to the server and can therefore not be used to identify your users in any way.
You do not need a cookie policy for these two storages.
Just to make sure that you didn't accidentally have any cookies set, you could also have a look into your dev-tools. The kinds of storage should be listed separately there.
Related
We have an app which stores some information in the browser's local storage via javascript. The app was rejected in the review phase because it saved these entries without adding them to the cookie consent. From what I can see we can only add cookies to the cookie consent via the app's manifest. How could we add these local storage entries to the consent? We can add them as cookies in the manifest with the same names but that would create these as cookies once the consent is approved, which is not ideal and unnecessary. Is there a better way?
In theory you could try to override the CookieConfiguration plugin in a way, so your local storage entries show up in the cookie consent window, yet do not actually set the cookies once accepted.
Personally, I wouldn't bother to go that far, as it might also lead to other issues in the review process. For now I would simply let the entries be set as cookies to get through the review process. Please create an issue on the issue tracker explaining the need for local storage settings in the cookie consent window.
I have multiple front-end & back-end apps running on different subdomains of the same domain. On the main front-end app I want to build a thing to switch between subdomains but also keep the session.
I've tried to:
use express-session
do some tricks with the JWT authentication
localStorage is not going to work as it is persistent on only 1 URL
but still can't figure out:
Is it possible to have a session shared across multiple subdomains?
What is the best solution to have a shared session across multiple subdomains?
The technologies I use:
Front-end: React JS
Back-end: Node & Express JS
To share sessions across sub-domains, you need to configure two things.
You need the proper cookie settings for the session cookie so that the browser will send the same session cookie to both sub-domains. This involves setting the domain attribute on the cookie to the root domain. You can set this in the cookie options for the express-session configuration.
You need to make sure that the server for each sub-domain has access to the same session store. If it's actually the same server for each sub-domain, then that's easy. But, if it's a different server, then you will need a shared session store, using some type of shared database (redis, mongodb, etc...). There are session store implementations for many different databases.
Luckily we're working on same project these days with with Nextjs as frontend and nodejs express as backend API.
We use cookies to manage the session on sub domains.
But to maintain the session on sub domains we use middleware in nextjs which on every page check for the session using token and user id.
On login send token, userid as parameters in url based on which user data get from api and saved that subdomain.
On logout removing cookies and sending paramter to tell the main domain to remove the saved cookies which completely clean the session.
This is how we maintain the auth session on multiple domains.
Hope this will helps.
If you find other way better then this i would like to know.
In our web application's UI, we load a video in an iframe. The video is on office 365/SharePoint server.
If the user is not logged into the organization's portal managed by Azure ADAL, a login screen is loaded in the iframe. If the user is already logged in the video plays normally.
So far fine. But our management do not want the iframe redirected to login page and instead set a cookie on the iframe and load the video.
We said that it is not possible to set a cookie on an iframe and send a request and also we asked how can we get Microsoft cookies into our application? The architect says there is a rest endpoint which will give the details of the cookie. But still we do not have idea how to set it.
Is it really possible to set cookies and send to Microsoft portal to avoid authentication? I believe it is not possible but the architects and management insists we try something.
I would say it's not possible to set a cookie "per frame", but I guess you can login the user ("somehow", see below) and then reload the frame (or check authentication before even trying to load the frame).
Idea to login silently:
Create an account on your SharePoint which is only allowed to watch the selected videos (aka a "Public-User")
Automatically login all not already authenticated users with this account
Maybe with a rest call to the SharePoint server, check whether the user is logged in
If not logged in, maybe in a (hidden) frame send the login data for the Public-User to the SharePoint
All future requests should have the cookie set.
Show them the video
But for the idea to manually set the cookie: Due to security issues, browsers won't let you (= your web application) to read or write cookies for another domain (= the SharePoint server).
Sort of.
We accomplished something like this through the use of a proxy server.
In short, the proxy (hosted in elastic beanstalk) would notice a request coming in, check it's cookies for one that we set to determine the user is logged in, and if it found that cookie on the request it would call some authorization endpoints with it to be able to append a new cookie onto the response (set-cookie header) which we would then use to determine how to proceed. The proxy was written with node.js/express.
As long as you end up having sameSite: 'None' on the cookie options when you are setting the cookie it should work, even though the site hosting the iframe is on another domain.
I'm not sure if this relates into your bigger picture, but maybe gives some inspiration to others with similar issues.
I have a main website running on AppEngine. It's on a subdomain like main.example.com. This main application is a content portal for our customers. It offers an Ajax application built on YUI. Customers can upload data to it. Users authenticate using Federated Login.
The Ajax application on it allows users to process the data previously uploaded. To do it it should use an webservice running on other subdomain like service.example.com. The webservice does not run on AppEngine but on our services - it's CPU heavy and built on other set of technologies. It would need to download the data on main application - but the downloading service - like everything on the main application - is behind the authentication wall.
I could programatically always allow the service to download wharever it wishes but I think this can turn into a major security problem.
How can I reuse the OpenID authentication "token" to allow it (the service) to appears to the main application as the authenticated user so it can download data? Or If I can do this what would be the best way to accomplish what I intend to do?
You can't really reuse the authentication token. What you should use is something akin to OAuth, though since you control both ends you can make it somewhat simpler:
Generate a shared secret, accessible by both main.example.com and service.example.com
When a user accesses service.example.com for the first time (no authentication cookie), redirect them to main.example.com/auth?continue=original_url (where original_url is the URL they attempted to access)
When you receive a request to main.example.com/auth, first log the user in the regular way (if they're not already). Then, take their user ID or other relevant credentials, and generate an HMAC from them, using the shared secret you established in step 1. Redirect the user to service.example.com/finish_auth, passing the computed HMAC, the authentication details such as user ID, and any parameters you were passed in such as the continue URL.
When you receive a request to service.example.com/finish_auth, compute the HMAC as above, and check it matches the passed in one. If it does, you know the request is a legitimate one. Set an authentication cookie on service.example.com containing any relevant details, and redirect the user back to their original URL.
This sounds complicated, but it's fairly straightforward in implementation. This is a standard way to 'pass' credentials between mutually trusting systems, and it's not unlike what a lot of SSO systems use.
We have one web application (sharepoint) that collects information from disparate sources. We would like to be able to link users to the main websites of those various sources and have them pre-authenticated. I.E. they enter their credentials for the other sources (which are a number of different types LDAP, AD and home grown!) and we retrieve some information for them, and remember there details (Possibly Single Sign-on to keep em nice and safe). The user can then click a link that will open the full app in another window already authenticated.
Is this even likely to be possible?
Office Server has a Single-Sign-On api as a builtin feature. you may want to look into that. It enables you to register user credentials securely, and to access it at runtime.
You need to act as a web browser acts to different sites with storing credentials (usually in cookies) locally. Use therefore a a proper client library with cookie support. This could go probably for most of sites. There are sites using HTTP authentication, which are also easier to access from appropriate client libraries. The most demanding can be access to SSL websites, but again, most client HTTP libraries cover that nowadays as well.
All you need now is just to prepare your web application to act as a proxy to all those separate web resources. How exactly this is done in Sharepoint, well, I hope others will answer that...
True Single Sign-on is a big task. Wikipedia describes common methods and links to a few SSO projects.
If you want something lighter, I've used this approach in the past:
Create a table to store temporary security tokens somewhere that all apps can access.
From the source app (Sharepoint in your case), on request of an external app, save a security token (maybe a guid, tight expiration, and userid) in the token table.
Redirect to a request broker page/handler in the destination app. Include the final page requested and the guid in the request.
In the broker, look up the security token. If it exists and hasn't expired, authenticate, authorize, and redirect to the final page if everything is good. If not, send a permissions err.
Security-wise, a guid should be near impossible to guess. You can shrink risk by letting the tokens expire very quickly - it shouldn't take more than a few seconds to call the broker.
If the destination app uses Windows Auth and doesn't have role-based logic, you shouldn't have to do much. Just redirect and let your File/UrlAuthorization handle it. You can handle role-based permissions with the security token db if required.