Can a chrome extension read data from the local storage? - security

If a chrome-extension requests the permission "Read and modify all your data on all websites you visit", can it then read data from the local storage of one of these visited sites?
I'm asking this because I know of some websites which store authentication tokens in the local storage. If the extension can access the token, it would be frightfully easy to harvest access tokens...

The answer is Yes. Content script injected by an extension has full access to localStorage for the site.

Related

Temporary public URL for Azure Blob storage?

I currently work with a VOIP product that allows our customers to record their phone calls. Recorded phone calls by default are stored on our servers, and a URL is stored which points to the recording, which is embedded in our customers' portals.
We are working on a feature that allows our customers to provide their own Azure Blob details, such that recordings are stored in their own container. The only problem we are having is that the container needs to be set to public so that the recording can be embedded dynamically in the browser.
The paths to the recordings contain multiple UUIDs, providing some kind of security through obscurity, although we still aren't too keen on requiring the containers to be public.
Does there exist a method in Azure Blob to generate temporary URLs/tokens for accessing files, such that we can refresh links (daily, for example) so that a bad actor couldn't share a recording with a link that will never cease to be valid?
What you are looking for is SAS token.
A shared access signature (SAS) provides secure delegated access to
resources in your storage account. With a SAS, you have granular
control over how a client can access your data. For example:
What resources the client may access.
What permissions they have to those resources.
How long the SAS is valid.

Best approach to create logical separation between users in Azure Blob Storage

I'm trying to understand how cloud storage works with Azure Blob Storage. The use case is a microservice architecture with a few APIs and a frontend where the users can upload, download and delete blobs.
I have two types of blobs, one type are profile pictures and assets that may be accessed by all the users and the other type are blobs that the user has ownership and only certain users can see/download (users of the same company, website admins...).
We have 3 concepts that I'm trying to figure out the purpose:
Storage account, that's me, the Azure account holder.
Container, that can be used one for every entity/user.
Blobs
Upload blobs can only be possible using a frontend of my microservice architecture, so the authentication will be service to service with the new service I want to build.
Download blobs it will be exposing an URL and (here start doubts) when the user click the URL, I'm going to check against AuthService if the user has a session logged (if not, redirect to login frontend) and then I need to request if the user has permissions to download this blob.
How can I do this?
I think about click URL, check with AuthService that the user is logged, download service ask for user information and then check against blob metadata what is the blob ownership. That needs to store in the upload process information into metadata like entity_id, user_id. I don't know...
Did you consider implementing an API/capability in your frontend to generate a SAS URL to the specific blob the user should have access to?
That way this API can verify the user permissions however you wish, and if the user's request checks out you provide him with SAS URL that will expire whenever you choose and can have read/write/delete (you choose) on a specific blob.
Also I'd highly recommend to separate storage accounts that hold system data that is entirely internal to the system, and storage accounts with blobs accessible to the user. This is become the SAS URL does contain the storage account DNS, which exposes it to DDOS and other DNS-based attacks, and therefore in my opinion you should limit their scope to only the blobs you need to let users access anyways.

White list specific chrome extension to access local storage if third party plugin cookies are disabled

I have created a chrome extension which uses local storage to store a few values which are used in the extension. If a user blocks third-party cookies in chrome settings under (chrome://settings/content/cookies), the extension does not have access to the local storage.
Certain organizations do not allow third party applications to access the cookies(local storage).
The client request is to only white list my extension and allow it to access the local storage without enabling third party cookies, is this possible?
Thanks in advance :)
You can use chrome.storage instead of localStorage API.
As localStorage is domain specific and chrome.storage is plugin specific and is unaffected by third party settings.

Shared Access Signatures in Azure for client blob access

Here's what I am trying to accomplish:
We have files stored in azure blobs and need to secure access to them so that only our installed Windows 8 Store App can download these blobs. My first thought was to use some sort of certificate. When the app is installed, it is installed with a certificate that it then passes in the header of the request to he server to obtain the blob.
I read about Shared Access Signatures and it kind of makes sense to me. It seems like an API that the client could use to obtain a temporary token granting access to the blobs. Great. How do I restrict access to the API for obtaining SAS tokens to only our installed client apps?
Thank you.
Using SAS urls is the proper way to do this, this way you can give up a specific resource for a limited amount of time (15 minutes for example) and with limited permissions (only read for example).
Since this app is installed on the users machine you can assume the user can see whatever the App is doing so there is no absolute way to secure your API to only be accessed by only your App, but you can make it a little more difficult to replicate by using SSL (https) endpoint and providing some "secret key" only your App knows.

ImageResizer with Private Blobs on Azure

I have a system providing access to private blobs based on a users login credentials. If they have permission they will be given a SAS Blob url to view document or image stored in Azure.
I want to be able to resize the images, but still maintain the integrity of the short window of access via the SAS.
What is the best approach with ImageResizer? Should I user the AzureReader2 plugin, or should i just use the RemoteReader with the SAS Url?
Thanks
ImageResizer would disk cache the resized result images indefinitely, regardless of restrictions on the source file.
You need to implement your authorization logic within the application using Authorize_Request or Config.Current.Pipeline.AuthorizeImage .
There's no way to pass-through storage authorization unless you disable all caching.

Resources