How to get long-lived embeddable link from Sharepoint / Onedrive file picker? - sharepoint

In our web app, we enable users to upload files, and preview them. We also enable them to upload from their cloud drives, such as Google Drive, DropBox, and OneDrive.Most of our users are using OneDrive for businesses.
In order to show the selected files, we are saving embeddable links from the cloud drives, and embedding them in an iframe, that way the permissions, and authorization are in the cloud derive control.
I tried using the request: POST /drives/{driveId}/items/{itemId}/preview, after the user selects the file from the file picker, and I using the getUrl field from the response payload to embed the file into an iframe. (the getURL as src)
I noticed that the URL contains access_token, and I deleted it before saving it, and I noticed that the iframe still works fine, and allows only users with permission to see the file, is this is a good usage of that?
In the Graph API docs, driveItem:preview mention that the driveItem:preview request generates a short-lived URL, I would like to know what is the expiration of the link (hour, day, week, month, etc..).
is it short-lived only because of the access_token, and authorized users in Sharepoint/Onedrive will have access for the long term?
Are there any pitfalls that I didn't notice?
I have to be sure that it is a robust way for using that otherwise, I will save broken links, and the users will have to pick again all the files..

Related

If a google doc/sheet is made public, how easily can other people find the URL?

Is it easy for people to find "public" google sheets/docs?
Context: Storing some semi-sensitive data (individual user info, of non-sensitive nature) for an app beta-test in google sheets. Planning to migrate to some DB in the future, but for now, just using JavaScript to pull the data directly from the google sheets (since there are visualizations being dynamically updated by the sheets).
Yes, it's easy to get information. Search engines may index and cache the information. Then, there are bots, crawlers and scrapers. Do NOT put (semi)sensitive information in public. Implement google-oauth properly with google-sheets-api to get information. You can also use service-accounts
Yes, it can be easily accessed.
According to the official Google article Share files from Google Drive: when you set your file's General Access setting to public:
Anyone can search on Google and get access to your file, without signing in to their Google account.
What you can do:
In the case of your app beta-test in google sheets data, you may want to reconsider to change your file's General Access setting to one of the following (in descending order of security):
Restricted - Only people that you manually give access to can view or edit your files. When you click the share button, a prompt will show and you may manually add the users who can view or edit your files:
Afterwards, you may select a role for those users and then they can be notified afterwards through email.
On the other hand, you can share the link to others. A prompt will show like the one below if you send the url through Google Chat:
You may opt to select Don't give access which will result in the following view on the other user's end:
This would mean that if unauthorized users get hold of the file URL, they will still need to send an access request. If other users submit the request, an email notification will be sent to your mail inbox. Other users who also own the file will also be notified by mail.
Your Organization - If you use a Google Account through work or school, anyone signed in to an account in your organization can open the file. If you are an administrator in a work or school workspace, you may set how members can share content within the organization. The administrator can prevent the sharing of content with group members outside your organization. If external sharing is prohibited, only group members who are in your organization can access the group's shared content.
Anyone with the link - Anyone who has the link can use your file, without signing in to their Google Account. This option is least recommended because if the URL is leaked to unauthorized users, they can easily access the file.
References:
Share files from Google Drive
Share content with a group
Don’t make it public unless you want the public to see it. Use oauth to access.

Can I set my Node.js Backend to access my Google Drive to list/download/upload?

I want my back-end to use the Google Drive API to be able to list/upload/download files from a normal google drive folder (as opposed to a cloud bucket). But not the user's Google Drive, only MY drive.
As far as the end-user is concerned they would just be on my site but when they upload a file, my back-end would receive it and store it on MY google drive. Same for a simple list of files in the folder, I just want them to click my front-end button and have it send a call to my back end, then my back-end sends a call to the google API and returns a list.... effectively making my back-end the middle man for my google Drive. So that my users don't need a google account to access my site
My reasoning is I want my users to NOT need a google account, but I will still need to share these folders with contractors. The contractors can have a google account, that doesn't bother me, And I don't want to have to re-invent the wheel by building a separate front-end for my contractors to download these folders when Google Drive already has perfectly working UI that will zip a folder and download it already built.
So I want:
User -> front-end -> my backend -> google drive
I have seen posts on doing this for other services, like analytics and calendar, but I really need drive capabilities.
A user could be either my client who needs to upload and download OR my client's clients (who will only ever need to upload)
Main Question:
Can I set my NodeJS Back-end to access my Google Drive to list/download/upload
If it is possible:
How, and should I? As I write this, I am thinking of issues... like, will uploading a file from Front-end to back-end then to google drive be too cumbersome to be practical? (These are video files that could be around 300-400mb).
If not possible OR it is too cumbersome:
Can anyone suggest anything that will make access to Google Cloud bucket folders easier? Package? Example? Method? Tutorial?
Frontend: VueJS with axios
Backend: super basic node/express API/back-end on an AWS ec2 server
The short Answer is Yes.
The way to do this is with a service account as these accounts are special credentials to be used by a service. These accounts are of the form: service-account-name#project-id.iam.gserviceaccount.com.
Once you have the service account you will need to grant access to this service account on the Google Drive location. To grant access to this service account share the location (files or folders) with this account, as if you were sharing the files with another user, just by adding the email on the edit permitions.
And Finally to manipulate the files there programatically you can use the NodeJS Client Library which will make the task easier than manipulate the API calls directly.

Is there a way to put authentication on calendar subscription url which is in ics format in outlook?

I have created a URL for subscribing to calendar events, mainly in Outlook. Since it has private information, I want users to be able to authenticate when subscribing to this calendar URL using a username and password. I don't want users to add passwords in the URL in order to authenticate.
Is there a way to achieve this where potentially a dialog box appears in outlook where user can enter their security credentials or some other way to authenticate? I'm using node.js on server side.
Thanks in advance!
I don't believe there is a consistent way to do this:
The RFC5545 specification is meant to "provide the definition of a common format for openly exchanging calendaring and scheduling information across the Internet".
Ie the receiving application must be able to access the url. It may work for some if the application user is able to access the url at the time they are logged in, then fail at other times. This is what annoyed me intensely with a school application. One could login & download an ics file and import it BUT could not subscribe to it. So whenever there were updates at a minimum each term, one had to login and re download & import.
Option:
You could have people login and get their unique obfuscated url. This is how google calendar does it. It is a 'private' but public url - anyone who gets sent that url can subscribe to it. Since even if it weren't public, the person who logs in, could also download it and send the file around, there is only 'some' additional minimal risk.
At any stage if people are no longer authorised to access the URL, then for their url you issue a 410, or issue empty ics file, or one with dummy events .
Calendar subscription are just HTTP resources, so did you try to protect your resource with Basic Authentication, e.g. by using something like https://www.npmjs.com/package/basic-auth ?

Accessing Google Photos using only a sharing link

I am trying to access a Google Photos album that has been shared via a sharing link. A UX user can access such an album without authenticating, simply by following the sharing link.
However, all APIs in Google Photos API that I can find require the user's OAuth token, and I do not wish to require the users to log in (as they are not required to do so in a browser).
I tried scraping the HTML output returned by Google Photos in response to the link, but it is quite difficult (not to mention fragile).
Any suggestions?

Content protection for Azure blobs

I'm not sure if this is a good question, but I know Azure Media Services has content protection for video and audio, and I know Azure Rights Management exists for documents and email, and seems to use a special client to view protected documents.
If I were to build a web application that lets users view sensitive documents, like CVs or financial histories, is there a way to let users view those documents (pdfs, word documents, whatever, they'd be uploaded as Azure blobs) in an ordinary web browser like Chrome etc. but without being able to download them (most importantly), print them, copy portions from them, and so forth?
Any type of content protection would need to be built by you. Blobs simply contain data that you put there.
You can make a blob private so that only your app can get to it, unless you generate a temporary Shared Access Signature (or policy). However: If you provide a link via SAS, there is no stopping someone from downloading it (until the link expires).
If you want to do something related to web-based browsing with content protection, you'd need to download the content from blob to your web app first, and then serve that content from your web app with whatever protections you wish to implement / take advantage of, without ever providing a blob's direct link to the the end-user.

Resources