Secure external links for Firebase Storage on NodeJS server-side - node.js

I'm having issues generating external links to files stored in my Firebase Storage bucket.
I'm using Google Cloud Storage for a while now and used this library (which is based on this answer) for generating external links for regular Storage buckets, but using it on the Firebase-assigned bucket doesn't seem to work.
I can't generate any secure HTTPS links and keep getting certificate validation error NET::ERR_CERT_COMMON_NAME_INVALID stating that my connection is not private. If I remove the 'S' from the HTTPS, the link works.
NOTE: Using the same credentials and private key to generate links for other buckets in my project, works just fine. It's only the Firebase bucket that is refusing to accept my signing...

I recommend using the official GCloud client, and then you can use getSignedUrl() to get a download URL to the file, like so:
bucket.file(filename).getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// The file is now available to read from this URL.
request(url, function(err, resp) {
// resp.statusCode = 200
});
});
Per Generate Download URL After Successful Upload this seems to work with Firebase and GCS buckets.

Related

How to render images(files) from S3 bucket blocked all public access in frontend( Private Write, Private read)

I have uploaded file to S3 bucket using aws-sdk as:
async function uploadFileToAws(file){
const fileName = `new_file_${new Date().getTime()}_${file.name}`;
const mimetype = file.mimetype;
const params = {
Bucket: config.awsS3BucketName,
Key: fileName,
Body: file.data,
ContentType: mimetype,
// ACL: 'public-read'
};
const res = await new Promise((resolve, reject) => {
s3.upload(params, (err, data) => err == null ? resolve(data) : reject(err));
});
return { secure_url: res.Location };
}
If we allow the bucket permission to public read then there is no problem, but we have the requirement of blocking public-read(public-access) and only allow the access of bucket object or image to be visible in owns products only(mobile and web apps) with the help of access Id and secret key or any other similar approach. Is this possible? does aws S3 provide such services?
I have gone through aws s3 documentation, googled, and walked through multiple StackOverflow threads and some blogs but no luck. I would really appreciate the suggestion, tips, help.
You could consider two options.
The first one would be through CloudFront and signed urls or cookies as explained in
Serving Private Content with Signed URLs and Signed Cookies
Basically, in this approach you would setup a CloudFront distribution which would be used to serve your private images. Since the users are authenticated, your backend would need to verify whether they can access the given image, and if so, generate a signed URL for the file. The signed url would enable the access to the said file. Details of this procedure are described in How Signed URLs Work.
The second possibility would be through pre-signed S3 URLs. It is somehow similar to the first one, except that it does not involve any extra service, such as CloudFront. Again, since users are authenticated, your back-end would verify their rights to view the given image, and generate pre-signed S3 url to enable them a temporary access to the image.
In both cases, bucket's do not need to be public. Access to the images is controlled by your back-end.

Serve private files directly from azure blob storage

My web app allows users to upload files.
I want to use cloud azure blob storage for this.
Since downloading will be very frequent (more than uploading)
I would like to save server computing time and bandwith and serve files directly from the azure blob server.
I believe it is made possible on Google cloud with Firebase (Firestorage).
Where you can upload and download directly from the client. (I know authentication and authorization are also managed by firebase so it makes things easier)
Does any similar mechanisms/service exist on Azure?
For example
When a user clicks an azure storage download link a trigger would check the JWT for authorization and data would be sent directly to the client from azure storage
Similar option is available with Azure Blob storage as well. You can use the Storage SDK to access the containers and list/download the blob
With a javascript backend You can either use SAS Token or Azure Storage JavaScript Client Library also supports creating BlobService based on Storage Account Key for authentication besides SAS Token. However, for security concerns, use of a limited time SAS Token, generated by a backend web server using a Stored Access Policy.
Example here
EDIT:
I have not answered the question completely above, However if you want to access the blob storage or download any files from the blob storage you can make use of normal http get request with SAS token generated with any JavaScript application.
With Angualr:
uploadToBLob(files) {
let formData: FormData = new FormData();
formData.append("asset", files[0], files[0].name);
this.http.post(this.baseUrl + 'insertfile', formData)
.subscribe(result => console.log(result));
}
downloadFile(fileName: string) {
return this.http.get(this.baseUrl + 'DownloadBlob/' + fileName, { responseType: "blob" })
.subscribe((result: any) => {
if (result) {
var blob = new Blob([result]);
let saveAs = require('file-saver');
let file = fileName;
saveAs(blob, file);
this.fileDownloadInitiated = false;
}
}, err => this.errorMessage = err
);
}
However the best practice (considering the security) is to have a backend API/Azure function to handle the file upload.

Google Cloud Storage access without providing credentials?

I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"

Can I use Dropbox API v2 to implement cloud storage in my web app?

I was working on a web app where one of the feature was that a user logs into our system and then uploads some files. I was wondering if there is a way I could use my own Dropbox account to store these uploaded files in an organised and secured manner. I would also like to retrieve these files later on so I will need to store their links in a separate database at the moment upload is done.
I am working on a node.js/JavaScript environment with Ubuntu if that matters and hosting the app on Heroku.
I think that the deprecated Datastore API had similar capability but is there a way to implement this with the API V2 ?
The Dropbox API does offer the ability to upload and download files, among other operations, so this should certainly be possible. You can find everything you need to get started with the Dropbox API, including documentation, tutorials, and SDKs here:
https://www.dropbox.com/developers
It's important to note though that the Dropbox API was designed with the intention that each user would link their own Dropbox account, in order to interact with their own files. However, it is technically possible to connect to just one account. The SDKs don't offer explicit support for it and we don't recommend doing so, for various technical and security reasons. Most of these concerns are allayed for server-side apps though, where the access tokens can be kept secret.
If you did want to go this route, instead of kicking off the authorization flow, you would manually use an existing access token for your account and app. (Just be careful not to revoke it, e.g. via https://www.dropbox.com/account/security .)
You can quickly upload/download file using my tiny dropbox v2 api wrapper (dropbox-v2-api):
Upload example:
const dropboxUploadStream = dropbox({
resource: 'files/upload',
parameters: {
path: '/dropbox/path/to/file.js'
}
}, (err, result) => {
//upload completed
});
fs.createReadStream('path/to/file.js').pipe(dropboxUploadStream);
Download example:
dropbox({
resource: 'files/download',
parameters: {
path: '/dropbox/image.jpg'
}
}, (err, result) => {
//download completed
})
.pipe(fs.createWriteStream('./image.jpg'));
Both combined:
const downloadStream = dropbox({
resource: 'files/download',
parameters: { path: '/source/file/path' }
});
const uploadStream = dropbox({
resource: 'files/upload',
parameters: { path: '/target/file/path' }
}, (err, response) => {
//upload finished
});
downloadStream.pipe(uploadStream);

encrypt object in aws s3 bucket

I am saving some images/objects in aws s3 bucket from my application. First i am getting signed url from nodejs service api and uploading images or files to singed url using jquery ajax. I can open image or object using the link provided in the properties (https://s3.amazonaws.com/bucketname/objectname).
I want to provide security for each uploaded object. Even by chance if any anonymous user gets the link (https://s3.amazonaws.com/bucketname/objectname) somewhere he should not be able to open it. They (objects) should be accessed and open only cases like when request has some headers key values etc. I tried server side encryption by specifying header key values in request as shown below.
var file = document.getElementById('fileupload').files[0];
$.ajax({
url: signedurl,
type: "PUT",
data: file,
header:{'x-amz-server-side-encryption':'AES256'},
contentType: file.type,
processData: false,
success: function (result) {
var res = result;
},
error: function (error) {
alert(error);
}
Doesn't sever side encryption keep encrypted object on s3 bucket storage? Does it only encrypts while transferring and decrypts before saving on s3 storage?
If it stores encrypted object on s3 storage then how can i open it using the link shown in properties.
Server-Side Encryption (SSE) in Amazon S3 encrypts objects at rest (stored on disk) but decrypts objects when they are retrieved. Therefore, it is a transparent form of encryption.
If you wish to keep objects in Amazon S3 private, but make them available to specific authorized users, I would recommend using Pre-Signed URLs.
This works by having your application generate a URL that provides time-limited access to a specific object in Amazon S3. The objects are otherwise kept private so they are not accessible.
See documentation: Share an Object with Others

Resources