Download from Firebase Storage - node.js

I'm using nodejs for the download of some files (mp3) from Firebase Storage, that I will send to the client.
I want to get blob file and then send to the client.
I read the docs and Firebase use refFromURL method to get the downloadable url.
But when I start the script, it says that refFromURL is not an url:
const firebase = require('firebase-admin')
var serviceAccount = require('./api/admin.json')
firebase.initializeApp(optionFirebase)
var storageSongs = firebase.storage()
let linkSong = storageSongs.refFromURL('FILE_URL')
It's not an authentication problem because I use the same options for Realtime Database and it works well.

refFromURL is a method provided by the Firebase JavaScript web client SDK. It's not available in the Cloud Storage nodejs SDK for backends.

Related

How can I integrate a credential json file with GCP bigquery with nodejs?

I have a file type of json. It is a credential file. I want to integrate with GCP bigquery and access to GCP bigquery using this credential file with Nodejs.
How can I do that?
How can integrate with GCP bigquery using credential file in nodejs?
How can I test the result of integration to test integration is valid or not?
You probably want the keyFilename attribute, unless I've misunderstood your question.
This GCP doc talks about authenticating using a service account key file.
So if your credentials file lived in /var/my_credentials.json (dumb path but whatever), your Node.js code would look something like this:
const {BigQuery} = require('#google-cloud/bigquery');
const options = {
keyFilename: '/var/my_credentials.json',
projectId: 'my_project',
};
const bigquery = new BigQuery(options);
Also consider: keep the contents of that credentials file in Google Secret Manager and use gcloud secrets versions access latest, dumping the output into a temporary json file local to the script, then remove the temporary json file after it's no longer needed by the script. No need to have credentials floating around on servers.

How to authenticate with tokens in Nodejs to a private bucket in Cloud Storage

Usually in Python what I do, I get the application default credentials, I get the access token then I refresh it to be able to authenticate to a private environment.
Code in Python:
# getting the credentials and project details for gcp project
credentials, your_project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
#getting request object
auth_req = google.auth.transport.requests.Request();
print(f"Checking Authentication : {credentials.valid}")
print('Refreshing token ....')
credentials.refresh(auth_req)
#check for valid credentials
print(f"Checking Authentication : {credentials.valid}")
access_token = credentials.token
credentials = google.oauth2.credentials.Credentials(access_token);
storage_client = storage.Client(project='itg-ri-consumerloop-gbl-ww-dv',credentials=credentials)
I am entirely new to NodeJS, and I am trying to make the same thing.
My goal later is to create an app engine application that would expose an image that is found in a private bucket, so credentials are a must.
How it is done?
For authentication, you could rely on the default application credentials that are present within the GCP platform (GAE, Cloud Functions, VM, etc.). Then you could just run the following piece of code from the documentation:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('albums');
const file = bucket.file('my-existing-file.png');
In most circumstances, there is no need to explicitly use authentication packages since they are already executed underneath the google-cloud/storage package in Nodejs. The same holds for the google-cloud-storage package in Python. It could help to look at the source code of both packages on Github. For me, this really helped to understand the authentication mechanism.
When I develop code on my own laptop, that interacts with google cloud storage, I first tell the gcloud SDK what my credentials are and on which GCP project I am working. I use the following commands for this:
gcloud config set project [PROJECT_ID]
gcloud auth application-default login
You could also set DEFAULT_APPLICATION_CREDENTIALS as an environment variable that points to a credentials file. Then within your code, you could pass the project name when initializing the client. This could be helpful if you are running your code outside of GCP on another server for example.

Firebase Storage Admin SDK: upload from image url

In the web sdk of firebase storage you can upload an image from Blob data. I have a nodeJS app and want to upload images from a url to my storage bucket. In the docs it's recommended to use the admin SDK if running a node server environment. But i cannot find this feature in the firbase storage admin documentation.
Heres my code:
const admin = require('firebase-admin');
const serviceAccount = require(`./credentials/my-keyfile.json`);
const app = admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
const storage = app.storage();
// Get Blob data from an external ImageUrl
const axios = require("axios");
const getBlobFromUrl = (url) => {
const response = await axios.get(url, { responseType: "blob" });
return response.data;
}
const blobData = getBlobFromUrl("https://images.unsplash.com/photo-1591476315349-faa1c8e6f567?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max")
// MY QUESTION -> how can i upload that blob data to my storage?
I was recently trying to do something similar, writing an in-memory file from a Node server (with the Firebase Admin SDK) to Firebase Storage.
I solved it by realizing that I needed to use Google Cloud Storage's Node API to do so (mentioned in the docs). This is the relevant doc page to do streaming transfers with Google Cloud Storage: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs.
Here are the basic steps:
Set the appropriate storageBucket address (e.g. gs://APP-NAME.appspot.com/), which you can get from your Firebase Storage console. Also, get the appropriate Firebase storage and bucket JS objects. Follow the example at https://firebase.google.com/docs/storage/admin/start
get the stream of byte data from your blob/file: https://developer.mozilla.org/en-US/docs/Web/API/Blob/stream
use Google Cloud Storage's API to pipe/write that stream to a specified location in your bucket: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs
So, following the the docs above, you'd do something like:
// Create a reference to a file object.
// myBucket is the bucket obtained in step 1 above
const file = myBucket.file("directoryName/imageName.png");
// step 2: get ReadableStream from Blob
const stream = blobData.stream();
// step 3: asynchronously pipe/write to file in cloud
async function streamFileUpload() {
stream.pipe(file.createWriteStream()).on('finish', () => {
// The file upload is complete
});
}
streamFileUpload().catch(console.error);
After this you'll see your file uploaded to Firebase Storage under directoryName/imageName.png.
The first link you have posted on uploading an image with Blob data
contains information on how you would do it with the client sdk.
When you use the Admin SDK, the use case is for the backend, for example, cloud functions. What you have shown in your post is used for this use case.
For implementations on how to upload an image using the client sdk, check out some of the Quickstart guides as well as the code labs.
If you're working primarily with Blob, you can check out some libraries like busboy and send HTTP requests instead.

Google Cloud Storage access without providing credentials?

I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"

Upload file to Azure Blob Storage directly from browser?

Is it possible to create an html form to allow web users to upload files directly to azure blob store without using another server as a intermediary? S3 and GAW blobstore both allow this but I cant find any support for azure blob storage.
EDIT November 2019
You can now refer to the official documentation:
Azure Storage JavaScript Client Library Sample for Blob Operations
Azure Storage client library for JavaScript
Initial answer
There is a New Azure Storage JavaScript client library for browsers (Preview).
(Everything from this post comes from the original article above)
The JavaScript Client Library for Azure Storage enables many web development scenarios using storage services like Blob, Table, Queue, and File, and is compatible with modern browsers
The new JavaScript Client Library for Browsers supports all the storage features available in the latest REST API version 2016-05-31 since it is built with Browserify using the Azure Storage Client Library for Node.js
We highly recommend use of SAS tokens to authenticate with Azure Storage since the JavaScript Client Library will expose the authentication token to the user in the browser. A SAS token with limited scope and time is highly recommended. In an ideal web application it is expected that the backend application will authenticate users when they log on, and will then provide a SAS token to the client for authorizing access to the Storage account. This removes the need to authenticate using an account key. Check out the Azure Function sample in our Github repository that generates a SAS token upon an HTTP POST request.
Code sample:
Insert the following script tags in your HTML code. Make sure the JavaScript files located in the same folder.
<script src="azure-storage.common.js"></script/>
<script src="azure-storage.blob.js"></script/>
Let’s now add a few items to the page to initiate the transfer. Add the following tags inside the BODY tag. Notice that the button calls uploadBlobFromText method when clicked. We will define this method in the next step.
<input type="text" id="text" name="text" value="Hello World!" />
<button id="upload-button" onclick="uploadBlobFromText()">Upload</button>
So far, we have included the client library and added the HTML code to show the user a text input and a button to initiate the transfer. When the user clicks on the upload button, uploadBlobFromText will be called. Let’s define that now:
<script>
function uploadBlobFromText() {
// your account and SAS information
var sasKey ="....";
var blobUri = "http://<accountname>.blob.core.windows.net";
var blobService = AzureStorage.createBlobServiceWithSas(blobUri, sasKey).withFilter(new AzureStorage.ExponentialRetryPolicyFilter());
var text = document.getElementById('text');
var btn = document.getElementById("upload-button");
blobService.createBlockBlobFromText('mycontainer', 'myblob', text.value, function(error, result, response){
if (error) {
alert('Upload filed, open browser console for more detailed info.');
console.log(error);
} else {
alert('Upload successfully!');
}
});
}
</script>
Do take a look at these blog posts for uploading files directly from browser to blob storage:
http://coderead.wordpress.com/2012/11/21/uploading-files-directly-to-blob-storage-from-the-browser/
http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript
The 2nd post (written by me) makes use of HTML 5 File API and thus would not work in all browsers.
The basic idea is to create a Shared Access Signature (SAS) for a blob container. The SAS should have Write permission. Since Windows Azure Blob Storage does not support CORS yet (which is supported by both Amazon S3 and Google), you would need to host the HTML page in the blob storage where you want your users to upload the file. Then you can use jQuery's Ajax functionality.
Now that Windows Azure storage services support CORS, you can do this. You can see the announcement here: Windows Azure Storage Release - Introducing CORS, JSON, Minute Metrics, and More.
I have a simple example that illustrates this scenario here: http://www.contentmaster.com/azure/windows-azure-storage-cors/
The example shows how to upload and download directly from a private blob using jQuery.ajax. This example still requires a server component to generate the shared access signature: this avoids the need to expose the storage account key in the client code.
You can use HTML5 File API, AJAX and MVC 3 to build a robust file upload control to upload huge files securely and reliably to Windows Azure blob storage with a provision of monitoring operation progress and operation cancellation. The solution works as below:
Client-side JavaScript that accepts and processes a file uploaded by user.
Server-side code that processes file chunks sent by JavaScript.
Client-side UI that invokes JavaScript.
Get the sample code here: Reliable Uploads to Windows Azure Blob Storage via an HTML5 Control
I have written a blog post with an example on how to do this, the code is at GitHub
It is based on Gaurav Mantris post and works by hosting the JavaScript on the Blob Storage itself.
Configure a proper CORS rule on your storage account.
Generate a Shared Access Signature from your target container.
Install the blob storage SDK: npm install #azure/storage-blob.
Assuming your file is Blob/Buffer/BufferArray, you can do something like this in your code:
import { ContainerClient } from "#azure/storage-blob";
const account = "your storage account name";
const container = "your container name";
const sas = "your shared access signature";
const containerClient = new ContainerClient(
`https://${account}.blob.core.windows.net/${container}${sas}`
);
async function upload(fileName, file) {
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const result = await blockBlobClient.uploadData(file);
console.log("uploaded", result);
}

Resources