Uploading fire to Firebase Storage using firebase-admin in Node - node.js

I need to upload a file from a Node script to my Firebase storage using firebase-admin.
This is how I'm doing in the browser: (IT WORKS)
// THIS IS INSIDE A FUNCTION
const storageRef = firebase.storage().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
This is how I'm TRYING to do in the Node script:
const admin = require('firebase-admin');
admin.initializeApp();
// THIS IS INSIDE A FUNCTION
const storageRef = admin.storage().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
And I'm getting the error:
TypeError: admin.storage(...).ref is not a function
I also tried:
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "MY_BUCKET.appspot.com"
});
// THIS IS INSIDE A FUNCTION
const storageRef = admin.storage().bucket().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
And I'm getting the error:
TypeError: admin.storage(...).bucket(...).ref is not a function
QUESTION
I need to save the file in my default bucket inside the my-images folder. What is the proper way of doing it?

With the Google Cloud Storage Node.js Client you need to use a different approach compare to the JavaScript SDK.
More precisely, you need to use the upload() method of a Bucket, as follows:
bucket.upload('my-images/' + fileName).then(function(data) {
const file = data[0];
});
You will find more example in the documentation (link above)

Related

Writing file in /tmp in a Firebase Function does not work

I am writing a Firebase function that exposes an API endpoint using express. When the endpoint is called, it needs to download an image from an external API and use that image to make a second API call. The second API call needs the image to be passed as a readableStream. Specifically, I am calling the pinFileToIPFS endpoint of the Pinata API.
My Firebase function is using axios to download the image and fs to write the image to /tmp. Then I am using fs to read the image, convert it to a readableStream and send it to Pinata.
A stripped-down version of my code looks like this:
const functions = require("firebase-functions");
const express = require("express");
const axios = require("axios");
const fs = require('fs-extra')
require("dotenv").config();
const key = process.env.REACT_APP_PINATA_KEY;
const secret = process.env.REACT_APP_PINATA_SECRET;
const pinataSDK = require('#pinata/sdk');
const pinata = pinataSDK(key, secret);
const app = express();
const downloadFile = async (fileUrl, downloadFilePath) => {
try {
const response = await axios({
method: 'GET',
url: fileUrl,
responseType: 'stream',
});
// pipe the result stream into a file on disc
response.data.pipe(fs.createWriteStream(downloadFilePath, {flags:'w'}))
// return a promise and resolve when download finishes
return new Promise((resolve, reject) => {
response.data.on('end', () => {
resolve()
})
response.data.on('error', () => {
reject()
})
})
} catch (err) {
console.log('Failed to download image')
console.log(err);
throw new Error(err);
}
};
app.post('/pinata/pinFileToIPFS', cors(), async (req, res) => {
const id = req.query.id;
var url = '<URL of API endpoint to download the image>';
await fs.ensureDir('/tmp');
if (fs.existsSync('/tmp')) {
console.log('Folder: /tmp exists!')
} else {
console.log('Folder: /tmp does not exist!')
}
var filename = '/tmp/image-'+id+'.png';
downloadFile(url, filename);
if (fs.existsSync(filename)) {
console.log('File: ' + filename + ' exists!')
} else {
console.log('File: ' + filename + ' does not exist!')
}
var image = fs.createReadStream(filename);
const options = {
pinataOptions: {cidVersion: 1}
};
pinata.pinFileToIPFS(image, options).then((result) => {
console.log(JSON.stringify(result));
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Authorization, Origin, X-Requested-With, Accept");
res.status(200).json(JSON.stringify(result));
res.send();
}).catch((err) => {
console.log('Failed to pin file');
console.log(err);
res.status(500).json(JSON.stringify(err));
res.send();
});
});
exports.api = functions.https.onRequest(app);
Interestingly, my debug messages tell me that the /tmp folder exists, but the file of my downloaded file does not exist in the file system.
[Error: ENOENT: no such file or directory, open '/tmp/image-314502.png']. Note that the image can be accessed correctly when I manually access the URL of the image.
I've tried to download and save the file using many ways but none of them work. Also, based on what I've read, Firebase Functions allow to write and read temp files from /tmp.
Any advice will be appreciated. Note that I am very new to NodeJS and to Firebase, so please excuse my basic code.
Many thanks!
I was not able to see you are initializing the directory as suggested in this post:
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const thumbFileName = 'thumb_' + fileName;
const workingDir = join(tmpdir(), `${object.name.split('/')[0]}/`);//new
const tmpFilePath = join(workingDir, fileName);
const tmpThumbPath = join(workingDir, thumbFileName);
await fs.ensureDir(workingDir);
Also, please consider that if you are using two functions, the /tmp directory would not be shared as each one has its own. Here is an explanation from Doug Stevenson. In the same answer, there is a very well explained video about local and global scopes and how to use the tmp directory:
Cloud Functions only allows one function to run at a time in a particular server instance. Functions running in parallel run on different server instances, which have different /tmp spaces. Each function invocation runs in complete isolation from each other. You should always clean up files you write in /tmp so that they don't accumulate and cause a server instance to run out of memory over time.
I would suggest using Google Cloud Storage extended with Cloud Functions to achieve your goal.

Specifying projectId when creating GCP Storage bucket

I am trying to create a GCP Storage Bucket using the Node.js library.
I've been using the steps here:
https://cloud.google.com/storage/docs/creating-buckets#storage-create-bucket-nodejs
And code pasted below.
The challenge is that my bucket keeps being created in the wrong project. My project is set in my gcloud cli, it's set in my node environment, and it's set in my script.
Is there some way to set the project in the values you pass to the library's createBucket function?
/**
* TODO(developer): Uncomment the following line before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const storageClass = 'Name of a storage class, e.g. coldline';
// const location = 'Name of a location, e.g. ASIA';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function createBucketWithStorageClassAndLocation() {
// For default values see: https://cloud.google.com/storage/docs/locations and
// https://cloud.google.com/storage/docs/storage-classes
const [bucket] = await storage.createBucket(bucketName, {
location,
[storageClass]: true,
});
console.log(
`${bucket.name} created with ${storageClass} class in ${location}.`
);
}
createBucketWithStorageClassAndLocation();
You can specify projectId when you initialize the Storage class:
const storage = new Storage({
projectId: 'my-project-id',
keyFilename: '/path/to/keyfile.json'
});
Source

Firebase function always timeout on large files?

I have created a firebase function that is triggered when a video is uploaded to the firebase storage and by using ffmpeg it add a watermark to it, it works fine with small video sizes but it always timeout in large ones. Any idea how I can overcome these limits
const functions = require('firebase-functions');
const { Storage, Bucket } = require('#google-cloud/storage');
const projectId = 'video-sharing-a57fa';
const admin = require('firebase-admin');
admin.initializeApp();
let gcs = new Storage({
projectId
});
const os = require('os');
const path = require('path');
const spawn = require('child-process-promise').spawn;
exports.addLogo = functions.runWith({ memory: '4GB', timeoutSeconds: 540 }).storage.object().onFinalize(async event => {
const bucket = event.bucket;
const contentType = event.contentType;
const filePath = event.name;
console.log('File change detected, function execution started');
if (path.basename(filePath).startsWith('resized-')) {
console.log('We already renamed that file!');
return;
}
const destBucket = gcs.bucket(bucket);
const tmpFilePath = path.join(os.tmpdir(), path.basename(filePath));
const metadata = { contentType: contentType };
const tmpLogoPath = path.join(os.tmpdir(), 'watermark.png');
await destBucket.file('watermark.png').download({
destination: tmpLogoPath
})
const newPath = path.join(os.tmpdir(), 'output.mp4')
return destBucket.file(filePath).download({
destination: tmpFilePath
}).then(() => {
console.log('entered spawn');
var str = "overlay=10:10"
return spawn('ffmpeg', ['-i', tmpFilePath, '-i', tmpLogoPath, '-filter_complex', str, newPath]);
}).then(() => {
console.log('chaning the name');
return destBucket.upload(newPath, {
destination: path.dirname(filePath) + '/resized-' + path.basename(filePath),
metadata: metadata
})
});
})
Cloud functions have a limited time for execution, it is limited to 9 mins max. More information here. Most likely the problem is that ffmpeg does not manage to add the watermark in time. Your actions should be:
Check the log of the function to confirm that this is exactly the error firebase functions:log --only <FUNCTION_NAME>
Consider different a different architecture option for processing really large files:
a. Limit the amount of data ffmpeg processes, e.g. with -ss 50 -t 10. In this scenario, there will the following architecture: a) one function that read files and put them into a queue, b) one function that reads the size of the file and puts the data into another queue, e.g. {name: "file1.mp4", start: 10, duration: 15}
b. Use an on-demand container such as Cloud Run
c. Use App Engine in case you are constantly processing some files

Google Cloud Function - read the CONTENT of a new file created a bucket using NodeJS

In GCP (not firebase) I have a bucket and a function which will be called when a new file was created in the bucket. Works great.
/**
* Triggered from a change to a Cloud Storage bucket.
*/
exports.mycompany_upload_file = (event, context) => {
const gcsEvent = event;
const filename = gcsEvent.name;
const bucketName = event.bucket;
console.log(`=====================================================`);
console.log(`Event Type: ${context.eventType}`);
console.log(`Bucket: ${bucketName}`);
console.log(`Datei: ${filename}`);
console.log(`=====================================================`);
// now I would like to open that file and read it line-by-line
How do I address that file? What's the path of that file?
Can I use standard node libraries like 'fs'
I found this post that might help you, but looks like basically the answer is as follows:
Note that it will store the result in a ramdisk, so you'll need enough RAM available to your function to download the file.
var storage = require('#google-cloud/storage');
const gcs = storage({projectId: "<your_project>"});
const bucket = gcs.bucket("<your_bucket>");
const file = bucket.file("<path/to/your_file>")
exports.gcstest = (event, callback) => {
file.download({destination:"/tmp/test"}, function(err, file) {
if (err) {console.log(err)}
else{callback();}
})
};
For more information you can check the documentation for Google Cloud Storage: Node.js Client

Firebase function: Error: unable to open for write

so I was trying to implement a firebase function. I went to firebase functions repository example and copied it. Everything is working properly "Deploy complete!" with no signs of an error. However, when I'm trying to upload image to the firebase store, firebase functions can't open it?
There is a code that I used:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
const {Storage} = require("#google-cloud/storage");
const gcs = new Storage();
const path = require('path');
const os = require('os');
const fs = require('fs');
const sharp = require("sharp");
exports.generateThumbnail = functions.storage.object().onFinalize(async (object) => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
if (!contentType.startsWith('image/')) {
return console.log('This is not an image.');
}
const fileName = path.basename(filePath);
if (fileName.startsWith('thumb_')) {
return console.log('Already a Thumbnail.');
}
const bucket = admin.storage().bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
console.log('Created temporary path',tempFilePath);
const metadata = {
contentType: contentType,
};
await bucket.file(filePath).download({destination: tempFilePath});
console.log('Image downloaded locally to', tempFilePath);
const thumbFileName = `thumb_${fileName}`;
const thumbFilePath = path.join(path.dirname(filePath), thumbFileName);
console.log('Created thumb path',tempFilePath);
const size = 200;
/*await*/ sharp(tempFilePath).resize(size,size).toFile(thumbFilePath);
await bucket.upload(tempFilePath, {
destination: filePath,
metadata: metadata,
});
return fs.unlinkSync(tempFilePath);
});
Error:
Cloud Functions has a read-only filesystem except for the /tmp directory. You have to make sure you are writing your data to a path /tmp/your-file
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
Cloud Functions Execution Environment
I guess that this will work also in Firebase, else please comment:
gcsfs
If you put gcsfs in the "requirements.txt" and import gcsfs in the Python code, you can use the module like this (as an example taken from Have a look at an example for saving a csv:
fs = gcsfs.GCSFileSystem(project=MY_PROJECT)
fs.ls(BUCKET_NAME)
# or choose 'w' here:
with fs.open(filepath, 'wb') as outcsv:
...
Further links:
How to open a file from google cloud storage into a cloud function
https://gcsfs.readthedocs.io/en/latest/index.html#examples

Resources