Firebase function: Error: unable to open for write - node.js

so I was trying to implement a firebase function. I went to firebase functions repository example and copied it. Everything is working properly "Deploy complete!" with no signs of an error. However, when I'm trying to upload image to the firebase store, firebase functions can't open it?
There is a code that I used:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
const {Storage} = require("#google-cloud/storage");
const gcs = new Storage();
const path = require('path');
const os = require('os');
const fs = require('fs');
const sharp = require("sharp");
exports.generateThumbnail = functions.storage.object().onFinalize(async (object) => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
if (!contentType.startsWith('image/')) {
return console.log('This is not an image.');
}
const fileName = path.basename(filePath);
if (fileName.startsWith('thumb_')) {
return console.log('Already a Thumbnail.');
}
const bucket = admin.storage().bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
console.log('Created temporary path',tempFilePath);
const metadata = {
contentType: contentType,
};
await bucket.file(filePath).download({destination: tempFilePath});
console.log('Image downloaded locally to', tempFilePath);
const thumbFileName = `thumb_${fileName}`;
const thumbFilePath = path.join(path.dirname(filePath), thumbFileName);
console.log('Created thumb path',tempFilePath);
const size = 200;
/*await*/ sharp(tempFilePath).resize(size,size).toFile(thumbFilePath);
await bucket.upload(tempFilePath, {
destination: filePath,
metadata: metadata,
});
return fs.unlinkSync(tempFilePath);
});
Error:

Cloud Functions has a read-only filesystem except for the /tmp directory. You have to make sure you are writing your data to a path /tmp/your-file
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
Cloud Functions Execution Environment

I guess that this will work also in Firebase, else please comment:
gcsfs
If you put gcsfs in the "requirements.txt" and import gcsfs in the Python code, you can use the module like this (as an example taken from Have a look at an example for saving a csv:
fs = gcsfs.GCSFileSystem(project=MY_PROJECT)
fs.ls(BUCKET_NAME)
# or choose 'w' here:
with fs.open(filepath, 'wb') as outcsv:
...
Further links:
How to open a file from google cloud storage into a cloud function
https://gcsfs.readthedocs.io/en/latest/index.html#examples

Related

Firebase function always timeout on large files?

I have created a firebase function that is triggered when a video is uploaded to the firebase storage and by using ffmpeg it add a watermark to it, it works fine with small video sizes but it always timeout in large ones. Any idea how I can overcome these limits
const functions = require('firebase-functions');
const { Storage, Bucket } = require('#google-cloud/storage');
const projectId = 'video-sharing-a57fa';
const admin = require('firebase-admin');
admin.initializeApp();
let gcs = new Storage({
projectId
});
const os = require('os');
const path = require('path');
const spawn = require('child-process-promise').spawn;
exports.addLogo = functions.runWith({ memory: '4GB', timeoutSeconds: 540 }).storage.object().onFinalize(async event => {
const bucket = event.bucket;
const contentType = event.contentType;
const filePath = event.name;
console.log('File change detected, function execution started');
if (path.basename(filePath).startsWith('resized-')) {
console.log('We already renamed that file!');
return;
}
const destBucket = gcs.bucket(bucket);
const tmpFilePath = path.join(os.tmpdir(), path.basename(filePath));
const metadata = { contentType: contentType };
const tmpLogoPath = path.join(os.tmpdir(), 'watermark.png');
await destBucket.file('watermark.png').download({
destination: tmpLogoPath
})
const newPath = path.join(os.tmpdir(), 'output.mp4')
return destBucket.file(filePath).download({
destination: tmpFilePath
}).then(() => {
console.log('entered spawn');
var str = "overlay=10:10"
return spawn('ffmpeg', ['-i', tmpFilePath, '-i', tmpLogoPath, '-filter_complex', str, newPath]);
}).then(() => {
console.log('chaning the name');
return destBucket.upload(newPath, {
destination: path.dirname(filePath) + '/resized-' + path.basename(filePath),
metadata: metadata
})
});
})
Cloud functions have a limited time for execution, it is limited to 9 mins max. More information here. Most likely the problem is that ffmpeg does not manage to add the watermark in time. Your actions should be:
Check the log of the function to confirm that this is exactly the error firebase functions:log --only <FUNCTION_NAME>
Consider different a different architecture option for processing really large files:
a. Limit the amount of data ffmpeg processes, e.g. with -ss 50 -t 10. In this scenario, there will the following architecture: a) one function that read files and put them into a queue, b) one function that reads the size of the file and puts the data into another queue, e.g. {name: "file1.mp4", start: 10, duration: 15}
b. Use an on-demand container such as Cloud Run
c. Use App Engine in case you are constantly processing some files

Google Cloud Function - read the CONTENT of a new file created a bucket using NodeJS

In GCP (not firebase) I have a bucket and a function which will be called when a new file was created in the bucket. Works great.
/**
* Triggered from a change to a Cloud Storage bucket.
*/
exports.mycompany_upload_file = (event, context) => {
const gcsEvent = event;
const filename = gcsEvent.name;
const bucketName = event.bucket;
console.log(`=====================================================`);
console.log(`Event Type: ${context.eventType}`);
console.log(`Bucket: ${bucketName}`);
console.log(`Datei: ${filename}`);
console.log(`=====================================================`);
// now I would like to open that file and read it line-by-line
How do I address that file? What's the path of that file?
Can I use standard node libraries like 'fs'
I found this post that might help you, but looks like basically the answer is as follows:
Note that it will store the result in a ramdisk, so you'll need enough RAM available to your function to download the file.
var storage = require('#google-cloud/storage');
const gcs = storage({projectId: "<your_project>"});
const bucket = gcs.bucket("<your_bucket>");
const file = bucket.file("<path/to/your_file>")
exports.gcstest = (event, callback) => {
file.download({destination:"/tmp/test"}, function(err, file) {
if (err) {console.log(err)}
else{callback();}
})
};
For more information you can check the documentation for Google Cloud Storage: Node.js Client

Uploading fire to Firebase Storage using firebase-admin in Node

I need to upload a file from a Node script to my Firebase storage using firebase-admin.
This is how I'm doing in the browser: (IT WORKS)
// THIS IS INSIDE A FUNCTION
const storageRef = firebase.storage().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
This is how I'm TRYING to do in the Node script:
const admin = require('firebase-admin');
admin.initializeApp();
// THIS IS INSIDE A FUNCTION
const storageRef = admin.storage().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
And I'm getting the error:
TypeError: admin.storage(...).ref is not a function
I also tried:
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "MY_BUCKET.appspot.com"
});
// THIS IS INSIDE A FUNCTION
const storageRef = admin.storage().bucket().ref('my-images/' + fileName);
const uploadTask = storageRef.put(file,metadata);
And I'm getting the error:
TypeError: admin.storage(...).bucket(...).ref is not a function
QUESTION
I need to save the file in my default bucket inside the my-images folder. What is the proper way of doing it?
With the Google Cloud Storage Node.js Client you need to use a different approach compare to the JavaScript SDK.
More precisely, you need to use the upload() method of a Bucket, as follows:
bucket.upload('my-images/' + fileName).then(function(data) {
const file = data[0];
});
You will find more example in the documentation (link above)

How can I upload single file to multiple path on aws s3 using nodejs?

How can I upload single file to multiple path on aws-s3 using nodeJS?
Problem: I have 1 images file: images.jpg. And now, I want to upload this file to aws-s3 with different path.
Uploading file to different directory in s3, Give your accesskeyid and secretaccesskey,
const fs = require('fs'),
AWS = require('aws-sdk'),
{promisify} = require('util')
const uploadFile =async (file_path, folder) => {
try {
var s3 = new AWS.S3({
accessKeyId: '',
secretAccessKey: ''
});
//reading the file and converting it into buffer
const readFile = promisify(fs.readFile).bind(fs);
const data =await readFile(file_path)
if (!data) throw "reading file failed";
const params = {
Bucket: 'testbucketnayan', // bucket name
Key: `${folder}/file.js`,
Body: data.toString()
};
const upload = promisify(s3.upload).bind(s3);
const up_data = await upload(params)
if (!up_data) throw "Upload failed"
console.log(up_data)
} catch(err) {
console.log(err);
}
}
The upload location is having the directory names. Where the file will get uploaded. looping through the array and uploading the file, you can use Promise.all, it will increase the performance.
const upload_location = ['abc', 'def'];
(async () => {
for (var folder of upload_location)
await uploadFile('resposetwi.js', folder);
})()
Please ask your question with the code you tried, it makes it easier to understand where is the issue.

Check if image exists at storage with firebase cloud function

i need your help with one function that i createed to manipulate the images that my users send to my app.
What i need is get the image that the user sent, resize and check if the image was changed to avoid the function to do all again. The examples that i saw change the image name and check if the beggined if the name is equals with the name set, but in my case i need keep the original name of the picture, So, how can i do that? Or exists a better way to solve this problem?
My function code is:
import * as functions from 'firebase-functions';
import * as Storage from '#google-cloud/storage';
const gcs = new Storage();
import { tmpdir } from 'os';
import { join, dirname } from 'path';
import * as sharp from 'sharp';
import * as fs from 'fs-extra';
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (!object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. Ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Source File
await bucket.file(filePath).download({
destination: tmpFilePath
});
// 3. Resize the images and define an array of upload promises
const sizes = [64, 128, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
// Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
// Upload to GCS
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName)
});
});
// 4. Run the upload operations
await Promise.all(uploadPromises);
// 5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
});
If you need to overwrite the original file, and you want to avoid an infinite loop with this function, you could attach custom metadata to the file when you upload it back to Cloud Storage. Then, when the function is invoked again for that file, you can check the metadata on the incoming ObjectMetadata object to know when the function should bail out without making any more changes.

Resources