I am writing a Firebase function that exposes an API endpoint using express. When the endpoint is called, it needs to download an image from an external API and use that image to make a second API call. The second API call needs the image to be passed as a readableStream. Specifically, I am calling the pinFileToIPFS endpoint of the Pinata API.
My Firebase function is using axios to download the image and fs to write the image to /tmp. Then I am using fs to read the image, convert it to a readableStream and send it to Pinata.
A stripped-down version of my code looks like this:
const functions = require("firebase-functions");
const express = require("express");
const axios = require("axios");
const fs = require('fs-extra')
require("dotenv").config();
const key = process.env.REACT_APP_PINATA_KEY;
const secret = process.env.REACT_APP_PINATA_SECRET;
const pinataSDK = require('#pinata/sdk');
const pinata = pinataSDK(key, secret);
const app = express();
const downloadFile = async (fileUrl, downloadFilePath) => {
try {
const response = await axios({
method: 'GET',
url: fileUrl,
responseType: 'stream',
});
// pipe the result stream into a file on disc
response.data.pipe(fs.createWriteStream(downloadFilePath, {flags:'w'}))
// return a promise and resolve when download finishes
return new Promise((resolve, reject) => {
response.data.on('end', () => {
resolve()
})
response.data.on('error', () => {
reject()
})
})
} catch (err) {
console.log('Failed to download image')
console.log(err);
throw new Error(err);
}
};
app.post('/pinata/pinFileToIPFS', cors(), async (req, res) => {
const id = req.query.id;
var url = '<URL of API endpoint to download the image>';
await fs.ensureDir('/tmp');
if (fs.existsSync('/tmp')) {
console.log('Folder: /tmp exists!')
} else {
console.log('Folder: /tmp does not exist!')
}
var filename = '/tmp/image-'+id+'.png';
downloadFile(url, filename);
if (fs.existsSync(filename)) {
console.log('File: ' + filename + ' exists!')
} else {
console.log('File: ' + filename + ' does not exist!')
}
var image = fs.createReadStream(filename);
const options = {
pinataOptions: {cidVersion: 1}
};
pinata.pinFileToIPFS(image, options).then((result) => {
console.log(JSON.stringify(result));
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Authorization, Origin, X-Requested-With, Accept");
res.status(200).json(JSON.stringify(result));
res.send();
}).catch((err) => {
console.log('Failed to pin file');
console.log(err);
res.status(500).json(JSON.stringify(err));
res.send();
});
});
exports.api = functions.https.onRequest(app);
Interestingly, my debug messages tell me that the /tmp folder exists, but the file of my downloaded file does not exist in the file system.
[Error: ENOENT: no such file or directory, open '/tmp/image-314502.png']. Note that the image can be accessed correctly when I manually access the URL of the image.
I've tried to download and save the file using many ways but none of them work. Also, based on what I've read, Firebase Functions allow to write and read temp files from /tmp.
Any advice will be appreciated. Note that I am very new to NodeJS and to Firebase, so please excuse my basic code.
Many thanks!
I was not able to see you are initializing the directory as suggested in this post:
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const thumbFileName = 'thumb_' + fileName;
const workingDir = join(tmpdir(), `${object.name.split('/')[0]}/`);//new
const tmpFilePath = join(workingDir, fileName);
const tmpThumbPath = join(workingDir, thumbFileName);
await fs.ensureDir(workingDir);
Also, please consider that if you are using two functions, the /tmp directory would not be shared as each one has its own. Here is an explanation from Doug Stevenson. In the same answer, there is a very well explained video about local and global scopes and how to use the tmp directory:
Cloud Functions only allows one function to run at a time in a particular server instance. Functions running in parallel run on different server instances, which have different /tmp spaces. Each function invocation runs in complete isolation from each other. You should always clean up files you write in /tmp so that they don't accumulate and cause a server instance to run out of memory over time.
I would suggest using Google Cloud Storage extended with Cloud Functions to achieve your goal.
Related
I'm trying to download a photo from an AWS S3 bucket via an express server to serve to a react app but I'm not having much luck. Here are my (unsuccessful) attempts so far.
The Workflow is as follows:
Client requests photo after retrieving key from database via Context API
Request sent to express server route (important so as to hide the true location from the client)
Express server route requests blob file from AWS S3 bucket
Express server parses image to base64 and serves to client
Client updates state with new image
React Client
const [profilePic, setProfilePic] = useState('');
useEffect(() => {
await actions.getMediaSource(tempPhoto.key)
.then(resp => {
console.log('server resp: ', resp.data.data.newTest) // returns ����\u0000�\u0000\b\u0006\
const url = window.URL || window.webkitURL;
const blobUrl = url.createObjectURL(resp.data.data.newTest);
console.log("blob ", blobUrl);
setProfilePic({ ...profilePic, image : resp.data.data.newTest });
})
.catch(err => errors.push(err));
}
Context API - just axios wrapped into its own library
getMediaContents = async ( key ) => {
return await this.API.call(`http://localhost:5000/${MEDIA}/mediaitem/${key}`, "GET", null, true, this.state.accessToken, null);
}
Express server route
router.get("/mediaitem/:key", async (req, res, next) => {
try{
const { key } = req.params;
// Attempt 1 was to try with s3.getObject(downloadParams).createReadStream();
const readStream = getFileStream(key);
readStream.pipe(res);
// Attempt 2 - attempt to convert response to base 64 encoding
var data = await getFileStream(key);
var test = data.Body.toString("utf-8");
var container = '';
if ( data.Body ) {
container = data.Body.toString("utf-8");
} else {
container = undefined;
}
var buffer = (new Buffer.from(container));
var test = buffer.toString("base64");
require('fs').writeFileSync('../uploads', test); // it never wrote to this directory
console.log('conversion: ', test); // prints: 77+977+977+977+9AO+/vQAIBgYH - this doesn't look like base64 to me.
delete buffer;
res.status(201).json({ newTest: test });
} catch (err){
next(ApiError.internal(`Unexpected error > mediaData/:id GET -> Error: ${err.message}`));
return;
}
});
AWS S3 Library - I made my own library for using the s3 bucket as I'll need to use more functionality later.
const getFileStream = async (fileKey) => {
const downloadParams = {
Key: fileKey,
Bucket: bucketName
}
// This was attempt 1's return without async in the parameter
return s3.getObject(downloadParams).createReadStream();
// Attempt 2's intention was just to wait for the promise to be fulfilled.
return await s3.getObject(downloadParams).promise();
}
exports.getFileStream = getFileStream;
If you've gotten this far you may have realised that I've tried a couple of things from different sources and documentation but I'm not getting any further. I would really appreciate some pointers and advice on what I'm doing wrong and what I could improve on.
If any further information is needed then just let me know.
Thanks in advance for your time!
Maybe it be useful for you, that's how i get image from S3, and process image on server
Create temporary directory
createTmpDir(): Promise<string> {
return mkdtemp(path.join(os.tmpdir(), 'tmp-'));
}
Gets the file
readStream(path: string) {
return this.s3
.getObject({
Bucket: this.awsConfig.bucketName,
Key: path,
})
.createReadStream();
}
How i process file
async MainMethod(fileName){
const dir = await this.createTmpDir();
const serverPath = path.join(
dir,
fileName
);
await pipeline(
this.readStream(attachent.key),
fs.createWriteStream(serverPath + '.jpg')
);
const createFile= await sharp(serverPath + '.jpg')
.jpeg()
.resize({
width: 640,
fit: sharp.fit.inside,
})
.toFile(serverPath + '.jpeg');
const imageBuffer = fs.readFileSync(serverPath + '.jpeg');
//my manipulations
fs.rmSync(dir, { recursive: true, force: true }); //delete temporary folder
}
I'm trying to use the cloud function to download a JSON file from here: http://jsonplaceholder.typicode.com/posts? then upload it to Cloud Storage bucket.
Log of function execution seems fine, the status returns 200. However, the JSON file uploaded to the bucket is only 20 Bytes and it is empty (while the original file is ~27 KB)
So please help me if I missed something, there is code and logs:
index.js
const {Storage} = require('#google-cloud/storage');
exports.writeToBucket = (req, res) => {
const http = require('http');
const fs = require('fs');
const file = fs.createWriteStream("/tmp/post.json");
const request = http.get("http://jsonplaceholder.typicode.com/posts?", function(response) {
response.pipe(file);
});
console.log('file downloaded');
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
const bucketName = 'tft-test-48c87.appspot.com';
const filename = '/tmp/post.json';
// Uploads a local file to the bucket
storage.bucket(bucketName).upload(filename, {
gzip: true,
metadata: {
cacheControl: 'no-cache',
},
});
res.status(200).send(`${filename} uploaded to ${bucketName}.`);
};
package.json
{
"name": "sample-http",
"version": "0.0.1",
"dependencies": {
"#google-cloud/storage": "^3.0.3"
}
}
Result:
Log:
As pointed by #DazWilkin, there are issues with asynchronous code. You must wait for onfinish() to trigger and then proceed. Also the upload() method returns a promise too. Try refactoring your function in async-await syntax as shown below:
exports.writeToBucket = async (req, res) => {
const http = require('http');
const fs = require('fs');
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
const bucketName = 'tft-test-48c87.appspot.com';
const filename = '/tmp/post.json';
await downloadJson()
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
gzip: true,
metadata: {
cacheControl: 'no-cache',
},
});
res.status(200).send(`${filename} uploaded to ${bucketName}.`);
}
const downloadJson = async () => {
const Axios = require('axios')
const fs = require("fs")
const writer = fs.createWriteStream("/tmp/post.json")
const response = await Axios({
url: "http://jsonplaceholder.typicode.com/posts",
method: 'GET',
responseType: 'stream'
})
response.data.pipe(writer)
return new Promise((resolve, reject) => {
writer.on('finish', resolve)
writer.on('error', reject)
})
}
This example uses Axios but you can do the same with http.
Do note that you can directly upload the fetched JSON as a file like this:
exports.writeToBucket = async (req, res) => {
const Axios = require("axios");
const { Storage } = require("#google-cloud/storage");
const storage = new Storage();
const bucketName = "tft-test-48c87.appspot.com";
const filename = "/tmp/post.json";
const { data } = await Axios.get("http://jsonplaceholder.typicode.com/posts");
const file = storage.bucket(bucketName).file("file.json");
const contents = JSON.stringify(data);
await file.save(contents);
res.status(200).send(`${filename} uploaded to ${bucketName}.`);
};
You can read more about the save() method in the documentation.
I don't write much NodeJS but I think your issue is with async code.
You create the stream and then issue the http.get but you don't block on the callback (piping the file) completing before you start the GCS upload.
You may want to attach an .on("finish", () => {...}) to the pipe and in that callback, upload the file to GCS.
NOTE IIRC GCS has a method that will let you write a stream directly from memory rather than going through a file.
NOTE if you pull the storage object up into the global namespace, it will only be created whenever the instance is created and not every time the function is invoked.
You don't need a write stream to get the URL data, fetch the URL, await the response to resolve, call the appropriate response.toJson() method.
Personally, I prefer to use Fetch and Axios over http as they are cleaner to work with. But with Nodes http you can do the following:
https.get(url,(res) => {
let body = "";
res.on("data", (chunk) => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
// do something with JSON
} catch (error) {
console.error(error.message);
};
});
}).on("error", (error) => {
console.error(error.message);
});
Once you have that, you can pass it directly to a storage method as a data blob or byte array.
byte[] byteArray = resultJson.toString().getBytes("UTF-8");
Hey everyone so quick question I want to allow a user to upload a WebM file and convert it using FFmpeg to mp4. I am using Nodejs for the backend and already have a route that uploads files to Amazon S3 file storage. But let's say I wanted to send that file and not store it anywhere but convert it to mp4 from the request itself is that possible? If not is it possible to take an s3 file URL and convert it to mp4? Can anybody point me in the right direction as to what is possible and the best way to do this?
basically all I want to do is
const objectUrl = createObjectURL(Blob);
ffmpeg -i objectURL S3OutputLocation
or
ffmpeg -i myS3InputLocation myS3OutputLocation
Okay so there is a couple of things you have to do in order to make this work.
1. you need to set up a local instance of multer as you need to upload the file locally before going to s3. I tried to do it with s3 directly but it seemed to use a lot of costly and time-consuming read operations that took much longer than writing the file to the server first. I found this to be the best solution.
You do this like so:
const localStorage = multer.diskStorage({
destination: function(req, file, cb) {
const destination = __dirname + "\\canvas-uploads";
console.log("destination", destination);
cb(null, destination);
},
filename: function(req, file, cb) {
const filename = req.body.id + "." + file.mimetype.toString().slice(file.mimetype.toString().lastIndexOf("/") + 1);
console.log("filename", filename);
cb(null, filename);
}
});
const uploadLocal = multer({
storage: localStorage
});
You need to set up ffmpeg-fluent and wrap it in a promise so you can be sure it's finished with all your processing (uploading to s3 and etc in the same route.) for convenience.
you do this like so:
router.post('/upload-temp', uploadLocal.array("upload"), async(req, res, next) =>{
res.json({id: req.body.id});
});
router.post('/ffmpeg', async(req, res, next) => {
try {
const reqPath = path.join(__dirname, '../upload/canvas-uploads/');
const {id, type} = req.body;
const localFileInput = `${reqPath}${id}.webm`;
const localFileOutput = `${reqPath}${id}.${type}`;
console.log("localInput", localFileInput);
console.log("localOutput", localFileOutput);
const key = `canvas/${id}.${type}`;
await new Promise((resolve, reject) => {
ffmpeg().input(localFileInput)
.withOutputFormat(type)
.output(localFileOutput)
.on('end',async ()=> {
const fileContent = await fs.readFileSync(localFileOutput);
await fs.unlinkSync(localFileInput);
await fs.unlinkSync(localFileOutput);
const params = {
Bucket: process.env.BUCKET_NAME,
Key: key,
Body: fileContent
}
await s3.putObject(params).promise();
resolve();
}).run();
})
res.send("success")
} catch(error){
console.log(error);
res.send(error);
}
});
I've written some node.js code which is sitting in the source of a Cloud Function. When it runs I want to read a text file from a google storage bucket, and process it.
The Code runs fine when running locally, but for some reason doesn't work when running in the Cloud Function. I would expect something written out from the console logs.
I can't see any errors, as I thought it might be a permissions problem (might be looking in the wrong place though).
Any ideas?
The awaits and async's were just because I wanted it to wait for the response before continuing, but that seems to have no effect on it either.
const fileName = 'testData.txt';
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('my_bucket_name');
const remoteFile = bucket.file(fileName);
await remoteFile.download(async function(err, contents) {
console.log("file err: "+err);
console.log("file data: "+contents);
});
What you can do is to verify that the runtime account for the function has the necessary permissions to access the bucket. In general the runtime account is PROJECT_ID#appspot.gserviceaccount.com and add at least the Storage Object Viewer (you can check more roles here).
Then, test the function again. If something goes wrong, please check the logs of the function.
EDIT
Not sure, but maybe seems to be something with the code. I've used the following to test the function and works perfect:
index.js:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('bucket_name');
const fileName = 'test.txt';
const remoteFile = bucket.file(fileName);
exports.helloWorld = (req, res) => {
console.log('Reading File');
var archivo = remoteFile.createReadStream();
console.log('Concat Data');
var buf = '';
archivo.on('data', function(d) {
buf += d;
}).on('end', function() {
console.log(buf);
console.log("End");
res.send(buf);
});
};
package.json:
{
"name": "sample-http",
"version": "0.0.1",
"dependencies": {
"#google-cloud/storage": "^4.7.0"
}
}
async function readStorageFile(obj){
try{
obj.Result = ''
if (obj.filename===undefined) return
bucket = gcs.bucket('gs:yourinfo');
//Get File
await bucket.file(obj.filename).download()
.then(async data => {
obj.data = data
obj.Result = 'OK'
return
})
.catch(err => {
return
})
return
}
catch (err){
return
}
}
obj = {filename:'TEST1.json'}
await readStorageFile(obj,'testing')
if (obj.Result==='OK') { console.log('obj.data='+obj.data) }
else { console.log('Not Found')}
return
I'm trying to get the permanent (unsigned) download URL after uploading a file to Google Cloud Storage. I can get the signed download URL using file.createWriteStream() but file.createWriteStream() doesn't return the UploadResponse that includes the unsigned download URL. bucket.upload() includes the UploadResponse, and Get Download URL from file uploaded with Cloud Functions for Firebase has several answers explaining how to get the unsigned download URL from the UploadResponse. How do I change file.createWriteStream() in my code to bucket.upload()? Here's my code:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({ projectId: 'my-app' });
const bucket = storage.bucket('my-app.appspot.com');
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + wordFileType);
const config = {
action: 'read',
expires: '03-17-2025',
content_type: 'audio/mp3'
};
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
I tried this, it didn't work:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
bucket.upload(response, options)
.then(function(uploadResponse) {
console.log('Then do something with UploadResponse.');
})
.catch(error => console.error(error));
});
});
}
The error message was Path must be a string. In other words, response is a variable but needs to be a string.
I used the Google Cloud text-to-speech API to simulate what you are doing. Getting the text to create the audio file from a text file. Once the file was created, I used the upload method to add it to my bucket and the makePublic method to got its public URL. Also I used the async/await feature offered by node.js instead of function chaining (using then) to avoid the 'No such object: ..." error produced because the makePublic method is executed before the file finishes uploading to the bucket.
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client using Application Default Credentials
const storage = new Storage();
// Imports the Google Cloud client library
const textToSpeech = require('#google-cloud/text-to-speech');
// Get the bucket
const myBucket = storage.bucket('my_bucket');
// Import other required libraries
const fs = require('fs');
const util = require('util');
// Create a client
const client = new textToSpeech.TextToSpeechClient();
// Create the variable to save the text to create the audio file
var text = "";
// Function that reads my_text.txt file (which contains the text that will be
// used to create my_audio.mp3) and saves its content in a variable.
function readFile() {
// This line opens the file as a readable stream
var readStream = fs.createReadStream('/home/usr/my_text.txt');
// Read and display the file data on console
readStream.on('data', function (data) {
text = data.toString();
});
// Execute the createAndUploadFile() fuction until the whole file is read
readStream.on('end', function (data) {
createAndUploadFile();
});
}
// Function that uploads the file to the bucket and generates it public URL.
async function createAndUploadFile() {
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML voice gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the text-to-speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
const writeFile = util.promisify(fs.writeFile);
await writeFile('my_audio.mp3', response.audioContent, 'binary');
console.log('Audio content written to file: my_audio.mp3');
// Wait for the myBucket.upload() function to complete before moving on to the
// next line to execute it
let res = await myBucket.upload('/home/usr/my_audio.mp3');
// If there is an error, it is printed
if (res.err) {
console.log('error');
}
// If not, the makePublic() fuction is executed
else {
// Get the file in the bucket
let file = myBucket.file('my_audio.mp3');
file.makePublic();
}
}
readFile();
bucket.upload() is a convenience wrapper around file.createWriteStream() that takes a local filesystem path and upload the file into the bucket as an object:
bucket.upload("path/to/local/file.ext", options)
.then(() => {
// upload has completed
});
To generate a signed URL, you'll need to get a file object from the bucket:
const theFile = bucket.file('file_name');
The file name will either be that of your local file, or if you specified an alternate remote name options.destination for the file on GCS.
Then, use File.getSignedUrl() to get a signed URL:
bucket.upload("path/to/local/file.ext", options)
.then(() => {
const theFile = bucket.file('file.ext');
return theFile.getSignedURL(signedUrlOptions); // getSignedURL returns a Promise
})
.then((signedUrl) => {
// do something with the signedURL
});
See:
Bucket.upload() documentation
File.getSignedUrl() documentation
You can make a specific file in a bucket publicly readable with the method makePublic.
From the docs:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// 'my-bucket' is your bucket's name
const myBucket = storage.bucket('my-bucket');
// 'my-file' is the path to your file inside your bucket
const file = myBucket.file('my-file');
file.makePublic(function(err, apiResponse) {});
//-
// If the callback is omitted, we'll return a Promise.
//-
file.makePublic().then(function(data) {
const apiResponse = data[0];
});
Now the URI http://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME] is a public link to the file, as explained here.
The point is that you only need this minimal code to make an object public, for instance with a Cloud Function. Then you already know how the public link is and can use it directly in your app.