Save multiple files to Firebase storage using cloud functions - node.js

I am looking around to find a way on how to be able to upload multiple images/files to Firebase Storage using functions but I am not able to find any example.
Actually I found an similar example but the the files limit is only 10mb and that is not enough for my case where user can post up to 15HD images.
I am looking forward just for a hint, example somewhere on how could I archieve this using functions.
Thank you in advance,
Cheers

It would be helpful to see what you have tried so far, but here's simple example of how this might look in a node module. It may be different using cloud functions, but it should at least point you in the right direction:
const Storage = require('#google-cloud/storage');
const storage = Storage({
projectId: firebaseProjectId,
credentials: serviceAccountCredentials
});
const bucketName = 'someFirebaseBucket';
const uploadFile = (filename) => {
return storage
.bucket(bucketName)
.upload(filename)
};
const uploadMultipleFiles = (fileList) => {
const uploads = fileList.map(uploadFile)
Promise.all(uploads)
.then(results => {
// handle results
})
.catch(error => {
// handle error
})
};

Related

Multer Error: Unexpected end of form at Multipart._final

I am using the multer package to handle uploading files to a google cloud bucket. Initially, my multer package was working correctly but once I migrated to firebase I found that all my attempts to upload files were not resolving.
After running firebase functions:log I found I was receiving the message Error: Unexpected end of form at Multipart._final I'm not totally sure why this error may be occurring especially since it was previously working.
This question seems to have the same issue however there don't seem to be any answers which help me. If possible I would like to know what the error message itself means as well.
Below is my code for invoking my multer function
const sendFile = () =>{
if(acceptedFiles.length > 0){
for(let i = 0; i < acceptedFiles.length; i++){
let file = acceptedFiles[i]
let blob = file.slice(0,file.size)
let newFile = new File([blob], file.name, {type: "text/plain"});
let formData = new FormData();
formData.append('dataFile',newFile)
fetch('https://somePath',{
method: "Post",
body: formData
})
.then(res => res.text())
.then((x) => console.log(x))
.then(() => sendDB(acceptedFiles[i],i))
}
}else{
window.alert("No files have been selected to be uploaded")
}
}
Here is where I actually send my file to my google bucket
app.post('/uploadFile', multer.single('dataFile'),async (req,res) =>{
console.log('made it to upload')
try{
if(req.file){
console.log('file found trying to upload....')
const blob = bucket.file(req.file.originalname);
const blobstream = blob.createWriteStream();
blobstream.on('finish',() => {
res.status(200).send('Success')
})
blobstream.end(req.file.buffer);
}
}catch(error){
res.status(500).send(error);
console.log(error)
}
})
Here is how I configured my multer and google cloud storage
const multer = Multer({
storage: Multer.memoryStorage()
})
let projectId = 'someID'
let keyFilename = 'someKeyFileName'
const storage = new Storage({
projectId,
keyFilename
})
I'm more than happy to provide any additional information if it's needed.
Update:
After some more research, it seems that firebase does not support serverside file upload. Is this only when using firebase storage or will this also apply to my case where I am attempting to upload my files to a google cloud bucket? Reference to this claim in case I'm incorrect.
I'm not sure if it's your case, because I was getting the same error, but without firebase, however in my case the problem was Multer itself. Dowgraded to 1.4.3 and it started working.
See https://github.com/expressjs/multer/issues/1144

Node.js Cloud Function - Stream CSV data directly to Google Cloud Storage file

I have a script that can call a RESTful API and retrieve CSV data from a report in chunks. I'm able to concatenate, parse, and display this data in the console. I am also able to write this CSV data to a local file and store it.
What I am trying to figure out is how to skip creating a file to store this data before uploading it to GCS and instead transfer it directly into Google Cloud Storage to save as a file. Since I am trying to make this a serverless cloud function, I am trying to stream it directly from memory into a Google Cloud Storage file.
I found this 'Streaming Transfers' documentation on google, but it only references doing this with 'gsutil' and I am struggling to find any examples or documentation on how to do this with node.js. I also tried to follow this answer on Stack overflow, but it's from 2013 and the methods seem a little out-dated. My script also isn't user-facing, so I don't need to hit any routes.
I am able to upload local files directly to my bucket using the function below, so Authentication isn't an issue. I'm just unsure how to convert a CSV blob or object in memory into a file in GCS. I haven't been able to find many examples so wasn't sure if anyone else has solved this issue in the past.
const { Storage } = require('#google-cloud/storage');
const storage = new Storage({
projectId,
keyFilename
});
function uploadCSVToGCS() {
const localFilePath = './test.csv';
const bucketName = "Test_Bucket";
const bucket = storage.bucket(bucketName);
bucket.upload(localFilePath);
};
I also found a 3rd party plugin that Google references called 'boto' that seems to do what I want, but this is for python, not node.js unfortunately.
Streaming object data to Cloud Storage is illustrated in the documentation. You will need to understand how node streams work, and make use of createWriteStream. The sample code is not exactly what you want, but you'll use the same pattern:
function sendUploadToGCS (req, res, next) {
if (!req.file) {
return next();
}
const gcsname = Date.now() + req.file.originalname;
const file = bucket.file(gcsname);
const stream = file.createWriteStream({
metadata: {
contentType: req.file.mimetype
},
resumable: false
});
stream.on('error', (err) => {
req.file.cloudStorageError = err;
next(err);
});
stream.on('finish', () => {
req.file.cloudStorageObject = gcsname;
file.makePublic().then(() => {
req.file.cloudStoragePublicUrl = getPublicUrl(gcsname);
next();
});
});
stream.end(req.file.buffer);
}
#doug-stevenson thanks for pushing me in the right direction. I was able to get it to work with the following code:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
const bucketName = 'test_bucket';
const blobName = 'test.csv';
const bucket = storage.bucket(bucketName);
const blob = bucket.file(blobName);
const request = require('request');
function pipeCSVToGCS(redirectUrl) {
request.get(redirectUrl)
.pipe(blob.createWriteStream({
metadata: {
contentType: 'text/csv'
}
}))
.on("error", (err) => {
console.error(`error occurred`);
})
.on('finish', () => {
console.info(`success`);
});
};

How to upload an image from an external link to google cloud storage?

I am using google-cloud-storage on Node.js . Trying to find a solution to programmatically download an image from an external url then upload to GCS.
I'm using fetch API to get the image and turn it into a blob
fetch('image.jpg').then( res => {
var blob = res.blob()
})
All tutorials I've found on the web deal with form upload using multer. But nothing about my case.
Now what's the right solution to upload this blob on my GCS bucket ?
EDIT : I've also tried this, without success
bucket.upload(urlImage, function(err, file){
const blobStream = file.createWriteStream();
blobStream.on('error', (err) => {
console.log('error',err)
});
blobStream.on('finish', () => {
console.log('finished with success')
});
blobStream.end(file);
})
Try using the request module for Node.js to obtain the image from the URL and pipe the result to be written into the bucket:
var request = require('request');
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('your-bucket-name');
const file = bucket.file('image-name.jpg');
request(‘image-URL’).pipe(file.createWriteStream());
It worked for me.

Firebase Cloud function & Cloud Vision API: TypeError: vision.detectText is not a function

I try to use the Cloud Vision API in a Firebase Cloud function to OCR an image stored in Firebase Storage.
I import the Google Cloud vision client library as follow
const vision = require('#google-cloud/vision');
and then I call
vision.detectText({ source: { imageUri: 'gs://xxxx.appspot.com/yyyy.JPG' } })
However I get an error
TypeError: vision.detectText is not a function
Initially I used
vision.textDetection({ source: { imageUri: ... } })
from this example https://cloud.google.com/vision/docs/reference/libraries#client-libraries-install-nodejs but I got the exact same error. I then read that textDetection has been replaced by detectText but no more success
Thanks in advance
It looks like you're not calling the APIs as documented. First, take a look at the sample code provided in the documentation:
const vision = require('#google-cloud/vision');
// Creates a client
const client = new vision.ImageAnnotatorClient();
/**
* TODO(developer): Uncomment the following line before running the sample.
*/
// const fileName = 'Local image file, e.g. /path/to/image.png';
// Performs text detection on the local file
client
.textDetection(fileName)
.then(results => {
const detections = results[0].textAnnotations;
console.log('Text:');
detections.forEach(text => console.log(text));
})
.catch(err => {
console.error('ERROR:', err);
});
You have to first create an ImageAnnotatorClient object, which as the textDetection() method you can call.

How to upload file in nodejs

I am a newbie in nodejs and firebase, but i need to upload files. I saw this tutorial but could not grab much. I am totally confused. In this tutorial which is the function to pass the choosed file?
the code is:
const keyFilename = "./my-private-api-key-file.json"; //replace this with api key file
const projectId = "my-project-id-should-go-here" //replace with your project id
const bucketName = `${projectId}.appspot.com`;
const mime = require('mime');
const gcs = require('#google-cloud/storage')({
projectId,
keyFilename
});
const bucket = gcs.bucket(bucketName);
const filePath = `./package.json`;
const uploadTo = `subfolder/package.json`;
const fileMime = mime.lookup(filePath);
bucket.upload(filePath, {
destination: uploadTo,
public: true,
metadata: {
contentType: fileMime,
cacheControl: "public, max-age=300"
}
}, function (err, file) {
if (err) {
console.log(err);
return;
}
console.log(createPublicFileURL(uploadTo));
});
function createPublicFileURL(storageName) {
return `http://storage.googleapis.com/${bucketName}/${encodeURIComponent(storageName)}`;
}
I want to upload file when the users selects a file. Can anyone provide me something to start ? thanks.
The purpose of the tutorial you're following is for handling files AFTER they have been uploaded to your server. This is useful for services (like Heroku or OpenShift, as the tutorial mentions) with ephemeral file systems that won't hold uploaded content permanently.
You should try following this tutorial instead, which explains how to build a front-end user-interface with AJAX. Once you've done that, then you might need to follow the tutorial linked in your question to get those files into permanent storage.
Hope this helped.
use multiparty library it helped me a lot in my current nodejs project

Resources