Firebase Cloud Function Serving Local File for Download - node.js

I created cloud function that generates an xlsx file, I need the user to download that file after it's generated.
Method 1: Upload to Bucket, then redirect
So far i've tried uploading the file to a bucket using this API, and then redirect him to the bucket file url, I also double checked the bucket name using this API, but I get the same error every time:
{"error":{"code":500,"status":"INTERNAL","message":"function crashed","errors":["socket hang up"]}}
Portion of the code that contains uploading to bucket:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
Portion of the code that proves file exists:
fs.access('myfile.xlsx', fs.constants.F_OK, (err) => {
console.log(`${file} ${err ? 'does not exist' : 'exists'}`);
});
I also checked if the library "#google-cloud/storage" reads the file, and it reads it correctly and gets the file size right.
Method 2: Direct Download
Download the file directly, the problem is that every doc online for nodejs to download a local file to the user is setting up a custom server to download the file, but i'm using firebase, so it's not in control of that server.

Just wanted to add more detail to the answer, since there's no need to write into a file and read from it to download it's data, simply take the data and send it, using the few lines below.
res.setHeader('Content-Type', 'application/vnd.openxmlformats');
res.setHeader("Content-Disposition", "attachment; filename=" + fileName);
res.end(fileData, 'binary');

If your excel file is created and should be returned to the client as a response to an HTTP request (calling to an API endpoint) then this is how you can do it.
export const getExcelFile = functions.https.onRequest(async (request, response) => {
// ...
// Create your file and such
// ..
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
response.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
response.send(fs.readFileSync('myfile.xlsx'));
return null;
});
Otherwise, if the excel file is created as a response to an event, and you want the user to download the file at another time, then you create a download link and serve it to the user in any way you want.
// ...
// Create your file and such
// ..
const [file] = await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
const [downloadUrl] = await file.getSignedUrl({
action: 'read',
expires: '20-03-2019' // Link expiry date: DD-MM-YYYY
});
console.log(downloadUrl);

Related

Trying to retrieve an mp3 file stored in AWS S3 and load it into my React client as a Blob...it's not working

I have a react web app that allows users to record mp3 files in the browser. These mp3 files are saved in an AWS S3 bucket and can be retrieved and loaded back into the react app during the user's next session.
Saving the file works just fine, but when I try to retrieve the file with getObject() and try to create an mp3 blob on the client-side, I get a small, unusable blob:
Here's the journey the recorded mp3 file goes on:
1) Saving to S3
In my Express/Node server, I receive the uploaded mp3 file and save to the S3 bucket:
//SAVE THE COMPLETED AUDIO TO S3
router.post("/", [auth, upload.array('audio', 12)], async (req, res) => {
try {
//get file
const audioFile = req.files[0];
//create object key
const userId = req.user;
const projectId = req.cookies.currentProject;
const { sectionId } = req.body;
const key = `${userId}/${projectId}/${sectionId}.mp3`;
const fileStream = fs.createReadStream(audioFile.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: key,
ContentType: "audio/mp3"
}
const result = await s3.upload(uploadParams).promise();
res.send(result.key);
} catch (error) {
console.error(error);
res.status(500).send();
}
});
As far as I know, there are no problems at this stage. The file ends up in my S3 bucket with "type: mp3" and "Content-Type: audio/mp3".
2) Loading file from S3 Bucket
When the react app is loaded up, an HTTP GET Request is made in my Express/Node server to retrieve the mp3 file from the S3 Bucket
//LOAD A FILE FROM S3
router.get("/:sectionId", auth, async(req, res) => {
try {
//create key from user/project/section IDs
const sectionId = req.params.sectionId;
const userId = req.user;
const projectId = req.cookies.currentProject;
const key = `${userId}/${projectId}/${sectionId}.mp3`;
const downloadParams = {
Key: key,
Bucket: bucketName
}
s3.getObject(downloadParams, function (error, data) {
if (error) {
console.error(error);
res.status(500).send();
}
res.send(data);
});
} catch (error) {
console.error(error);
res.status(500).send();
}
});
The "data" returned here is as such:
3) Making a Blob URL on the client
Finally, in the React client, I try to create an 'audio/mp3' blob from the returned array buffer
const loadAudio = async () => {
const res = await api.loadAudio(activeSection.sectionId);
const blob = new Blob([res.data.Body], {type: 'audio/mp3' });
const url = URL.createObjectURL(blob);
globalDispatch({ type: "setFullAudioURL", payload: url });
}
The created blob is severely undersized and appears to be completely unusable. Downloading the file results in a 'Failed - No file' error.
I've been stuck on this for a couple of days now with no luck. I would seriously appreciate any advice you can give!
Thanks
EDIT 1
Just some additional info here: in the upload parameters, I set the Content-Type as audio/mp3 explicitly. This is because when not set, the Content-Type defaults to 'application/octet-stream'. Either way, I encounter the same issue with the same result.
EDIT 2
At the request of a commenter, here is the res.data available on the client-side after the call is complete:
Based on the output of res.data on the client, there are a couple of things that you'd need to do:
Replace uses of res.data.Body with res.data.Body.data (as the actual data array is in the data attribute of res.data.Body)
Pass a Uint8Array to the Blob constructor, as the existing array is of a larger type, which will create an invalid blob
Putting that together, you would end up replacing:
const blob = new Blob([res.data.Body], {type: 'audio/mp3' });
with:
const blob = new Blob([new Uint8Array(res.data.Body.data)], {type: 'audio/mp3' });
Having said all that, the underlying issue is that the NodeJS server is sending the content over as a JSON encoded serialisation of the response from S3, which is likely overkill for what you are doing. Instead, you can send the Buffer across directly, which would involve, on the server side, replacing:
res.send(data);
with:
res.set('Content-Type', 'audio/mp3');
res.send(data.Body);
and on the client side (likely in the loadAudio method) processing the response as a blob instead of JSON. If using the Fetch API then it could be as simple as:
const blob = await fetch(<URL>).then(x => x.blob());
Your server side code seems alright to me. I'm not super clear about the client-side approach. Do you load this into the blob into the HTML5 Audio player.
I have a few approaches, assuming you're trying to load this into an audio tag in the UI.
<audio controls src="data:audio/mpeg;base64,blahblahblah or html src" />
Assuming that the file you had uploaded to S3 is valid here are two approaches:
Return the data as a base64 string instead of as a buffer directly from S3. You can do this in your server side by returning as
const base64MP3 = data.Body.toString('base64');
You can then pass this in to the MP3 player in the src property and it will play the audio. Prefix it with data:audio/mpeg;base64
Instead of returning the entire MP3 file, have your sectionID method return a presigned S3 URL. Essentially, this is a direct link to the object in S3 that is authorized for say 5 minutes.
Then you should be able to use this URL directly to stream the audio
and set it as the src. Keep in mind that it will expire.
const url = s3.getSignedUrl('getObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds
});
You stated: "The created blob is severely undersized and appears to be completely unusable"
This appears to me that you have an encoding issue. Once you read the MP3 from the Amazon S3 bucket, you need to encode it properly so it functions in a web page.
I did a similar multimedia use case that involved MP4 and a Java app. That is, i wanted a MP4 obtained from a bucket to play in the web page - as shown in this example web app.
Once I read the byte stream from the S3 bucket, I had to encode it so it would play in a HTML Video tag. Here is a good reference to properly encode a MP3 file.

creategunzip() on google cloud storage object

So I'm uploading backup files in JSON-format to a google cloud storage bucket. Server is NodeJS. To save space, I want to compress the files before uploading.
My function to upload a file is:
const bufferStream = new stream.PassThrough()
bufferStream.end(Buffer.from(req.file.buffer, 'utf8'))
const bucket = storage.bucket('backups')
const filename = 'backup.json.gz'
const file = bucket.file(filename)
const writeStream = file.createWriteStream({
metadata: {
contentType: 'application/json',
contentEncoding: 'gzip'
},
validation: "md5"
})
bufferStream.pipe(zlib.createGzip()).pipe(writeStream).on('finish', async () => {
return res.status(200).end()
})
This function works. I have a problem with the decompressing, while downloading. My function here is:
const bucket = storage.bucket('backups')
let backup = ''
const readStream = bucket.file('backup.json.gz').createReadStream()
readStream.pipe(zlib.createGunzip()) // <-- here
readStream.on('data', (data) => {
backup += data
})
readStream.on('end', () => {
res.status(200).send(backup).end()
})
When I use the download function, I get the following error:
Error: incorrect header check
Errno: 3
code: Z_DATA_ERROR
When I just delete the createGunzip() function, it all works! I can even read the content of the file, but for some reason, I'm thinking this might not be the ideal solution. Now, for testing, I have files with max. filesize 50kB but problably will get files > 10Mb in production.
Does the createGunzip() function expects a buffer? Or is there something else wrong?
Thanks!
According to the documentation if your objects are gzipped and uploaded properly,
then the returned object will be automatically decompressed, that's why no gunzipping needed in your case.
If you want to receive the file as-is then you should include Accept-Encoding: gzip headers with your request.

Display PDF file in ReactJS that is received from a Node.js server?

I am trying to build a system where a user can store pdf files on a server, and another user can view those pdf files by using a simple click on a file link.
I am trying to store a file in a MySQL database and retrieve it using app.get(). I have successfully stored the file in the database using BLOB, but when I try to retrieve it, it is in some other format.
I have also tried to store the file in local folder ./uploads using 'express-fileupload', but that also doesn't seem to work when I try to retrieve the file location. After receiving the file location I am sending it back to my React app, and then try to open it using embed and iframe tags.
I have also tried 'react-pdf', 'simple-react-pdf', but nothing seems to work.
Below is the code that is written on server side that is sending the pdf file. I have also tried sending the location of pdf file that is stored in location provided in the code below. But that also doesn't work.
app.get('/getFile', (req, res) => {
const {email, courseid, filename} = req.query;
console.log(email);
console.log(courseid);
console.log(filename);
var filePath = `${__dirname}`+'/uploads/'+`${filename}`;
fs.readFile(filePath , function (err,data){
console.log(data);
res.contentType("application/pdf");
res.send(data);
});
});
This worked for me:
Node:
app.get("/getFile", function(req, res) {
res.sendFile(__dirname + "/test.pdf");
});
React:
axios(`http://localhost:5000/getFile `, {
method: "GET",
responseType: "blob"
//Force to receive data in a Blob Format
})
.then(response => {
//Create a Blob from the PDF Stream
const file = new Blob([response.data], {
type: "application/pdf"
});
//Build a URL from the file
const fileURL = URL.createObjectURL(file);
//Open the URL on new Window
window.open(fileURL);
})
.catch(error => {
console.log(error);
});

Sending Zip file from server to client browser with Express and Archiver

I am a beginner with Node and I am trying to figure out how to create a zip file at the server then send it to the client and then download the zip file to the user's browser. I am using the Express framework and I am using Archiver to actually do the zipping. My server code is the following which was taken from Dynamically create and stream zip to client
router.get('/image-dl', function (req,res){
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-disposition': 'attachment; filename=myFile.zip'
});
var zip = archiver('zip');
// Send the file to the page output.
zip.pipe(res);
// Create zip with some files. Two dynamic, one static. Put #2 in a sub folder.
zip.append('Some text to go in file 1.', { name: '1.txt' })
.append('Some text to go in file 2. I go in a folder!', { name: 'somefolder/2.txt' })
.finalize();
});
So its zipping two text files and returning the result. On the client side I am using the following function in a service to actually call that endpoint
downloadZip(){
const headers = new Headers({'Content-Type': 'application/json'});
const token = localStorage.getItem('token')
? '?token=' + localStorage.getItem('token')
: '';
return this.http.get(this.endPoint + '/job/image-dl' + token, {headers: headers})
.map((response: Response) => {
const result = response;
return result;
})
.catch((error: Response) => {
this.errorService.handleError(error.json());
return Observable.throw(error.json());
});
}
and then I have another function which calls downloadZip() and actually downloads the zip file to the user's local browser.
testfunc(){
this.jobService.downloadZip().subscribe(
(blah:any)=>{
var blob = new Blob([blah], {type: "application/zip"});
FileSaver.saveAs(blob, "helloworld.zip");
}
);
}
When testfunc() is called, a zip file is downloaded to the user's browser however when I try to unzip it it creates a zip.cpgz file which then turns back into a zip file when clicked in an infinite loop which indicates that some kind of corruption happened. Can anyone see where I went wrong here?

Upload base64 encoded jpeg to Firebase Storage (Admin SDK)

I am trying to send a picture from my mobile hybrid app (Ionic 3) to my Heroku backend (Node.js) and have the backend upload the picture to Firebase Storage and return the newly uploaded fil download url to the mobile app.
Keep in mind that I am using the Firebase Admin SDK for Node.js.
So I send the base64 encoded image to Heroku (I check the encoded string with an online base64 decoder and it is alright) which is handle by the following function:
const uploadPicture = function(base64, postId, uid) {
return new Promise((resolve, reject) => {
if (!base64 || !postId) {
reject("news.provider#uploadPicture - Could not upload picture because at least one param is missing.");
}
let bufferStream = new stream.PassThrough();
bufferStream.end(new Buffer.from(base64, 'base64'));
// Retrieve default storage bucket
let bucket = firebase.storage().bucket();
// Create a reference to the new image file
let file = bucket.file(`/news/${uid}_${postId}.jpg`);
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg'
}
}))
.on('error', error => {
reject(`news.provider#uploadPicture - Error while uploading picture ${JSON.stringify(error)}`);
})
.on('finish', (file) => {
// The file upload is complete.
console.log("news.provider#uploadPicture - Image successfully uploaded: ", JSON.stringify(file));
});
})
};
I have 2 major issues:
Upload succeeds but I when I go to Firebase Storage console, there is an error when I try to display the preview of the picture and I cannot open it from my computer when I download it. I guess it is an encoding thing....?
How can I retrieve the newly uploaded file download url ? I was expecting an object to be returned in the .on('finish), like in the upload() function, but none is returned (file is undefined). How could I retrieve this url to send it back in the server response?
I want to avoid using the upload() function because I don't want to host files on the backend as it is not a dedicated server.
My problem was that I add data:image/jpeg;base64,at the beginning of the base64 object string ; I just had to remove it.
For the download url, I did the following:
const config = {
action: 'read',
expires: '03-01-2500'
};
let downloadUrl = file.getSignedUrl(config, (error, url) => {
if (error) {
reject(error);
}
console.log('download url ', url);
resolve(url);
});

Resources