Download file from /tmp to client in Google Cloud Function - node.js

I am trying to download a file created on an http Google Cloud function saved in the /tmp directory. Everything I try throws "Error: could not handle the request".
The file my code generates is saved at /tmp/output.wav and I can use fs.readdirSync('/tmp') to see the file. But, if I try res.download('/tmp/output.wav', 'output.wav') it throws the "Error: could not handle the request".
If I do res.send( fs.readdirSync('/tmp')[0] ); I can see the file. So I know it's there and readable. Why can't I get it to download to the client?
I checked the logs and there's nothing additional. This code also executes locally on my machine.
I am wondering if this is due to a quota limitation? Would I get a better error if so?
Full code:
exports.master = async (req, res ) => {
const fs = require('fs');
// Some code to make the file at /tmp/output.wav
fs.readdirSync('/tmp').forEach(file => {
console.log(file);
files.push( file );
});
// Code will fire up to this point
res.download('/tmp/output.wav', 'output.wav');
});

Your code is working on my end:
exports.master = async (req, res) => {
const fs = require('fs')
// Some code to make the file at /tmp directory
fs.writeFileSync('/tmp/output.txt','testing output')
var files = [{}];
fs.readdirSync('/tmp','utf8').forEach(file => {
console.log(file);
files.push(file);
});
res.download('/tmp/output.txt','output.txt')
}
I believe there's a problem in your async function or creation of files or on your file/s.

Related

Streaming a large remote file

I need to serve local files from a different server using node. The api endpoint are being handled by express.
The goal is not to contain the entire file in memory instead stream the data so it shows the output to the enduser progressively.
By reading the stream api documentation i came up with this solution with a combination with expressjs response. Here is the example:
const open = (req, res) => {
const formattedUrl = new url.URL(
"https://dl.bdebooks.com/Old%20Bangla%20Books/Harano%20Graher%20Jantra%20Manob%20-%20Shaktimoy%20Biswas.pdf"
);
const src = fs.createReadStream(formattedUrl);
return src.pipe(res);
};
But when i hit this express endpoint http://localhost:3000/open it throws following error:
TypeError [ERR_INVALID_URL_SCHEME]: The URL must be of scheme file
I would like to display the file content inline! What I am doing wrong? Any suggestions will be greatly appreciated. :)
fs.createReadStream() operates on the file system. It does not accept an http or https URL. Instead, you need to use something like http.get() to make an http request and then return a readable stream that you can then pipe from.
const open = (req, res) => {
const formattedUrl = new url.URL("https://dl.bdebooks.com/Old%20Bangla%20Books/Harano%20Graher%20Jantra%20Manob%20-%20Shaktimoy%20Biswas.pdf");
http.get(formattedUrl, (stream) => {
stream.pipe(res);
}).on('error', (err) => {
// send some sort of error response here
});
};

NodeJs/Express display pdf file in browser

I am working on NodeJs/Express project and need to show in the browesr pdf file that is stored in
/public/images
Here is relevant router code:
router.post('/show_file', async (req,res)=>{
try {
let path = './public/images/1.pdf'
var data =fs.readFileSync(path);
res.contentType("application/pdf");
res.send(data);
} catch (err) {
res.status(500)
console.log(err)
res.send(err.message)
}
})
I don't get any errors but nothing is happening ie.browser is not opening etc.
Thanks in advance for any guidance.
The first change that I would do is to remove the async. It will just mess out the code with unneeded Promises.
Second, I removed the need to catch the exception, verifying the existence of the file with fs.existsSync(path). Try to not to rise exceptions as often as possible. If you know something can rise an exception, test it.
Last, and most important, I created a reading stream of the file and piped the result to the response with fs.createReadStream(path).pipe(res). This way, the client recieves the file as it is read and your memory is spared. Great for large files.
Reading a file can be memory intensive, so loading it all in memory is a bad practice. You just need a handfull of request to overload your machine.
You can read more on the pipe method here.
In this example, any GET call to /router/show_file will return the pdf.
const express = require('express')
const app = express()
const fs = require('fs')
const router = express.Router()
router.get('/show_file', (req, res) => {
const path = './public/images/1.pdf'
if (fs.existsSync(path)) {
res.contentType("application/pdf");
fs.createReadStream(path).pipe(res)
} else {
res.status(500)
console.log('File not found')
res.send('File not found')
}
})
app.use('/router', router) // Here we pass the router to the app with a path
app.listen(9999, () => console.log('Listening to port 9999'))

How to download files in /tmp folder of Google Cloud Function and then upload it in Google Cloud Storage

So I need to deploy a Google Cloud Function that allow me to make 2 things.
The first one is to DOWNLOAD any files on SFTP/FTP server on /tmp local directory of the Cloud Function. Then, the second step, is to UPLOAD this file in a bucket on the Google Cloud Storage.
Actually I know how to upload but I don't get how to DOWNLOAD files from ftp server to my local /tmp directory.
So, actually I have written a GCF that receive in parameters (on the body), the configuration (config) to allow me to connect on the FTP server, the filename and the path.
For my test I used the following ftp server test: https://www.sftp.net/public-online-sftp-servers with this configuration.
{
config:
{
hostname: 'test.rebex.net',
username: 'demo',
port: 22,
password: 'password'
},
filename: 'FtpDownloader.png',
path: '/pub/example'
}
After my DOWNLOAD, I start my UPLOAD. For that I check if I found the DOWNLOAD file in '/tmp/filename' before to UPLOAD but the file is nerver here.
See the following code:
exports.transferSFTP = (req, res) =>
{
let body = req.body;
if(body.config)
{
if(body.filename)
{
//DOWNLOAD
const Client = require('ssh2-sftp-client');
const fs = require('fs');
const client = new Client();
let remotePath
if(body.path)
remotePath = body.path + "/" + body.filename;
else
remotePath = "/" + body.filename;
let dst = fs.createWriteStream('/tmp/' + body.filename);
client.connect(body.config)
.then(() => {
console.log("Client is connected !");
return client.get(remotePath, dst);
})
.catch(err =>
{
res.status(500);
res.send(err.message);
})
.finally(() => client.end());
//UPLOAD
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({projectId: 'my-project-id'});
const bucket = storage.bucket('my-bucket-name');
const file = bucket.file(body.filename);
fs.stat('/tmp/' + body.filename,(err, stats) =>
{
if(stats.isDirectory())
{
fs.createReadStream('/tmp/' + body.filename)
.pipe(file.createWriteStream())
.on('error', (err) => console.error(err))
.on('finish', () => console.log('The file upload is completed !!!'));
console.log("File exist in tmp directory");
res.status(200).send('Successfully executed !!!')
}
else
{
console.log("File is not on the tmp Google directory");
res.status(500).send('File is not loaded in tmp Google directory')
}
});
}
else res.status(500).send('Error: no filename on the body (filename)');
}
else res.status(500).send('Error: no configuration elements on the body (config)');
}
So, I received the following message: "File is not loaded in tmp Google directory" because after fs.stat() method, stats.isDirectory() is false. Before I use the fs.stats() method to check if the file is here, I have just writen files with the same filenames but without content.
So, I conclude that my upload work but without DONWLOAD files is really hard to copy it in the Google Cloud Storage.
Thanks for your time and I hope I will find a solution.
The problem is that your not waiting for the download to be completed before your code which performs the upload starts running. While you do have a catch() statement, that is not sufficient.
Think of the first part (the download) as a separate block of code. You have told Javascript to go off an do that block asynchronously. As soon as your script has done that, it immediately goes on to do the the rest of your script. It does not wait for the 'block' to complete. As a result, your code to do the upload is running before the download has been completed.
There are two things you can do. The first would be to move all the code which does the uploading into a 'then' block following the get() call (BTW, you could simplify things by using fastGet()). e.g.
client.connect(body.config)
.then(() => {
console.log("Client is connected !");
return client.fastGet(remotePath, localPath);
})
.then(() => {
// do the upload
})
.catch(err => {
res.status(500);
res.send(err.message);
})
.finally(() => client.end());
The other alternative would be to use async/await, which will make your code look a little more 'synchronous'. Something along the lines of (untested)
async function doTransfer(remotePath, localPath) {
try {
let client - new Client();
await client.connect(config);
await client.fastGet(remotePath, localPath);
await client.end();
uploadFile(localPath);
} catch(err) {
....
}
}
here is a github project that answers a similar issue to yours.
here they deploy a Cloud Function to download the file from the FTP and upload them directly to the bucket, skipping the step of having the temporal file.
The code works, the deployment way in this github is not updated so I'll put the deploy steps as I suggest and i verified they work:
Activate Cloud Shell and run:
Clone the repository from github: git clone https://github.com/RealKinetic/ftp-bucket.git
Change to the directory: cd ftp-bucket
Adapt your code as needed
Create a GCS bucket, if you dont have one already you can create one by gsutil mb -p [PROJECT_ID] gs://[BUCKET_NAME]
Deploy: gcloud functions deploy importFTP --stage-bucket [BUCKET_NAME] --trigger-http --runtime nodejs8
In my personal experience this is more efficient than having it in two functions unless you need to do some file editing within the same cloud function

Firebase Cloud Function Serving Local File for Download

I created cloud function that generates an xlsx file, I need the user to download that file after it's generated.
Method 1: Upload to Bucket, then redirect
So far i've tried uploading the file to a bucket using this API, and then redirect him to the bucket file url, I also double checked the bucket name using this API, but I get the same error every time:
{"error":{"code":500,"status":"INTERNAL","message":"function crashed","errors":["socket hang up"]}}
Portion of the code that contains uploading to bucket:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
Portion of the code that proves file exists:
fs.access('myfile.xlsx', fs.constants.F_OK, (err) => {
console.log(`${file} ${err ? 'does not exist' : 'exists'}`);
});
I also checked if the library "#google-cloud/storage" reads the file, and it reads it correctly and gets the file size right.
Method 2: Direct Download
Download the file directly, the problem is that every doc online for nodejs to download a local file to the user is setting up a custom server to download the file, but i'm using firebase, so it's not in control of that server.
Just wanted to add more detail to the answer, since there's no need to write into a file and read from it to download it's data, simply take the data and send it, using the few lines below.
res.setHeader('Content-Type', 'application/vnd.openxmlformats');
res.setHeader("Content-Disposition", "attachment; filename=" + fileName);
res.end(fileData, 'binary');
If your excel file is created and should be returned to the client as a response to an HTTP request (calling to an API endpoint) then this is how you can do it.
export const getExcelFile = functions.https.onRequest(async (request, response) => {
// ...
// Create your file and such
// ..
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
response.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
response.send(fs.readFileSync('myfile.xlsx'));
return null;
});
Otherwise, if the excel file is created as a response to an event, and you want the user to download the file at another time, then you create a download link and serve it to the user in any way you want.
// ...
// Create your file and such
// ..
const [file] = await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
const [downloadUrl] = await file.getSignedUrl({
action: 'read',
expires: '20-03-2019' // Link expiry date: DD-MM-YYYY
});
console.log(downloadUrl);

Node.js to send images via REST API

Im struggling to find material on this
I have a rest API, written in node.js, that uses mongoDB.
I want users to be able to upload images (profile pictures) and have them saved on the server (in mongoDB).
A few questions, Ive seen it is recommended to use GridFS, is this the best solution?
How do i send these files? Ive seen res.sendFile, but again is this the best solution?
If anyone has any material they can link me I would be appreciative
thanks
You won't be able to get the file object on the server directly. To get file object on the server, use connect-multiparty middleware. This will allow you to access the file on the server.
var multipart = require('connect-multiparty');
var multipartmiddleware = multipart();
var mv = require('mv');
var path = require('path');
app.post("/URL",multipartmiddleware,function(req,res){
var uploadedImage = req.files.file;
for (var i = 0; i < uploadedImage.length; i++) {
var tempPath = uploadedImage[i].path;
var targetPath = path.join(__dirname ,"../../../img/Ads/" + i + uploadedImage[i].name);
mv(tempPath, targetPath, function (err) {
if (err) { throw err; }
});
}
})
Use file system
Generally in any database you store the image location in the data as a string that tells the application where the image is stored on the file system.
Unless your database needs to be portable as a single unit, the storing of images inside of the database as binary objects generally adds unnecessary size and complexity to your database.
-Michael Stearne
In MongoDB, use GridFS for storing files larger than 16 MB.
- Mongo Documentation
Therefore unless your images will be over 16 MB, you should either store the file on a CDN (preferable) or the server's own file system and save its URL to user's document on the database.
Local file system implementation
This method uses Busboy to parse the photo upload.
in relevant html file:
<input type="file" title="Choose a file to upload" accept="image/*" autofocus="1">
Handler function for your photo upload route in server file (you will need to fill in the variables that apply to you and require the necessary modules):
function photoUploadHandlerFunction (req, res) {
var busboy = new Busboy({ headers: req.headers })
busboy.on('file', function (fieldname, file, filename, encoding, mimetype) {
const saveToDir = path.join(__dirname, uploadsPath, user.id)
const saveToFile = path.join(saveToDir, filename)
const pathToFile = path.join(uploadsPath, user.id, filename)
const writeStream = fs.createWriteStream(saveToFile)
createDirIfNotExist(saveToDir)
.then(pipeUploadToDisk(file, writeStream))
.then(findUserAndUpdateProfilePic(user, pathToFile))
.catch((err) => {
res.writeHead(500)
res.end(`Server broke its promise ${err}`)
})
})
busboy.on('finish', function () {
res.writeHead(200, { 'Connection': 'close' })
res.end("That's all folks!")
})
return req.pipe(busboy)
}
Where the promise functions createDirIfNotExist and pipeUploadToDisk could look like this:
function createDirIfNotExist (directory, callback) {
return new Promise(function (resolve, reject) {
fs.stat(directory, function (err, stats) {
// Check if error defined and the error code is "not exists"
if (err) {
if (err.code === 'ENOENT') {
fs.mkdir(directory, (err) => {
if (err) reject(err)
resolve('made folder')
})
} else {
// just in case there was a different error:
reject(err)
}
} else {
resolve('folder already existed')
}
})
})
}
function pipeUploadToDisk (file, writeStream) {
return new Promise((resolve, reject) => {
const fileWriteStream = file.pipe(writeStream)
fileWriteStream.on('finish', function () {
resolve('file written to file system')
})
fileWriteStream.on('error', function () {
reject('write to file system failed')
})
})
}
To answer your question 'How do I send these files?', I would need to know where to (MongoDB, to the client...). If you mean to the client, you could serve the static folder where they are saved.
If you still want to learn about implementing GridFs tutorialspoint have a good tutorial
More material
Good tutorial on handling form uploads
Tutorial using the node-formidable module
If you're using the mongoose odm you can use the mongoose-crate module and send the file wherever for storage.
Also, this is a good case for shared object storage like AWS S3 or Azure blob storage. If you are running a distributed setup in something like AWS, you usually don't want to store photos on the local server.
Store the url or key name in the database that points to the S3 object. This also integrates with CloudFront CDN pretty easily.
As suggested before. MultiPart for the actual upload.

Resources