Handling errors in express pipelines - node.js

I have a small web service that basically receives a PUT and saves the payload to a file. It can happen that the file write fails due to permission issues. I would like that to be communicated back to the client.
To make it easier to read I have boiled the program down to a minimum. The req.pipe chain is way longer in the real program. (with many more possibilities for errors)
const fs = require('fs');
const express = require('express');
const app = express();
app.put('/write/:id', (req, res, next) => {
const filename = 'data/' + req.params.id;
console.log("write: " + filename);
req
.pipe(fs.createWriteStream(filename))
.on('error', next)
req.on('end', () => {
res.send('saved\n' );
console.log("sent response");
})
});
app.listen(8081, '0.0.0.0');
Trouble is that no matter what I do it will always respond "saved" to the client. I had kinda hoped the next call would have got me out of there.
What is the elegant way to make differentiated responses on errors occurring server side?

several hours later it seems I nailed it:
const fs = require('fs');
const express = require('express');
const app = express();
app.put('/write/:id', (req, res, next) => {
const filename = 'data/' + req.params.id;
console.log("write: " + filename);
req
.pipe(fs.createWriteStream(filename))
.on('error', (e) => {
console.log("error ", e);
res.status(400).send("failed");
})
.on('close', () => {
res.send("saved\n");
})
});
app.listen(8081, '0.0.0.0');
Notice how I'm listening for close within the pipeline and not end on the request it self.

Related

Downloading JSON file in expressjs and reading it

I have a task where I am given a URL such as https://xyz.json. This URL prompts the downloading of the JSON file into the local. I am now required to read the use this JSON data for further processing. Since I am new to NodeJS and express, I find myself confused about how to achieve this in ExpressJS.
This is what I've tried :
const https = require("https");
const fs = require("fs");
const file = fs.createWriteStream("outputFile.json");
const request = https.get(
"https://xyz.json",
function (response) {
response.pipe(file);
// after download completed close filestream
file.on("finish", () => {
file.close();
console.log("Download Completed");
});
}
);
Here, in the outputFile.json, no data is present
Qn2) Can I periodically download using setTimeOut(). Would it be efficient or is there any better way of caching data to make the application faster?
Thanks in advance!
Here's a sample app that downloads a json triggered when you hit an API route hosted as ExpressJS sever.
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const bodyParser = require('body-parser');
const axios = require('axios');
const fs = require('fs');
const app = express();
app.use(cors());
app.use(morgan(':method :url :status :user-agent - :response-time ms'));
app.use(bodyParser.json());
app.get('/', async (req, res) => {
try {
const { status, data } = await axios.get('http://52.87.135.24/json-files/events.json'); // Can be replaced by your json url
if (status === 200) {
fs.writeFileSync('data.json', JSON.stringify(data));
res.status(200).json({
success: 'Downloaded file.',
data: data // Comment it if you don't want to send the data back
})
} else {
res.status(404).json({ 'Failed': 'File not found.' })
}
} catch (err) {
console.log(err);
res.status(500).json({ 'Error': 'Internal Server Error' });
}
});
app.listen(process.env.PORT || 3000, function () {
console.log('Express app running on port ' + (process.env.PORT || 3000))
});
And as I mentioned that this download gets triggered every time you make a request on http://localhost:3000 in this case, you can create a client script that acts like a cron job in which you can use the setTimeout or actually, setInterval to download your file periodically.
const axios = require('axios');
setInterval(async () => {
await axios.get('http://localhost:3000/');
}, 5000);
Here's such a script along! :)

How to use Node to read PUT file?

I'm trying to replicate the functionality of bashupload.com but using node. I want the simplicity of just doing curl host -T file but I ran into some problems because I can't seem to understand how to read the PUT file. Curl uses a PUT request when you use the -T option, so it has to be PUT.
I tried using packages like multiparty:
receiveAndUploadFile: function (req, res) {
var multiparty = require('multiparty');
var form = new multiparty.Form();
// var fs = require('fs');
form.parse(req, function(err, fields, files) {
console.log('The files', files)
console.log('The fields', fields)
})
res.send("Okay, bye.")
}
But this prints undefined values for files and fields.
I also tried using express-fileupload middleware
app.use(fileUpload({}));
but still, if I try to print req.files then I will get undefined.
Is there any specific way to read the file?
Thanks a lot!
This is my main file, index.js::
const express = require("express");
const path = require("path");
const app = express();
const port = 8080;
const tools = require("./tools");
const fileUpload = require("express-fileupload");
app.use(fileUpload());
app.use(express.static(__dirname + "/assets"));
app.get("/", (req, res) => {
res.sendFile(path.join(__dirname + "/index.html"));
});
app.get("/f", (req, res) => {
res.send("This route is only available as a POST route.");
});
app.put("/f", tools.receiveAndUploadFile);
app.listen(port, () => {
console.log(`Server started listening on port: ${port}`);
});
And the tools.js file:
const fs = require("fs");
const path = require("path");
module.exports = {
receiveAndUploadFile: function (req, res) {
console.log("Files: ", req.files);
res.send("Okay bye");
},
};
This is printing "Files: undefined" to the console.
A PUT and a POST are effectively the same thing. To upload arbitrary data, just read the data stream and write it to a file. Node provides a .pipe method on streams to easily pipe data from one stream into another, for example a file stream here:
const fs = require('fs')
const express = require('express')
const app = express()
const PORT = 8080
app.get('/*', (req, res) => res.status(401).send(req.url + ': This route is only available as a POST route'))
app.put('/*', function (req, res, next) {
console.log('Now uploading', req.url, ': ', req.get('content-length'), 'bytes')
req.pipe(fs.createWriteStream(__dirname + req.url))
req.on('end', function () { // Done reading!
res.sendStatus(200)
console.log('Uploaded!')
next()
})
})
app.listen(8080, () => console.log('Started on :8080'))
If you do a PUT to /file.mp4, it will upload all the data over to the script dir (__dirname) + the URL file path.
via curl, curl http://localhost:8080/ -T hello.txt

Download File on Request in Firebase

I am looking for a solution to directly download a file in the Firebase Storage when hitting an API endpoint. I tried initializing a Google-Cloud Storage and downloading the file from the bucket.
const app = require('express')();
const { Storage } = require("#google-cloud/storage");
const storage = new Storage({keyFilename: keyPath});
app.get("/download", (req, res) => {
storage.bucket(bucketName).file("file.txt").download({destination: './file.txt'});
});
app.listen(8080);
But this does not work!
I simply get:
UnhandledPromiseRejectionWarning: Error: Not Found
Could someone help me, please?
Where did you initialize the app
Original answer:
// Dependencies
const express = require('express')
const PORT = process.env.PORT || 3002;
// Initialize the App
const app = express();
// Start the app
app.listen(PORT, () => {
console.info(`Server is listening on port ${PORT}`);
});
Update:
Making HTTP requests to download files is an asynchronous operation. You need to wait for the file to be downloaded from the Google Cloud Storage before sending it to the client
const app = require('express')();
const { Storage } = require("#google-cloud/storage");
const storage = new Storage({keyFilename: keyPath});
// I am using async/await here
app.get("/download", async (req, res) => {
// You have to wait till the file is downloaded
await storage.bucket(bucketName).file("file.txt").download({destination: './file.txt'});
// Send the file to the client
res.download('./file.txt')
});
app.listen(8080);
If the intention is to stream the file to the requesting client, you can pipe the data from Cloud Storage through to the response. It will look similar to the following:
const {Storage} = require('#google-cloud/storage');
const express = require('express');
const BUCKET_NAME = 'my-bucket';
const app = express();
const storage = new Storage({keyFilename: './path/to/service/key.json'});
app.get("/download", (req, res) => {
storage.bucket(bucketName).file("path/in/bucket/to/file.txt").createReadStream()
.on('error', (err) => {
res.status(500).send('Internal Server Error');
console.log(err);
})
.on('response', (storageResponse) => {
// make sure to check storageResponse.status
res.setHeader('content-type', storageResponse.headers['Content-Type']);
res.setHeader('content-length', storageResponse.headers['Content-Length']);
res.status(storageResponse.status);
// other headers may be necessary
// if status != 200, make sure to end the response as appropriate. (as it won't reach the below 'end' event)
})
.on('end', () => {
console.log('Piped file successfully.');
res.end();
}).pipe(res);
});
app.listen(8080);

logging all request and data in server file node.js

I have to build a custom logger that logs information about each request it receives. I have to use: Agent,Time,Method,Resource,Version,Status I think I already created my logger and the things i want to to log. now i have to Expose an endpoint http://localhost:3000/logs that will return a json object with all the logs I dont know how to do it. Help!
const express = require('express');
const fs = require('fs');
const app = express();
app.use(( req, res, next) => {
// write your logging code here
var agent = req.headers('user-agent');
var time = new Date()
var method = req.method;
var baseUrl = req.originalUrl;
var version = 'HTTP/' + req.httpVersion;
var status = res.statusCode;
var allData = agent + time + method + baseUrl + version + status;
fs.appendFile('./log.csv', allData, (err) => {
if (err) throw err;
console.log(allData)
next()
})
});
app.get('/', (req, res) => {
// write your code to respond "ok" here
res.status(200).send('Ok');
});
app.get('/logs', (req, res) => {
// write your code to return a json object containing the log data here
fs.readFile('log.csv', 'utf8', (err, data) => {
});
module.exports = app;
}
Check this library. You can use it for csvtojson conversion.
Anyway, consider that if the csv file dimensions grows a lot, reading the whole file and converting it to JSON will be an overkill. Consider a database for a scalable solution.

Node JS Stream Only works on first iteration

I have a simple node app that parses a csv file into a string. In my server file, I call a module that runs makes a stream and pipes it into my parser.
The problem is that is code works perfectly the first time it is run, but fails after that. I've gotten a "Write after End" error so I believe there is something wrong with the stream or parser variable not being reset properly after each use. Thanks for any help!
const express = require('express');
const app = express();
const path = require('path');
const port = process.env.PORT || 3000;
const formidable = require('formidable');
const parser = require('./csvparse.js');
const fs = require('fs');
//send the index page on a get request
app.listen(port, () => console.log('Example app listening on port: ' + port));
app.get('*', (req, res) => res.sendFile(path.join(__dirname + "/index.html")));
app.post('/upload', function(req, res) {
//upload the file from the html form
var form = new formidable.IncomingForm();
form.parse(req,function(err, fields, files) {
if (err) throw err;
//get the path to the uploaded file for the parser to use
var filePath = files.spinFile.path;
parser(filePath, function(data) {
if (data == null) {
res.sendFile(path.join(__dirname + "/index.html"));
}
res.send("<code>" + data + "</code>");
});
});
});
The module export function looks like this:
module.exports = function(filePath, cb) {
var stream = fs.createReadStream(filePath);
stream.pipe(parser);
//when the stream is done, songsLog is complete and ready to be returned
stream.on('close', function() {
cb(songsLog);
});
};
Try wrapping the contents of your module.exports in another function.
module.exports = function(filepath, cb) {
function parse(filepath) {
const stream = fs.createReadStream(filepath)
etc...
}
return {
parse
}
}
then from your route, call parser.parse('file.txt') and you should have a new read stream.

Resources