I'm trying to replicate the functionality of bashupload.com but using node. I want the simplicity of just doing curl host -T file but I ran into some problems because I can't seem to understand how to read the PUT file. Curl uses a PUT request when you use the -T option, so it has to be PUT.
I tried using packages like multiparty:
receiveAndUploadFile: function (req, res) {
var multiparty = require('multiparty');
var form = new multiparty.Form();
// var fs = require('fs');
form.parse(req, function(err, fields, files) {
console.log('The files', files)
console.log('The fields', fields)
})
res.send("Okay, bye.")
}
But this prints undefined values for files and fields.
I also tried using express-fileupload middleware
app.use(fileUpload({}));
but still, if I try to print req.files then I will get undefined.
Is there any specific way to read the file?
Thanks a lot!
This is my main file, index.js::
const express = require("express");
const path = require("path");
const app = express();
const port = 8080;
const tools = require("./tools");
const fileUpload = require("express-fileupload");
app.use(fileUpload());
app.use(express.static(__dirname + "/assets"));
app.get("/", (req, res) => {
res.sendFile(path.join(__dirname + "/index.html"));
});
app.get("/f", (req, res) => {
res.send("This route is only available as a POST route.");
});
app.put("/f", tools.receiveAndUploadFile);
app.listen(port, () => {
console.log(`Server started listening on port: ${port}`);
});
And the tools.js file:
const fs = require("fs");
const path = require("path");
module.exports = {
receiveAndUploadFile: function (req, res) {
console.log("Files: ", req.files);
res.send("Okay bye");
},
};
This is printing "Files: undefined" to the console.
A PUT and a POST are effectively the same thing. To upload arbitrary data, just read the data stream and write it to a file. Node provides a .pipe method on streams to easily pipe data from one stream into another, for example a file stream here:
const fs = require('fs')
const express = require('express')
const app = express()
const PORT = 8080
app.get('/*', (req, res) => res.status(401).send(req.url + ': This route is only available as a POST route'))
app.put('/*', function (req, res, next) {
console.log('Now uploading', req.url, ': ', req.get('content-length'), 'bytes')
req.pipe(fs.createWriteStream(__dirname + req.url))
req.on('end', function () { // Done reading!
res.sendStatus(200)
console.log('Uploaded!')
next()
})
})
app.listen(8080, () => console.log('Started on :8080'))
If you do a PUT to /file.mp4, it will upload all the data over to the script dir (__dirname) + the URL file path.
via curl, curl http://localhost:8080/ -T hello.txt
Related
For migration purposes I have to transform the content generated by nextjs into JSON format like {content: "generated markup"} in expressjs.
const express = require('express');
const next = require('next');
const port = 8080;
const dev = process.env.NODE_ENV !== 'production';
const nextApp = next({dev});
const handle = nextApp.getRequestHandler();
nextApp.prepare().then(() => {
const server = express();
server.all('*', async (req, res) => {
return handle(req, res);
});
server.use((req, res, next) => {
/* How to set res.json({content:<RESULT_FROM_NEXT_JS>})??? */
});
server.listen(port, () => {
console.log(`[server]: Server is running at http://localhost:${port}`);
});
});
What I understand so far is that next creates a stream of chucked data but I do not know how to block this stream to then create a json from it. Any clue on how to build a middleware for that? Or any other idea on how to generate a JSON in this format?
I made the following code to send .zip file upon get request to localhost:3000 but the file is being download without filename and without extension
const express = require("express");
const app = express();
app.get("/", async (req, res) => {
res.sendFile("/files/a.rar", {
extensions:["rar", "zip"]
})
})
app.listen(3000, () => {
console.log("server connected");
});
how i can do this
Have you tried using res.download(...) instead of res.sendFile(...)?
I had the same issue and I was able to make it work by using this code block.
import path from 'path';
...
app.get("/", async (req, res) => {
res.download(path.resolve('files/a.rar'), {
extensions: ["rar", "zip"],
});
});
Give it a try!
I have a small web service that basically receives a PUT and saves the payload to a file. It can happen that the file write fails due to permission issues. I would like that to be communicated back to the client.
To make it easier to read I have boiled the program down to a minimum. The req.pipe chain is way longer in the real program. (with many more possibilities for errors)
const fs = require('fs');
const express = require('express');
const app = express();
app.put('/write/:id', (req, res, next) => {
const filename = 'data/' + req.params.id;
console.log("write: " + filename);
req
.pipe(fs.createWriteStream(filename))
.on('error', next)
req.on('end', () => {
res.send('saved\n' );
console.log("sent response");
})
});
app.listen(8081, '0.0.0.0');
Trouble is that no matter what I do it will always respond "saved" to the client. I had kinda hoped the next call would have got me out of there.
What is the elegant way to make differentiated responses on errors occurring server side?
several hours later it seems I nailed it:
const fs = require('fs');
const express = require('express');
const app = express();
app.put('/write/:id', (req, res, next) => {
const filename = 'data/' + req.params.id;
console.log("write: " + filename);
req
.pipe(fs.createWriteStream(filename))
.on('error', (e) => {
console.log("error ", e);
res.status(400).send("failed");
})
.on('close', () => {
res.send("saved\n");
})
});
app.listen(8081, '0.0.0.0');
Notice how I'm listening for close within the pipeline and not end on the request it self.
I am trying to send data from a .json file as a response with Node.js. I am pretty new to it, and i don't know how to handle the Buffer.
This is what I did:
const express = require('express');
const fs = require('fs');
const path = require('path');
const bodyParser = require('body-parser');
const app = express();
const port = 3000;
app.use(bodyParser.urlencoded({extended: false}));
app.use('/', (req, res, next) => {
fs.readFile(path.join(__dirname, 'data', 'filename.json'), (err, content) => {
res.send(JSON.stringify(content));
})
});
app.listen(port, () => {
console.log(`server is running on port: ${port}`)
});
I expect to get the data in JSON format but what I get is a Buffer or just numbers. I guess i'm not getting some concepts right.
Save the buffer into a variable and then use toString() method and after that JSON.parse
What you want to do is specify the encoding like so:
fs.readFile(path.join(__dirname, 'data', 'filename.json'), 'utf8', (err, content) => {
res.setHeader('Content-Type', 'application/json');
res.send(content); // content will be a string
})
Otherwise according to the documentation you will get the Buffer.
I have a simple node app that parses a csv file into a string. In my server file, I call a module that runs makes a stream and pipes it into my parser.
The problem is that is code works perfectly the first time it is run, but fails after that. I've gotten a "Write after End" error so I believe there is something wrong with the stream or parser variable not being reset properly after each use. Thanks for any help!
const express = require('express');
const app = express();
const path = require('path');
const port = process.env.PORT || 3000;
const formidable = require('formidable');
const parser = require('./csvparse.js');
const fs = require('fs');
//send the index page on a get request
app.listen(port, () => console.log('Example app listening on port: ' + port));
app.get('*', (req, res) => res.sendFile(path.join(__dirname + "/index.html")));
app.post('/upload', function(req, res) {
//upload the file from the html form
var form = new formidable.IncomingForm();
form.parse(req,function(err, fields, files) {
if (err) throw err;
//get the path to the uploaded file for the parser to use
var filePath = files.spinFile.path;
parser(filePath, function(data) {
if (data == null) {
res.sendFile(path.join(__dirname + "/index.html"));
}
res.send("<code>" + data + "</code>");
});
});
});
The module export function looks like this:
module.exports = function(filePath, cb) {
var stream = fs.createReadStream(filePath);
stream.pipe(parser);
//when the stream is done, songsLog is complete and ready to be returned
stream.on('close', function() {
cb(songsLog);
});
};
Try wrapping the contents of your module.exports in another function.
module.exports = function(filepath, cb) {
function parse(filepath) {
const stream = fs.createReadStream(filepath)
etc...
}
return {
parse
}
}
then from your route, call parser.parse('file.txt') and you should have a new read stream.