nodejs input stream using express - node.js

Is there a way that using express a route consumer can send an input stream to the endpoint and read it?
In short I want the endpoint user upload a file by streaming it instead of the multipart/form way. Something like:
app.post('/videos/upload', (request, response) => {
const stream = request.getInputStream();
const file = stream.read();
stream.on('done', (file) => {
//do something with the file
});
});
Is it possible to do it?

In Express, the request object is an enhanced version of http.IncomingMessage, which "...implements the Readable Stream interface".
In other words, request is already a stream:
app.post('/videos/upload', (request, response) => {
request.on('data', data => {
...do something...
}).on('close', () => {
...do something else...
});
});
If your intention is to first read the entire file into memory (probably not), you can also use bodyParser.raw():
const bodyParser = require('body-parser');
...
app.post('/videos/upload', bodyParser.raw({ type : '*/*' }), (request, response) => {
let data = req.body; // a `Buffer` containing the entire uploaded data
...do something...
});

Related

How to push to Node stream after error in 10+?

I picked up some old stream code recently (written when 8.x was LTS) and attempted to update it to 12.x. This led to an interesting break in the way I dealt with ENOENT file errors.
Here's a simplification:
const { createServer } = require('http')
const { createReadStream } = require('fs')
const PORT = 3000
const server = createServer((req, res) => {
res.writeHead(200, {
'Content-Type': 'application/json'
})
const stream = createReadStream(`not-here.json`, {encoding: 'utf8'})
stream.on('error', err => {
stream.push(JSON.stringify({data: [1,2,3,4,5]}))
stream.push(null)
})
stream.pipe(res)
})
server.listen(PORT)
server.on('listening', () => {
console.log(`Server running at http://localhost:${PORT}/`)
})
In Node 8, the above code works fine. I'm able to intercept the error, write something to the stream and let it close normally.
In Node 10+ (tested 10, 12, and 13) the stream is already destroyed when my error callback is called. I can't push new things on the stream and handle the error gracefully for the client side.
Was this an intentional change and can I still handle this error in a nice way for the clint side?
One possibility. Open the file yourself and only create the stream with that already successfully opened file. That will allow you to handle ENOENT (or any other errors upon opening the file) before you get into the messy stream error handling mechanics. The stream architecture seems most aligned with aborting upon error, not recovering with some alternate behavior.
const { createServer } = require('http');
const fs = require('fs');
const PORT = 3000;
const server = createServer((req, res) => {
res.writeHead(200, {'Content-Type': 'application/json'});
fs.open('not-here.json', {encoding: 'utf8'}, (err, fd) => {
if (err) {
// send alternative response here
res.end(JSON.stringify({data: [1,2,3,4,5]}));
} else {
const stream = fs.createReadStream(null, {fd, encoding: 'utf8'});
stream.pipe(res);
}
});
});
server.listen(PORT);
server.on('listening', () => {
console.log(`Server running at http://localhost:${PORT}/`)
});
You could also try experimenting with the autoDestroy or autoClose options on your stream to see if any of those flags will allow the stream to still be open for you to push data into it, even if the file created an error opening or reading. The doc on those flags is not very complete so some combination of programming experiements and studying the code would be required to see if they could be manipulated to still add data to the stream after your stream got an error.
The answer by jfriend00 pointed me in the right direction.
Here are two different ways I solved this. I wanted a function that returned a stream rather than handle the error in the req handler function. This is more like what I'm actually doing in real code.
Handling error from stream:
Just like above except I took care to manually destroy the stream. Does this correctly take care of the internal file descriptor? I think it does.
const server = createServer((req, res) => {
res.writeHead(200, {
'Content-Type': 'application/json'
})
getStream().pipe(res)
})
function getStream() {
const stream = createReadStream(`not-here.json`, {
autoClose: false,
encoding: 'utf8'
})
stream.on('error', err => {
// handling "no such file" errors
if (err.code === 'ENOENT') {
// push JSON data to stream
stream.push(JSON.stringify({data: [1,2,3,4,5]}))
// signal the end of stream
stream.push(null)
}
// destory/close the stream regardless of error
stream.destroy()
console.error(err)
})
return stream
}
Handling the error during file open:
Like jfriend00 suggests.
const { promisify } = require('util')
const { Readable } = require('stream')
const { open, createReadStream } = require('fs')
const openAsync = promisify(open)
const server = createServer(async (req, res) => {
res.writeHead(200, {
'Content-Type': 'application/json'
})
const stream = await getStream()
stream.pipe(res)
})
async function getStream() {
try {
const fd = await openAsync(`not-here.json`)
return createReadStream(null, {fd, encoding: 'utf8'})
} catch (error) {
console.log(error)
// setup new stream
const stream = new Readable()
// push JSON data to stream
stream.push(JSON.stringify({data: [1,2,3,4,5]}))
// signal the end of stream
stream.push(null)
return stream
}
}
I still like handling in the stream better but would love to hear reasons why you might do it one way or the other.

How to save my cam stream in my server realtime node js?

how can I save my chunks of streams which converted into blobs in my node js server real-time
client.js | I am my cam stream as binary to my node js server
handleBlobs = async (blob) => {
let arrayBuffer = await new Response(blob).arrayBuffer()
let binary = new Uint8Array(arrayBuffer)
this.postBlob(binary)
};
postBlob = blob => {
axios.post('/api',{blob})
.then(res => {
console.log(res)
})
};
server.js
app.post('/api', (req, res) => {
console.log(req.body)
});
how can I store the incoming blobs or binary into one video file at the end of video recording completion.
This appears to be a duplicate of How to concat chunks of incoming binary into video (webm) file node js?, but it doesn't currently have an accepted answer. I'm copying my answer from that post into this one as well:
I was able to get this working by converting to base64 encoding on the front-end with the FileReader api. On the backend, create a new Buffer from the data chunk sent and write it to a file stream. Some key things with my code sample:
I'm using fetch because I didn't want to pull in axios.
When using fetch, you have to make sure you use bodyParser on the backend
I'm not sure how much data you're collecting in your chunks (i.e. the duration value passed to the start method on the MediaRecorder object), but you'll want to make sure your backend can handle the size of the data chunk coming in. I set mine really high to 50MB, but this may not be necessary.
I never close the write stream explicitly... you could potentially do this in your /final route. Otherwise, createWriteStream defaults to AutoClose, so the node process will do it automatically.
Full working example below:
Front End:
const mediaSource = new MediaSource();
mediaSource.addEventListener('sourceopen', handleSourceOpen, false);
let mediaRecorder;
let sourceBuffer;
function customRecordStream(stream) {
// should actually check to see if the given mimeType is supported on the browser here.
let options = { mimeType: 'video/webm;codecs=vp9' };
recorder = new MediaRecorder(window.stream, options);
recorder.ondataavailable = postBlob
recorder.start(INT_REC)
};
function postBlob(event){
if (event.data && event.data.size > 0) {
sendBlobAsBase64(event.data);
}
}
function handleSourceOpen(event) {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
}
function sendBlobAsBase64(blob) {
const reader = new FileReader();
reader.addEventListener('load', () => {
const dataUrl = reader.result;
const base64EncodedData = dataUrl.split(',')[1];
console.log(base64EncodedData)
sendDataToBackend(base64EncodedData);
});
reader.readAsDataURL(blob);
};
function sendDataToBackend(base64EncodedData) {
const body = JSON.stringify({
data: base64EncodedData
});
fetch('/api', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body
}).then(res => {
return res.json()
}).then(json => console.log(json));
};
Back End:
const fs = require('fs');
const path = require('path');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json({ limit: "50MB", type:'application/json'}));
app.post('/api', (req, res) => {
try {
const { data } = req.body;
const dataBuffer = new Buffer(data, 'base64');
const fileStream = fs.createWriteStream('finalvideo.webm', {flags: 'a'});
fileStream.write(dataBuffer);
console.log(dataBuffer);
return res.json({gotit: true});
} catch (error) {
console.log(error);
return res.json({gotit: false});
}
});
Without attempting to implement this (Sorry no time right now), I would suggest the following:
Read into Node's Stream API, the express request object is an http.IncomingMessage, which is a Readable Stream. This can be piped in another stream based API. https://nodejs.org/api/stream.html#stream_api_for_stream_consumers
Read into Node's Filesystem API, it contains functions such as fs.createWriteStream that can handle the stream of chunks and append into a file, with a path of your choice. https://nodejs.org/api/fs.html#fs_class_fs_writestream
After completing the stream to file, as long as the filename has the correct extension, the file should be playable because the Buffer sent across the browser is just a binary stream. Further reading into Node's Buffer API will be worth your time.
https://nodejs.org/api/buffer.html#buffer_buffer

How to access "request payload" in Koa web-framework?

We are using navigator.sendBeacon function to send data to Koa server, in which we are using bodyparser.
If we not wrapped data into form then by default this function send data as request payload. How I can able to access this data on Koa server?
Example -
navigator.sendBeacon('http://localhost:3000/cookies/', 'test=payload')
At server, request body is blank.
Considering that
Koa does not parse request body, so you need to use either koa-bodyparser or koa-body,
koa-bodyparser by default has only json and form parsing enabled,
From your screenshot, it is clear that navigator.sendBeacon set the Content-Type to text,
You need to change the Koa server code, so that it parses text data.
Example:
const Koa = require('koa'),
bodyParser = require('koa-bodyparser'),
app = (module.exports = new Koa());
app.use(bodyParser({ enableTypes: ['json', 'text'] }));
app.use(async (ctx) => {
// ctx.request.body should contain the parsed data.
console.log('Received data:', ctx.request.body);
ctx.body = ctx.request.body;
});
if (!module.parent) app.listen(3000);
Tested with
koa 2.7.0,
koa-bodyparser 4.2.1.
Although koa doesn't parse request body and for some reason you don't want to use koa-bodyparser you can still use the raw http to collect the body from request object.
app.use(async (ctx) => {
try {
// notice that I'm wrapping event emitter in a `promise`
ctx.body = await new Promise((resolve, reject) => {
let data = '';
// this is same as your raw `http.request.on('data', () => ()`
ctx.req.on('data', chunk => {
data += chunk;
};
ctx.req.on('error', err => {
reject(err);
};
ctx.req.on('end', () => {
resolve(data);
};
});
} catch (e) {
console.error(e);
}
});

How can I get multiple upload stream with multer storage engine?

I am making a multer storage engine which makes stream connection between client and S3 Server.
At middle of the stream, my code examine chunks and send it to S3.
I could get a file stream from node.js server. But when I request file array upload, node inspector shows only one stream. What should I do?
Stream engine snippet
CustomStreamEngine.prototype._handleFile = function _handleFile (req, file, cb) {
// for inspect
req.files.length // 1
file;
};
request controller
var streamStorage = multer({
storage: streamEngine()
});
dev.post('/rec_test', streamStorage.array('source'), (req, res, next) => {
});
I just published this streaming multipart/form-data parser on npm as form-parser:
You should be able to do the following:
dev.post('/rec_test', async (req, res, next) => {
// Parse request
await parser(req, async ({ fieldType, fieldName, fieldContent }) => {
// Log all fields
console.log({ fieldType, fieldName, fieldContent });
// Handle 'source' file fields
if (fieldType === 'file' && fieldName === 'source[]') {
// Get file info
const { fileName, fileType, fileStream } = fieldContent;
// Upload fileStream to S3 :-)
}
});
});
Hope it's helpful.
K
I think you can add some logs to https://github.com/expressjs/multer/blob/master/lib/make-middleware.js to check.
Currently, I use axios on the client to send multi files to the server with multer. And I can see all files in the function
busboy.on('file', function (fieldname, fileStream, filename, encoding, mimetype), but there is only one file at a time, and this function will call the _handfile function of the custom storage, so that I think it is the reason for your issue.
Hope it can help you

Streaming an uploaded file to an HTTP request

My goal is to accept an uploaded file and stream it to Wistia using the the Wistia Upload API. I need to be able to add fields to the HTTP request, and I don't want the file to touch the disk. I'm using Node, Express, Request, and Busboy.
The code below has two console.log statements. The first returns [Error: not implemented] and the second returns [Error: form-data: not implemented]. I'm new to streaming in Node, so I'm probably doing something fundamentally wrong. Any help would be much appreciated.
app.use("/upload", function(req, res, next) {
var writeStream = new stream.Writable();
writeStream.on("error", function(error) {
console.log(error);
});
var busboy = new Busboy({headers: req.headers});
busboy.on("file", function(fieldname, file, filename, encoding, mimetype) {
file.on("data", function(data) {
writeStream.write(data);
});
file.on("end", function() {
request.post({
url: "https://upload.wistia.com",
formData: {
api_password: "abc123",
file: new stream.Readable(writeStream)
}
}, function(error, response, body) {
console.log(error);
});
});
});
req.pipe(busboy);
});
I am not to familiar with the busboy module, but there errors you are getting are from attempting to use un-implemented streams. Whenever you create a new readable or writable stream directly from the stream module you have to create the _read and _write methods respectively Stream Implementors (node.js api). To give you something to work with the following example is using multer for handling multipart requests, I think you'll find multer is easier to use than busboy.
var app = require('express')();
var fs = require('fs');
var request = require('request');
app.use(multer());
app.post("/upload", function(req, res, next) {
// create a read stream
var readable = fs.createReadStream(req.files.myfile.path);
request.post({
url: "https://upload.wistia.com",
formData: {
api_password: "abc123",
file: readable
}
}, function(err, res, body) {
// send something to client
})
});
I hope this helps unfortunately I am not familiar with busboy, but this should work with multer, and as I said before there problem is just that you are using un-implemented streams I'm sure there is a way to configure this operation with busboy if you wanted.
If you want to use multipart (another npm) here is a tutorial:
http://qnimate.com/stream-file-uploads-to-storage-server-in-node-js/

Resources