how can I save my chunks of streams which converted into blobs in my node js server real-time
client.js | I am my cam stream as binary to my node js server
handleBlobs = async (blob) => {
let arrayBuffer = await new Response(blob).arrayBuffer()
let binary = new Uint8Array(arrayBuffer)
this.postBlob(binary)
};
postBlob = blob => {
axios.post('/api',{blob})
.then(res => {
console.log(res)
})
};
server.js
app.post('/api', (req, res) => {
console.log(req.body)
});
how can I store the incoming blobs or binary into one video file at the end of video recording completion.
This appears to be a duplicate of How to concat chunks of incoming binary into video (webm) file node js?, but it doesn't currently have an accepted answer. I'm copying my answer from that post into this one as well:
I was able to get this working by converting to base64 encoding on the front-end with the FileReader api. On the backend, create a new Buffer from the data chunk sent and write it to a file stream. Some key things with my code sample:
I'm using fetch because I didn't want to pull in axios.
When using fetch, you have to make sure you use bodyParser on the backend
I'm not sure how much data you're collecting in your chunks (i.e. the duration value passed to the start method on the MediaRecorder object), but you'll want to make sure your backend can handle the size of the data chunk coming in. I set mine really high to 50MB, but this may not be necessary.
I never close the write stream explicitly... you could potentially do this in your /final route. Otherwise, createWriteStream defaults to AutoClose, so the node process will do it automatically.
Full working example below:
Front End:
const mediaSource = new MediaSource();
mediaSource.addEventListener('sourceopen', handleSourceOpen, false);
let mediaRecorder;
let sourceBuffer;
function customRecordStream(stream) {
// should actually check to see if the given mimeType is supported on the browser here.
let options = { mimeType: 'video/webm;codecs=vp9' };
recorder = new MediaRecorder(window.stream, options);
recorder.ondataavailable = postBlob
recorder.start(INT_REC)
};
function postBlob(event){
if (event.data && event.data.size > 0) {
sendBlobAsBase64(event.data);
}
}
function handleSourceOpen(event) {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
}
function sendBlobAsBase64(blob) {
const reader = new FileReader();
reader.addEventListener('load', () => {
const dataUrl = reader.result;
const base64EncodedData = dataUrl.split(',')[1];
console.log(base64EncodedData)
sendDataToBackend(base64EncodedData);
});
reader.readAsDataURL(blob);
};
function sendDataToBackend(base64EncodedData) {
const body = JSON.stringify({
data: base64EncodedData
});
fetch('/api', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body
}).then(res => {
return res.json()
}).then(json => console.log(json));
};
Back End:
const fs = require('fs');
const path = require('path');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json({ limit: "50MB", type:'application/json'}));
app.post('/api', (req, res) => {
try {
const { data } = req.body;
const dataBuffer = new Buffer(data, 'base64');
const fileStream = fs.createWriteStream('finalvideo.webm', {flags: 'a'});
fileStream.write(dataBuffer);
console.log(dataBuffer);
return res.json({gotit: true});
} catch (error) {
console.log(error);
return res.json({gotit: false});
}
});
Without attempting to implement this (Sorry no time right now), I would suggest the following:
Read into Node's Stream API, the express request object is an http.IncomingMessage, which is a Readable Stream. This can be piped in another stream based API. https://nodejs.org/api/stream.html#stream_api_for_stream_consumers
Read into Node's Filesystem API, it contains functions such as fs.createWriteStream that can handle the stream of chunks and append into a file, with a path of your choice. https://nodejs.org/api/fs.html#fs_class_fs_writestream
After completing the stream to file, as long as the filename has the correct extension, the file should be playable because the Buffer sent across the browser is just a binary stream. Further reading into Node's Buffer API will be worth your time.
https://nodejs.org/api/buffer.html#buffer_buffer
Related
I'm trying to send a blob image, but I'm getting Error: Unexpected end of form using multer with Serverless Framework.
From console.log
My understanding is I have to append it to FormData before sending it in the body, but I haven't been able to get backend to accept file without crashing
uploadImage(imageData: File) {
console.log('IMAGE DATA', imageData);
let formData = new FormData();
formData.append('file', imageData, 'file.png');
let headers = new HttpHeaders();
headers.append('Content-Type', 'multipart/form-data');
headers.append('Accept', 'application/json');
let options = { headers: headers };
const api = environment.slsLocal + '/add-image';
const req = new HttpRequest('PUT', api, formData, options);
return this.http.request(req);
}
backend
const multerMemoryStorage = multer.memoryStorage();
const multerUploadInMemory = multer({
storage: multerMemoryStorage
});
router.put(
'/add-image',
multerUploadInMemory.single('file'),
async (req, res: Response) => {
try {
if (!req.file || !req.file.buffer) {
throw new Error('File or buffer not found');
}
console.log(`Upload Successful!`);
res.send({
message: 'file uploaded'
});
} catch (e) {
console.error(`ERROR: ${e.message}`);
res.status(500).send({
message: e.message
});
}
console.log(`Upload Successful!`);
return res.status(200).json({ test: 'success' });
}
);
app.ts
import cors from 'cors';
import express from 'express';
import routers from './routes';
const app = express();
import bodyParser from 'body-parser';
app.use(cors({ maxAge: 43200 }));
app.use(
express.json({
verify: (req: any, res: express.Response, buf: Buffer) => {
req.rawBody = buf;
}
})
);
app.use('/appRoutes', routers.appRouter);
app.use(
bodyParser.urlencoded({
extended: true // also tried extended:false
})
);
export default app;
From my understanding with serverless framework I have to install
npm i serverless-apigw-binary
and add
apigwBinary:
types: #list of mime-types
- 'image/png'
to the custom section of the serverless template yaml file.
The end goal is not to save to storage like S3, but to send the image to discord.
What am I missing? I appreciate any help!
I recently encountered something similar in a react native app. I was trying to send a local file to an api but it wasn't working. turns out you need to convert the blob file into a base64 string before sending it. What I had in my app, took in a local file path, converted that into a blob, went through a blobToBase64 function, and then I called the api with that string. That ended up working for me.
I have this code snippet to help you but this is tsx so I don't know if it'll work for angular.
function blobToBase64(blob: Blob) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result as string);
};
reader.readAsDataURL(blob);
});
}
Hope this helps!
You can convert your Blob to a File using
new File([blob], "filename")
and then you should be able pass that file to your existing uploadImage method.
Looks like you are passing Blob instead of File based on your console.log(). So you should convert Blob to a File before calling the server. You can change your frontend code like this:
uploadImage(imageData: File) {
// Convert Blob to File
const file = new File([imageData], "file_name", { type: imageData.type });
let formData = new FormData();
formData.append('file', file, 'file.png');
const api = environment.slsLocal + '/add-image';
return this.http.put(api, formData);
}
Note: For more info about converting Blob to File, you can check this StackOverflow question.
The thing that got it working for me was this article.
There might be something different about using Express through Serverless Framework so things like mutler and express-fileupload might not work. Or could be because it's an AWS Lambda function. I don't know this for sure though. I just know I never got it working. This article was the only thing that worked for Serverless Framework + Express.
I also had to install version 0.0.3 of busboy ie npm i busboy#0.0.3. The newer version didn't work for busboy. Newer version was saying Busboy is not a constructor
Since I'm sending the file to discord and not S3 like this article does, I had to tweak the parser.event part in this part of the article for the handler.ts
export const uploadImageRoute = async (
event: any,
context: Context
): Promise<ProxyResult> => {
const parsedEvent: any = await parser(event);
await sendImageToDiscord(parsedEvent.body.file);
const response = {
statusCode: 200,
body: JSON.stringify('file sent successfully')
};
return response;
};
comes in as a Buffer which I was able to send as a file like this
const fs = require('fs-extra');
const cwd = process.cwd();
const { Webhook } = require('discord-webhook-node');
const webhook = new Webhook('<discord-webhook-url>');
export async function sendImageToDiscord(arrayBuffer) {
var buffer = Buffer.from(arrayBuffer, 'base64');
const newFileName = 'nodejs.png';
await fs.writeFile(`./${newFileName}`, buffer, 'utf-8').then(() => {
webhook.sendFile(`${cwd}/${newFileName}`);
});
}
});
I hope this helps someone!
So essentially, what my api call does is it 1) takes in video data using parse multipart, 2) converts that video data to a real mp4 file using ffmpeg, and then 3) is supposed to send back the video data to the client in the response body.
Steps 1 and 2 work perfectly - it's that third step that I am stuck on.
The api call creates the Out.mp4 file, but when I try and read its info using createReadStream, the chunks array doesn't populate, and a null context.res body is returned.
Please let me know what I am doing wrong and how I can pass back the video info properly so as to be able to convert the video info back to a playable mp4 file on the client's side.
Also, lmk if you have any questions or things I can clarify.
Here is the api call index.js file
const fs = require("fs");
module.exports=async function(context, req){
try{
//Get the input file setup
context.log("Javascript HTTP trigger function processed a request.");
var bodyBuffer=Buffer.from(req.body);
var boundary=multipart.getBoundary(req.headers['content-type']);
var parts=multipart.Parse(bodyBuffer, boundary);
var temp = "C:/home/site/wwwroot/In.mp4";
fs.writeFileSync(temp, Buffer(parts[0].data));
//Actually execute the ffmpeg script
var execLineBuilder= "C:/home/site/wwwroot/ffmpeg-5.1.2-essentials_build/bin/ffmpeg.exe -i C:/home/site/wwwroot/In.mp4 C:/home/site/wwwroot/Out.mp4"
var execSync = require('child_process').execSync;
//Executing the script
execSync(execLineBuilder)
//EVERYTHING WORKS UP UNTIL HERE (chunks array seems to be empty, even though outputting chunk to a file populates
//That file with data)
//Storing the chunks of the output mp4 into chunks array
execSync.on('exit', ()=>{
chunks = [];
const myPromise = new Promise((resolve, reject) => {
var readStream = fs.createReadStream("C:/home/site/wwwroot/Out.mp4");
readStream.on('data', (chunk)=> {
chunks.push(chunk);
resolve("foo");
});
})
})
myPromise.then(()=>{
context.res={
status:200,
body:chunks
}
})
}catch (e){
context.res={
status:500,
body:e
}
}
}```
you can use an npm package called azure-function-express this package will basically convert your azure function to an express
This way you can directly read the mp3 file you saved and send it directly.
const createHandler = require("azure-function-express").createHandler;
const express = require("express");
const fs = require('fs');
const app = express();
app.get("/api/HttpTrigger1", (req, res) => {
res.writeHead(200, {'Content-Type': 'video/mp4'});
let open = fs.createReadStream('./test.mp3');
res.send(open);
});
This way you will be able to share the video also running the ffmpeg might also be simple
So I'm trying to make the html form:
<form action="blahblah" encblah="multipart/form-data" whatever>
Thats not the problem, I need to make that form send the blob to express
app.post('/upload/avatars', async (req, res) => {
const body = req.body;
console.log(req.file);
console.log(body);
res.send(body);
});
So I can access the blob, create a read stream, pipe it to the cloud, and bam, upload the file without downloading anything on the express server it self.
Is that possible?
If yes, please tell me how.
If no, please tell me other alternatives.
On the client we do a basic multi-part form upload. This example is setup for a single image but you could call uploadFile in sequence for each image.
//client.ts
const uploadFile = (file: File | Blob) => {
const formData = new FormData();
formData.append("image", file);
return fetch("/upload", {
method: "post",
body: formData,
});
};
const handleUpload = (event: any) => {
return event.target.files.length ? uploadFile(event.target.files[0]) : null;
};
On the server we can use multer to read the file without persisting it to disk.
//server.js
const express = require("express");
const app = express();
const multer = require("multer");
const upload = multer();
app.post(
"/upload",
upload.fields([{ name: "image", maxCount: 1 }]),
(req, res, next) => {
console.log("/upload", req.files);
if (req.files.image.length) {
const image = req.files.image[0]; // { buffer, originalname, size, ...}
// Pipe the image.buffer where you want.
res.send({ success: true, count: req.files.image.originalname });
} else {
res.send({ success: false, message: "No files sent." });
}
}
);
For larger uploads I recommend socket.io, but this method works for reasonably sized images.
it is possible, but when you have a lot of traffic it would overwhelm your express server (in case you are uploading videos or big files ) but if it's for uploading small images (profile image, etc...) you're fine. either way you can use Multer npm
I'd recommend using client-side uploading on ex: s3-bucket, etc..., which returned a link, and therefore using that link.
as the title says. I have an app that could collect all image urls from an user input url. Now I want to able to zip them and when the user press the download button, it will fires a request contain the array of all the image urls to download.js and let the download.js to process the download.
In addition, I am using express.js and react; the express.js is using port 5000
Someone sent me a working sample code: https://repl.it/#chiKaRau/picture-packer-4-rex-1
However, this code will create its own port 3000
I want to able to process the download on my current port 5000 while express is launch on port 5000
So I changed some code, however, once I pressed the download button, nothing happens (no error and no download)
Would anyone tell me how to solve this? Thank
download.js
const express = require('express');
let router = express.Router();
const fetch = require('node-fetch')
// to get the images const
JSZip = require('jszip')
// to zip them up
const micro = require('micro')
// to serve them
router.post('/download-pics', (req, res) => {
const files = [
{
url: "https://jeremyliberman.com/static/489f2e7cf7df14bc2c8ac2bc8c76aa59/cb864/avatar.png",
file: 'avatar.png'
},
{
url: "https://jeremyliberman.com/static/489f2e7cf7df14bc2c8ac2bc8c76aa59/cb864/avatar.png",
file: 'avatar1.png'
},
{
url: "https://jeremyliberman.com/static/489f2e7cf7df14bc2c8ac2bc8c76aa59/cb864/avatar.png",
file: 'avatar2.png' }
]
// Start a simple web service with one route
// Create an in-memory zip file
var zip = new JSZip();
// Fetch each image source
const request = async () => {
for (const { file, url } of files) {
const response = await fetch(url);
const buffer = await response.buffer();
zip.file(file, buffer);
}
}
request();
// Set the name of the zip file in the download
res.setHeader('Content-Disposition', 'attachment; filename="pictures.zip"')
// Send the zip file
zip.generateNodeStream({ type: 'nodebuffer', streamFiles: true })
.pipe(res).on('finish', function() {
console.log("out.zip written.");
}) })
//export this router to use in our index.js module.exports = router;
Function request returns a promise. You need to wrap the rest of the code after request() in then(() => {})
request()
.then(() => {
// Set the name of the zip file in the download
res.setHeader('Content-Disposition', 'attachment; filename="pictures.zip"')
// Send the zip file
zip.generateNodeStream({ type: 'nodebuffer', streamFiles: true })
.pipe(res).on('finish', function() {
console.log("out.zip written.");
})
})
Is there a way that using express a route consumer can send an input stream to the endpoint and read it?
In short I want the endpoint user upload a file by streaming it instead of the multipart/form way. Something like:
app.post('/videos/upload', (request, response) => {
const stream = request.getInputStream();
const file = stream.read();
stream.on('done', (file) => {
//do something with the file
});
});
Is it possible to do it?
In Express, the request object is an enhanced version of http.IncomingMessage, which "...implements the Readable Stream interface".
In other words, request is already a stream:
app.post('/videos/upload', (request, response) => {
request.on('data', data => {
...do something...
}).on('close', () => {
...do something else...
});
});
If your intention is to first read the entire file into memory (probably not), you can also use bodyParser.raw():
const bodyParser = require('body-parser');
...
app.post('/videos/upload', bodyParser.raw({ type : '*/*' }), (request, response) => {
let data = req.body; // a `Buffer` containing the entire uploaded data
...do something...
});