How to handle blob in express(nodejs) - node.js

I've got google extension, react frontend app and express server.
I use mediaRecorder to record my screen and insert it into frontend page.There is no problem, video works just fine in frontend
const blob = new Blob(chunks, { type: "video/mp4;" });
const savedVideo = document.getElementById("savedVideo");
chunks = [];
const videoURL = window.URL.createObjectURL(blob);
savedVideo.src = videoURL;
var tracks = stream.getTracks();
tracks[0].stop();
let response = await fetch('http://localhost:3001/upload', {
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
},
body: blob
});
The problem starts when i send blob to server.I want to save video(Only on server side)I suppose problem is in handling blob on the server side, maybe I doing smth wrong, here my server code:
const express = require("express");
const cors = require('cors');
const fs = require('fs');
const app = express();
const port = 3001;
app.use(cors({
origin: 'http://localhost:3000'
}));
app.post("/upload", (req, res) => {
console.log('req.body', req.body)
req.on('readable', function(){
const data = req.read();
if(data) {
fs.createWriteStream('videeoo.mp4').write(data);
// also i didnt sure about this method to write file
}
console.log('data', data);
});
});
app.listen(port, () => {
console.log(`Server started at http://localhost:${port}`);
});
express logs
I'am waiting your best practices)Grasias!

In order to handle blobs in nodejs app.post() you should introduce express.raw() into it. Then you can create a blob from the buffer:
app.post('/raw/:cmd', express.raw({type: "*/*"}), async (req, res) => {
const buffer = req.body
const blob = new Blob([buffer], {type: "application/octet-stream"})
})

well... this is problematic... the req.read() doesn't normally process binary data. There's also a conceptual issue here: a video can potentially be huge, but in your application you're waiting for the whole file to be uploaded before you start writing it. So if you have 10 users, each uploading 10GB files, this is a problem. So you really want to store the file as it arrives, so that you only keep a few bytes in your memory at a time... but then what if you want to limit the size of the file? probably 10GB files is not something you want to deal with?
So... there are really a lot of corner cases and things to consider. In general, you don't want to handle these things manually. Luckily there are libraries like multer that can handle all these issues for you: https://expressjs.com/en/resources/middleware/multer.html you just define the destination directory, the max file size, etc and the library takes care of everything for you

Related

problem while POSTing to the server in Express

I'm learning Express and I face an issue which I can't understand.
When I route to /addPerson I expect to log the name: 'Mike', age: 30 to the console. Instead I got nothing logged to the console. What's wrong in my code?
here's the server.js code
const Express = require('express'),
app = Express(),
PORT = process.env.PORT || 5000,
parser = require('body-parser'),
data = []
// initialize the main project folder
app.use(Express.static('public'))
// running the server
app.listen(PORT, () => {
console.log(`Server is running at port ${PORT}`);
})
// include body parser to handle POST requests
app.use(parser.urlencoded({extended: false}))
app.use(parser.json())
// setup CORS
const cors = require('cors')
app.use(cors())
// GET request
app.get('/', (req, res) => {
res.send('<h1>Home Page</h1>')
})
app.get('/addPerson', (req, res) => {
res.send('<h1>Hello Hany</h1>')
})
// POST request
app.post('/addPerson', (req, res) => {
data.push(req.body)
console.log(data);
})
and here is the client side app.js code
const postData = async ( url = '', data = {})=>{
console.log(data);
const response = await fetch(url, {
method: 'POST',
credentials: 'same-origin',
headers: {
'Content-Type': 'application/json',
},
// Body data type must match "Content-Type" header
body: JSON.stringify(data),
});
try {
const newData = await response.json();
console.log(newData);
return newData;
}catch(error) {
console.log("error", error);
}
}
postData('/addPerson', {name: 'Mike', age: 30});
this the files structure
Alright, I've taken a look at your code and this is what I've noticed. Within your server.js file you have this code block:
app.get('/addPerson', (req, res) => {
res.send('<h1>Hello Hany</h1>')
})
That is sending back a static H1 tag when the user creates a get request to localhost:5000/addPerson. Then, directly below that you have your post route but you're never fully accessing that from anywhere (I looked through all your app.js code to double check).
Instead, I have changed your code to return a static html file with a button that allows you to call this function (just as an example so you can see that your routes do in fact work). This isn't the cleanest solution to your problem but I just wanted to make sure you see where the problem lies as I've been in your shoes before when I first started working with express. You can take a look at the CodeSandbox I setup below to replicate your issue and take a look through all the code to get an understanding.
To properly solve your issue using the app.js file you would have to serve the javscript file as your "frontend". Personally I'm a big fan of React so I usually serve my frontend with React, while my backend is express. You can also very easily serve this file using NodeJS in a similar fashion that you are with your "backend". If you were to take the React approach you would be able to modify this code:
app.get("/addPerson", (req, res) => {
res.sendFile(path.resolve(__dirname, "public", "index.html"));
});
To find the frontend section you desire using React (I can recommend react-router if you require multiple routes but I don't want to overwhelm you with too much information yet) and complete the same function. If you have any questions feel free to reach out and let me know! Hopefully this helps!

How can I Serve an updated Json file with Expressjs?

I am storing a single array of objects in a json file. I have a worker calling an api at an interval to update the json file. The json file is being served to my site from an expressjs server. When the api is called however it serves the first version of the json file, not with the most recent data.
I saw some mention that the browser may be caching the file, however when I log the length of the array before it is served it still has the original length.
I also tried requiring the file inside the get function, thinking that might be when the file is read, however there was no change.
Here is some code:
const express = require('express')
const app = express()
const port = 3001
let jobData = require('/exampleFile')
app.get('/jobs', async (req, res) => {
let data = await jobData;
console.log(data.length)
await res.header('Access-Control-Allow-Origin', 'http://localhost:3000')
res.set('Cache-Control', 'no-store, no-cache, must-revalidate, private')
return await res.send(data)}
)
app.listen(port, () => console.log(`Example app listening on port ${port}!`))
I just found a solution, while writing the question. I changed the way the file is read to use the fs module instead of using require:
const jobData = await JSON.parse(fs.readFileSync('/exampleFile', 'utf8'));

How to save external API response to Firebase

Im working with a React App where I present a list top Podcasts. I'm using iTunes Search API to dynamically present data to the user. For now, I working with a Node Express server to setup my custom endpoints. The problem is that the API has a request limit, so I tought that I could save what I get from the response to Firebase and present the data from firebase instead.
To my question;
Can in some way save the response I get from iTunes Search API to Firebase?
For now my code for fetching data from my API Endpoints looks like this in my Node+Express server:
const express = require('express');
const unirest = require('unirest');
const app = express();
const port = process.env.PORT || 5000;
// Get all Episodes from a specific podcast
app.get('/api/podcast/episodes', (req, res) => {
const feedurl = req.query.feedurl
unirest.get(feedurl)
.end((response) => {
res.status(200).send(response.body)
});
});
// Get Podcast by ID
app.get('/api/podcast/:id', (req, res) => {
const podID = req.params.id;
unirest.get(`https://itunes.apple.com/lookup?id=${podID}&country=se`)
.end((response) => {
res.status(200).send(response.body)
});
});
// Get Podcast Categorys
app.get('/api/podcast/:category/:amount', (req, res) => {
const categoryID = req.params.category;
const amount = req.params.amount;
unirest.get(`https://itunes.apple.com/se/rss/toppodcasts/limit=${amount}/genre=${categoryID}/explicit=true/json`)
.end((response) => {
res.status(200).send(response.body)
});
});
// Get Podcast Categorys
app.get('/api/categorys', (req, res) => {
unirest.get('https://itunes.apple.com/WebObjects/MZStoreServices.woa/ws/genres?id=26&cc=se')
.end((response) => {
res.status(200).send(response.body)
});
});
app.listen(port, () => console.log(`Listening on port ${port}`));
Im just looking for someone who could point me in the right direction how to proceed. Cause for now I'm stuck, big time.
Depending on how long you want to cache the response, you can use a whole different things - a physical database like MySql, Sqlite, MongoDB etc to locally persist data.
If you only want to keep the cached result for a short period of time, you can use in-memory cache or just any other tool that offers you same functionality. Redis is also a good contender as a temporary store, especially when you expect to scale to more than one node instance for your application.
Below, I have modified a part of your code to cache result for 10mins, using memory-cache npm module
const express = require('express');
const unirest = require('unirest');
const cache = require('memory-cache');
const CACHE_DURATION = 10 * 60 * 1000; //10mins
const app = express();
const port = process.env.PORT || 5000;
// Get all Episodes from a specific podcast
app.get('/api/podcast/episodes', (req, res) => {
const cacheKey = req.query.feedurl; //Or anything unique to this route
const cachedData = cache.get(cacheKey);
if(cachedData) {
return res.json(cachedData);
}
const feedurl = req.query.feedurl
unirest.get(feedurl)
.end((response) => {
res.status(200).send(response.body);
cache.put(cacheKey, response.body, CACHE_DURATION);
});
});
---- the rest of your code ----
You can hit the route as many times as you want and be guaranteed that data will be fetched from iTunes only once in 10mins.
The second and subsequent requests will be served a lot faster from cache.
Let me know if this is what you are looking for.

res.download(NodeJS) not triggering a download on the browser

I've been struggling with this for a while and can't seem to find an answer, I'm developing a website with a budgeting option, I'm sending an object from the client to the server, and that server is using PDFKit to create a PDF version of the budget, once it's created I want to actually send back that PDF to the client and trigger a download, this is what I've done
Client-side code:
let data = {
nombre: this.state.name,
email: this.state.email,
telefono: this.state.phone,
carrito: this.props.budget.cart,
subTotal: this.props.budget.subTotal,
IVA: this.props.budget.tax,
total: this.props.budget.subTotal + this.props.budget.tax
}
axios({
method: 'post',
url: 'http://localhost:1337/api/budget',
data: data
})
.then((response) => {
console.log('This is the response', response);
window.open('/download')
})
.catch((error) => {
alert(error);
})
So that data goes to my server-side code perfectly and it looks like this
const pdf = require('pdfkit');
const fs = require('fs');
const path = require('path');
exports.makePDFBudget = (req, res) => {
let myDoc = new pdf;
myDoc.pipe(fs.createWriteStream(`PDFkit/budget.pdf`));
myDoc.font('Times-Roman')
.fontSize(12)
.text(`${req.body.name} ${req.body.phone} ${req.body.email} ${req.body.cart} ${req.body.subTotal} ${req.body.total} ${req.body.tax}`);
myDoc.end()
}
That's creating my PDF, what I want now is that once it's created and the response is sent back to the client, the client opens a new window with the URL "/download" which is set to download that PDF, but that's not happening for some reason, it opens up the new window but the download never starts and it throws absolutely no error I'm my Node console or browser console
this is how I send my file to the client
const fs = require('fs');
const path = require('path');
exports.downloadPDFBudget = (req, res) => {
res.download(__dirname + 'budget.pdf', 'budget.pdf');
}
And this is how my server index looks like
const bodyParser = require('body-parser');
const express = require('express');
const app = express();
const api = express.Router();
const { makePDFBudget } = require('./PDFkit/makePDFBudget.js');
const { downloadPDFBudget } = require('./PDFkit/downloadPDFBudget.js')
app.use(express.static(__dirname + '/../public'));
app.use(bodyParser.urlencoded({extended: true}));
app.use(bodyParser.json({extended: true}));
api.route('/budget')
.post(makePDFBudget)
api.route('/download')
.get(downloadPDFBudget)
app.use('/api', api);
const port = 1337;
app.listen(port);
console.log('Listening on port ', port);
module.exports = app;
I just solved it, the port in which I was running my client obviously was different from the one I was running my server, so I had to open a window to my server's port to trigger the download, I realized this because I threw a console log on the function that was supposed to do the res.download it wasn't showing up. Thanks!
I guess the main problem here:
res.download(__dirname + 'budget.jpg', 'budget.pdf');
Make a correct file name. Your file is pdf, not jpg.
At this code res.end(Buffer.from('budget.pdf')) you sending string, not file content. But headers like you want to send a file.
The last. Your application designed like you will have only one user. Could you add userId to file names? Or use DB for storing data and generate pdf on request without storing a file to the file system.

Download a youtube video file in node js using ytdl

I want to make user able to download a youtube video using node-ytdl.
For example when client side make a GET request for certain route the video should be downloaded in response.
var ytdl = require('ytdl-core');
var express= require('express');
//Init App Instance
var app=express();
app.get('/video',function(req,res){
var ytstream=ytdl("https://www.youtube.com/watch?v=hgvuvdyzYFc");
ytstream.on('data',function(data){
res.write(data);
})
ytstream.on('end',function(data){
res.send();
})
})
Above is my nodejs code. Even though in network it seems to download the response it does not make user download as a file.I don't want to store any file on server.It would be great if someone could help me how to solve the issue.
res object is a writable stream so you can directly pipe the output of ytdl to res object like this -
ytdl("http://www.youtube.com/watch?v=xzjxhskd")
.on("response", response => {
// If you want to set size of file in header
res.setHeader("content-length", response.headers["content-length"]);
})
.pipe(res);
You have to also pass the headers. Try it:
app.get('/video', (req, res) => {
var url = "https://www.youtube.com/watch?v=hgvuvdyzYFc";
res.header("Content-Disposition", 'attachment; filename="Video.mp4');
ytdl(url, {format: 'mp4'}).pipe(res);
});
If someone is still getting an error just update the package to latest version by running:
npm i ytdl-core#latest
Ok, so make a string var, then add data to it on the data event. On end, send everything. Here is an example:
const ytdl = require("ytdl-core"),
app = require("express")();
app.get("/video", (req, res) => {
let data = "", vid = ytdl("https://www.youtube.com/watch?v=hgvuvdyzYFc");
vid.on("data", d => data += d);
vid.on("end", () => res.send(data));
res.header("Content-Disposition", 'attachment; filename="Video.mp4');
});

Resources