Get video resolution in nodejs - node.js

I have been trying to get an answer to this without really finding any. Excuse me if this sounds stupid or obvious.
I have a nodejs application and basically I would like to simply get the resolution of a video. Imagine I have film stored on disk and I would like to be able to know if it is in 720p or 1080p or anything else.
I understood that I might need to use ffmpeg to do so, but then I also understood that ffmpeg was mostly used to "record, convert and stream audio and video files".
That does not mean retrieve video resolution.
Thank you for your help
Edit 1:
The node.js app is a desktop app and needs to be portable to Linux, windows and OS X. If possible a portable answer would be more appreciated but of course any answer is welcome.

To be honest I think the best method I found was to use fluent-ffmpeg with ffprobe as you are able to set the the path to the executable. The only problem is that ffmpeg has to be shipped with the app. So different executables have to be shipped, one for each distribution/os/derivation. If anyone has anything better I am open to answers.
Getting the width, height and aspect ratio using fluent-ffmpeg is done like so:
var ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfprobePath(pathToFfprobeExecutable);
ffmpeg.ffprobe(pathToYourVideo, function(err, metadata) {
if (err) {
console.error(err);
} elseĀ {
// metadata should contain 'width', 'height' and 'display_aspect_ratio'
console.log(metadata);
}
});

There's a npm package called get-video-dimensions that also use ffprobe and it's much easier to use. It also support promises and async/await.
import getDimensions from 'get-video-dimensions';
Using promise:
getDimensions('video.mp4').then(dimensions => {
console.log(dimensions.width);
console.log(dimensions.height);
})
or async/await:
const dimensions = await getDimensions('video.mp4');
console.log(dimensions.width);
console.log(dimensions.height);

I use node-ffprobe to accomplish this for images:
var probe = require('/usr/lib/node_modules/node-ffprobe');
probe(filePath, function (err, data) {
//the 'data' variable contains the information about the media file
});

fileMetaData will have width, height, codec info, aspect ratio etc ...
const ffprobe = require('ffprobe')
const ffprobeStatic = require('ffprobe-static')
const fileMetaData = await ffprobe(fileName, { path: ffprobeStatic.path })
fileName could be video('webm', 'mov', 'wmv', 'mpg', 'mpeg', 'mp4','flv' etc..) or image(jpg, gif, png etc..) path.
fileName example: /path/to/video.mp4 or http://example.com/video.mp4

One way to do this would be to to run another application as a child process, and get the resolution from std out. I'm not aware of any pure node.js solution for this.
See child_process.exec https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
and ffprobe
How can I get the resolution (width and height) for a video file from a linux command line?

Related

Get 'sharp' metadata after piping through another module

I want to compress an image using Sharp image processing library, pass it through an external quant library and then get the sharp metadata for it. I want to do this to actually overlay the compressed image size onto the image (during development only).
For a WEBP this is easy because everything is in the sharp pipeline.
// specify the compression
myImage.webp({ quality: 80 });
// actually compress it
var tempBuffer = await myImage.toBuffer({ resolveWithObject: true});
// create a new sharp object to read the metadata
var compressedImage = sharp(tempBuffer.data);
// Image data is now available in
console.log(compressedImage.info.size / 1024);
But when using the quant library I'm piping it into a third party library and so it's no longer a sharp object. I need to get the raw buffer out again in the most efficient way. I'm new to Node.js and don't know how to do this.
resizedImage.png()
.pipe(new quant(['--quality=50-70', '--speed', '1', '-']));
Do I need to use something like https://www.npmjs.com/package/stream-to-array ?
That seems crazy to me! Am I missing something?
Figured it out. You can just pipe it back into sharp()like this:
resizedImage.png()
.pipe(new quant(['--quality=50-70', '--speed', '1', '-']))
.pipe(sharp());
Then you can call metadata() or further resizing etc. (not that you'd normally do that!)

How to control output bit depth in graphicsmagick (for Node)?

So I have two PNG images, both non-transparent 24bpp.
One image contains a rainbow, other one contains a single line of text:
I do the same thing with both of them:
var gm = require('gm').subClass({imageMagick: true})
gm("./sources/source.png").bitdepth(24).write("test.png", function(){
console.log("test.png")
});
gm("./sources/source2.png").bitdepth(24).write("test2.png", function(){
console.log("test2.png")
});
where gm is this
And I set both to 24bpp explicitly
In result I have two images with different bit depth:
In some cases I also had 32bpp image.
How can I make it create only 24bpp image (discard alpha channel if needed).
Also, I don't want to create jpgs.
Thanks to #mark-setchell, I could force bit depth. I did it this way in Node:
gm("./sources/source.png")
.out("-define")
.out("png:color-type=2")
.write("test.png", function(){
console.log("test.png")
});
out() is an undocumented method but it basically helps you add custom parameters to commandline. Notice that
.out("-define png:color-type=2")
won't work, it only works if you pass each parameter in individual .out() call
.bitdepth(24) doesn't seem to affect output at all, probably because I did .subClass({imageMagick: true}) above.
My suggestion is to try using -define to set the variable png:color-type=2. As you worked out, and kindly shared with the community, it is done as follows:
gm("./sources/source.png")
.out("-define")
.out("png:color-type=2")
.write("test.png", function(){
console.log("test.png")
});

Best way to record a HTML Canvas/WebGL animation server-side into a video?

I have a set of animations which I can make in Canvas (fabric.js) or WebGL (three.js). I need to record them automatically, server-side, through a script and output a video file.
The animations include:
Pictures
Videos (with audio)
Other animations/effects
I have researched a lot during last few months on this.
Results
1. Use PhantomJS + FFMPEG
Run HTML Canvas animations on headless browser(PhantomJS) and record with FFMPEG. Here the issue is PhantomJS supports neither WebGL nor Video element. http://phantomjs.org/supported-web-standards.html
2. Use Websockets to send data back to server using DataURL
Here again, we will need to run the animations on browser (which we can't because we have to do everything on server).
3. Use node-canvas
This is a library by TJ Holowaychuk which allows rendering HTML Canvas on Node.js. But it has its own limitations plus I haven't really explored this field much.
(If someone could shed more light on this library)
If anyone has done it before or can guide me somewhere useful.
All we need to do is use some data to create animations and record it into a video, everything on server side.
You can use electron to render WebGL pages with BrowserWindow option "show" set to false and/or use xvfb-run to run headless.
I don't think node-canvas supports the webgl context, so you'll
have to use a library built around 2d drawing, and it certainly
won't have support for any video codecs.
If you can get your animation to work using node-canvas, you can
grab the animated frames at a rate appropriate for your content,
something like this:
Disclosure: I've successfully used FFmpeg to encode a sequence
of externally generated images, but haven't tried the setInterval()
method below. In addition to the animation overhead itself, I don't
know how exporting a canvas to PNG files at 30 FPS would perform.
// assuming "canvas" is asynchronously drawn on some interval
function saveCanvas(canvas, destFile) {
return new Promise((resolve, reject) => {
const ext = path.extname(destFile),
encoder = '.png' === ext ? 'pngStream'
: 'jpegStream';
let writable = fs.createWriteStream(destFile),
readable = canvas[encoder]();
writable
.on('finish', resolve)
.on('error', err => {
let msg = `cannot write "${destFile}": ${err.message}`;
reject(new Error(msg));
});
readable
.on('end', () => writable.end())
.on('error', err => {
let msg = `cannot encode "${destFile}": ${err.message}`;
reject(new Error(msg));
});
readable.pipe(writable);
});
}
const FPS = 30;
let frame = 0,
tasks = [],
interval = setInterval(() => tasks.push(
saveCanvas(canvas, `frame_${frame++}.png`)), 1000 / FPS);
// when animation is done, stop timer
// and wait for images to be written
clearInterval(interval);
Promise.all(tasks).then(encodeVideo);
function encodeVideo() {
// too much code to show here, but basically run FFmpeg
// externally with "-i" option containing "frame_%d.png"
// and "-r" = FPS. If you want to encode to VP9 + WEBM,
// definitely see: http://wiki.webmproject.org/ffmpeg/vp9-encoding-guide
}
And then use FFmpeg to encode a sequence of images into a video.
For the code behind encodeVideo(), you can look at this example.
Edit: There may be an issue with canvas.pngStream() writing
incorrect frames while the animation loop continuously draws on
that one canvas--maybe a copy of the canvas needs to be created
per frame? That would surely create significant memory pressure.
I think that the chromium headless mode might support WebGL already and is another possibility. The video rendering part is yet to come though:
https://bugs.chromium.org/p/chromium/issues/detail?id=781117
CCapture.js makes this pretty easy.

nodejs image manipulation with gm / imagemagick

I'm writing simple app that downloads JPEGs images from Flickr API, and then process them.
All I want to do, is to pick 4 random pixels from each image and save the HEX values.
Is it possible at all? I read a lot of graphicmagick documentation, but can't find a way to do this.
Whats the best way to decode JPEG and get this values? I tried a few plugins but neither can do this by default...
Take care!
https://npmjs.org/package/get-pixels seems nice for that:
var getPixels = require("get-pixels")
getPixels("lena.png", function(err, pixels) {
if(err) {
console.log("Bad image path")
return
}
console.log("got pixels", pixels.shape)
})

Play audio with Node.JS

I'm currently using child_process and command-line mplayer to play audio on the local machine, with my Node.JS application. This works, but it's not really an excellent solution. My biggest issue is that it takes 500ms from mplayer is started to audio starts playing.
Are there better ways to play audio? Preferably compressed audio, but I'll take what I can get.
I would suggest using node-speaker, which outputs raw PCM data to your speakers (so basically, it plays audio).
If you're playing something like mp3 files you might need to decode it first to PCM data, which is exactly what node-lame does.
Hope that helps.
Simplest I've found (on Mac OS) is to use
exec('afplay whatever.mp3', audioEndCallback)
Introducing, audic. It doesn't use any native dependencies so it can't break like in the answers higher up.
Observe:
import Audic from 'audic';
const audic = new Audic('audio.mp3');
await audic.play();
audic.addEventListener('ended', () => {
audic.destroy();
});
or more simply:
import {playAudioFile} from 'audic';
await playAudioFile('audio.mp3');
I think what you asking is there any good modules that work with audio in the nodejs ecosystem?
Whenever you have this type of question you should first go the npmjs and just type a appropiate keyword.
Here is list of modules that related to audio I found on the npmjs site.
substacks's baudio looks good to me.
You can use play-sound module also:
Install using npm, Run command:
npm install play-sound --save
Now use in your code:
var player = require('play-sound')(opts = {})
player.play('./music/somebody20.flac', function (err) {
if (err) throw err;
console.log("Audio finished");
});
Check out sound-play, it's a simple solution that works on Windows and MacOS without using external players:
const sound = require('sound-play')
sound.play('music.mp3')
Disclaimer: I'm the author of this package.
Check out node-groove - Node.js binding to libgroove:
This library provides decoding and encoding of audio on a playlist. It is intended to be used as a backend for music player applications, however it is generic enough to be used as a backend for any audio processing utility.
Disclaimer: I wrote the library, which is free, open source, and not affiliated with any product, service, or company.
You can use the play sound module and path to achieve this.
npm install play-sound
import path from 'path';
const __dirname = path.resolve();
import sound from 'sound-play';
const filePath = path.join(__dirname, "file.mp3");
sound.play(filePath);

Resources