Cant read UDP network stream in h264 from opencv4nodejs - node.js

i'm trying to get a video stream from a djy drone 'Tello' with nodejs. the UDP server is working because i can manually get the stream and show it on my mac with "ffplay udp://198.168.10.1:11111".
Anyway i cant make opencv4nodejs (porting of opencv) working while making const capture = new VideoCapture('udp://192.168.10.1:11111', cv.CAP_FFMPEG).
Construction of object 'capture' fails with this message:
'OpenCV: Couldn't read video stream from file "udp://192.168.10.1:11111"'
P.S: if I use const capture = new VideCapture(0), etc.. it does successfully construct everything using my Mac camera.
Any help would be great, thank you!

Solution: The issue was installing opencv4nodejs without having ffmpeg installed BEFORE. So i run again npm install and rebuild opencv that was now aware of FFMPEG and everything worked as was supposed to do. I can read from udp stream and forward video as i want to my front end

Related

How to convert m3u8 file to mp4 video using nodeJS with a remote S3 input

The objective is to convert .m3u8 file(hls stream) to .mp4 video inside my NodeJS application. I've tried doing the same using ffmpeg on console and that works fine but unable to find a recently maintained package that helps me do this in node.
Also, my input file is not the usual file located in my local directory but a remote AWS S3 object URL which is otherwise accessible to all(public bucket), in simpler words,
How do I do ffmpeg -i https://mycloudfrontURL/myHLSfile.m3u8 output.mp4 in JS?
The best solution is to use ffmpeg directly in your node.js application.
Install ffmpeg on your pc. Then create a node.js file that runs ffmpeg like so:
const { execSync } = require('child_process')
const input = 'https://mycloudfrontURL/myHLSfile.m3u8'
const output = 'output.mp4'
execSync(`ffmpeg -y -i "${input}" "${output}"`)
You'll be able to execute ffmpeg and get the same result as if running it from the terminal. Make sure to have ffmpeg installed or indicate the full path if it doesn't work.
If this worked for you, upvote the solution.

Node.js: Converting MP4 to GIF (puppeteer-lottie)

I am trying to convert a .mp4 file to .gif using NPM packages, but nothing seems to be working.
I tried using gifski binary package (NPM) for this, but no luck. It says its a binary package and you can use it by child_process.spawn() or similar. I installed it with -g (global) flag and seems like its not recognized even with global flag. Not sure If you can set PATH or anything. Let me know if its possible.
As for the other tries, I used gify and its just not doing anything (no file or error).
I am getting the .mp4 file from puppeteer-lottie NPM package. Here's my code, if needed for testing:
const renderLottie = require('puppeteer-lottie');
await renderLottie({ animationData: data, output: 'example.mp4', width: 640 });
animationData: Sticker JSON
I am pretty sure that there are much easier ways to do this, but I am just using complex ones. I just want .mp4 to .gif at the end.
Thanks for your time.
For those of you who are still trying to find solution, I finally found it.
We just use ffmpeg with child_process.exec(). No need to install anything.
const { exec } = require("child_process");
exec("ffmpeg -i input.mp4 -qscale 0 output.gif");
input is your mp4 file that you want to convert, and output is your result gif file.
Source: Conrad Lotz's answer

Cannot find ffprobe?

I am trying to generate video thumbnail in my node project for that I tried
thumbsupply and video-thumbnail npm both returns the same error called not found: ffprobe
const thumbsupply = require('thumbsupply');
const ffprobe = require('#ffprobe-installer/ffprobe');
let aa = thumbsupply.generateThumbnail('videoplayback.mp4', {
size: thumbsupply.ThumbSize.MEDIUM, // or ThumbSize.LARGE
timestamp: "10%", // or `30` for 30 seconds
forceCreate: true,
cacheDir: "~/myapp/cache",
mimetype: "video/mp4"
})
console.log(aa);
Thumbsupply uses fluent-ffmpeg (from a quick look at the source):
https://github.com/fluent-ffmpeg/node-fluent-ffmpeg
fluent-ffmpeg has information on the requirements around ffmpeg installation and the required path at the link above.
Prerequisites
ffmpeg and ffprobe
fluent-ffmpeg requires ffmpeg >= 0.9 to work. It may work with previous versions but several features won't be available (and the library is not tested with lower versions anylonger).
If the FFMPEG_PATH environment variable is set, fluent-ffmpeg will use it as the full path to the ffmpeg executable. Otherwise, it will attempt to call ffmpeg directly (so it should be in your PATH). You must also have ffprobe installed (it comes with ffmpeg in most distributions). Similarly, fluent-ffmpeg will use the FFPROBE_PATH environment variable if it is set, otherwise it will attempt to call it in the PATH.
ffmpeg details including installation is here: https://www.ffmpeg.org/download.html
I was also getting this issue Error: Cannot find ffprobe
I have done following steps to make it work in ubuntu
sudo apt update
sudo apt install ffmpeg
restarted my pm2 api instance.
This was my case with the deployment server,
Since, I was working in local on mac os, I used
brew install ffmpeg

Error: spawn EACCESS node-webkit ffmpeg with fluent-ffmpeg

I'm doing a nodewebkit app and trying to bundle ffmpeg inside so the user doesn't have to have ffmpeg installed on their system.
EACCESS has to do with rights running the code. I tried a chmod -R ug+rw ffmpegFolder to no avail.
I downloaded the osx binaries from here: https://evermeet.cx/ffmpeg/ I'm assuming these are compiled but I could be wrong?
I am bundling the extracted ffmpeg folder into the root of my .nw, which, extracted looks like this:
This part has to do with fluent-ffmpeg.
It has this method called setFfmpegPath(path) which tells fluent-ffmpeg to use a FFMPEG you provide.
I get the fs.realpath to ffmpeg-2.5.4 and use that.Using ./ffmpeg-2.5.4 or /ffmpeg-2.5.4 or ffmpeg-2.5.4 just gives a spawn ENOENT error which I've read, means not found.
If I remove setFfmepgPath from my fluent-ffmpeg command it works fine using my system ffmpeg.
I feel I'm on the right track with the spawn EACCESS error. How to make it play nice though?

Beagleboard Angstrom Linux, Image Capture Script streamer alternative

I want to take a snapshot from my logitech webcam with desired resolution and save the image by using linux bash script. I need to do it on my beagleboard with Angstrom image. In my beagleboard i can capture with using cheese. But i dont know how to capture in terminal with script.
In my host computer i am using streamer with
streamer -c /dev/video0 -b 16 -o outfile.jpeg
But i dont know how to take snapshots in Angstrom. Can you make suggestions?
How can i capture with command line?
Regards
I've used mjpg-streamer with some success. It sends a video stream through port 8080, although you can change that by editing the startup script.
I used instructions from here, although I skipped the make install part and just run it off my home directory. Worked both with the default Angstrom image and with Debian running off the SD card (i.e., non-flashed).
You can view the stream by pointing your browser (either local or over-the-LAN) to http://beagle.address:8080/?action=x, where x is stream or snapshot. I trust those parameters are self-explanatory :).
You can use a text-based browser such as links to open the URL, and links will then prompt you for the filename of the image. That's for testing, then I suppose you can find a way to save the snapshot without human intervention if you intend to use it from a script.
I'm using gstreamer to capture webcam input on Beaglebone using a logitech webcam. You'll need gstreamer with gstreamer-utils installed. I'm using Ubuntu and they can be found from the standard repos. Here's the CLI command:
gst-launch v4l2src num-buffers=1 ! ffmpegcolorspace ! video/x-raw-yuv,width=320,height=240 ! jpegenc ! filesink location=test.jpg
Unfortunately, I'm experiencing some issues after some images as the pipeline freezes on v4l2src. Maybe you'll have better luck with your setup.

Resources