Play local .avi videos in Node.js / Electron app - node.js

Maddening gap in an app I'm developing is there appears to be little (or no) support for AVI in the HTML5 video implementation. So, I need a workaround that is cross-platform, and package-able with my electron app.
Videos are hosted locally
I'm not averse to encoding on the fly (ffmpeg avi -> mp4 and use HTML5 natively?)
WebChimera appears dying due to VLC and Electron changes (devs can't keep up) (Is there another npm package that can do this?)
A wrapper that calls a native VLC instance might work -- but how do I ensure that VLC is available on the system with my packaging?
Should I just spawn a native app in a separate window (ie, Totem on Linux)? (seems clunky)
Latest videoj-java plugin apparently has issue (https://github.com/Afterster/videojs-java/issues/2) and adding another layer (java) to the electron stack seems somehow unsavory.
FFBinaries (https://github.com/vot/ffbinaries-node) seems promising... but oddly FFPlay is not available for Linux (though I suspect my linux consumers likely have a ffmpeg already installed).
NB: Files are decidedly AVI. I can't change this.
Any hints / pointers greatly appreciated!
UPDATE
On my system, using ffmpeg to convert:
ffmpeg -i infile.AVI -vcodec copy -acodec copy outfile.mp4
Takes no time at all (they are short videos):
real 0m0.138s
user 0m0.100s
sys 0m0.032s
So I'm leaning toward packaging ffmpeg with my program and converting before loading.

Take a look at this project:
https://github.com/RIAEvangelist/electron-video-player
According to the known supported formats:
https://github.com/RIAEvangelist/electron-video-player#known-supported-video-types
it supports:
mp4
webm
ogg
mov (MPEG4 | H.264)
avi (MPEG4 | H.264)
mkv (MPEG4 | H.264)
m4v (MPEG4 | H.264)
Take a look at its source code and see if you can implement it similarly.
You said that you need AVI support but AVI is just a container - if you need other codecs than the ones supported by this project then you will still need to transcode it first.
If you cannot do it like this then you may try using something similar to:
https://www.npmjs.com/package/mplayermanager
and bundle mplayer with your app, or some other player.

According to this SO answer, Electron now suports multiple video formats in the <video> tag, including .mkv, .avi and other formats. You don't need to rely on an external player.

Related

How to merge video file with audio file and maintain creation time?

I was finicking around with youtube-dl and ended up downloading a video that youtube-dl wasn't able to merge the generated audio and video. After some investigation, I found that there was an issue in my ffmpeg config.
Normally, if you actually run youtube-dl a second time after fixing ffmpeg, it will automatically merge the files for you. But as fate would have it, the online video has since been deleted so youtube-dl freaks out.
Fortunately ffmpeg itself can also merge audio and video files, but loses a very nice feature youtube-dl's implementation has, keeping the creation time of the files (i.e. creation rather than download or publication time).
Is there any way to merge an audio and video file and keep the creation/last modified date?
Here's my own solution on a Mac OS (should work on any UNIX), partially adapted from https://superuser.com/a/277667/776444:
I'm sure there's a way to do this using only FFMPEG but I ended up using touch:
ffmpeg -i originalVideo.mp4 -i originalAudio.mp4 -c:v copy -c:a aac combined.mp4
touch -r originalVideo.mp4 combined.mp4
Using these, I was able to change the file creation time for combined.mp4 to 28 April 2020, to match originalVideo.mp4.

mpeg-dash with live stream

I would like to use MPEG-DASH technology in situations where I am constantly receiving a live video stream from a client. The Web server gets a live video stream, keeps generating the m4s file, and declares it in mpd. So the new segment can be played back constantly.
(I'm using FFMPEG's ffserver. So the video stream continues to accumulate in /tmp/feed1.ffm file.)
Using MP4Box seems to be able to generate mpd, init.mp4, m4s for already existing files. But it does not seem to support live streaming.
I want fragmented mp4 in segment format rather than mpeg-ts.
A lot of advice is needed!
GPAC maintainer here. The dashcast project (and likely its dashcastx replacement from our Signals platform should help you). Please open issues on github if you have any issues.
Please note that there are some projects like this one using FFmpeg to generate some HLS and then GPAC to ingest the TS segments to produce MPEG-DASH. This introduces some latency but proved to be very robust.
Below information may be useful.
latest ffmpeg supports the live streaming and also mp4 fragmenting.
Example command
ffmpeg -re -y -i <input> -c copy -f dash -window_size 10 -use_template 1 -use_timeline 1 <ClearLive>.mpd

How to read video file using v4l2

I want to read a video file using v4l2, say an AVI file. And read it frame by frame.
As far as I can tell I need to use the read() function. But how isn't very clear to me. There are also hardly any examples available. So maybe a simple example on how to do this would help.
This is not what the Video4Linux2 (V4L2) API is for. It is not designed for reading multimedia files from disk, decoding them and playing them. Rather, it is designed to interface to assorted multimedia input devices (like webcams, microphones, TV tuners, and video capture devices), capture A/V data, and play it.
Take it from the V4L2 API introduction:
Video For Linux Two is [...] a kernel interface for analog radio and
video capture and output drivers.
For reading an AVI file and decoding/playing it (programmatically) on Linux, look into FFmpeg or GStreamer.

Video encoding in real time through node js?

Do Anyone know realtime encode video in node with ffmpeg ? I know transloadit was done well on this. Any idea ?
https://transloadit.com/blog/2010/12/realtime-encoding-over-150x-faster
Co-founder at Transloadit here : ) We used pipes. Node.js allows us to see the data as it is still being uploaded (we used our node-formidable module). FFmpeg allows using stdin for input via ffmpeg -i -. So you can pipe uploaded bytes into that spawned child_process's stdin, and that's that : )
Off-topic, we later deprecated the feature. It turned out the market for it was less interested than we had imagined, and it sadly introduced enough operational headaches that we said goodbye to this, by us, beloved feature.
Use Fluent FFMpeg. A tremendous module that can transcode on the fly:
https://github.com/schaermu/node-fluent-ffmpeg

UDP live webcam streaming with VLC on linux to iPhone

I've searched on the web for something like this, but everything is about ´saving the file on ipod´ and some offtopic solutions and examples that i can't really use.
I want my linux laptop to stream the webcam media to my iPod (with my code) I am really lost on this.. I could use VLC to stream it in UDP from the laptop, but how would i get it and show on the iPod side? Should i use mpmovieplayer?
Note: I could send the frames in iplimage (from my linux code:blocks project) to the iPod, and for what i've searched on the web, there are methods to transform IplImage to UIImage on the iPod, but i don't know how to make it work :s
thks for any help.
I think you best bet would be to use ffmpeg. When you just want to stream videos that you have taken with your webcam you can just encode it correctly with ffmpeg and the iPhone will automagically do a progressive download, in most cases this will do.
Something like this :
ffmpeg -i $1 -acodec libfaac -ab 128kb -vcodec mpeg4 -b 1200kb -mbd 2 -flags +4mv -cmp 2 -subcmp 2 -s 320x180 $1.mp4
when you however need real live streaming i would look at the following projects.
I played around with this project: http://www.ioncannon.net/projects/http-live-video-stream-segmenter-and-distributor/ but I remember it to be quite complex. It should however provide what you want.
Considering you are using a linux laptop, you might be interested in http://www.mythtv.org/wiki/Streaming_to_iPod_touch_or_iPhone
I think however it is missing the realtime component.
On the ipod side i think MPMoviePlayerViewController (ios 4.0) is the way to go, just implement the delegate methods and if the iPod can play it, this controller will handle everything.
I hope this helps, if you find an (easy) solution let us know ;-).

Resources