nginx rtmp to hls streaming - http-live-streaming

My scenario is to pull data from a RTSP source via ffmpeg, send it to nginx-rtmp, and use nginx to provide hls playback. There are quite a lot of tutorials and q&a's on the internet. I followed this one:
https://docs.peer5.com/guides/setting-up-hls-live-streaming-server-using-nginx/
However, it miserably failed. To make things simpler to understand, I would like to ask the core question:
Who is responsible to create the m3u8 playlist file?
I tried to experiment in two steps: first, try to push a local mp4 file and play it back via HLS:
Following the above tutorial, I try to use ffmpeg to push a local mp4 file to nginx-rtmp, and use videojs to play it. The browser reported error:
VIDEOJS: ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) No compatible source was found for this media.
Secondly, I have successfully saved the video file pushed to nginx-rtmp as a series of FLV file, and I know that I can use exec_push to call ffmpeg to convert flv to a format that is compatible with HLS. Again, here the core question is, howto create and UPDATE the m3u8 file as new video data is coming in endlessly.
For now, I would like experts to help me tackle the first question -- playback static mp4 file through HLS. Any tutorials on m3u8 playlist and mpeg-ts files are also much appreciated!

The nginx-rtmp module by itself creates and updates the playlist as new segments arrive.
To troubleshoot check if the .m3u8 files are created under the folder specified in hls_path of your nginx conf. Rest is just nginx serving a file using http.
If that works try the HLS url directly in safari (safari got inbuilt HLS player) or in Chrome (Play HLS M3u8) extension enabled.
If that works the problem must be with your player.html

Related

Play hls m3u8 format in nextjs

How can I use hls m3u8 format in nextjs
I tried with video.js CDN but I couldn't achieve it. Hls.js and react player are not compatible with the last version.
Please check the example at this link: https://stackblitz.com/edit/github-hgv7za-ezygog.
Currently, it is working for me with the sample m3u8 file located at https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8.
If you encounter a specific error, please provide more details, such as the console log, etc.

Embed and play sound (via audio element) in electron app

The actual task is to play mp3 file that should be embedded with the electron app.
I placed the mp3 file near my electron app "main.js".
I wanted to play it using some "sound-play" (https://github.com/nomadhoc/sound-play) package using bridge api, but it didn't worked out, sound is not playing.
So I want to just play it using an audio element (which seems to be more universal), but I don't get what src url I should supply to it. And then I wonder how to correctly add an additional asset to the built version (I'm using electron-builder).
Working with extra/local resources in election apps is kind of obscure topic for me. Advice would be appreciated.

Chromecast from a node media server

I used npm hls-server to setup an hls streaming server on my dev machine. Then i took android sample app and tried adding another video with url that of my localhost.
The video plays fine inside the sample app but i am unable to cast it to chromecast.
I tried changing mediatype and also setup cors proxy using npm package cors-anywhere.
I used ffmpeg to generate hls playlist and .ts files.
Would be great if someone can help out.
The problem was with my HLS stream. What I did was to use sample receiver app from google and attached debugger to see error code/exceptions.
Then i took another video and used ffmpeg to produce hls again which worked out quite fine.

HLS stream with jwplayer using requirejs

I would like to use JWPlayer loading by RequireJS to play HLS streams. But I have a problem with it.
All the single files are working for me ( for example .mp4 files ) except the streams.
I've created a plunker example, in the playlist there is an .mp4 file and a test stream from the jwplayer's site and the stream doesn't work.
Please help me, because if the JWPlayer can not work with RequireJS we will not buy the official licence :(

converting audio file format after upload

I am developing a phonegap application for iOS and Android. In iOS, recorded sound format is .wav and in android, it's .amr/.mp3. After recording, audio file gets uploaded to a web server.
I need to convert these files from .amr/.wav to .mp3 after upload - what options do I have?
I've read about ffmpeg and ffmpeg-php. Will that work with shared hosting? If yes, can someone provide any link to install it?
Any converter, which will work with shared hosting of justHost cPanel?
any php class which can send my audio file to ffmpeg server and get back mp3 file? Just thinking.
Any other option?
I don't mind the method - I need the the required output.

Resources