I used npm hls-server to setup an hls streaming server on my dev machine. Then i took android sample app and tried adding another video with url that of my localhost.
The video plays fine inside the sample app but i am unable to cast it to chromecast.
I tried changing mediatype and also setup cors proxy using npm package cors-anywhere.
I used ffmpeg to generate hls playlist and .ts files.
Would be great if someone can help out.
The problem was with my HLS stream. What I did was to use sample receiver app from google and attached debugger to see error code/exceptions.
Then i took another video and used ffmpeg to produce hls again which worked out quite fine.
Related
How can I use hls m3u8 format in nextjs
I tried with video.js CDN but I couldn't achieve it. Hls.js and react player are not compatible with the last version.
Please check the example at this link: https://stackblitz.com/edit/github-hgv7za-ezygog.
Currently, it is working for me with the sample m3u8 file located at https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8.
If you encounter a specific error, please provide more details, such as the console log, etc.
The actual task is to play mp3 file that should be embedded with the electron app.
I placed the mp3 file near my electron app "main.js".
I wanted to play it using some "sound-play" (https://github.com/nomadhoc/sound-play) package using bridge api, but it didn't worked out, sound is not playing.
So I want to just play it using an audio element (which seems to be more universal), but I don't get what src url I should supply to it. And then I wonder how to correctly add an additional asset to the built version (I'm using electron-builder).
Working with extra/local resources in election apps is kind of obscure topic for me. Advice would be appreciated.
My scenario is to pull data from a RTSP source via ffmpeg, send it to nginx-rtmp, and use nginx to provide hls playback. There are quite a lot of tutorials and q&a's on the internet. I followed this one:
https://docs.peer5.com/guides/setting-up-hls-live-streaming-server-using-nginx/
However, it miserably failed. To make things simpler to understand, I would like to ask the core question:
Who is responsible to create the m3u8 playlist file?
I tried to experiment in two steps: first, try to push a local mp4 file and play it back via HLS:
Following the above tutorial, I try to use ffmpeg to push a local mp4 file to nginx-rtmp, and use videojs to play it. The browser reported error:
VIDEOJS: ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) No compatible source was found for this media.
Secondly, I have successfully saved the video file pushed to nginx-rtmp as a series of FLV file, and I know that I can use exec_push to call ffmpeg to convert flv to a format that is compatible with HLS. Again, here the core question is, howto create and UPDATE the m3u8 file as new video data is coming in endlessly.
For now, I would like experts to help me tackle the first question -- playback static mp4 file through HLS. Any tutorials on m3u8 playlist and mpeg-ts files are also much appreciated!
The nginx-rtmp module by itself creates and updates the playlist as new segments arrive.
To troubleshoot check if the .m3u8 files are created under the folder specified in hls_path of your nginx conf. Rest is just nginx serving a file using http.
If that works try the HLS url directly in safari (safari got inbuilt HLS player) or in Chrome (Play HLS M3u8) extension enabled.
If that works the problem must be with your player.html
In the Additional Steps section at the foot of MAS: Packaging your app, I read:
To ensure proper validation [of your NWJS app for the Mac App Store], you have to check those steps:
Delete the FFMPEG library:
rm "YourApp.app/Contents/Frameworks/nwjs
Framework.framework/Libraries/ffmpegsumo.so"
Note: Currently, FFMPEG library cannot be submitted to Mac App Store as is.
When I delete the FFMPEG library, if my app includes (ogg) audio files and attempts to play them, the app crashes. I assume that this is because the FFMPEG library is essential for audio playback.
Does this mean that the current Mac-App-Store-friendly NWJS build (nwjs-macappstore-v0.12.3-osx-x64) does not support audio? Or is the an alternative audio library that I can use with it?
I have installed red5 media server via windows OS. Demo application are running perfectly. My question is, i want to stream audio from red5 media server. So please help how to store audio in red5 and play from red5 to my custom player(Jplayer) using php
JPlayer doesn't require Red5. You can just put your media in a folder that's accessible via the web and stream from there. If you feel you must use Red5 then follow their examples for the demo app - you can even use the sample folder they stream those from since it doesn't sound like you need any server-side code.