Play pcm audio from a websocket stream in React - node.js

I am trying to play a pcm audio stream from a websocket in a react application.
On the backend I have a websocket that streams pcm audio. I was able to create a node file that would connect to the websocket pcm stream and play the audio through my computer speakers using the npm packages websocket-stream and speaker.
The issue I am having now is how to move this code over to a react application. When I try to use the websocket-stream package in react, the application does not compile. In addition, the speaker package does not seem to work in the browser. I have found a few examples of playing mp3 files in react, but none that play from a stream.
Any help/direction would be greatly appreciated!

I used this github project and ported the code over to a react app: https://github.com/samirkumardas/pcm-player

Related

Embed and play sound (via audio element) in electron app

The actual task is to play mp3 file that should be embedded with the electron app.
I placed the mp3 file near my electron app "main.js".
I wanted to play it using some "sound-play" (https://github.com/nomadhoc/sound-play) package using bridge api, but it didn't worked out, sound is not playing.
So I want to just play it using an audio element (which seems to be more universal), but I don't get what src url I should supply to it. And then I wonder how to correctly add an additional asset to the built version (I'm using electron-builder).
Working with extra/local resources in election apps is kind of obscure topic for me. Advice would be appreciated.

merge audio and video in nodejs without using ffmpeg

Im trying to merge audio and video both in mp4 and x264 codec but only with node js in server side without using ffmpeg.
is there any way to achieve something like this?
I already tried:
https://github.com/gkozlenko/node-video-lib
but its not working.
any ideas ?
tnx

Chromecast from a node media server

I used npm hls-server to setup an hls streaming server on my dev machine. Then i took android sample app and tried adding another video with url that of my localhost.
The video plays fine inside the sample app but i am unable to cast it to chromecast.
I tried changing mediatype and also setup cors proxy using npm package cors-anywhere.
I used ffmpeg to generate hls playlist and .ts files.
Would be great if someone can help out.
The problem was with my HLS stream. What I did was to use sample receiver app from google and attached debugger to see error code/exceptions.
Then i took another video and used ffmpeg to produce hls again which worked out quite fine.

Playing audio in a Mac App Store compatible NWJS app

In the Additional Steps section at the foot of MAS: Packaging your app, I read:
To ensure proper validation [of your NWJS app for the Mac App Store], you have to check those steps:
Delete the FFMPEG library:
rm "YourApp.app/Contents/Frameworks/nwjs
Framework.framework/Libraries/ffmpegsumo.so"
Note: Currently, FFMPEG library cannot be submitted to Mac App Store as is.
When I delete the FFMPEG library, if my app includes (ogg) audio files and attempts to play them, the app crashes. I assume that this is because the FFMPEG library is essential for audio playback.
Does this mean that the current Mac-App-Store-friendly NWJS build (nwjs-macappstore-v0.12.3-osx-x64) does not support audio? Or is the an alternative audio library that I can use with it?

Playing audio using red5 media server

I have installed red5 media server via windows OS. Demo application are running perfectly. My question is, i want to stream audio from red5 media server. So please help how to store audio in red5 and play from red5 to my custom player(Jplayer) using php
JPlayer doesn't require Red5. You can just put your media in a folder that's accessible via the web and stream from there. If you feel you must use Red5 then follow their examples for the demo app - you can even use the sample folder they stream those from since it doesn't sound like you need any server-side code.

Resources