Embed and play sound (via audio element) in electron app - audio

The actual task is to play mp3 file that should be embedded with the electron app.
I placed the mp3 file near my electron app "main.js".
I wanted to play it using some "sound-play" (https://github.com/nomadhoc/sound-play) package using bridge api, but it didn't worked out, sound is not playing.
So I want to just play it using an audio element (which seems to be more universal), but I don't get what src url I should supply to it. And then I wonder how to correctly add an additional asset to the built version (I'm using electron-builder).
Working with extra/local resources in election apps is kind of obscure topic for me. Advice would be appreciated.

Related

How to play the audiobook .m4b file in julia?

I want to play a audio book file xxx.m4b in a web app, but I can't find a package to handle the m4b file. I find a possible package called ffmpeg but I have no idea how to implement it. How can I play a audio book file in a web app or just natively in my desktop?

Playing Audio in Electron from main process

im developing an electron app, which needs to play a sound in case of incoming message from webSocket connection. The websocket is handled on the main process, as the user switches pages during usage. I can not play the sound from the renderer, as i don't know in which page the user is at the moment, the webSocket message comes in (in worst case he is in between to pages while navigating).
Is there chance to play back audio on main process? The audio file is stored locally within the project file structure.
Kind regards,
BoxSon
I found this simple npm package for playing sounds without renderer, and
you can easily include it to your electron project.
Firstly, you need to install by npm install sound-play or save directly to your project with npm install sound-play --save and
initialize by this
const sound = require("sound-play");
and to play file just one line of code
sound.play("file.mp3");
You can learn more on the official site through this link
Found a workaround to solve this by my own:
create and open a hidden window.
In this window load a HTML5 audio player
via IPC send message to this hidden window from main to play a sound.
Little bit of effort but works like a charm.
Don't forget to destroy the hidden window on application closure (application won't close if you forget this step).

Chromecast from a node media server

I used npm hls-server to setup an hls streaming server on my dev machine. Then i took android sample app and tried adding another video with url that of my localhost.
The video plays fine inside the sample app but i am unable to cast it to chromecast.
I tried changing mediatype and also setup cors proxy using npm package cors-anywhere.
I used ffmpeg to generate hls playlist and .ts files.
Would be great if someone can help out.
The problem was with my HLS stream. What I did was to use sample receiver app from google and attached debugger to see error code/exceptions.
Then i took another video and used ffmpeg to produce hls again which worked out quite fine.

Is there a good React Native module for playing/streaming audios?

I'm new to react native and building an audio streaming app. I found quite a few react native wrappers for native modules for playing audio files https://js.coach/react-native/react-native-ios-audio?search=audio
But none of them seem sufficient. I'm looking for a module that would
allow playing in the background and handle interruptions
have play-speed control
allow both streaming and local file playing
Does such a module exist? Any tips would be much appreciated. Thanks!
If your intention is to use the features you mentioned in a cross-platform (android/ios) app, then you do have some options, but there isn't a single module that will cover each of those bases.
Right now, I'm using a few different modules to accomplish similar functionality to what you're looking for:
react-native-audio-streaming to stream audio from a remote URL.
react-native-sound to play local audio files.
react-native-audio-streamer to stream audio from a remote URL on Android, since it uses ExoPlayer, instead of the AACDecoder used by react-native-audio-streaming.
Support for loading sound over the network was recently added to react-native-sound for iOS, and someone has mentioned the possibility of implementing it on Android as well. If this happens, then it may be the best choice for your use case.
I've included the current react-native-sound feature set below. Hope this helps!
Try Following Modules.
react-native-audio-streaming
react-native-audio-streamer

Playing audio in a Mac App Store compatible NWJS app

In the Additional Steps section at the foot of MAS: Packaging your app, I read:
To ensure proper validation [of your NWJS app for the Mac App Store], you have to check those steps:
Delete the FFMPEG library:
rm "YourApp.app/Contents/Frameworks/nwjs
Framework.framework/Libraries/ffmpegsumo.so"
Note: Currently, FFMPEG library cannot be submitted to Mac App Store as is.
When I delete the FFMPEG library, if my app includes (ogg) audio files and attempts to play them, the app crashes. I assume that this is because the FFMPEG library is essential for audio playback.
Does this mean that the current Mac-App-Store-friendly NWJS build (nwjs-macappstore-v0.12.3-osx-x64) does not support audio? Or is the an alternative audio library that I can use with it?

Resources