How to play realtime binary stream through client speakers in HTML5 - node.js

I need help playing a binary stream to a client's speakers using the client's web browser. The stream is being recorded from a client's web browser and is sent to a NodeJS server using BinaryJS. I have successfully streamed the binary data back to the client from the server, but cannot figure out how to play it. I am using NodeJs, BinaryJS, webAudio API, and HTML5. I have also been testing with Firefox. Has anybody done this before? Thanks in advance.

If this is an option for you, the simplest option would be to encode your data to a compressed format (say, mp3, ogg, opus, etc.), and simply put the URL in an <audio> tag.
This pages is a good introduction on how to stream mp3 from node.

Related

How can I Record video from webcam on client with lossless frame pixels through browser?

I need to build a website which recording the person from the camera (he must allow the camera first), but I need the record frame by frame with lossless pixels.
I tried to figure this out with some options:
opencv.js - didn't figure it, it is using the browser video element, this is changing the pixels by compressions right?
ngx-webcame - I read it using capturing lossless images but not video
Now the other issue that I need to send the frames to the server?
should I save the frames on client process it on client computer and then send the result to the server?
Is there an option to send the video data frame to the server for future use?
Someone told me to build an agent that will do this actions and send the data on chunks but in that case I don't know really how to do it and I need clarification on that and some instruction on how to start build something like that.
If anyone have an example codes or anything that can direct me to the solution it will be very helpful.
I've created something similar befor using RecordRTC.
It takes advantageg of WebRTC. It works pretty straitforward. Record the video localy and upload it as a file.
https://github.com/muaz-khan/RecordRTC

how to host mp3 file with koa framework

I'm coding API server to be used in mp3 app.
I've used koa-send, koa-static, and just setting mp3 file to response-body.
But, no matter what API the app uses, the app stops. When I sent the length of the MP3 file separately because the app did not seem to accept the length of the MP3 file, it worked on iOS but not on Android.
If I post the same MP3 file on S3 and send request to that URL, it worked well, so I can't understand what the problem is.
Also, if I play music on Safari using my API, it comes out as a live broadcast. (using other sites, it comes in the form of mp3)
If it's a problem that you don't know how long it's playing, why is it the same file, but not on other sites, and not on my API?
Other storage site:
My API:

Google Action should play radio stream

I need to develop a Google Action which streams an audio/radio stream.
i thought about media response.
But the documentation says: "Audio for playback must be in a correctly formatted .mp3 file. Live streaming is not supported."
Documentation
Can someone give me an hint, what i have to do to stream an audio-stream? i found a german google action "baden fm" which streams their radio. But not sure how they do it.
Kind Regards
Stefan
The only ways to do this currently:
Stream it in chunks of MP3 files, using the callback at the end of streaming to stream the next chunk
Getting listed on TuneIn, Radio.com or iHeartRadio. From observation, Baden FM seems to be using TuneIn
Through an App Action
Use a Web site link that starts streaming via BrowseCarousel or Button
Last 2 options are not helpful if you're going after non-browser-enabled devices.
Also saw this thread which has some insight on MP3 size/duration: How can I tell Actions on Google to stream audio?
Google Actions do not currently support live audio streaming. I'm in contact with them but it seems they have no ETA to support this.
I was successful doing so with an mp3 live stream:
NPR: https://npr-ice.streamguys1.com/live.mp3?ck=1597372625378
but not with mpd
BBC test stream: https://rdmedia.bbc.co.uk/dash/ondemand/testcard/1/client_manifest-audio.mpd
or with the HLS that my company uses ( .m3u8, can't publish the link publicly )
Note: added links as text/code since I'm not sure whether their companies policies are cool with them being indexed.

Record Screen's Happenings(Audio+Video)

i am new baby in WebRTC and want to implement system like video conferencing , live streaming or you can skype using WebRTC and NodeJS.
i am confused with one thing , as its our one of client's requirement , suppose on page whatever is happening it may be video conferencing say one moderator answering to many audiences one by one , so there should be one video created , which continuously recording all this stuff together and sending live stream to server to save in our database.
is this kind of stuff implementable or not?
any help please.
You can capture video through grabbing Jpeg images from a canvas element. You could also capture the entire page(if numerous videos in the same page) through grabbing the page itself through chrome.
For audio, recording remote audio with the Audio API is still an issue but locally grabbed audio is not an issue.
RecordRTC and my Modified Version for recording streams either to file or through websockets respectively.
Capture a page to a stream on how to record or screenShare an entire page of Chrome.
If you have multiple different videos not all in the same page but want to combine them all, I would suggest recording them as above and then combining and syncing them up server side(not in javascript but probably in C or C++).
If you MUST record remote audio, then I would suggest that you have those particular pages send their audio data over websockets themselves so that you can sync them up with their video and with the other sessions.

Using an audio stream URL with wpaudio or any other such script

I have bunch of audio stream URLs, like this one:
http://popplers5.bandcamp.com/download/track?enc=mp3-128&id=1269403107&stream=1
(which, by the way, are from the incredible bandcamp . com)
I need to know how to use these with audio streaming scripts like the wpaudio plugin (wpaudio . com)
Most of these plugins require a link to an actual mp3 file. As you can see, the URL above is an audio stream, not an actual mp3. How do I put the two together?
If you visit that URL, your browser should start playing the audio stream. I basically need to be able to embed the audio stream into a web page.
Thanks!
You can use mplayer:
mplayer -dumpstream url
Check this page:
http://www.mplayerhq.hu/DOCS/HTML/en/streaming.html
for more information.
If this is not a live stream thing, as it looks like for the above file, you can use a simple wget:
wget "http://popplers5.bandcamp.com/download/track?enc=mp3-128&id=1269403107&stream=1" -O outfile.mp3
Be sure to check copyright/licensing of the content before you do anything with it.
You can use WPaudio or any of the others you want, no problem. There is absolutely no difference between that stream URL you posted, and a random MP3 file served up from somewhere. At least, not from your browser's perspective.
Your browser has no idea, and does not care, how the source server gets its content. It's all HTTP, and the only thing your browser cares about is content type. Here's what that looks like:
Content-Type: audio/mpeg
Same as it would be if you tossed an MP3 up there. No biggie.
Use WPaudio, or any other web MP3 player. They all will work fine.

Resources