WP7 audio stream problem - audio

I'm using MediaElement to play audio mp3 stream,
everything goes ok, but now I have mp3 stream that does not end with .mp3,
( http://server2.fmstreams.com:8011/spin103) and I'm getting
AG_E_NETWORK_ERROR
I found solution to add ?ext=mp3, but it didn't work for me, any ideas?

If you are streaming live radio, the stream may be encoded by an IceCast server or ShoutCast server. To read these streams, you will need to decode the stream in memomry and pass it to the MediaElement once it has been decoded.
have a look at Mp3MediaStreamSource : http://archive.msdn.microsoft.com/ManagedMediaHelpers
and Audio output from Silverlight
I lost tons of time on this, and this is the best solution I found so far.
You also have to be sure that while you are testing, the device must be unplugged from the computer.

Related

How do I receive video stream data in node server?

I don't know how to get started with this.
What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.
I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?
also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?
It depends on which live streaming protocol you want to play on the player, as #Brad said, HLS is the most common protocol for player.
Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.
The stream flow is like this:
FFmpeg/OBS ---RTMP--->--+
+--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+
The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):
For RTMP, you could use FFmpeg or OBS to push stream to your media server.
If want to push stream by H5, the only way is use WebRTC.
The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.

Stream audio from web browser

Is it possible to capture all audio stream on my PC (from web browser) and stream it via LAN ?
I use Yandex Music (music.yandex.ru) service. So I logged into my yandex account and I have no any audio files, just online stream. I want to make something like LAN-radio. Users will visit an HTML-page located on our server and listen my audio stream.
Can I use icecast or similar software to stream non-file audio?
Or should I connect my PC's line out to line IN (or mic) and read audio stream via Java or flash? Any ideas?
Have you tried looking at things like Jack and Soundflower? These allow you to reroute the audio from one program to another. You could then reroute the sound into Java or flash and go from there.
https://rogueamoeba.com/freebies/soundflower/
http://jackaudio.org/
You can try WebRTC and MediaStream API for that. You can get audio from user's audio device or a stream they are playing in browser. You can find dcoumentation on those APIs from MDN pages.

NodeJS piping with ffmpeg

I wanted to do a HTTP live stream on a screen cast with using ffmpeg, nodejs and html5 . I wanted it to be as real time as possible. However, I find that my video received by the client was behind by 1~2 seconds (On Chrome/Chromium). I am using vp8/webm as my codec.
I have eliminated the following factors as such:
1) Network: I have tried serving and receiving the video file locally by stating the video source to be 127.0.0.1:PORT or localhost:PORT
2) ffmpeg encoding speed:I have tried outputting the file locally, it the "delay" seems to be negligible.
3) Chrome internal buffer. The buffer was accounted to be 0.07s~0.08s.
On the nodeJS side, I have a child process that runs the ffmpeg command, and did a ffmpeg.stdout.pipe(res); <-- ffmpeg is child_process.spawn(...)
So it seems that the ffmpeg.std.pipe(res) of nodejs seems to be the one delaying the video stream. Am I correct in assuming so ? Is there anyway that I may reduce the delay ?
Go to WebRTC no need to implement any thing like codec,pipe,etc(already in chrome,opera,firefox)
Uses:
MediaCaptureAPI(access your cam and mic and convert object to URL, default they are using vp8 codec,etc)
RTCPeerconnectionAPI(send and receive media stream p2p)
RTCDatachannelAPI(send and receive data using p2p)

Rendering Audio Stream from RTSP server

i have an RTSP server which is re-streaming the A/V stream from the camera to clients.
Client side we are using MF to play the stream.
I can successfully play the video but not able to play the audio from the server. However when i use vlc to play, it can play both A/V.
Currently i am implementing IMFMediaStream and have created my customize media stream. I have also created a separate IMFStreamDescriptor for audio and added all the required attributes. When i run , everything goes fine but my RequestSample method never gets called.
Please let me know if i am doing it wrong or if there is any other way to play the audio in MF.
Thanks,
Prateek
Media Foundation support for RTSP is limited to a small number of payload formats. VLC supports more (AFAIR through Live555 library). Most likely, your payload is not supported in Media Foundation.

mp3 http streaming : recording and playing simultaneosly

I have a server (linux) program that generates audio files (mp3). What
I need is to broadcast these files using http stream. The tricky part
is that the broadcast starts when the file to be transmitted is not
fully generated.
I tried to do this using mpd+mpc but once I use the "mpc play" command
only already existing part of the file is buffered and transmitted,
and the player disregards the part that appears after beginning of
playback.
Is there any way to send a mp3 http stream (using mpd or any other
server-side player) so that the player won't stop the playback as it
reaches the end of the part that was buffered initially?
Any ideas, please.
http://streamripper.sourceforge.net/ can record and broadcast the same stream
shotcast(or icecast, dont remember) was designed especially for this, and could re-encode your stream on the fly

Resources