hi guys i want to make app like obs to stream video to live video web service (twitch,youtube) with rmtp portocol with node js lang
the website give me this two tokens to make live :
Stream URL:
rtmp://rtmp.cdn.asset.aparat.com:443/event
Stream Key
ea1d40ff5d9d91d39ca569d7989fb1913?s=2f4ecd8367346828
I know how to stream from obs but i want to stream music from server 24 hours
ill happy to help me with libs or examples
if theres other langs to do this work pleas tell that
thank you
Related
I am currently working on nodejs and socket app that is currently doing 1:1 video conference that is using webrtc. Videos are two separate element in the html and i would like to merge them together so that i can broadcast to rtmp url for public view (2:many). Is this possible
For webrtc, i followed this tutorial https://www.youtube.com/watch?v=DvlyzDZDEq4, and for broadcasting i am using ffmpeg which current does 1 video stream.
Please confirm if this is doable
Update
I was able to merge the video using
https://www.npmjs.com/package/video-stream-merger
And now the final issue
i am receiving merger.result which is merged stream and I tried to create a mediaRecorder object. Callback method for MediaRecorder ondataavailable is called only once but not every 250ms which i need to broadcast to youtube. How can i do this?
var merger = new VideoStreamMerger(v_opts);
...
...
merger.start()
myMediaRecorder = new MediaRecorder(merger.result);
myMediaRecorder.start(250);
myMediaRecorder.ondataavailable = function (e) {
console.log("DataAvailable")
//socket.emit("binarystream", e.data);
state = "start";
//chunks.push(e.data);
}
So you're looking for many peers. This is possible - please see the below links for reference.
WebRTC: https://webrtc.github.io/samples/src/content/peerconnection/multiple/
StackOverflow reference: webRTC multi-peer connection (3 clients and above)
GitHub reference: https://github.com/Dirvann/webrtc-video-conference-simple-peer
https://deepstream.io/tutorials/webrtc/webrtc-full-mesh/
my workflow is:
Using ffmpeg I send an RTMP stream to WOWZA App1.
App1 sends the stream to an internal 2nd app(App2).
App2 transcodes and packetize to hls and is the origin for Cloudfront distribution.
Cloudfront serves the streams to users.
The player on users is based on HLS.js.
To prepare for differents scenarios I forced App2 to restart during a test transmission, in this case App1 is still receiving stream from ffmpeg and trying to send it to App2, after app2 is ready this link is established again and App1 continues sending the stream to App2, but there is no video on client side.
Before restart, chunklist.m3u8 lists many chunks until the 17th: media-u3510ez40_17.ts
Then, while App2 is restarting chunklist.m3u8 do not exist and cloudfront send 404 error.
And then, when App2 is back, chunklist.m3u8 lists a new list of chunks starting at 1 with a new id: media-u1ofkjj9w_1.ts
The problem is that there is no video and network traffic shows that the browser do not downloads the new listed chunks.
The chunklist.m3u8 keep adding new chunks but the browser do not download any of this... until appears the 18th Chunks... and the video restart.
I try many times and the problem is the same, before restart the last chunk has a number N, and after restart there is no video until reach de N+1 Chunk, but the Ids are different.
I don't know if this issue is on Wowza, Cloudfront or the HLS.js player :/
chunklist.m3u8 Before Restart:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:9
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-DISCONTINUITY-SEQUENCE:0
#EXTINF:8.333,
media-u3510ez40_1.ts
#EXTINF:8.333,
media-u3510ez40_2.ts
#EXTINF:8.334,
.
.
.
media-u3510ez40_16.ts
#EXTINF:8.333,
media-u3510ez40_17.ts
chunklist.m3u8 After Restart:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:17
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-DISCONTINUITY-SEQUENCE:0
#EXTINF:16.396,
media-u1ofkjj9w_1.ts
#EXTINF:8.333,
media-u1ofkjj9w_2.ts
.
.
.
media-u1ofkjj9w_16.ts
#EXTINF:8.333,
media-u1ofkjj9w_17.ts
#EXTINF:8.333,
media-u1ofkjj9w_18.ts
You need to set cupertinoCalculateChunkIDBasedOnTimecode property to true in Add Custom Property section of the Wowza Streaming Engine application. Refer this: https://www.wowza.com/docs/how-to-configure-apple-hls-packetization-cupertinostreaming
Also, note that it will help in the case of encoders that send synchronized timecodes to Wowza. For encoder that doesn't send synchronized timecodes, I suggest to implement an absolute timecode irrespective of encoder send it or not. This will help the application to recover from the N+1 chunk number after the restart.
The below page will help you to configure it properly.
http://thewowza.guru/how-to-set-stream-timecodes-to-absolute-time/
About the session-id change part, When using Wowza as origin for Cloudfront distribution, you need to enable httpOriginMode and disable httpRandomizeMediaName to make it work properly. Below Wowza doc will help you set up it properly, https://www.wowza.com/docs/how-to-configure-a-wowza-server-as-an-http-caching-origin
I’m trying to play mp3 live streaming and I’m using “media response” as shown in the Action on Google guide, here is the code:
if(!conv.surface.capabilities.has('actions.capability.MEDIA_RESPONSE_AUDIO')) {
conv.ask('Sorry, this device does not support audio playback.');
}
else{
conv.ask(new MediaObject({
name: 'Radio one',
url: 'my_streaming_url.mp3',
description: 'A funky Jazz tune',
}));
conv.ask(new Suggestions(['Radio two']));
}
All works fine, but there is about 20 seconds of audio latency on Google home and Google home mini. No latency on Google Assistant Android app and on Action on Google Simulator and no latency also if “url” is an mp3 file. Any idea why there is this delay?
Google Home's media player appears to buffer roughly 20-30 seconds of playable audio before starting.
If you control the Icecast streaming server, increase the <burst-size> value either in the <limits /> section or <mount />. By default, it is set to 65536 bytes.
You can work out the ideal burst-size limit for your stream in bytes by calculating the following:
bitrateKbps * bufferSeconds * 1024 / 8
For a 128kbps stream, try 327680.
(Also, make sure the server's <queue-size /> is bigger than the burst-size)
The 20 second latency with live streaming urls appears to be a thing on google home (mini) and chromecast. Not only if initiated from google assistant : also when using google cast API. I have no idea why they use so much buffering.
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).