Is it possible to cast two streams from two different URL's simultaneously? How can I achieve that? I guess I would need to add another tag in the receiver application. How can I control the second video tag in the receiver application?
You can only have one video element active at a time on a Chromecast device.
No, it's not possible. The Chromecast ecosystem is designed to use a single app at time. Maybe, if you could create another app which accept two streams you could do that, but in the current state you can't.
Related
I am very confused about the calling sdk specs. They are clear about the fact that only one video stream can be rendered at one time see here...
BUT when I try out the following sample I get video streams for all members of the group call. When I try the other example (both from ms), it behaves like written in the specs... So I am totally confused here why this other example can render more than one video stream in parallel? Can anybody tell me how to understand this? Is it possible or not?
EDIT: I found out that both examples work with multiple videos streams. So it is cool that the service provide more than the specs say, but I do not get the point why the specs tell about that not existing limitations...
Only one video stream is supported on ACS Web (JS) calling SDK, multiple video stream can be rendered for incoming calls but A/V quality is not guaranteed at this stage for more than one video. Support for 4(2x2) and 9(3x3) is on the roadmap and we'll publish support as network bandwidth paired with quality assurance testing and verification is identified and completed.
I have been trying to implement a web application that will be able to handle following scenario:
Streaming video/audio from a client to other clients (actually a particular set of them, no broadcasting) and server at the same time. The data source would be a webcam of the client.
This streamed data has to be displayed in the real time on the other clients' browser and be saved on the server side for the 'archiving' purposes.
It has to be implemented in node.js + socket.io environment.
To put it in some more specific context... The scenario is that there is a guy that makes a kind of a room for the users that he chooses. After the chosen users join the room, the creator starts streaming video/audio from his/her built in devices (webcam). All of the guests receive the data in real time, moreover the data is being sent to the server where it is stored so it can be recovered after the stream and room get closed.
I was thinking about mixing Socket.IO with WebRTC. In theory the combination of these two seem just perfect for the job.
The Socket.IO is great for gathering specific set of users by assigning some sockets to a room and for signaling process demanded by the WebRTC.
At the same time WebRTC is awesome for P2P connection between users gathered in the same room, it is also really easy to get access to the webcam and other built in devices that I might want to use.
So yeah, everything is looking pretty decent in theory but I would really need to see some code in action so I could actually try to implement it on my own. Moreover, I see some issues:
How do I save the stream that is sent by the P2P connection? Obviously server does not have access to that. I was thinking that I might treat the server as another 'guest', so it would be just another endpoint of the P2P connection with the creator of the room. Somehow it feels edgy, though.
Wouldn't it be better to treat server as the middleman between the creator and the clients? At one point there might be some, probably insignificant, delay comparing to P2P but presumably it would be the same for all the clients. (I tried that but I can't get the streaming from webcam to the server done, that's however is the topic for a different question as I am having problems with processing the MediaStream)
I was looking for some nice solutions but without any success. I have seen that there is this nice P2P solution made for socket.io: http://socket.io/blog/socket-io-p2p/ . The thing is - I don't think it will handle the data stream well. The examples mention only simple chat app and I need something a little bit heavier than that.
I would be really thankful for some specific examples, docs, whatever may lead me a little closer to the implementation of it as I really don't know how to approach it.
Thanks in advance :)
You task can be solved by using one of the open source WebRTC-servers.
For example, kurento.
You can realize schemas of stream:
One to one
One to many
Many to many
WebRtc-server schema
Clients will connect to each other through the WebRTC server.
So, on server side you can record the stream, or send it for transcoding.
webSocket is used for communicating with server.
You can find some examples according to your task
Video streaming to multiple users is a really hard problem that unfortunately requires extensive infrastructure to achieve. You will not be able to stream video data through a websocket. WebRTC is also not a viable solution for what you are describing because, as you mentioned, the WebRTC protocol is P2P, as in the streaming user will need to make a direct connection to all the 'viewers'. This will obviously not scale beyond a few 'viewers'. WebRTC is more for direct video calls like in Skype for example.
Here is an article describing the architecture used by a somewhat popular live streaming service. As you can see achieving live video at any sort of scale will require considerable resources.
I want to do some stuff using kinect and my research took me to two libs, libfreenect and OpenNi, the first one apparently just extract video data, am I right? The second one was acquired by Apple and dissolved, however some of the binary data and documentation was recovered by structure.io and this library does give the complete Kinect data. My idea is to use a socket.io server to process the Kinect input data and send it to the browser, then use JavaScript to process it on the client. My question is, does anyone here has achieved such thing? And if so, could you give me some guidance on how to achieve this or where to start please?
For Kinect for Windows V2 =>
https://www.npmjs.com/package/kinect2 [I've used it]
For kinect v1 =>
https://github.com/nguyer/node-kinect
http://metaduck.com/09-kinect-browser-node.html
http://blog.whichlight.com/post/53241512333/streaming-kinect-data-into-the-browser-with-nodejs
http://depthjs.media.mit.edu/
This library achieves something similar to what you were looking to do. It uses Kinect2 (mentioned in another response) to get the Kinect data, but also lets you stream it to another browser.
https://github.com/kinectron/kinectron
I'm looking for something to stream audio like radio (playing continuously and clients can join in the middle of a song) with node.js. Is there any node.js module (which I couldn't find)or anything else that I can use along with node.js to achieve this? Is this possible at all with node.js? If not, what do you recommend to use otherwise? (though, I prefer node.js) It's ok for me to use HTML5 Audio API and I don't care about IE support.
Thanks.
Yes, this is entirely possible. I am hosting internet radio on Node.js at the moment.
All you have to do is take the raw stream data from the encoder and send it via HTTP to any connected clients. The clients are good about synching up with the stream, so you don't have to worry about aligning to frames or anything.
I'm a new programmer working on j2me apps, I'm trying to create app to play music online, connect to server is and streaming is Ok. However, I got a problem about manage the memory. The memories in feature phone is quite small, so after stream it is able to read about 1M at one time, but audio file that I expect is about 30MB. So what is the solution for this issue.
ps// I tried to using thread but that is quite bad because interruption between change the threads.
I think the best option for streaming with JavaME is to use the RTSP protocol. But I don't know what it requires to set that up.
If you insist on using HTTP, you can have one thread fetch the data while another thread is playing. Looks like you can test that method with this app:
http://handheld.softpedia.com/get/Video/J2MEStreaming-28778.shtml