How to create video and voice call to python application? - python-3.x

I want to add video and voice call to my web application developed with python.
I searched about it on internet, I found that I can do it with WebRTC, but this work is done with JavaScript, but I don't know how to do this work with python?
I'm using Sanic as a web framework in python 3.6.
On the other hand, is it possible to do this work with socketio in python?
I know this module is suitable for chatting apps.
I appreciate for your help.

There are several aspects involved in building WebRTC application:
Serving the web pages and javascript code used by your web clients. You can either use plain static files, or a server-side framework of your choice.
Providing a signaling channel which allows participants to exchange information about what media they support (audio, video, data channels) and how they can reach each other. Very often a WebSocket is used for this, but it's not the only possibility.
Taking part in the actual WebRTC media exchange. This really depends on your usecase. If you are doing one-to-one audio/video then the WebRTC endpoints are usually web browsers, but they could also be native applications. If you are building something like a voice-over-IP service, then most likely one endpoint is a browser, and the other is a server such as Asterisk or FreeSWITCH.
In the event you actually want your users to communicate with a custom server written in Python (for instance if you are doing audio / video processing using OpenCV) you can take a look at aiortc:
https://github.com/jlaine/aiortc

Sanic is just a web server which will serve HTML and JavaScript. You could also use any web server and it would not matter to WebRTC. Web server has no interaction with WebRTC code in any way.
All WebRTC code that you need for video chat will be in a JavaScript file and that code will be used by your browser (Firefox, Chrome, Opera,...). What you need to do in a server is signaling between peers. For this signaling process you can use socketio in python.
I would recommend you to learn more about WebRTC https://codelabs.developers.google.com/codelabs/webrtc-web/#0

Related

Should I use webRTC alongside Socket.IO if I want a live chat ability beside the real-time video streaming?

What I am trying to do is to create a simple virtual classroom project like Adobe connect, but obviously simpler, using Flutter and NodeJS, and I need the following options:
Real-time video or only voice streaming
Live chat box
Screen sharing ability
File sharing ability(Like PDF or PowerPoint or other text/doc files)
Whiteboard
As I searched so far I found that it seems WebRTC works for video/voice streaming and also screen sharing as well.
Also most of the livechat projects using Socket.IO.
My main question here is to know can I use only WebRTC for both real-time video/voice streaming and also live chat as well? Is it a good idea or it's better to combine Socket.IO and WebRTC together?
Furthermore I want to know can I use each of those libraries for File-Sharing purposes?
WebRTC gives you lower latency and a lot of functionality for conferencing out of the box. So for video/audio calls and screen sharing this is definitely a better choice.
Also, there's an option to use p2p communication which reduces latency even more and saves you resources on the server-side. Though if you intend to support many participants it looks less beneficial - you will need to maintain n-1 connections for each user if you have n users in total.
For live chat, whiteboard and file sharing there would be no big difference in terms of performance.
Things to consider:
WebRTC is more complex technology than websockets to setup and support
There might be opensource solutions for this features, i would make a decision based on what you can reuse in your project
You can use WebRTC for some of the features and websockets for others
can I use only WebRTC for both real-time video/voice streaming and
also live chat as well
Yes you can, there's a RTCDataChannel interface for exchanging arbitrary data. It can be used for live chat / whiteboard / file transfer.
As a good example, there's an opensource project peercalls, that implements chat and file transfer via WebRTC through the same connection that is used for conferencing.
Websockets can be used for file transfer as well, check out this library.
Using WebRTC requires signaling server and signaling is often implemented using websocket, check this mdn article Signaling and video calling
And with websocket you can implement livechat too, so it is not an either or situation but both quite often.

Signaling mechanism for webRTC using simpleWebRTC js library and backend in django

I am trying to build video conferencing web application where multiple people can join the call and there will be video/audio/data transmission. I researched a lot for this. What i understand is we can achieve this using webRTC protocol. I started doing research for js libraries and i came to know for simpleWebRTC. I want to develop with backend as a django. I tried to run sample demo code from https://www.sitepoint.com/webrtc-video-chat-application-simplewebrtc/. but i am not able to establish connection with socket.io which is simplewebRTC signaling sandbox server.
I tried with node express server as well, but still i got the same error:- net::ERR_FAILED. This error occured in simplewebrtc-with-adapter-js while making connection.
What would be the correct technologies to achieve this type of functionality?
Front-end webRTC libraries:- simplewebRTC/ EasyRTC/ Any else apis?
Signaling mechanism:- What and how can we use to connect with webRTC?
Backend:- Node js/Django?
I still confused with the signaling protocols/STUN/TURN servers as we have to define the servers by our self. simpleWebRTC is not providing that which we can use in production.
Any help would be much appreciated!
I just started a video calling and chat application as well. open-easyrtc, no problems so far, their demo just works after npm install.
As for signaling servers, since I just started I haven't concerned myself much about them but the most I can make out of it is it's used for exchanging information like video metadata, network information, etc. open-easyrtc comes with public STUN and TURN servers, not sure about the limitation especially if you're going to have a lot of users.
It's also possible to deploy your own, I'm looking at learning more about coturn
once I finished developing my application.
You can use simple-peer, a simple library for webrtc. Here is an example project with multiple users project, DEMO.

Building a chat application, NodeJS and Express - what should I use for media streaming?

I've previously built chat servers using NodeJS (i.e. central chat server with clients, no p2p), with Electron, or just good old Express. I'd like to re-use as much of my old code as possible. Thus, the only missing piece of the puzzle for me is what to use to enable both public and private video/audio streaming. File sending isn't necessary.
Is there anything out there I can 'easily' drop in to this model? I'm aware of Kurento and a few similar offerings but these feel like overkill for how I'm hoping to work.
update: Given a few suggestions about WebRTC, which I'm open to, but plans for this app include automated moderation/content filtering of any video broadcasts and text. So I assume such a solution would need to either treat the server as a 'hardcoded' peer somehow so that it's fairly safe to assume it will see a copy of anything sent over the public chat network. Of course, for private communications this need not be the case. On the flip side, worst case, operating in a spoke topology is fine too.
You can start with a WebRTC samples
https://webrtc.github.io/samples/
WebRTC is kind of standard now for audio/video calls. It's all work p2p with no server interaction.
The only one thing you need to build is a signaling protocol to connect 2 users. For this you can use/extend your nodejs app chat.

Send MediaStream from Browser to Server to encode

I am developing a video conferencing system that should enable users to stream a session to (for now) the server. I would prefer using WebRTC to connect client and server. The big hurdle I stumbled upon now is the question how to actually "live stream" the video from getUserMedia to the server.
I came across various methods including using a canvas element as well as some gateway (like Janus or Kurento) in the middle. I also found this answer here on Stack Overflow. However since this is a learning project I would prefer to use a pure WebRTC solution as well as not using the upcoming recording API, since I am aiming at a "live streaming" to the server.
My idea was to use nodeJS and have the server act as a peer. However I did not found a package for nodeJS that would enable those two channels (dataChannel seems always to be possible and might be a solution together with the canvas solution). So my question now: Are there any packages out there that I missed? Or is there any other way to implement this?

Methods for calling APIs in one Nodejs app from another Nodejs app

Our application will have a website and a mobile app both communicating to the same API backend. I have one Nodejs application for serving only APIs and a second Nodejs app serving html pages for the website. I am using Expressjs web framework for both of these apps.
What are different methods to call APIs in one Nodejs from another Nodejs app? Additional information on when to use each method would be great.
EDIT:
Example,
I have the following applications
NodejsAPI (node & express)
NodejsWebsite (node & express)
MobileApp
NodejsAPI will provide access to APIs for the MobileApp and the NodejsWebsite. MobileApp will access APIs over http. But I want to know what are the options for NodejsWebsite to call APIs in NodejsAPI app. From what I understand this will be inter process communication between the two processes. For .net applications such communications could be done using .net pipes, tcp communication etc. What are the equivalent methods for Nodejs applications on unix and linux platforms?
Thinking from IPC perspective I found the following to be useful,
What's the most efficient node.js inter-process communication library/method?
https://www.npmjs.org/package/node-ipc
There's node's vanilla http client, http client swiss army knife, request, then there's superagent, similar to jQuery.ajax. To make your life easier there's armrest and fementa, both different flavors of the same thing.
Now if you want to reach for more performance and have another interface of your application, you can use one of these RPC solutions:
dnode: One of the most popular solutions. It's makes things very easy. It's makes using remote interfaces seamless. phantomjs-node uses dnode. Doesn't perform well with huge objects compared to others. For small stuff, it's perfect. There's other ports for other languages too.
zerorpc: Uses zeromq as it's socket library which is famous for being reliable. It supports connecting to a python client too.
smith: RPC systems used in cloud9 editor backend. Basically almost as nice as dnode, but faster. Both smith and zerorpc uses msgpack instead of JSON, so they will save bytes on the wire.
axon-rpc: A lightweight solution. As nice to use as zerorpc. You can configure it to use msgpack with axon-msgpack.
All of above work on both TCP(To be used on different machines) or Unix Domain Sockets(faster than TCP, but only on the same machine).
If you want more performance, you can embed your NodejsAPI in your NodejsWebsite, by just simply requiring it's interface module.
If you want answers better than this, write a more specific question. The question as it is, is too broad.

Resources