Enable web socket streaming in Bot Framework Composer BOT - node.js

I have a nodeJS bot built with the composer.
I'm attempting to enable telephony. I followed the directions that were laid out on GitHub, however, when I call I get dead air. Then the telephony channel on the bot shows an error that it could not connect to my bot. Web sockets and streaming are enabled on the app but it looks as if the code needs to be updated as well.
Where do you enable web sockets in the nodeJS code built with composer?

it looks as if the code needs updated as well. Where do you enable web sockets in the nodeJS code built with composer?
It is set up to use WebSockets already. I don't know if you need something special for telephony or not. Please see here.
when I call I get dead air
My assumption is that it is something else.

Related

Signaling mechanism for webRTC using simpleWebRTC js library and backend in django

I am trying to build video conferencing web application where multiple people can join the call and there will be video/audio/data transmission. I researched a lot for this. What i understand is we can achieve this using webRTC protocol. I started doing research for js libraries and i came to know for simpleWebRTC. I want to develop with backend as a django. I tried to run sample demo code from https://www.sitepoint.com/webrtc-video-chat-application-simplewebrtc/. but i am not able to establish connection with socket.io which is simplewebRTC signaling sandbox server.
I tried with node express server as well, but still i got the same error:- net::ERR_FAILED. This error occured in simplewebrtc-with-adapter-js while making connection.
What would be the correct technologies to achieve this type of functionality?
Front-end webRTC libraries:- simplewebRTC/ EasyRTC/ Any else apis?
Signaling mechanism:- What and how can we use to connect with webRTC?
Backend:- Node js/Django?
I still confused with the signaling protocols/STUN/TURN servers as we have to define the servers by our self. simpleWebRTC is not providing that which we can use in production.
Any help would be much appreciated!
I just started a video calling and chat application as well. open-easyrtc, no problems so far, their demo just works after npm install.
As for signaling servers, since I just started I haven't concerned myself much about them but the most I can make out of it is it's used for exchanging information like video metadata, network information, etc. open-easyrtc comes with public STUN and TURN servers, not sure about the limitation especially if you're going to have a lot of users.
It's also possible to deploy your own, I'm looking at learning more about coturn
once I finished developing my application.
You can use simple-peer, a simple library for webrtc. Here is an example project with multiple users project, DEMO.

How can I transfer messages between a chatbot and a web application?

Hi stackoverflow community,
for my master thesis I am currently looking for a suitable messaging protocol or a message broker or middleware that can be used to exchange messages instantly between a chatbot, created using the SAP Conversational AI Framework and intended to serve as a fallback channel, and a specially developed SAPUI5 web application. The whole thing could be imagined as a live chat between a customer and a customer service employee.
The SAP Conversational AI Framework supports Webhooks, so I can connect a Node.js application, for example. The only limitation is that the Webhook URL must start with "https", so that a WebSocket server is virtually eliminated.
Would I have to develop such an interface myself or are there already libraries/frameworks that meet my expectations?
I am looking forward to your feedback.
Many greetings
Hmm I have never used the framework but I have made research and that is what I have found:
https://github.com/SAPConversationalAI/webchat
It enables you to deploy your bot directly to your website.
SAP Docs here:
https://cai.tools.sap/docs/concepts/webchat
I suggest you to use React to develop the frontend. You can use webchat in that way:
https://github.com/SAPConversationalAI/webchat#react-component
so if you want to use it you don't need a backend. You need only serve js files to your browser. For more details you can ask me in the comments.

Building a chat application, NodeJS and Express - what should I use for media streaming?

I've previously built chat servers using NodeJS (i.e. central chat server with clients, no p2p), with Electron, or just good old Express. I'd like to re-use as much of my old code as possible. Thus, the only missing piece of the puzzle for me is what to use to enable both public and private video/audio streaming. File sending isn't necessary.
Is there anything out there I can 'easily' drop in to this model? I'm aware of Kurento and a few similar offerings but these feel like overkill for how I'm hoping to work.
update: Given a few suggestions about WebRTC, which I'm open to, but plans for this app include automated moderation/content filtering of any video broadcasts and text. So I assume such a solution would need to either treat the server as a 'hardcoded' peer somehow so that it's fairly safe to assume it will see a copy of anything sent over the public chat network. Of course, for private communications this need not be the case. On the flip side, worst case, operating in a spoke topology is fine too.
You can start with a WebRTC samples
https://webrtc.github.io/samples/
WebRTC is kind of standard now for audio/video calls. It's all work p2p with no server interaction.
The only one thing you need to build is a signaling protocol to connect 2 users. For this you can use/extend your nodejs app chat.

Azure apps with WebRTC

Is it possible to get an app that uses WebRTC (acting like a "regular client", since WebRTC is a P2P protocol) to work as an Azure App?
It seems that Azure Apps are too restrictive with their ports for WebRTC to work as intended.
Mostly I ask because such an application has already been developed and tested locally, but silently fails when ported to an Azure App. The library used for WebRTC communications (IceLink) gets to the point where it should decide if the link to a peer is up or not, but just stops there and doesn't call the expected callback, nor issue an error, log something, etc.
The creators of the library I'm using were able to answer this:
You can run on Azure, but you have to use a dedicated VM. The cloud/role/app offering does not support the ports/protocols required for P2P communications.
http://support.frozenmountain.com/hc/communities/public/questions/203604033-IceLink-application-on-an-Azure-App#answer-205871726

Node.js/SignalR Communication

I got a server running SignalR and another server runing Node.js. I would like these two servers to communicate using SignalR. Is this possible?
I'm thinking I can use the SignalR client javascript library to connect to the SignalR server from Node:js but I can't find any good examples of how to do this.
Well the answer to can you do this is ultimately "yes" because there is nothing proprietary about SignalR communication. It's just variations of HTTP or WebSockets with a custom handshake/message framing protocol for Hubs on top of that.
So the how is the question we'd need to answer. Out of the box SignalR provides a client side JavaScript library based on the jQuery plug-in architecture, but that won't help Node.js. Someone has started a Node.js implementation here on GitHub, but according to the ReadMe it only supports HTTP long polling today. I'm unaware of any other implementations at this time, but obviously you start with that one and fork it to add support for the other transports if you wanted.

Resources