kurento-utils, multiple WebRtcPeer's in one client - node.js

I have an application where I'm sharing both my desktop and my webcam to my Kurento Server (two different endpoints in the same pipeline), starting a recording endpoint for both and then alerting the client on the other hand that they're both ready to consume.
My issue comes with having two WebRtcPeerRecvonly peers on my client, if one doesn't finish before the other one makes a request to consume I either get videos of the same Desktop endpoint or two videos of the same Webcam endpoint.
webcam peer
initWebcamUser(id){
let options = {
onicecandidate: (candidate) => {
socket.emit('onWebcamIceCandidate',{
candidate : candidate,
socket_id : id,
});
}
};
webRtcWebcamPeer = new kurentoUtils.WebRtcPeer.WebRtcPeerRecvonly(options, function(error) {
this.generateOffer((error, offerSdp) => {
socket.emit('viewerWebcam',{
sdpOffer : offerSdp,
socket_id : id
});
});
});
}
and my desktop peer.
initDesktop(socket_id){
let options = {
onicecandidate: (candidate) => {
socket.emit('onDesktopIceCandidate',{
candidate : candidate,
socket_id : socket_id,
});
}
}
webRtcDesktopPeer = new kurentoUtils.WebRtcPeer.WebRtcPeerRecvonly(options, function(error) {
this.generateOffer((error, offerSdp) => {
socket.emit('viewerDesktop',{
sdpOffer : offerSdp,
socket_id : socket_id
});
});
});
}
I've come to the conclusion that they're both sharing the same kurentoUtils.WebRtcPeer as if I set a delay of 2 seconds before invoking initDesktop after calling initWebcamUser I get the correct streams 100% of the time.
I guess this boils down to the question of is there anyway to do this concurrently? Or should I set up a promise based system on when the WebcamPeer is complete, if so where would I put this in this process, when iceCandidates are added?
Edit: I feel it's important to note that I am assigning these peers to their respective 'participants' in my webcamViewerResponse / desktopViewerResponse respectively so they wouldn't be referenced from those temp webRtcWebcamPeer/webRtcDesktopPeer variables when I'm having this issue.
Thanks in advance.

If anyone is looking for an answer for this, I found a solution. Not the most elegant but it works 100% of the time.
endpoint.on('OnIceComponentStateChanged', function(event) {
if(event.state==='CONNECTED'){
//resolve your promise to continue on with your next connection here.
}
});

Related

Implementing this Node.js blockhain on a network

I am looking at this Blockchain from firebase. https://github.com/fireship-io/node-blockchain/blob/main/index.ts
The Blockchain is simple enough. There are many similar examples of blockchain implementations, but I don't really see any that are actually used in a network. I am trying to get some footing for implementing this, at least with 2 users to start.
The thing I'm confused about at the moment is what actually gets shared between users? Is it the chain itself? Just new transactions? When a user does Wallet.sendMoney(5, satoshi.publicKey), it would update the local wallet, but then what? I'm guessing send the transaction to others on the network, but then each copy of the blockchain adds/verifies independently. This seems problematic because some of the transactions could get lost (internet outage or whatever), which makes me wonder if the whole blockchain gets sent, yet this seems unwieldy.
The thing I'm confused about at the moment is what actually gets
shared between users?
You meant between nodes not "users". All the nodes should have the same chain and transactions. So you have to implement a pub-sub system to listen for certain events and also publish the transaction and chain. Using redis, you can create a class. I explained on code:
// Note that redis stores strings.
const redis = require("redis");
// create two channels to transfer data
const CHANNELS = {
BLOCKCHAIN: "BLOCKCHAIN",
TRANSACTION: "TRANSACTION",
};
// unlike socket, we do not need to know the address of otehr nodes
// ---------------------------HERE IS BROADCASTING STATION---------------------
// it writes multiple processes to communicate over channels.
class PubSub {
constructor({ blockchain, transactionPool }) {
// it is able to broadcast its chain and replacing the valid chain
this.blockchain = blockchain;
this.transactionPool = transactionPool;
this.publisher = redis.createClient();
this.subscriber = redis.createClient();
this.subscribeToChannels();
// this.subscriber.on("message", (channel, message) => {
// return this.handleMessage(channel, message);
// });
this.subscriber.on("message", (channel, message) =>
this.handleMessage(channel, message)
);
}
// we are listening all channels
subscribeToChannels() {
Object.values(CHANNELS).forEach((channel) => {
this.subscriber.subscribe(channel);
});
}
publish({ channel, message }) {
//we unsubscrive so we dont send message to ourselves
// we subscribe again to receive messages
this.subscriber.unsubscribe(channel, () => {
this.publisher.publish(channel, message, () => {
this.subscriber.subscribe(channel);
});
});
}
// ------THIS IS WHERE BROADCASTING DONE-------------
handleMessage(channel, message) {
const parsedMessage = JSON.parse(message);
switch (channel) {
case CHANNELS.BLOCKCHAIN:
this.blockchain.replaceChain(parsedMessage, true, () => {
// we need to clear local transaction pool becasue we got a new chain
this.transactionPool.clearBlockchainTransactions({
chain: parsedMessage,
});
});
break;
case CHANNELS.TRANSACTION:
console.log("this in pubsusb", this.transactionPool);
this.transactionPool.setTransaction(parsedMessage);
break;
default:
return;
}
}
broadcastChain() {
this.publish({
channel: CHANNELS.BLOCKCHAIN,
message: JSON.stringify(this.blockchain.chain),
});
}
broadcastTransaction(transaction) {
this.publish({
channel: CHANNELS.TRANSACTION,
message: JSON.stringify(transaction),
});
}
}
export default PubSub;

Is it a good idea to chain multiple service calls in Angular 7

I am creating a basic MEAN app and while working on it, I have to call multiple services one after the another. For example, while placing an order:
PlaceOrder(){
productService.CheckAvailability().subscribe(() => {
if (available){
customerService.GetCustomer().subscribe(() =>{
if(newCustomer){
customerService.CreateCustomer().subscribe(() => {
orderService.CreateOrder().subscribe(() => {
console.log("Order placed");
});
});
}
else //old customer
{
orderService.CreateOrder().subscribe(() => {
console.log("order placed");
});
}
});
} //endif
});
}
Now, I am wondering if it is a good idea and application design to chain these service calls this way Or if this design is going to impact the application speed and efficiency. The same goes with Nodejs server also. There, I have sequential db updates.

How to fetch list of participants in a twilio conference after joining a new agent

I'm working on Twilio conference call scenario. My code snippet adds a supervisor in a live conference call. Now I want to send Client callsid, Agent callsid and supervisor callsid to front end. Used following code snippet. So according to my scenario three callsid would be returned however, in some cases two call sids are returned. I don't know why but I've handled the situation through if else structure. But facing issue that supervisor sid is always returned with correct value but agentcallsid and clientcallsid are switched in some cases. Need guidance for any alternative method. code snippet is given below.
const joinConference = (conferenceSid, supervisorName, callback) => {
let participantsArray = [];
client.conferences(conferenceSid)
.participants
.create({
from: '+18xxxxxxxxx',
to: `client:${supervisorName}`
})
.then(participant => {
const supervisorCallSid = participant.callSid;
console.log(`Supervisor Call Sid in Join conference is ${supervisorCallSid}`);
client.conferences(conferenceSid)
.participants
.list((error, results) => {
if (error) {
console.error(error);
return;
}
participantsArray = results.map((participantsResult, index) => {
return {
[`agent${index}CallSid`]: participantsResult.callSid
}
});
console.log(participantsArray);
let result = {};
if (participantsArray.length > 2) {
result = {
"supervisorCallSid": participantsArray[0].agent0CallSid,
"clientCallSid": participantsArray[1].agent1CallSid,
"agentCallSid": participantsArray[2].agent2CallSid
}
} else {
result = {
"supervisorCallSid": supervisorCallSid,
"clientCallSid": participantsArray[0].agent0CallSid,
"agentCallSid": participantsArray[1].agent1CallSid
}
}
callback(result);
})
})
.done();
}
The issue is that agentcallsid and clientcallsid positions are changed everytime when the function is called. Is it issue with map.
Twilio developer evangelist here.
First up, you sometimes get the supervisor call sid and sometimes not because your API call is in a race condition with the supervisor joining the conference. I'd rely on the sid that is returned from the call creation API request rather than the list of participants.
Listing participants in the call does not have an implicit order, so you may find that the agent and the caller are switched given different calls. In this case, I'd likely store the agent and caller call sids somehow when the call is started. You can then refer to your record of who is agent and who is caller, rather than reach out to the API.
I don't know much more about your setup, so I can't suggest anything to specifically help your problem. I hope this sets you in the right direction though.

NodeJS CouchDB Long Polling Debacle

I have a web app that is published via ExpressJS on NodeJS, of course. It uses CouchDB as it's data source. I implemented long polling to keep the app in sync at all times between all users. To accomplish this I use the following logic:
User logs into app and an initial long poll request is made to Node via an Express route.
Node in turn makes a long poll request to CouchDB.
When Couch is updated it responds to the request from Node.
Lastly Node responds to the browser.
Simple. What is happening, though, is that when I refresh the browser it freezes up on every fifth refresh. Huh? very wierd. But I can reproduce it over and over, even in my test environment. Every fifth refresh without fail freezes up Node and causes the app to freeze. Rebooting Node fixes the issue.
After much hair pulling I THOUGHT I solved it by changing this:
app.get('/_changes/:since*', security, routes.changes);
To this:
app.get('/_changes/:since*', security, function () { routes.changes });
However, after further testing this is just failing to run routes.changes. So no actual solution. Any ideas why long polling CouchDb from Node would do such a strange thing? On the fifth refresh I can have a break point in node on the first line of my routing code and it never get's hit. However, in the browser I can break on the request to node for long polling and it seems to go out. It's like Node is not accepting the connection for some reason...
Should I be approaching long polling from Node to CouchDB in a different way? I'm using feed=longpoll, should I maybe be doing feed=continuous? If I turn down the changes_timeout in couchdb to 5 seconds it doesn't get rid of the issue, but it does make it easier to cope with since the freezes only last 5 seconds tops. So this would seem to indicate that node can't handle having several outstanding requests to couch. Maybe I will try a continuous feed and see what happens.
self.getChanges = function (since) {
Browser:
$.ajax({
url: "/_changes/" + since,
type: "GET", dataType: "json", cache: false,
success: function (data) {
try {
self.processChanges(data.results);
self.lastSeq(data.last_seq);
self.getChanges(self.lastSeq());
self.longPollErrorCount(0);
} catch (e) {
self.longPollErrorCount(self.longPollErrorCount() + 1);
if (self.longPollErrorCount() < 10) {
setTimeout(function () {
self.getChanges(self.lastSeq());
}, 3000);
} else {
alert("You have lost contact with the server. Please refresh your browser.");
}
}
},
error: function (data) {
self.longPollErrorCount(self.longPollErrorCount() + 1);
if (self.longPollErrorCount() < 10) {
setTimeout(function () {
self.getChanges(self.lastSeq());
}, 3000);
} else {
alert("You have lost contact with the server. Please refresh your browser.");
}
}
});
}
Node:
Routing:
exports.changes = function (req, res) {
var args = {};
args.since = req.params.since;
db.changes(args, function (err, body, headers) {
if (err) {
console.log("Error retrieving changes feed: "+err);
res.send(err.status_code);
} else {
//send my response... code removed here
}
})
}
Database long poll calls:
self.changes = function (args, callback) {
console.log("changes");
if (args.since == 0) {
request(self.url + '/work_orders/_changes?descending=true&limit=1', function (err, res, headers) {
var body = JSON.parse(res.body);
var since = body.last_seq;
console.log("Since change: "+since);
self.longPoll(since, callback);
});
} else {
self.longPoll(args.since, callback);
}
}
self.longPoll = function (since, callback) {
console.log("about to request with: "+since);
request(self.url + '/work_orders/_changes?feed=continuous&include_docs=true&since=' + since,
function (err, res, headers) {
console.log("finished request.")
if (err) { console.log("Error starting long poll: "+err.reason); return; } //if err send it back
callback(err, res.body);
});
}
Socket.io will automatically fall back to long polling, and doesn't have a problem like the one you are having. So just use that. Also for CouchDB changes use this https://github.com/iriscouch/follow or maybe this one https://npmjs.org/package/changes like the other guy suggested.
Its very bad practice to reinvent things when we have popular modules that already do what you need. There are currently more than 52,000 node modules on https://npmjs.org/. People make a big deal about copying and pasting code. In my mind reinventing basic stuff is even worse than that.
I know with so many modules its hard to know about all of them, so I'm not saying you can never solve the same problem as someone else. But take a look at npmjs.org first and also sites like http://node-modules.com/ which may give better search results.

Socket.io Duplicate clients on a namespace

Hi I'm trying to use dynamic namespace to create them on demand.
It's working except that I get duplicate or more client for some reason
Server side :
io.of("/" + group ).on("connection", function(socket_group) {
socket_group.groupId = group;
socket_group.on("infos", function(){
console.log("on the group !");
})
socket_group.on('list', function () {
Urls.find({'group' : '...'}).sort({ '_id' : -1 }).limit(10).exec(function(err, data){
socket_group.emit('links', data);
});
})
[...]
})
Client Side :
socket.emit('list', { ... });
On the client side only one command is sent but the server is always responding with 2 or more responses. Every time I close/open my app the response is incremented.
Thanks if you find out.
This could be correct behavior; just like if you were to stack event handlers in any other environment. To ensure you only execute the attachment of your function to the namespace once, check for its presence first (or define this someplace in your program that only runs once as needed).
Try the condition (!io.namespaces["/" + group])

Resources