Good Evening,
I'm currently testing AS3+NodeJS communication to delve into multiplayer games. I'm currently very experienced in Flash, but pretty new to NodeJS.
The problem I have is that the data that Node sends is different to what Flash Receives.
Take the following working NodeJS Code (I'm not asking if it is right or wrong or for "best practice" - im testing different things out). Look specifically at the 'clients' object and the "data" event handler:
var net = require('net');
var mySocket;
var clients = {
'0': 'myTest'
};
var server = net.createServer(function(socket) {
mySocket = socket;
mySocket.on("data", function(data){
myData = data + " -- " + clients[0];
console.log("Data=" + myData);
mySocket.write(myData);
});
});
server.listen(3000, "127.0.0.1");
When "hello world" data is sent to the server the expected output is this:
Console:
"Data=hello world--myTest"
Flash:
"Data Received: [hello world--myTest]"
The console outputs the information that i am expecting, however the Flash outputs :-
"Data Received: [ -- myTesthello world]"
A few snippets from my AS3 Connection Class is below :-
public function createConnection():void{
this.currStatus = "Connecting..";
this.mySocket = new XMLSocket("localhost",3000);
this.mySocket.addEventListener(DataEvent.DATA, onReceiveData);
For handling data that is received, we just trace it for now:
private function onReceiveData(evt:DataEvent):void{
//We have recieved some data from the server. Act upon it..
//not sure yet what it will do with the data.. just trace for now.
trace("Data Received: [" + evt.data + "]");
}
If anyone can point out why the data is in a different order when received in Flash, it would be a good learning point. As I said im very new to NodeJS so there may be something i am missing (I am aware there is no .on"connect".... took it out to test w/o it).
Thanks in advance.
Related
I'm a total beginner in Using Node-Red and nodeJS. Im trying to write nodejs code in an "daemon node" for handling my payload before sending it by MQTT to my nodejs Server. My Problem is: I can't get my payload from stdin.
I tried everything i found online about reading from stdin, but I didn't find a solution.
"use strict"
let mqtt = require("mqtt");
let client = mqtt.connect("mqtt://192.168.178.36");
let obj = process.stdin;
console.log(obj);
client.on('connect', () => {
console.log("Sending...")
client.publish("test/reader01", "Reader01: " + (new Date()).toString() + "\n" + obj);
client.end();
});
The program you can see here sends the aktual Date to the server an prints out a string including a net.socket object on the console, but i can't get the payload from my stream.
I'm trying to read in data from an arduino using serialport, and serve it to a web browser.
Without the webserver (ie. if I just leave out that 'listen' call at the end), the serial data gets constantly streamed in with the expected 5 updates per second shown in the console.
But when I add the 'listen' call, nothing is shown on the console until I make a request to the server with my web browser, at which time the console gets at most only one log entry added (but sometimes still nothing).
The data shown in the web browser is the 'old' data from whenever the last request was made, not the current latest data from the arduino. In other words, the serial data is processed a little after each http request is served - not very useful.
const http = require('http');
const serialport = require('serialport');
var serial = new serialport('/dev/ttyUSB0', {
baudRate: 115200
});
var jsonStr = '';
var jsonObj = {};
function handleData(data) {
jsonStr += data;
if ( data.indexOf('}') > -1 ) {
try {
jsonObj = JSON.parse(jsonStr);
console.log(jsonObj);
}
catch(e) {}
jsonStr = '';
}
};
serial.on('data', function (data) {
handleData(data);
});
const app = http.createServer((request, response) => {
response.writeHead(200, {"Content-Type": "text/html"});
response.write(JSON.stringify(jsonObj));
response.end();
});
app.listen(3000);
(The data coming from the arduino is already a JSON string which is why I'm looking for a '}' to start parsing it.)
I also tried using the 'readable' event for getting the serial data but it makes no difference:
serial.on('readable', function () {
handleData(serial.read());
});
If I understand it correctly, the listen call itself is not blocking, it merely registers an event listener/callback to be triggered later. As an accepted answer in a related question says: "Think of server.listen(port) as being kinda similar to someElement.addEventListener('click', handler) in the browser."
If node.js is single threaded then why does server.listen() return?
So why is that 'listen' preventing the serial connection from receiving anything, except for briefly each time a request is served? Is there no way I can use these two features without them interfering with each other?
I discovered that the code worked as expected on a different computer, even though the other computer was using the exact same operating system (Fedora 20) and the exact same version of node.js (v10.15.0) which had been installed in the exact same way (built from source).
I also found that it worked ok on the original computer with a more recent version of Fedora (29).
This likely points to some slight difference in usb/serial drivers which I don't have the time, knowledge or need to delve into. I'll just use the configurations I know will work.
I want to send and later display a video in the browser. I have the application that takes the video from my camera and converts it to bytes, later on - it sends it through UDP to a specific port.
Now - I have the node.js script that receives already the bytes:
var PORT = 19777;
var MULTICAST_GROUP = "224.0.0.251";
var dgram = require("dgram");
var payload = new Buffer('A wild message appears');
var client = dgram.createSocket("udp4");
client.on("message", function(message, rinfo) {
console.log("received: ",message,rinfo);
});
client.on("listening", function() {
console.log("listening on ",client.address());
client.setBroadcast(true);
client.setTTL(64);
client.setMulticastTTL(64);
client.setMulticastLoopback(true);
client.addMembership(MULTICAST_GROUP);
client.send(payload, 0, payload.length, PORT, MULTICAST_GROUP, function(err,bytes) {
console.log("err: "+err+" bytes: "+bytes);
// client.close();
});
});
client.on("close", function() {
console.log("closed");
});
client.on("error", function(err) {
console.log("error: ",err);
});
client.bind(19777);
and after running that - I see the packets in the console.
I need to process each packet (e.g. check whether it is from the current frame) - which hopefully generates the video stream on the other site - and - somehow - send it to the client's browser. At this point I don't even know how to process the packets itself - where and how are they stored after receiving them, if I can send it later to the browser, how to play it in the browser (html5?), etc...
I'm a total beginner when it comes to node.js and similar, so I've read a lot during last hours and I found an article that recommends Kaazing WebSocket Gateway, but I'm not sure if it's still an acceptable standard for that. Also, I found the webRTC, but I'm not sure if I can use it with custom stream (since I'm not using any popular video codects).
Could anyone give me any hint how to process now? I found some stackoverflow previous question, but the solution provided there is no longer online.. I'm lost in that completely and I will appreciate any help from you guys,
thanks!
I am writing a webapp, using express.js.
My webapp achieves the following
User posts 100 json objects
Each json object is processed via a service call
Once the service call is completed, a session variable is incremented
On incrementation of the session variable, a server side event must be sent to the client to update the progress bar
How do i achieve listening on a session variable change to trigger a server-sent event?
Listening to a variable change is not the only solution I seek?
I need to achieve sending a server-sent event once a JSON object is processed.
Any appropriate suggestion is welcome
Edit (based on Alberto Zaccagni's comment)
My code looks like this:
function processRecords(cmRecords,requestObject,responseObject)
{
for (var index = 0; index < cmRecords.length; index++)
{
post_options.body = cmRecords[index];
request.post(post_options,function(err,res,body)
{
if(requestObject.session.processedcount)
requestObject.session.processedcount = requestObject.session.processedcount + 1;
else
requestObject.session.processedcount = 1;
if(err)
{
appLog.error('Error Occured %j',err);
}
else
{
appLog.debug('CMResponse: %j',body);
}
var percentage = (requestObject.session.processedcount / requestObject.session.totalCount) * 100;
responseObject.set('Content-Type','text/event-stream');
responseObject.json({'event':'progress','data':percentage});
});
};
}
When the first record is updated and a server side event is triggered using the responseObject (express response object)
When the second record is updated and I try triggering a server side event using the same responseObject. I get an error saying cannot set header to a response that has already been sent
It's hard to know exactly what the situation is without seeing the routes/actions you have in your main application...
However, I believe the issue you are running into is that you are trying to send two sets of headers to the client (browser), which is not allowed. The reason this is not allowed is because the browser does not allow you to change the content type of a response after you have sent the initial response...as it uses that as an indicator of how to process the response you are sending it. You can't change either of these (or any other headers) after you have sent them to a client once (one request -> one response -> one set of headers back to the client). This prevents your server from appearing schizophrenic (by switching from a "200 Ok" response to a "400 Bad Request," for example).
In this case, on the initial request, you are telling the client "Hey, this was a valid request and here is my response (via the status of 200 which is either set elsewhere or being assumed by ExpressJS), and please keep the communication channel open so I can send you updates (by setting your content type to text/event-stream)".
As far as how to "fix" this, there are many options. When I've done this, I've used the pub/sub feature of redis to act as the "pipe" that connects everything up. So, the flow has been like this:
Some client sends a request to /your-event-stream-url
In this request, you set up your Redis subscriber. Anything that comes in on this subscription can be handled however you want. In your case, you want to "send some data down the pipe to the client in a JSON object with at least a data attribute." After you have set up this client, you just return a response of "200 Ok" and set the content type to "text/event-stream." Redis will take care of the rest.
Then, another request is made to another URL endpoint which accomplishes the task of "posting a JSON object" by hitting /your-endpoint-that-processes-json. (Note: obviously this request may be made by the same user/browser...but the application doesn't know/care about that)
In this action, you do the processing of their JSON data, increment your counters, or do whatever...and return a 200 response. However, one of the things you'd do in this action is "publish" a message on the Redis channel your subscribers from step #1 are listening to so the clients get the updates. Technically, this action does not need to return anything to the client, assuming the user will have some type of feedback based on the 200-status code or on the server-sent event that is sent down the pipe...
A tangible example I can give you is this gist, which is part of this article. Note that the article is a couple years old at this point so some of the code may have to be tweaked a bit. Also note this is not guaranteed to be anything more than an example (ie: it has not been "load tested" or anything like that). However, it may help you get started.
I came up with a solution please let me know if this is the right way to do stuff ?
Will this solution work across sessions ?
Server side Code
var events = require('events');
var progressEmitter = new events.EventEmitter();
exports.cleanseMatch = function(req, res)
{
console.log('cleanseMatch Inovked');
var progressTrigger = new events.EventEmitter;
var id = '';
var i = 1;
id = setInterval(function(){
req.session.percentage = (i/10)*100;
i++;
console.log('PCT is: ' + req.session.percentage);
progressEmitter.emit('progress',req.session.percentage)
if(i == 11) {
req.session.percentage = 100;
clearInterval(id);
res.json({'data':'test'});
}
},1000);
}
exports.progress = function(req,res)
{
console.log('progress Inovked');
// console.log('PCT is: ' + req.session.percentage);
res.writeHead(200, {'Content-Type': 'text/event-stream'});
progressEmitter.on('progress',function(percentage){
console.log('progress event fired for : ' + percentage);
res.write("event: progress\n");
res.write("data: "+percentage+"\n\n");
});
}
Client Side Code
var source = new EventSource('progress');
source.addEventListener('progress', function(e) {
var percentage = JSON.parse(e.data);
//update progress bar in client
App.updateProgressBar(percentage);
}, false);
I am using Socket.io to stream live tweets to my users using Twitter's Streaming API (my implementation is more or less based on this tutorial).
The problem is that every time a connection event is fired by Socket.io the newly connected client causes every other client connected to the server to cease updating. While it would take too long to go through all the hacks that I tried, I will say that I played with it enough that I believe the problem is caused by Socket.io's multiplexing of the connections from multiple clients (enabled by default) as a performance boost to allow multiple clients or connections to share the same underlying socket. In short, I believe this to be the case because I don't think it would be possible for new connections to affect older connections in this manner if not for the connection multiplexing. In other words, if a new, independent connection with its own underlying (TCP) socket were created every time a client connected it would be impossible for this to occur since one connection would know nothing about the other and therefore couldn't affect any other client's state as is currently happening. This also leads me to believe that simply disabling the multiplexing functionality would be the simplest way to get around this problem since I am not concerned about scaling because Node.js already handles all the concurrency I'm likely to need to handle very adequately.
I have gone through Socket.io's documentation but could not see where the ability to "demultiplex" the connections is exposed via the API, so if anyone knows how to do this I'd create appreciate your response.
My code below is pretty standard and simple. But just to be clear, the issue is that whenever a new client connects to Socket.io every other client stops receiving new tweets and updates are no longer pushed to the older client unless I refresh the browser in which case the newly refreshed client will begin to update and receive fresh tweets again, but the other still connected clients will then stop updating.
Server-side Code:
// Code also uses ntwitter (https://github.com/AvianFlu/ntwitter) as an abstraction over Twitter's Streaming API
io.sockets.on('connection', function (socket) {
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
socket.send(t);
}
});
});
});
Client-Side Code
// Notice the option that I passed in as the second argument. This supposedly forces every
// new client to create a new connection with the server but it either doesn't work or I'm
// implementing it incorrectly. It is the very last configuration option listed in the
// documentation linked to above.
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('message', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
If I am thinking of this incorrectly or if there's something wrong with my code I am definitely open to any suggestions. I'd also be happy to reply with any additional details.
I would try something like this
Serverside:
io.sockets.on('connection', function (socket) {
//Other Connectiony goodness here.
});
});
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
io.sockets.emit("tweet", t);
}
});
Client-side:
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('tweet', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
Basically have the stream from twitter outside your socket, and then on a new tweet emit a message to all connected.