Why is received websocket data coming out as a buffer? - node.js

I'm trying to do a very basic websocket, but i dont understand why I'm not getting a string back.
I'm using the ws module from npm for the server. https://github.com/websockets/ws
client:
let socket = new WebSocket('wss://upload.lospec.com');
socket.addEventListener('open', function (event) {
socket.send('test');
});
server:
const wss = new WebSocket.Server({ server });
wss.on("connection", function (ws) {
ws.on("message", function (asdfasdf) {
console.log("got new id from client",asdfasdf);
});
server result:
got new id from client <Buffer 74 65 73 74>
Trying to follow the examples on the docs, as well as this tutorial: https://ably.com/blog/web-app-websockets-nodejs
But it's not coming out like a string like both places promise.
Why isn't this coming out as a string?

You probably use a different version of ws than the tutorial does. It seems like the tutorial uses a version older than v8, while you use a version of v8+.
From the changelog for 8.0.0:
Text messages and close reasons are no longer decoded to strings. They are passed as Buffers to the listeners of their respective events.
The listeners of the 'message' event now take a boolean argument specifying whether or not the message is binary (e173423).
Existing code can be migrated by decoding the buffer explicitly.
websocket.on('message', function message(data, isBinary) {
const message = isBinary ? data : data.toString();
// Continue as before.
});
websocket.on('close', function close(code, data) {
const reason = data.toString();
// Continue as before.
});
This also describes the solution.
Alternatively, you can downgrade to version 7.5.0 of ws to be on par with what the tutorial uses:
npm i ws#7.5.0
In regards to the example "sending and receiving text data" in the library docs: I believe it's an oversight on their end that this example wasn't updated when v8 was released. You could open an issue on GitHub to let them know.

If you use console.log(` ${data} `), it will not show <Buffer.
where as if you console.log(data), it will show <Buffer
e.g.
ws.on('message',data=>{
console.log(`user sended:${data}`)
})
It's a bug or normal you can say.
alter method:
ws.on('message',data=>{
data=data.toString()
console.log("user sended:",data)
})
OR
ws.on('message',data=>{
console.log("user sended:",data.toString())
})

In Latest version one need to provide %s while printing the received message. Here is the simple code snippet from official page npm websocket official page.
ws.on('message', function message(data) {
console.log('received: %s', data);
});

I was having same issue. I found the issue was because of version of ws.
So, I installed another version of ws.
Try this instead:
npm i ws#7.5.0
It worked in my case.

Related

Getting "Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client" while using Axios

I am trying to use different Axios calls to get some data from a remote server. One by one the calls are working but as soons as I call them directly after each other its throwing the error message about the headers. I did some research already and I guess it has sth to do that there the headers of the first call gets in the way of the second call. That is probably a very simplematic description of the problem but I am new to node js and the way those axios calls are working.
This is an example of one of my Api calls:
app.get('/api/ssh/feedback', function(req, res){
conn.on('ready', function(){
try {
let allData = {}
var command = 'docker ps --filter status=running --format "{{.Names}}"'
conn.exec(command, function(err, stream){
if (err) throw console.log(err)
stream.on('data', function(data){
allData = data.toString('utf8').split('\n').filter(e=>e)
return res.json({status: true, info: allData})
})
stream.on('close', function(code){
console.log('Process closed with: ' + code)
conn.end()
})
stream.on('error', function(err){
console.log('Error: ' + err)
conn.end()
})
})
} catch (err) {
console.error('failed with: ' + err)
}
}).connect(connSet)
})
I am using express js as a middleware and the shh2 package to get the connection with the remote server. How I mentioned before the call is working but crashes if it is not the first call. I am able to use the api again after I restart the express server.
This is how I am calling the api through axios in my node js frontend:
getNetworkStatus(e){
e.preventDefault()
axios.get('/api/ssh/network').then(res =>{
if(res.data.status){
this.setState({network_info: 'Running'})
this.setState({network: res.data.info})
} else {
this.setState({network_info: 'No Network Running'})
this.setState({network: 'No Network detected'})
}
}).catch(err => {
alert(err)
})
}
I would be really grateful for any help or advice how to solve this problem. Thanks to everyone who spends some time to help me out.
There are two issues in the code you've provided:
You are making assumptions about 'data' events. In general, never assume the size of the chunks you receive in 'data' events. You might get one byte or you might get 1000 bytes. The event can be called multiple times as chunks are received and this is most likely what is causing the error. Side note: if the command is only outputting text, then you are better off using stream.setEncoding('utf8') (instead of manually calling data.toString('utf8')) as it will take care of multi-byte characters that may be split across chunks.
You are reusing the same connection object. This is a problem because you will continue to add more and more event handlers every time that HTTP endpoint is reached. Move your const conn = ... inside the endpoint handler instead. This could also be causing the error you're getting.

WebSocket interface cannot be used in place of WebSocket?

I'm following a guide to try and set up a WebSocket.Server using ws and Express 4 with NodeJS and TypeScript. Problem is that the guide I'm following (found here: https://morioh.com/p/3b302785a62f) seems like it's out of date or something because the code provided doesn't work.
I'm trying to use an extended websocket object to keep track of an alive connection. It looks like this:
interface ExtWebSocket extends WebSocket {
isAlive: boolean;
}
Now doing this, as the code in the tutorial shows, I get an error:
setInterval(() => {
wss.clients.forEach((ws: ExtWebSocket) => {
if (!ws.isAlive) return ws.terminate();
ws.isAlive = false;
ws.ping(null, false, true);
});
}, 10000);
The error states that I cannot use the ExtWebSocket in the forEach loop but I cannot figure out why?
It is because you can't 'de-narrow' a WebSocket to an ExtWebSocket since it extends WebSocket. If you had an ExtWebSocket, then you could narrow it down to a WebSocket but not vice-versa.
Here is a link to the typescript handbook on this topic: https://www.typescriptlang.org/docs/handbook/2/narrowing.html

IBM Watson Speech to Text API with node. How to output to DOM?

I am using a npm module to work with IBM's Watson to do speech to text. I'm using this package here: https://github.com/watson-developer-cloud/speech-javascript-sdk.
I can authenticate fine, but other than that nothing happens. I want to take the text from the response and insert it in the DOM. I tried the following just to try it out and I'm not getting any kind of feedback.
WatsonSpeech.SpeechToText.recognizeMicrophone({token: token, keepmic: true, ouputElement: "body"}).promise().then(function() {
console.log("talking");
})
The docs say the following for this method:
Other options passed to WritableElementStream if options.outputElement
is set.
And
Pipes results through a FormatStream by default, set options.format=false > to disable.
I would think that the
WatsonSpeech.SpeechToText.recognizeMicrophone
would take a callback function so I can handle the response and put insert it in my DOM, but I can't figure that out. Also, I'm not really a JS guy, so I don't know what the promise does.
Chapter 3 of "Zero to Cognitive" has exactly this code applied.
https://github.com/rddill-IBM/ZeroToCognitive
I recommend you to take a look at his lessons on youtube, but here is the code that I found.
function initPage ()
{
var _mic = $('#microphone'); var _stop = $("#stop");
_mic.addClass("mic_enabled");
_stop.addClass("mic_disabled");
_mic.on("click", function ()
{
var _className = this.className;
if(this.className == "mic_enabled")
{
_mic.addClass("mic_disabled");
_mic.removeClass("mic_enabled");
_stop.addClass("mic_enabled");
_stop.removeClass("mic_disabled");
$.when($.get('/api/speech-to-text/token')).done(
function (token) {
stream = WatsonSpeech.SpeechToText.recognizeMicrophone({
token: token,
outputElement: '#speech' // CSS selector or DOM Element
});
stream.on('error', function(err) { console.log(err); });
});
}
});
_stop.on("click", function() {
console.log("Stopping text-to-speech service...");
if (stream != undefined) {stream.stop(); }
_mic.addClass("mic_enabled");
_mic.removeClass("mic_disabled");
_stop.addClass("mic_disabled");
_stop.removeClass("mic_enabled");
});
}

send JSON data to Unity from Node.js

I asked this question on the official forum but I guess I don't have enough rep there to be taken seriously :O
I'm using Unity3D free.
Has anyone used https://www.assetstore.unity3d.com/en/#!/content/21721 with success? This plugin actually came the closest to working (I also tried this and this but they don't work for me).
I contacted the author but haven't got a reply yet, so was wondering if someone had made this work?
(edit: I would like to point out that I don't mind buying some other plugin if you have used it and found it easy/useful to communicate with your Node.js server via SocketIO - so please recommend)
Concretely, here's my problem with it:
I cant find a way to send JSON data to Unity from Node.js as it keeps getting an error.
I tried numerous ways, and this is one of them:
io.on('connection', function(socket){
console.log('a user connected');
var data ='{"email":"some#email.com","pass":"1234"}';
var dataJson = JSON.stringify(data);
console.dir(dataJson);
socket.emit('newResults', dataJson);
console.log('server emited newResults');
socket.on('fromClient', function(data){
console.log("got msg from client");
console.dir(data);
});
socket.on('disconnect', function(){
console.log('user disconnected');
});
});
In Unity3D I use the following function to intercept this:
public void HandleNewResults(SocketIOEvent e){
Debug.Log(string.Format("[name: {0}, data: {1}]", e.name, e.data));
Debug.Log (new JSONObject (e.data));
}
but it crashes (it catches the error signal) at this point with (when debugging is turned on) this message:
SocketComm:TestError(SocketIOEvent) (at Assets/_Scripts/SocketComm.cs:58)
SocketIO.SocketIOComponent:EmitEvent(SocketIOEvent) (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:400)
SocketIO.SocketIOComponent:EmitEvent(String) (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:392)
SocketIO.SocketIOComponent:OnError(Object, ErrorEventArgs) (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:382)
WebSocketSharp.Ext:Emit(EventHandler`1, Object, ErrorEventArgs) (at Assets/SocketIO/WebsocketSharp/Ext.cs:992)
WebSocketSharp.WebSocket:error(String) (at Assets/SocketIO/WebsocketSharp/WebSocket.cs:1011)
WebSocketSharp.WebSocket:Send(String) (at Assets/SocketIO/WebsocketSharp/WebSocket.cs:1912)
SocketIO.SocketIOComponent:EmitPacket(Packet) (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:309)
SocketIO.SocketIOComponent:EmitClose() (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:299)
SocketIO.SocketIOComponent:Close() (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:184)
SocketIO.SocketIOComponent:OnApplicationQuit() (at Assets/SocketIO/Scripts/SocketIO/SocketIOComponent.cs:164)
Can you please shed some light on how to aproach this problem?
I'm using Unity3D free.
But SocketIO needs Unity Pro. If you want to use native .NET sockets.
Unity Pro is required in order to build using .NET sockets. You may
use a replacement package if you don't want to use native sockets.
You can use Good ol' Sockets.It's usable for sockets.
I used such code:
socket.emit('event_name', {"email":"some#email.com","pass":"1234"});
I do not know if it right. The function in unity is:
public void TestBoop (SocketIOEvent e)
{
Debug.Log ("[SocketIO] Boop received: " + e.name + "--" + e.data);
);
}
And output is:
[SocketIO] Boop received: boop--{"email":"some#email.com","pass":"1234"}
Perhaps it does't right at all but the result at least comes to unity
Make these changes:
io.on('connection', function(socket){
console.log('a user connected');
var data ='{"email":"some#email.com","pass":"1234"}';
var dataJson = JSON.stringify(data);
console.dir(dataJson);
console.log('server emited newResults');
socket.on('beep', function(){
socket.emit('boop', {'boop': dataJson});
console.log("got msg from client");
console.dir(data);
});
socket.on('disconnect', function(){
console.log('user disconnected');
});
});
this is will work.

How to close a readable stream (before end)?

How to close a readable stream in Node.js?
var input = fs.createReadStream('lines.txt');
input.on('data', function(data) {
// after closing the stream, this will not
// be called again
if (gotFirstLine) {
// close this stream and continue the
// instructions from this if
console.log("Closed.");
}
});
This would be better than:
input.on('data', function(data) {
if (isEnded) { return; }
if (gotFirstLine) {
isEnded = true;
console.log("Closed.");
}
});
But this would not stop the reading process...
Edit: Good news! Starting with Node.js 8.0.0 readable.destroy is officially available: https://nodejs.org/api/stream.html#stream_readable_destroy_error
ReadStream.destroy
You can call the ReadStream.destroy function at any time.
var fs = require("fs");
var readStream = fs.createReadStream("lines.txt");
readStream
.on("data", function (chunk) {
console.log(chunk);
readStream.destroy();
})
.on("end", function () {
// This may not been called since we are destroying the stream
// the first time "data" event is received
console.log("All the data in the file has been read");
})
.on("close", function (err) {
console.log("Stream has been destroyed and file has been closed");
});
The public function ReadStream.destroy is not documented (Node.js v0.12.2) but you can have a look at the source code on GitHub (Oct 5, 2012 commit).
The destroy function internally mark the ReadStream instance as destroyed and calls the close function to release the file.
You can listen to the close event to know exactly when the file is closed. The end event will not fire unless the data is completely consumed.
Note that the destroy (and the close) functions are specific to fs.ReadStream. There are not part of the generic stream.readable "interface".
Invoke input.close(). It's not in the docs, but
https://github.com/joyent/node/blob/cfcb1de130867197cbc9c6012b7e84e08e53d032/lib/fs.js#L1597-L1620
clearly does the job :) It actually does something similar to your isEnded.
EDIT 2015-Apr-19 Based on comments below, and to clarify and update:
This suggestion is a hack, and is not documented.
Though for looking at the current lib/fs.js it still works >1.5yrs later.
I agree with the comment below about calling destroy() being preferable.
As correctly stated below this works for fs ReadStreams's, not on a generic Readable
As for a generic solution: it doesn't appear as if there is one, at least from my understanding of the documentation and from a quick look at _stream_readable.js.
My proposal would be put your readable stream in paused mode, at least preventing further processing in your upstream data source. Don't forget to unpipe() and remove all data event listeners so that pause() actually pauses, as mentioned in the docs
Today, in Node 10
readableStream.destroy()
is the official way to close a readable stream
see https://nodejs.org/api/stream.html#stream_readable_destroy_error
You can't. There is no documented way to close/shutdown/abort/destroy a generic Readable stream as of Node 5.3.0. This is a limitation of the Node stream architecture.
As other answers here have explained, there are undocumented hacks for specific implementations of Readable provided by Node, such as fs.ReadStream. These are not generic solutions for any Readable though.
If someone can prove me wrong here, please do. I would like to be able to do what I'm saying is impossible, and would be delighted to be corrected.
EDIT: Here was my workaround: implement .destroy() for my pipeline though a complex series of unpipe() calls. And after all that complexity, it doesn't work properly in all cases.
EDIT: Node v8.0.0 added a destroy() api for Readable streams.
At version 4.*.* pushing a null value into the stream will trigger a EOF signal.
From the nodejs docs
If a value other than null is passed, The push() method adds a chunk of data into the queue for subsequent stream processors to consume. If null is passed, it signals the end of the stream (EOF), after which no more data can be written.
This worked for me after trying numerous other options on this page.
This destroy module is meant to ensure a stream gets destroyed, handling different APIs and Node.js bugs. Right now is one of the best choice.
NB. From Node 10 you can use the .destroy method without further dependencies.
You can clear and close the stream with yourstream.resume(), which will dump everything on the stream and eventually close it.
From the official docs:
readable.resume():
Return: this
This method will cause the readable stream to resume emitting 'data' events.
This method will switch the stream into flowing mode. If you do not want to consume the data from a stream, but you do want to get to its 'end' event, you can call stream.resume() to open the flow of data.
var readable = getReadableStreamSomehow();
readable.resume();
readable.on('end', () => {
console.log('got to the end, but did not read anything');
});
It's an old question but I too was looking for the answer and found the best one for my implementation. Both end and close events get emitted so I think this is the cleanest solution.
This will do the trick in node 4.4.* (stable version at the time of writing):
var input = fs.createReadStream('lines.txt');
input.on('data', function(data) {
if (gotFirstLine) {
this.end(); // Simple isn't it?
console.log("Closed.");
}
});
For a very detailed explanation see:
http://www.bennadel.com/blog/2692-you-have-to-explicitly-end-streams-after-pipes-break-in-node-js.htm
This code here will do the trick nicely:
function closeReadStream(stream) {
if (!stream) return;
if (stream.close) stream.close();
else if (stream.destroy) stream.destroy();
}
writeStream.end() is the go-to way to close a writeStream...
for stop callback execution after some call,
you have to use process.kill with particular processID
const csv = require('csv-parser');
const fs = require('fs');
const filepath = "./demo.csv"
let readStream = fs.createReadStream(filepath, {
autoClose: true,
});
let MAX_LINE = 0;
readStream.on('error', (e) => {
console.log(e);
console.log("error");
})
.pipe(csv())
.on('data', (row) => {
if (MAX_LINE == 2) {
process.kill(process.pid, 'SIGTERM')
}
// console.log("not 2");
MAX_LINE++
console.log(row);
})
.on('end', () => {
// handle end of CSV
console.log("read done");
}).on("close", function () {
console.log("closed");
})

Resources