How to create a custom object using node.js - node.js

I have a java-based server that allows client applications to connect via other programming languages (java, unity, obj-c). I would like to know how to add javascript to this list using node.js and socket.io. The server listens on a set port and accepts simple json data plus an int for length in bytes, it response in the same format. The format of the "packet" is like so:
first four bytes are the length
00 00 00 1c
remaining bytes are the data
7b 22 69 64 22 3a 31 2c 22 6e 61 6d 65 22 3a 22 73 6f 6d 65 77 69 64 67 65 74 22 7d
The data is sent over TCP and is encoded in little endian. The object in originating from Java is modeled like so:
public class Widget {
private int id;
private String name;
public int getId() { return id; }
public String getName() { return name; }
}
The JSON would be:
{"id":1,"name":"somewidget"}

You will need a TCP socket. Connect it to the service and listen for the data event. When the data event is fired look at the buffers length. If it is <= 4 byte, you propably should discard it*. Else read the first 4 bytes using readUInt32() specifying 'little' as the endianess. Then use this number for the length of the remainding buffer. If the buffer is shorter than the given length, only "read" the remaining length, else "read" the given length. Use the toString method for this. You will get a string that can be parsed using the JSON.parse method, which will return you the JavaScript object matching the JSON.
You can build your packets basicly the same way by instanciating a buffer and writing all the data to it.
Also see the Buffers and net documentation for details.
* I do not know when node fires it's data events but your data might be received fragmented (that is splitted up into multiple data events). It could happen and due to the streaming nature of the tcp protocol it most likely will happen if the JSON string is long enough to not fit into a single frame. In that case, you propably should not simply discard the buffer, but try to reassemble the fragmented data.

So if you want just send request to server and get response you could do this:
var net = require('net');
var soc = net.Socket();
soc.connect(8088);
soc.on('connect', function(){
var data, request, header;
data = {request : true};
data = JSON.stringify(data);
request = new Buffer(Buffer.byteLength(data));
request.write(data);
header = new Buffer(4);
header.write(request.length.toString());
// send request
soc.end(header.toString('binary') + request.toString('binary'));
});
soc.on('data', function(buffer){
// Crop length bytes
var data = JSON.parse(buffer.slice(4).toString('utf-8'));
soc.destroy();
console.log(data);
});

Related

NodeJS Reading Buffer Binary To Float

I have a large DAT file that holds custom byte information,
And I've been tasked with converting the solution to JavaScript.
It's to make the solution be more single-language and convert to serverless cloud computing.
But, I've run into an issue with converting this test data.
The values supposed to return a float,
But I can't seem to get the number converted correctly.
The sliced buffer output is <Buffer 40 82 e2 31 d6 d7 2e 8d>,
Which is supposed to return 604.274335557087
But actually returns 4.090111255645752.
And I'm at my wits end right now.
Any thoughts?
fs.readFile(file, (err, data) => {
...
// Read other buffer slice() values fine until this step.
// Like: readInt8(), readIntBE(0, 4), readBigUInt64BE(0, 8)
...
let FloatNumber = data.slice(start, end).readFloatBE();
console.log('FloatNumber', FloatNumber);
...
}
const buf = Buffer.from([0x40, 0x82, 0xe2, 0x31, 0xd6, 0xd7, 0x2e, 0x8d]);
console.log( buf.readDoubleBE(0) );
// Prints: 604.274335557087

How to get first 16 bytes of an SHA1 hash in node.js?

I'm trying to interoperate with a Java server. As part of the protocol, I need to create a SHA1 hash of my content. For some reason, only the first 16 bytes of the hash digest are used, encoded in Base64. The message digest is an array of bytes in Java, and it's truncated to length 16 before being Base64 encoded.
How can I do the same in javascript on node? I'm using the built in node crypto, but the digest is not simply an array. How can I access the hash value and retrieve the first 16 bytes?
The following code gives me 20 bytes:
var crypto = require('crypto');
var hash = crypto.createHash('sha1');
data = hash.update('This is the password', 'utf-8');
gen_hash= data.digest('base64');
console.log(gen_hash);
Instead of directly generating a Base64 encoded output you can return a Buffer, slice it to get the first 16 bytes and then encode it.
const crypto = require('crypto');
var hash = crypto.createHash('sha1');
data = hash.update('This is the password', 'utf-8');
gen_hash= data.digest().slice(0, 16).toString('base64');
console.log(gen_hash);
The Buffer of the hash looks like this before slicing:
<Buffer f2 ff c3 eb 0e 37 bb 19 91 0a 05 c0 88 d2 e6 0d 6a 0e d5 25>
result is:
8v/D6w43uxmRCgXAiNLmDQ==
your original code gave:
8v/D6w43uxmRCgXAiNLmDWoO1SU=

Most efficient way of copying a "hex array" string to a buffer?

I'm receiving a high volume of "hex array" strings in the form:
'16 03 03 00 50 40 f2 12 71 0b c0 4f 99 dc 87 6f'
What's the most efficient way of copying them into an existing, larger buffer?
I'm guessing the naive way would be:
var lineBuffer = new Buffer(line.replace(/\s+/g, ''), 'hex');
lineBuffer.copyTo(mainBuffer, offset);
offset += 16;
I'm wary of using line[index] and doing the simple bit shift and sum, because string[index] just resolves to another string.
To move this out of the comments,
The solution you're looking for is probably something like this:
const line = '16 03 03 00 50 40 f2 12 71 0b c0 4f 99 dc 87 6f';
const regex = / /g;
const encoding = 'hex';
const replacement = '';
function getHexString(input) {
return input.replace(regex, replacement);
}
const existingBuffer = Buffer.alloc(1024); // just as an example
const offset = existingBuffer.length; // or wherever you need them to go.
function write(buffer, str, pos) {
const hexString = getHexString(str);
buffer.write(hexString, pos, encoding);
}
write(existingBuffer, line, 0);
I can get about 1,600,000 ops/sec through benchmark.js with () => write(existingBuffer, line, 0). Cheating a bit, since I'm constantly writing to position 0 instead of appending, but it should be close enough to get you an idea. The closest I could come with other combinations I was trying was about 1,200,000 ops/sec.
Also as a side note, I would strongly suggest using Buffer.alloc() wherever you are creating your original buffer, if you aren't. You can also use Buffer.allocUnsafe() which is faster, but that may leave non-zero data in the buffer. Which may be okay, if you know you're going to fill the entire buffer with new data before using it (or are only using known-filled slices). More reading here.

Node stream buffers in console.log vs process.stdout.write

Using NodeJS v5.6 I created a file called read-stream.js:
const
fs = require('fs'),
stream = fs.createReadStream(process.argv[2]);
stream.on('data', function(chunk) {
process.stdout.write(chunk);
});
stream.on('error', function(err) {
process.stderr.write("ERROR: " + err.message + "\n");
});
and a data file in plain text called target.txt:
hello world
this is the second line
If I do node read-stream.js target.txt the contents of target.txt are printed normally on my console and all is well.
However if I switch process.stdout.write(chunk); with console.log(chunk); then the result I get is this:
<Buffer 68 65 6c 6c 6f 20 77 6f 72 6c 64 0a 74 68 69 73 20 69 73 20 74 68 65 20 73 65 63 6f 6e 64 20 6c 69 6e 65 0a>
I recently found out that by doing console.log(chunk.toString()); the contents of my file are once again printed normally.
As per this question, console.log is supposed to use process.stdout.write with the addition of a \n character. But what exactly is happening with encoding/decodings here?
Thanks in advance.
process.stdout is a stream and its write() function only accepts strings and buffers. chunk is a Buffer object, process.stdout.write writes the bytes of data directly in your console so they appear as strings. console.log builds a string representation of the Buffer object before outputting it, hence the <Buffer at the beginning to indicate the object's type and following are the bytes of this buffer.
On a side note, process.stdout being a stream, you can pipe to it directly instead of reading every chunk:
stream.pipe(process.stdout);
I believe I found out what's happening:
The implementation of console.log in NodeJS is this:
Console.prototype.log = function() {
this._stdout.write(util.format.apply(this, arguments) + '\n');
};
However the util.format function of lib/util.js in NodeJS uses the inspect method on any input object which in turn: Returns a string representation of object, which is useful for debugging.
Thus what's happening here is that due to util.format "object casting", anytime that we pass an object to console.log, that particular object is first turned into a string representation and then is passed to process.stdout.write as a string and finally gets written to the terminal.
So, when we directly use process.stdout.write with buffer objects, util.format is completely skipped and each byte is directly written to terminal as process.stdout.write is designed to handle them directly.

Forging or Building packets in NodeJS

I have to send a packet, when viewed in hex is:
0A 00 2C 01 23 00 0C 00 B3 01
0A 00 is the length which is 10.
2C 01 is a identifier 12c or could be a decimal packet id.
23 00 is a version of dec 35.
0C 00 is another version which is dec 12.
b3 01 is 435.
which is from different variables and configs.
var packet_id = 300;
var game_version = config.game.version; // 35 from config
var update_version = config.update.version; // 12 from config
var date_version = config.date.version; // 435 from version
The length is then calculated from the size with the length. Then build this buffer and send it..
But how do I do this? I also was thinking that I want to predefine packet structure and just enter parameters like:
packet("versionCheck", // name of packet structure (defined somewhere
300 , // the packet id on the structure
config.game.version, // the 2nd parameter for the versionCheck structure.....
............
I am trying to use the packet package by bigeasy for node but I can only make it work with Parsing, not with Building packets.
That's actually a pretty basic example of packet, since everything is static. Something like this might work:
function Packet (length, id, v1, v2, other) {
this._length = length;
this._id = id;
this._v1 = v1;
this._v2 = v2;
this._other = other;
}
Packet.prototype.toBuffer = function () {
var buffer = new Buffer(this._length);
buffer.writeUInt16LE(this._length, 0);
buffer.writeUInt16LE(this._id, 2);
buffer.writeUInt16LE(this._v1, 4);
buffer.writeUInt16LE(this._v2, 6);
buffer.writeUInt16LE(this._other, 8);
return buffer;
}
var packet = new Packet(10, 0x12c, 35, 12, 435);
console.log(packet.toBuffer());
Also for parsing you don't need anything special, just same as toBuffer, but with read instead of write methods

Resources