I have just started using node.js, I'm running a node server with sockets.io and i need to send a buffer of bytes to the client.
I understand that this can be done by first translating the byte buffer to base64 and sending that, then translating it back on the client side. but i was wondering if there is a more elegant way of getting the byte stream to the client.
Socket.IO 1.0 Now supports Binary data transfer. Please have a look here . You can use Blob, ArrayBuffer and File.
https://github.com/binaryjs/binaryjs can be a solution. base 64 have ~30% of overhead size, so if you need to transfer large amount of data it will become inefficient.
There is also socket.io-stream https://github.com/nkzawa/socket.io-stream
It is little difficult to use binaryjs with socket.io.
Try deliveryjs
https://github.com/liamks/Delivery.js
which provides the means of communication between clients and server via socket.io.
However this module also uses the base64 conversion, which is a drawback.
Related
I have a websocket server made with express in nodejs
Websocket documentation states that websocket.send() can pass string or array buffer. Now I want to send a big array of objects (200k lines when formatted) over websocket to clients. What is the best way to send up data such as this.
What I've tried: I've tried sending it directly by stringifying it, Now this works completely fine but delay is very long.
So is there any way to send such a massive array data to clients while keeping speed intact? I think array buffers might help but I couldn't find any suggested examples
Also if this issue is code specific then let me know in comment so that I can share code snippets as well.
Usually sockets are used to handle real time messaging, sacrificing common http features with the goal of to be light and fast. Check:
How websockets can be faster than a simple HTTP request?
Transfer large amount of data goes against this. Check :
Sending large files over socket
Advice 1
Use socket to detect the real time notification of some change in crypto ticker should use socket.
After tha when client knows (thank to the socket) that there are a new or and updated crypto ticker (which is large), download it using common http instead socket as #jfriend00 also recommends in comments.
Also if data is large, you should use an approach to split the data and send chunk by chunk using common http, not sockets.
Advice 2
As #jfriend00 said and also the major file host services do, implement and algorithm to split the data and send part by part to the client.
If the chunk or part is small, maybe you could use sockets but this is a common feature , so use the known way: common http
Is there a way to log or view the actual bytestream being sent to the server when using either the grpc or #grpc/grpc-js clients in NodeJS?
I'm working with an opaque GRPC server that accepts my bytes when I stream them, but doesn't do what it's supposed to do. I'd like to view the actual bytes being sent to the server, as we suspect it's a problem with how the GRPC libraries are serializing 64 bit integers.
The GRPC_VERBOSITY=debug GRPC_TRACE=tcp,http,api,http2_stream_state env variables for the native grpc module haven't been helpful in this specific case -- they show part of one byte stream, but not the full byte-stream.
Even a "here's the place in the code where the serialization happens" would be useful.
The GRPC_VERBOSITY setting there is correct. If you are using TLS, you can see all of the data that is sent and received with GRPC_TRACE=secure_endpoint. If you are using plaintext connections, you can instead see it with GRPC_TRACE=tcp. In both cases, you will need to pick the data you are looking for out of the HTTP/2 framing, and it may show compressed messages, which would be essentially impossible to interpret.
Alternatively, if your setup allows it, you may want to try Wireshark. It should be able to handle the HTTP/2 framing for you, and I believe it has plugins to handle gRPC traffic specifically.
I'm using socket.io-stream to share file over socket from server t browser. I'd like to use the same to share audio stream from browser to server. Is it possible? I know that browser audio stream is different from node.js stream, so i need to convert it, how?
Not 100% sure what you're expecting to do with the data, but this answer may be of use to you.
Specifically, I'd suggest you use getUserMedia to get your audio, hook it up to a Script Processor, convert the data, and emit those data chunks to socket.io. Then on your server, you can capture those chunks and write them to your node.js stream. Code samples are at the link; they're fairly lengthy and I don't want to spam, so I won't reproduce them here.
So I want to transfer sound bytes over a websocket from a phone to a server. However according to http://crossbar.io/docs/Features crossbar seems to only implement json and msgpack. Can I stil transfer binary messages over crossbar using some other way?
Also multiple crossbar clients (for eg )seems to only provide json and webpack as de/serialization formats. Am I missing something?
WAMP is primarily intended for transmission of messages, not large (binary) payloads. For small chunks you can encode the audio so that it can be part of a regular WAMP payload. For an example of this for a webcam image, see the Tessel camera example - https://github.com/crossbario/crossbarexamples/tree/master/iotcookbook/device/tessel/camera. This works fine in principle, though there is, of course, the encoding/decoding overhead.
I see how I can create a readable stream for a file in GridFS in Node.js using the native mongo driver. However, I'm writing a server that responds to byte range requests, so I'd like to only stream back part of the file. What's the best way to do this? Should I just implement my own read stream that pulls data from the database in chunks? Thanks!
Unfortunately the stream does not support arbitrary starting points so you'll have to implement your own that seeks into the correct chunk and then streams from there.