NodeJS: How to send a Buffer in chunks - node.js

I'm trying to create an artificially slow-loading image with NodeJS (currently using ExpressJS). I have Buffer containing a base64-encoded image in it, and I'd like to stream chunks of the buffer to the client each separated by a given delay. Is this possible?
I know how to send raw text to the client in chunks, and I know how to send an entire Buffer to the client (both of these with response.write()), but I am not sure how I can break up an image's bytes into chunks and send that to the client incrementally. Is this possible?

You can construct slices from a buffer using buf.subarray(start, end) and send these via response.write.

Related

Python TCP packets getting mixed

I have a multiplayer game written in python and uses TCP, So when I send two packets at the same time they get mixed up example if I send "Hello there" and "man" the client receives "hello thereman".
What should I do to prevent them from getting mixed?
That's the way TCP works. It is a byte stream. It is not message-based.
Consider if you write "Hello there" and "man" to a file. If you read the file, you see "hello thereman". A socket works the same way.
If you want to make sense of the byte stream, you need other information. For example, add line feeds to the stream to indicate end of line. For a binary file, include data structures such as "2-byte length (big-endian) followed by <length> bytes of data" so you can read the stream and break it into decipherable messages.
Note that socket methods send() and recv() must have their return values checked. recv(1024) for example can return '' (socket closed) or 1-1024 bytes of data. The size is a maximum to be returned. send() can send less than requested and you'll have to re-send the part that didn't send (or use sendall() in the first place).
Or, use a framework that does all this for you...

Nodejs - Image Buffer size is way larger than the original file

I need to send a byte array representation of an image from an express server to an android device. After some researches, found this solution
const buffer= fs.readFileSync('image_path');
res.send(Uint8Array.from(buffer));
It works BUT the size of the sent response is like 3 times higher than the original file (136Ko becomes 487Ko).
Turns out that the it is the size of buffer.
Is there a way to fix this so that the sent response matches the original file size? or a better way to send the file to match Java's byte array?

How does buffer works in node js?

I'm new in node js and trying to broadcast video streaming, but not getting any idea how to do this. Want to know how buffering works in a node js application?
Buffers are instances of the Buffer class in node, which is designed to handle raw binary data. Each buffer corresponds to some raw memory allocated outside V8. Buffers act somewhat like arrays of integers, but aren't resizable and have a whole bunch of methods specifically for binary data. In addition, the "integers" in a buffer each represent a byte and so are limited to values from 0 to 255 (2^8 - 1), inclusive.
More about buffers here.
Looks something like this:
Data is processed in terms of streams , instead whole of data at a time. These streams are collected in a buffer and once the buffer is full, the streams are passed on from one point to another (to the client requesting the data).
something like streaming movies online. This way we don't have to wait for the whole of data to arrive but receive in chunk and start using it even before the data is arrived. This video is simple and helpful.

Socket.io reading and writing binary data with zero copy buffers

I see that socket.io supports binary data. To send I could just set a Buffer object.
I want to send / receive a large number medium size files. I want to see if it can be optimized. When creating a Buffer from file and sending via socket.io, does it internally create any copy is the data or is it handled with zero-copy?
Similarly, when receiving, is it possible to receive the data as a Buffer that can be written to a file without creating a copy? I couldn't find an example of receiving data as a Buffer. Can someone point out examples of receiving binary data as a Buffer?

is the nodejs Buffer asynchronous or synchronous?

I dont see a callback in the Buffer documentation at http://nodejs.org/api/buffer.html#buffer_buffer. Am I safe to assume that Buffer is synchronous? I'm trying to convert a binary file to a base64 encoded string.
What I'm ultimately trying to do is take a PNG file and store its base64 encoded string in MongoDB. I read somewhere that I should take the PNG file, use Buffer to convert to base64, then pass this base64 output to Mongo.
My code looks something like this:
fs.readFile(filepath, function(err, data) {
var fileBuffer = new Buffer(data).toString('base64');
// do Mongo save here with the fileBuffer ...
});
I'm a bit fearful that Buffer is synchronous, and thus would be blocking other requests while this base64 encoding takes place. If so, is there a better way of converting a binary file to a base64 encoded one for storage in Mongo?
It is synchronous. You could make it asynchronous by slicing your Buffer and converting a small amount at a time and calling process.nextTick() in between, or by running it in a child process - but I wouldn't recommend either of those approaches.
Instead, I would recommend not storing images in your db- store them on disk or perhaps in a file storage service such as Amazon S3, and then store just the file path or URL in your database.

Resources