I want to open an image and send it and send to the sockets that are connected in my nodejs server.
The problem is when I'm trying to send some image via socket.write(myImage); because the method write() can't send it as an image object. The "solution" that i've found is to create a buffer from my image, copy in another buffer and parse it to a base64 encode and send this using socket.write(myBase64Img), receive it in my Qt client and decode that image with openCV. the problem is that this is very expensive. There is another way to send my image via TCP sockets?
PS: I can't send the URL of the image, I want to send the image.
Thanks!
You should be able to write the raw image data to the socket without base64-encoding it, just pass a Buffer containing that binary data to write(). You could also stream the data if you don't have the image completely in memory already since that would help save on memory usage.
Assuming that you're dealing with an image that is already on disk, the right way to do this is to stream the image directly from disk to the socket.
var stream = fs.createReadStream(filePath);
stream.pipe(socket);
See the docs for details:
fs.createReadStream
Stream#pipe
Related
i am using mongodb as my database and node.js as a server. i managed to save image array as buffer data. can i send those buffer data to the front-end and display. is it faster than sending image URL s?,
You can store image in your own database and send It to the front-end, but It is not recommended if you don't have a strong enough server to process that buffer datas as described here.
When we work with streams in node js, it's easy to understand data flow from disk to node js then to disk. But what if the stream is from request body? The data flow through network when they are read or node.js saves the whole stream in it's memory?
The request body is an incoming stream over the network. node.js will read part of it from TCP, fill up a local buffer and then not read any more until some of it is read locally out of the buffer to make more room in the buffer.
The data flow through network when they are read
Yes, with some local buffering in the stream object and also in the TCP stack for efficiency. TCP will use flow control to tell the source to pause sending if the local buffers are full.
or node.js saves the whole stream in it's memory?
No, not all in memory.
I'm using socket.io-stream to share file over socket from server t browser. I'd like to use the same to share audio stream from browser to server. Is it possible? I know that browser audio stream is different from node.js stream, so i need to convert it, how?
Not 100% sure what you're expecting to do with the data, but this answer may be of use to you.
Specifically, I'd suggest you use getUserMedia to get your audio, hook it up to a Script Processor, convert the data, and emit those data chunks to socket.io. Then on your server, you can capture those chunks and write them to your node.js stream. Code samples are at the link; they're fairly lengthy and I don't want to spam, so I won't reproduce them here.
I am using the combination of fragmented mp4 and websockets to stream a live video stream to web browser from where MSE takes over.
I have successfully fragmented into the appropriate fmp4 format using ffmpeg and have checked the data using an mpeg4parser tool.
Utilising a websocket server, the incoming data is broadcasted to all the browser clients connected via websocket. This works fine for both playback and live streaming(using rtsp stream as the input).
The problem I am facing occurs when a client tries to access the stream midway, i.e, once the ffmpeg stream has started. I have saved the init segment(ftyp + moov) elements in a queue buffer in the websocket server. This queue buffer sends this data to each new client on connection.
I believe this data is sent correctly since the browser console does not throw the 'Media Source Element not found' error. Yet no video is streamed when it receives the broadcasted moof/mdat pairs.
So a couple of questions I would like the answer to are:
1) I have observer that each moof element contains a sequence number in it's mfhd child element. Does this have to start from 1 always, which will naturally not be the case for a video stream accessed midway?
2) Is it possible to view the data in the browser client.js. At present all I can view is that my mediaBuffer contains a bunch of [Object ArrayBuffer]. Can I print the binary data inside these buffers?
3) From the server side the data seems to be sent in moof/mdat fragments as each new data arriving from the ffmpeg output to the websocket server begins with a moof element. This was noticed by printing the binary data in console. Is there a similar way to view this data in client side.
4) Does anyone have an idea of why this is happening? Some fragmented mp4 or ISO BMFF format detail that I am missing.
If any further detail is required for clarification please let me know, I will provide it.
Make sure your fragments include a base media decode time. Then set the video tag 'currentTime' to the time of the first fragment received.
I am new to the J2ME technology. And I am making an application which will transfer the text and image(downloaded through http and stored into an ImageItem of a form) from a client mobile to the server mobile using bluetooth. The connection used is SPP. I have succeded to transfer the text message. But I am unable to transfer the image.
Can anyone help me to transfer the image to the server mobile through bluetooth directly without saving it into the phone memory or memory card.,
I would be thankful to you.
javax.microedition.lcdui.Image.getRGB() is the method you are looking for.
If myImageItem is your ImageItem object, the code would look like this:
------------
Image myImage = myImageItem.getImage();
int[] myImageInts = new int[myImage.getHeight() * myImage.getWidth()];
// Beware of OutOfMemoryError here.
myImage.getRGB(myImageInts, 0, myImageInts.length, 0, 0,
myImage.getWidth(), myImage.getHeight());
------------
You can then convert each int in the array into 4 bytes
(in the correct order please)
and feed these to your Connection's OutputStream.
Alternatively, DataOutputStream.writeInt() does the conversion for you.
Well if your server mobile is using Bluetooth and also running an application written by you, then you can create your own protocol to do this.
For image transfer, it is best to send the bytes that were downloaded over HTTP (and used to create the ImageItem), then receive them at the server end and display in the same way.
What is the specific problem you're encountering while doing this?
funkybro
As funkybro suggested, you can use the bytes to transfer the image to the server mobile. For that you need to can just open the output stream of the connection that you have made to the bluetooth server mobile and then write the byte contents on to the output stream.