I need to implement a video streaming with the help of kafka.I am able to stream the video to the browser with the help of node.js.But I want to know how to stream video from node.js to kafka producer and also the consumer.
I got this code for sending text messages but i want to send the live video.
const Transform = require('stream').Transform;
const ProducerStream = require('./lib/producerStream');
const _ = require('lodash');
const producer = new ProducerStream();
const stdinTransform = new Transform({
objectMode: true,
decodeStrings: true,
transform (text, encoding, callback) {
text = _.trim(text);
console.log(`pushing message ${text} to ExampleTopic`);
callback(null, {
topic: 'ExampleTopic',
messages: text
});
}
});
process.stdin.setEncoding('utf8');
process.stdin.pipe(stdinTransform).pipe(producer);
Related
I get my stream from my client like this :
webrtc_connection.ontrack = async (e) => {
//TODO : RECORD
}
How can I record / save it into a file on server? Apparently nodejs does not have MediaRecorder, so I am at loss for going further.
There are two options. The first is to use MediaRecorder+Socket.io+FFmpeg. Here is an example of how to stream from the browser to RTMP via node.js, but instead of streaming, you can just save it to the file.
draw your video on canvas, use canvas.captureStream() to get MediaStream from the canvas;
append your audio to MediaStream that you got in the previous step using MediaStream.addTrack();
use MediaRecorder to get raw data from the MediaStream;
send this raw data via WebSockets to node.js;
use FFmpeg to decode and save your video data to a file.
The Second is to use node-webrtc. You can join your WebRTC room from the server as another participant using WebRTC and record media tracks using FFmpeg. Here is an example.
I am using socket.io to send the raw PCM data from each audio channel like so:
this.streamNode.onaudioprocess = (e) => {
const leftChan = e.inputBuffer.getChannelData(0);
const rightChan = e.inputBuffer.getChannelData(1);
socket.emit('stream_rx_channel1', convertFloat32ToInt16(leftChan));
socket.emit('stream_rx_channel2', convertFloat32ToInt16(rightChan));
};
I am using Web Audio API with a ScriptProcessorNode to capture the PCM data from each channel and I emit the left & right channel data separately to the NodeJs server.
However, I need to know of a way to merge the streams back together in NodeJs to create a stereo audio stream that can be sent to google's speech-to-text service. Google automatically transcribes for each channel in the audio stream(check here). I need this because the left and right channels in this audio stream are 2 different voices.
I am using Google's stream recognize for realtime speech to text transcribing.
I'm trying to send data (video and audio) from Browser to a NodeJS server. On the client side, I'm using getUserMedia to get a stream data and trying to send it over websockets via SockJS. Here is my code so far :
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
.then(function(stream){
// trying to send stream
var video = window.URL.createObjectURL(stream);
// send stream
mystream.send(video.play());
})
Where mystream is a SockJS instance.
My need is to persist the video as it is watched by a peer.
Has anyone ever sent a stream video to a server ? I'm out of ideas on this one. Any help/hint is appreciated.
After hours of researching, I just gave up and used Kurento. It is well documented and there is some pretty interesting examples involving NodeJS sample code. I let the question open in case someone comes with a better idea.
I have this piece of code:
navigator.mediaDevices.getUserMedia(param)
.then(function(stream) {
video.srcObject = stream;
video.play();
}
})
.catch(function (err) {});
In this code I want to send this stream over socketio to nodejs server so that I can use it on receiver end for display in video element.
How can I achieve this?
I think something like this is your best bet: https://stackoverflow.com/a/17938723/5915143
You'd record the stream using MediaStreamRecorder and send it with 'emit()' calls on socket io to your server.
Alternatively you can use a streaming library built on socket.io like Endpoint.js to handle the stream.
I have the MediaStream object returned from getUserMedia and can show it in my own screen.
The thing is, I don't know how to *send / pipe / stream/ * that MediaStream from Point A through NodeJS using socket.io to Point B.
My code right now is:
// Cámara
if (navigator.getUserMedia) {
navigator.getUserMedia({audio: true, video: true}, function(stream) {
video.src = URL.createObjectURL(stream) || window.URL.createObjectURL(stream);
webcamstream = stream;
}, onVideoFail);
} else {
alert ('failed');
}
});
function onVideoFail(e) {
console.log('webcam fail!', e);
};
I need the way to make this stream being sent constantly to other user using NodeJS.
The comment made in the answer for Audio and video conference with NodeJS are still valid
if you were to send the streams through socket.io instead of a peer connection, in any case that would be the raw video content (bytes). You would lose:
the streaming part (RTP/RTCP), and corresponding packet loss cancellation
the bandwidth adaptation
the encryption (DTLS)
the media engine (jitter correction, …)
over webRTC.
why not implement an SFU in node.js? Use node.js/socket.io for the signaling (i.e. the initial handshake) but also use node.js as a peer connection endpoint which relays/routes the media stream to (an)other(s) peer(s)? You would have an intermediate server like you seem to want, and all the advantages of webRTC.
another solution is to use an MCU, google webrtc+mcu and you will find many.
this might be a little late, but there's now a library called wrtc/node-webrtc. use that! https://github.com/node-webrtc/node-webrtc