Wav to Blob in nodejs - node.js

I'm not sure how to create a blob from a wav file in node. Do I just use Buffer like so?...
var blippityBlob = new Buffer(filePathToWave);

Maybe you could take a look at BinaryJS
Quoting:
BinaryJS is a lightweight framework that utilizes websockets to send, stream, and pipe binary data bidirectionally between browser javascript and Node.js.
Server Code
var server = BinaryServer({port: 9000});
server.on('connection', function(client){
client.on('stream', function(stream, meta){
var file = fs.createWriteStream(meta.file);
stream.pipe(file);
});
});
Client Code
var client = BinaryClient('ws://localhost:9000');
client.on('open', function(stream){
var stream = client.createStream({file: 'hello.txt'});
stream.write('Hello');
stream.write('World!');
stream.end();
});

The answer lies in a combination of these two posts:
Node.js canĀ“t create Blobs?
Convert a binary NodeJS Buffer to JavaScript ArrayBuffer

Related

How to Stream utf8 over WebSockets in NodeJS ws?

I am not sure if this is a bug or not.
According to websocket/ws documentation you can use NodeJS streams.
However, my browser only receives Binary messages when doing this, despite setting encoding to utf8:
(new WebSocket.Server({port: 8080})).on('connection', function(ws){
var duplex = WebSocket.createWebSocketStream(ws, {encoding: 'utf8'});
var stream = require('fs').createReadStream(pathToFile, {encoding: 'utf8'});
stream.pipe(duplex);
});
Is this a bug, or how can I stream it as utf8?

How could I stream a video with a range from a FTP server in node.js

I'm using nodejs with express and this FTP node package
https://www.npmjs.com/package/ftp
here is what I do:
var Client = require('ftp');
var fs = require('fs');
var c = new Client();
c.on('ready', function() {
c.get('foo.txt', function(err, stream) {
if (err) throw err;
stream.once('close', function() { c.end(); });
stream.pipe(res);
});
});
c.connect();
and in front I simply use a video player that get it's stream from that server
The issue I'm having is that the .get method does not provide a range parameter so I cannot get a specific part of a video (get a stream that start at 5mins of the video). I'm only capable to get a stream from it start's.
How could I manage to open a stream of a video on a FTP server with a giving range so I can later stream a specific part of that video using the range header coming from the client ?
Thanks a lot
Have you found this example? Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?
You didn't provide any details on how are you loading the video on the frontend, add some snippets of how did you wrote that both on front and backend.
IF you just need a way to pass range parametar through get request, you can use query, but you would have to manually implement that and I dont believe you would want to do that (/video.mpg?range=99)

How do I intercept outgoing tcp messages in node?

How can I write a simple stream which intercepts messages?
For example, say I want to log (or eventually transform) a message being sent over the wire by a user's socket.write(...) call.
Following is a minimal program which attempts to do this:
const net = require('net');
const stream = require('stream');
const socket = new net.Socket();
const transformer = new stream.Transform({
transform(chunk,e,cb){
console.log("OUT:"+chunk.toString());
cb();
}});
//const client = socket.pipe(transformer); // <= prints "OUT:" on client, but nothing on server
const client = transformer.pipe(socket); // <= prints nothing on client, but "hello world" on server
socket.on('data', (data)=>{ console.log("IN:"+data.toString()); });
socket.connect(1234, 'localhost', ()=>{ client.write("hello world"); });
When I do socket.pipe(transformer), the client prints "OUT:" (like I want), but doesn't actually send anything to the server. When I swap the pipe locations, transformer.pipe(socket), nothing gets printed to the client but the message gets sent to the server.
Although not listed here, I also tried to use the Writable stream, which does print the message on the client, but it is never sent to the server (if I do a this.push(...) inside the Writable stream, it still doesn't seem to send to the server)
What am I missing here?
EDIT: Reformatted the code for clarity and updated the text
It looks like I needed to change the following line
socket.connect(1234, 'localhost', ()=>{ client.write("hello world"); });
to this
socket.connect(1234, 'localhost', ()=>{ transformer.write("hello world"); });
This is based on #Mr.Phoenix's comment. I expected .pipe() to return a new stream which I could use. I believe that is how Java's netty framework does it and I kept expecting node streams to work the same way.
You're not writing any data out of the stream.
You need to either this.push(chunk) or change the call to cb to cb(null, chunk).
See the docs about implementing transform streams for more info.

Sending emails without using mail server from customer premises

I have software running in customer servers on premises and there are multiple software and I want on failure of any software it should send emails to me
It can be a pain enabling & configuring to work with the customers mail servers.
I thought to write simple socket program in NodeJS to read the error log file and push those messages to my server that should handle the sending email
or may be web service to call for sending email.
If any has used things like this please tell me or Is there any easy solution exist somewhere?
Updating my question
As per comments I tried to implement same solution here is my main nodejs server file and where exactly I am facing problem now in Socket event emit. I want to emit socket event whenever log.xml file get changes, This run only one time.
var app = require('http').createServer(handler),
io = require('socket.io').listen(app),
parser = new require('xml2json'),
fs = require('fs');
app.listen(8030);
console.log('server listening on localhost:8030');
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
/* Email send services This code to in my client server outside of main socket server cloud This part is working fine I tested it in my different server
var socket = io.connect('http://localhost:8030');
socket.on('error', function (data) {
// convert the json string into a valid javascript object
var _data = JSON.parse(data);
mySendMailTest(_data);
*/
Please apologies me as I am new to stackoverflow community.
I think there is no problem in your socket code you need to use fs.watchFile before reading file. this is watch function similar to Angular Watch , it will detect any change happen to your file and run another function in callback to emit the socket
https://nodejs.org/docs/latest/api/fs.html#fs_fs_watchfile_filename_options_listener
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
// watching the file
fs.watchFile(__dirname + '/var/home/apache/log.xml', function(curr, prev) {
// on file change just read it
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
});

how to put data continously put data into a stream and transmit it while compressing it in node js

I am a newbie to javascript.
What i am trying to do is to fetch data from the data base and then transmit it on the internet.
Now i can only read one entry at a time but i want to compress all the entries together rather than compressing one entry at a time.
I can either store all of them in an array and then pass this array to zlib function. but this take up alot of time and memory.
Is it somehow possible to compress the data while transmitting it in node js with express api at the same time as it is being read, sort of like streaming servers, who on real time compress data while retrieving it from memory and then transmitting it over to the client
It's certainly possible. You can play around with this example:
var express = require('express')
, app = express()
, zlib = require('zlib')
app.get('/*', function(req, res) {
res.status(200)
var stream = zlib.createGzip()
stream.pipe(res)
var count = 0
stream.write('[')
;(function fetch_entry() {
if (count > 10) return stream.end(']')
stream.write((count ? ',' : '') + JSON.stringify({
_id: count,
some_random_garbage: Math.random(),
}))
count++
setTimeout(fetch_entry, 100)
})()
})
app.listen(1337)
console.log('run `curl http://localhost:1337/ | zcat` to see the output')
I assume you're streaming JSON, and setTimeout calls would need to be replaced with actual database calls of course. But the idea stays the same.
I'd recommend to use node.js's pipe.
Here is an example of pipe streaming with zlib (compression): it reads a file, compresses it and writes it to a new file.
var gzip = zlib.createGzip();
var fs = require('fs');
var inp = fs.createReadStream('input.txt');
var out = fs.createWriteStream('input.txt.gz');
inp.pipe(gzip).pipe(out);
You can change the input to come from your database input and change the output to be the HTTP response.
ref : http://nodejs.org/api/stream.html
ref : http://nodejs.org/api/zlib.html

Resources