How nodejs stdio streams could be wrapped into a Duplex.
I've tried something like
const pair = {
readable: process.stdin,
writable: process.stdout,
};
const duplex = Duplex.fromWeb(pair, { encoding: 'utf8', objectMode: true });
Still it doesn't want to accept ReadStream as a ReadableStream (as well as the WriteStream I suppose)
node:internal/webstreams/adapters:596
throw new ERR_INVALID_ARG_TYPE(
^
TypeError [ERR_INVALID_ARG_TYPE]: The "pair.readable" property must be an instance of ReadableStream. Received an instance of ReadStream
So what is the way to wrap one into another?
Related
I have the following code which has been heavily inspired from nodejs official documentation of a client-side example
import http2 from 'http2';
// The `http2.connect` method creates a new session with example.com
const session = http2.connect('https://somedomain.com');
// Handle errors
session.on('error', (err) => console.error(err))
const req = session.request({
':authority': 'somedomain.com',
':path': '/some-path',
':method': 'GET',
':scheme': 'https',
'accept': 'text/html',
});
// To fetch the response body, we set the encoding
// we want and initialize an empty data string
req.setEncoding('utf8')
let data = ''
// append response data to the data string every time
// we receive new data chunks in the response
req.on('data', (chunk) => { data += chunk })
// Once the response is finished, log the entire data
// that we received
req.on('end', () => {
console.log(`\n${data}`)
session.close();
});
req.on('error', (error) => {
console.log(error);
});
req.end();
Please note that the actual hostname has been replaced with somedomain.com. Running this, results in data getting logged, as expected, however, the process doesn't shut down gracefully. I get the following unhandled error in the terminal.
node:events:504
throw er; // Unhandled 'error' event
^
Error [ERR_HTTP2_STREAM_ERROR]: Stream closed with error code NGHTTP2_FLOW_CONTROL_ERROR
at new NodeError (node:internal/errors:371:5)
at ClientHttp2Stream._destroy (node:internal/http2/core:2330:13)
at _destroy (node:internal/streams/destroy:102:25)
at ClientHttp2Stream.destroy (node:internal/streams/destroy:64:5)
at Http2Stream.onStreamClose (node:internal/http2/core:544:12)
Emitted 'error' event on ClientHttp2Stream instance at:
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21) {
code: 'ERR_HTTP2_STREAM_ERROR'
}
I understand it is possible that the server is behaving incorrectly. However, there should be a way on the nodejs client to close the session gracefully. Regardless, what would be the ideal way to handle such errors? I've already tried listening to session.on('error') and req.on('error') but that doesn't work.
I am making a NodeJS Music bot for discord, and I suddenly encountered a problem. The bot properly joins the channel, lights up (indicating it is speaking), but then there's no audio. After trying to find the root cause of the problem, I believe it to be a problem with the ytdl() function from the ytdl-core module.
const stream = await ytdl(song.url, {
filter: 'audioonly',
type: 'opus',
highWaterMark: waterMark
});
Looking at the result of stream, I found this:
PassThrough {
_readableState: ReadableState {
objectMode: false,
highWaterMark: 524288,
buffer: BufferList { head: null, tail: null, length: 0 },
length: 0,
...
Which meant that I am not getting any buffer/stream data. It is indeed playing, but because there's nothing-- there's only silence to be heard.
I tried using pipe() and it worked just fine, but I can't play it as it is through my Music Bot.
ytdl function is like synchronous function.
"ytdl-core": "^4.10.1"
const stream = await ytdl(song.url); pause code execution until ytdl finished downloading stream.
console.log("YTDL Start");
const stream = await ytdl(song.url, {
filter: 'audioonly',
type: 'opus',
highWaterMark: waterMark
});
console.log("Is it done?");
stream.on('close', () => {
console.log('Read stream closed');
});
stream.on('finish', () => {
console.log('Read stream Finished');
});
From electron I pass node.js modules with the contextBridge to the interface like this:
const udp = require('dgram');
contextBridge.exposeInMainWorld(
'electron',
{
udp: udp,
}
)
Then for example in the web console I can do:
var udp = window.electron.udp
var client = udp.createSocket('udp4');
Most of the methods on client work.
But when I try to attach event handlers I this error:
client.on('message',function(msg,info){
console.log('Data received from server : ' + msg.toString());
});
=> VM542:1 Uncaught TypeError: client.on is not a function
I don't understand why in the web console I can't use the on-methods?
How should I listen to events there? Or could this error be related to Jquery?
Thank you
I know that I could set: contextIsolation: false but then I got other errors.
client => {_events: {…}, _eventsCount: 0, _maxListeners: undefined, type: "udp4", Symbol(kCapture): false, …}
I am writing node js code to upload a picture to my Google Drive from Raspberry Pi
I have tried to upload the image file produced in the same folder. Node js seems to ignore the file. It always returns ENOENT although the file is present. I have verified the existence of the file manually. The path and the filename are correct. I have also verified by printing the filename to the console and it seems to match.
var fileName1 = Date.now();
const path = require('path');
const fs1 = require('fs');
var fN = fileName1+".jpg";
console.log("Only filename : "+fN);
const finalPath = path.join(__dirname, fN);
console.log("Final filename : "+finalPath);
var media = {
mimeType: 'image/jpeg',
//PATH OF THE FILE FROM YOUR COMPUTER
body: fs1.createReadStream(finalPath)
};
Output
Only filename : 1571724785329.jpg
Final filename : /home/pi/nodecode/1571724785329.jpg
events.js:174
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory,open
'/home/pi/nodecode/1571724785329.jpg'
Thanks a lot everyone for sharing their views and try to work out the answer for me. I found the problem. The file was not created at that instant when nodejs was trying to locate the file. The file creation and upload was happening parallel. So I used python instead of nodejs.
Your Code seems to work for me. The only problem I see is in the way you are creating a file name.. There is no chance the filename is going to be the same as Date.Now() no matter how you generate it previously.
Thus the sequence of code execution is the only possible cause for your problem. For example, the file you expect to load in the readable stream is not yet created when the code to create a read stream is executing.
TRY
Debugging the code execution sequence and find out if the file is created before or after this piece of code is being executed fs1.createReadStream(finalPath)
var fileName1 = "wf";
const path = require('path');
const fs1 = require('fs');
var fN = fileName1 + ".jpg";
console.log("Only filename : " + fN);
const finalPath = path.join(__dirname, fN);
console.log("Final filename : " + finalPath);
var media = {
mimeType: 'image/jpeg',
//PATH OF THE FILE FROM YOUR COMPUTER
body: fs1.createReadStream(finalPath)
};
console.log(media);
The Output I got.
Only filename : wf.jpg
Final filename : D:\dir1\dir2\myproject\wf.jpg
{ mimeType: 'image/jpeg',
body:
ReadStream {
_readableState:
ReadableState {
objectMode: false,
highWaterMark: 65536,
buffer: BufferList { head: null, tail: null, length: 0 },
length: 0,
pipes: null,
pipesCount: 0,
mode: 438,
start: undefined,
end: Infinity,
autoClose: true,
pos: undefined,
bytesRead: 0,
closed: false } }
I am trying to create a proxy update code with node.js, and im getting this error:
events.js:180
throw new errors.TypeError('ERR_INVALID_ARG_TYPE', 'listener', 'Function');
^
TypeError [ERR_INVALID_ARG_TYPE]: The "listener" argument must be of type Function
at _addListener (events.js:180:11)
at WriteStream.addListener (events.js:240:10)
at WriteStream.close (fs.js:2298:10)
at WriteStream.<anonymous> (/Users/camden/Desktop/proxyupdate/u.js:9:15)
at WriteStream.emit (events.js:164:20)
at finishMaybe (_stream_writable.js:616:14)
at afterWrite (_stream_writable.js:467:3)
at onwrite (_stream_writable.js:457:7)
at fs.write (fs.js:2242:5)
at FSReqWrap.wrapper [as oncomplete] (fs.js:703:5)
here is my code:
var UpdateProxyList = function(sourceURL, destinationPath) {
var HTTP = require("http");
var FS = require("fs");
var File = FS.createWriteStream(destinationPath);
HTTP.get(sourceURL, function(response) {
response.pipe(File);
File.on('finish', function() {
File.close();
});
File.on('error', function(error) {
FS.unlink(destinationPath);
})
});
}
UpdateProxyList("http://www.example.com/proxy.txt", "myproxylist.txt");
Im on MacOS Sierra with node.js v9.3.0.
apparently, when I use node.js v8.9.3, it works fine
Between v8.9.3 and v9.3.0, the implementation of WriteStream.prototype.close has changed.
In v8.9.3, it was a reference to ReadStream.prototype.close, for which a callback argument was optional.
In v9.3.0, it is now a separate method that, amongst other things, emits a close event:
WriteStream.prototype.close = function(cb) {
if (this._writableState.ending) {
this.on('close', cb);
return;
}
...
};
The error that you get is caused by this.on('close', cb), which requires a Function second argument that isn't being passed in your code.
I'm not sure if you actually need to use a finish handler at all in your situation, as writable handling will be done internally by the .pipe() code.