I have the following code:
process.stdin.pipe(output).pipe(socket);
I want to modify the contents of output before it is piped into socket. I need to add an additional pipe in between the two, but I am not sure how to do that. I checked the documentation on Streams but I could not figure out how to create my own method to invoke pipe with that will intercept the current pipeline.
What kind of object does the pipe method accept?
Related
What does
process.stdout.connect({})
do and how can I use it?
Tryied googleing it but found nothing.
Tryied options but get error:
TypeError: self._handle.connect is not a function
As the documentation says, process.stdout returns a stream connected to stdout and that stream is of type Socket. Now, if you check the documentation for Socket and look for the connect method you'll find that it initiates a connection on a given socket.
In order to use process.stdout you don't actually need to call connect, you can just write arbitrary strings to the stream which will get echoed in your stdout, e.g.
process.stdout.write("test"); // will print test to the console
But if logging to the console is your intent, it's probably easier to just use the provided console.log() which adds a lot of formatting and other stuff (see this for further info).
process.stdin.on('data',function(data){input_std_in+=data});
What is the correct explanation for the above piece of code. I am new to Node.js and have found many variations of this on the net, but I am still not clear.
The process.stdin is used to read data from commandline (Simple Explaination) For more see here
So below you are waiting for data event of stdin ie you wait for user to type some data in terminal ,you read it and append it to some string.As node js (javascript) is event driven ,it waits for some event to happen get the data from that event and use it further ,like in the script below it appends to already declared variable.
let input_std_in="";
process.stdin.on('data',function(data){
console.log("Data",data.toString())
input_std_in+=data.toString()
});
See working here
I'm writing a client-side application which should read in a file, transform its content and then export the result. To do this, I decided on Re-Frame.
Now, I'm just starting to wrap my head around Re-Frame and cloujurescipt itself and got the following thing to work:
Somewhere in my view functions, I send this whenever a new file gets selected via a simple HTML input.
[:input {:class "file-input" :type "file"
:on-change #(re-frame/dispatch
[::events/file-name-change (-> % .-target .-value)])}]
What I get is something like C:\fakepath\file-name.txt, with fakepath actually being part of it.
My event handler currently only splits the name and saves the file name to which my input above is subscribed to display the selected file.
(re-frame/reg-event-db
::file-name-change
(fn [db [_ new-name]]
(assoc db :file-name (last (split new-name #"\\")))))
Additionally I want to read in the file to later process it locally. Assuming I'd just change my on-change action and the event handler to do this instead, how would I do it?
I've searched for a while but found next to nothing. The only things that came up where other frameworks and such, but I don't want to introduce a new dependency for each and every new problem.
I'm assuming you want to do everything in the client using HTML5 APIs (eg. no actual upload to a server).
This guide from MDN may come handy: https://developer.mozilla.org/en-US/docs/Web/API/File/Using_files_from_web_applications
It seems you can subscribe to the event triggered when the user selects the file(s), then you can obtain a list of said files, and inspect the files contents through the File API: https://developer.mozilla.org/en-US/docs/Web/API/File
In your case, you'll need to save a reference to the FileList object from the event somewhere, and re-use it later.
When I pipe something like an image file through a stream is there any way to send an meta object along with it?
My server gets sent an image from a user. The image gets pushed through a set of streams that perform various actions.
The final stream emits a data event, it passes the resulting image buffer into a callback but I lose all context for the user. I need to keep the resulting image tied to the user's id and some other meta data.
Ideal:
stream.on('data', function(img, meta){
...
})
Thanks for any possible solutions!
In short, no, there's nothing built into Node.js to support including metadata with streams. You do have some other options, though, including:
You could use a closure to track the meta data separately from the stream. For example:
function handleImage(imageStream) {
var meta = {...};
imageStream.pipe(otherStreams).on('data', function(image) {
// you now have `image` and `meta` variables at your disposal here.
}
}
The downside of this is that the metadata is not available to your otherStreams.
This is a good solution if your other streams are third-party code outside of your control, of if they don't need to know about the metadata.
You could do something similar to HTTP headers, where all the data up to a certain point is meta data, and everything after it is the image. (In HTTP, the deliminator is wherever \n\n occurs first.) All of your streams in the chain have to know about this and handle it though.
If you know your metadata will always be in one chunk and none of your streams split or merge chunks, then you could simplify this a bit and just say that the first (or last) chunk is always metadata.
Switch to an object stream like Amoli mentioned in his answer. Here you would pass {image: imgData, meta: {...}}. You would then have to update your other streams to expect this format.
The main downside of this method, though, is that you either have to pass the metadata multiple times, cache it somewhere for each stream that needs it, or pass your entire image as one chunk (which kind of kills the entire point of "streams"). And, from what I've been told, node.js can optimize text/binary streams better than object streams. So, this probably isn't a good approach for your situation.
https://github.com/dominictarr/mux-demux might be helpful here. It combines multiple streams into one, so you could have separate image and meta streams. I'm not sure how well it would work for your situation though. You'd probably need to update all of your streams to be aware of it.
I know I said that all but the first option require modifying the other streams, but there is a way around that: you could create a generic "stream wrapper" that splits up the image and meta data and passes just the image data through the main stream, and has the meta data bypass it and go on to the next one down the chain. This gets ugly fast though, so probably not the best idea.
Basically, whenever you want to read or write any objects which are not strings or buffers, you’ll need to put your stream into objectMode
Example (source):
function S3Lister (s3, options) {
options || (options = {});
stream.Readable.call(this, { objectMode : true });
this.s3 = s3; // a knox-like client.
this.marker = options.start;
this.connecting = false;
this.ended = false;
}
util.inherits(S3Lister, stream.Readable);
We set the stream to use objectMode as we want to return not just data but also some metadata.
For more information:
Node.js Docs stream object mode
An introduction to nodes streams
I created a module called metastream for this type of thing. (It is in npm).
I tried using the SetOutputVoices function, and the constructor parameter, but both result in a XAUDIO2_E_INVALID_CALL as the result when used on a submix voice.
the docs say that you get that error by calling it from an audio callback, but i'm not. i have even tried calling it before starting the audio engine.
the same method works for source voices, so i'm pretty sure i'm not passing a bad XAUDIO2_VOICE_SENDS structure.
Submix voices have a processing order, specified by the processingStage parameter in IXaudio2.CreateSubmixVoice
You can only send output to a submix voice with a lower processing stage. and i had all of my submixes at the default processing stage (0).