I have the follow NODE.JS code:
var a = [1,2,3,4,5,6]
function test(){
var v = a.pop()
if (!v) return
function uno(){
due(v, function(){
console.log(v)
})
console.log("Start:",v)
return test()
}
function due(v, cb){
setTimeout(function(){
console.log(v);
cb();
}, 5000);
}
uno();
}
test()
This is the output:
Start: 6
Start: 5
Start: 4
Start: 3
Start: 2
Start: 1
6
6
5
5
4
4
3
3
2
2
1
1
as you can see inside uno() function i call due() function with a timeout.
I have two: console.log(v) (inside uno() and due())
could somone explain me WHY when i call the callback (cb()) the v value is the same?
doing:
due(v, function(){
console.log(v)
})
the console.log will keep the v value i passed in the due() call?
Why it does not get the "global" v value on the test() function?
The callback cb() is the following function: function(){ console.log(v) } and the v is taken from the local environment that is in effect when you define the function, because it is not a parameter to the callback function (upvalue). That means, the first time you call test(), it has the value 6, the second time the value 5 etc.
You should give the parameters different name than the global variables, for example:
function due(param_v, cb){
setTimeout(function(){
console.log(param_v);
cb();
}, 500);
}
Then you might spot the difference.
Edit: this is not related to node at all, more to JavaScript (and many programming languages behave exactly the same). You should play around with it and put the callbacks etc. aside for a while.
var a
function print_a () {
// this function sees the variable named a in the "upper" scope, because
// none is defined here.
console.log(a)
}
function print_b () {
// there is no variable named "b" in the upper scope and none defined here,
// so this gives an error
console.log(b)
}
a = 1
print_a() // prints 1
// print_b() // error - b is not defined
var c = 1
function dummy () {
var c = 99
function print_c () {
// the definition of c where c is 99 hides the def where c is 1
console.log(c)
}
print_c()
}
dummy() // prints 99
Related
I have some code below. I hope it can log a and b correctly, but as the result, it logs a as 1, b is not defined with error:
Uncaught ReferenceError: b is not defined
function foo(){
var a = 1
let b = 2
(function bar() {
console.log(a)
console.log(b)
}())
}
console.log(foo());
If I change the code to make the bar as a function declaration, every thing is fine
function foo(){
var a = 1
let b = 2
function bar() {
console.log(a)
console.log(b)
}
bar()
}
console.log(foo());
I know some thing about function scope, block scope, hoisting. But I really don't understand what happens make the b to be 'not defined'
All the code above was run in the Chrome 66's devTools
This is one of the reasons not to rely on Automatic Semicolon Insertion (ASI) unless you and every programmer who might conceivably work on the code are very clear on the rules for ASI.
Your code is trying to call 2 as a function, passing in the result of calling your bar function as an argument, because the () around the following IIFE can be reasonably (by ASI rules) considered part of the let b = statement. Since calling bar happens before b is declared (it's not declared until after its initializer runs), you get the error.
If we change the line breaks a bit you can see what's going on:
function foo(){
var a = 1
let b = 2(function bar() {
console.log(a)
console.log(b)
}());
}
console.log(foo());
Adding the ; fixes it:
function foo(){
var a = 1;
let b = 2;
(function bar() {
console.log(a);
console.log(b);
}());
}
console.log(foo());
I am learning Node JS right now and I am confused by the snippet of code below. Dosomething is called in the code later on without any parameters. So what is the value cb set to (since there were no parameters passed)?
let dosomething = (cb) => {
checkAuthToken.get((err, authKey) => {
if (err) {
return cb(err)
}
return cb(null, authKey);
})
}
#UZA, here in your code, dosomething() is a function that takes 1 parameter as another callback function.
In case if error, you are calling that callback function with 1 error parameter err.
In case of success, you are calling the callback with 2 parameters null & authKey.
You have used arrow functions in your code. I think, it is making you doubtful.
Please comment if you the explanation doesn't solve your problem. I will update my answer with more examples.
Here I have shown 2 simple examples.
» Simple function syntax
function doSomething(cb) {
if(true) {
cb("I am a programmer");
}
}
// Call doSomething() with 1 parameter as a function
doSomething(
function (message) {
console.log(message); // I am a programmer
})
» Arrow function syntax (implementation of above code using arrow functions)
var doSomething = (cb) => {
if(true) {
cb("I am a programmer");
}
}
// Call doSomething() with 1 parameter as a function
doSomething (
(message) => {
console.log(message); // I am a programmer
})
Is there a way to make Node.js stream as coroutine.
Example
a Fibonacci numbers stream.
fibonacci.on('data', cb);
//The callback (cb) is like
function cb(data)
{
//something done with data here ...
}
Expectation
function* fibonacciGenerator()
{
fibonacci.on('data', cb);
//Don't know what has to be done further...
};
var fibGen = fibonacciGenerator();
fibGen.next().value(cb);
fibGen.next().value(cb);
fibGen.next().value(cb);
.
.
.
Take desired numbers from the generator. Here Fibonacci number series is just an example, in reality the stream could be of anything a file, mongodb query result, etc.
Maybe something like this
Make the 'stream.on' function as a generator.
Place yield inside the callback function.
Obtain generator object.
Call next and take the next value in stream.
Is it at-least possible if yes how and if not why? Maybe a dumb question :)
If you don't want to use a transpiler (e.g. Babel) or wait until async/await make it to Node.js, you can implement it yourself using generators and promises.
The downside is that your code must live inside a generator.
First, you can make a helper that receives a stream and returns a function that, when called, returns a promise for the next "event" of the stream (e.g. data).
function streamToPromises(stream) {
return function() {
if (stream.isPaused()) {
stream.resume();
}
return new Promise(function(resolve) {
stream.once('data', function() {
resolve.apply(stream, arguments);
stream.pause();
});
});
}
}
It pauses the stream when you're not using it, and resumes it when you ask it the next value.
Next, you have a helper that receives a generator as an argument, and every time it yields a promise, it resolves it and passes its result back to the generator.
function run(fn) {
var gen = fn();
var promise = gen.next().value;
var tick = function() {
promise.then(function() {
promise = gen.next.apply(gen, arguments).value;
}).catch(function(err) {
// TODO: Handle error.
}).then(function() {
tick();
});
}
tick();
}
Finally, you would do your own logic inside a generator, and run it with the run helper, like this:
run(function*() {
var nextFib = streamToPromises(fibonacci);
var n;
n = yield nextFib();
console.log(n);
n = yield nextFib();
console.log(n);
});
Your own generator will yield promises, pausing its execution and passing the control to the run function.
The run function will resolve the promise and pass its value back to your own generator.
That's the gist of it. You'd need to modify streamToPromises to check for other events as well (e.g. end or error).
class FibonacciGeneratorReader extends Readable {
_isDone = false;
_fibCount = null;
_gen = function *() {
let prev = 0, curr = 1, count = 1;
while (this._fibCount === -1 || count++ < this._fibCount) {
yield curr;
[prev, curr] = [curr, prev + curr];
}
return curr;
}.bind(this)();
constructor(fibCount) {
super({
objectMode: true,
read: size => {
if (this._isDone) {
this.push(null);
} else {
let fib = this._gen.next();
this._isDone = fib.done;
this.push(fib.value.toString() + '\n');
}
}
});
this._fibCount = fibCount || -1;
}
}
new FibonacciGeneratorReader(10).pipe(process.stdout);
Output should be:
1
1
2
3
5
8
13
21
34
55
I want to do some prepare work, and my other work should start after these is done, so I call these work by Q.all, but some work is async, this is what I want.
May be my code will make you understand me, in this simple example I want to do this:
call foo2 for item in array
in foo2, I wait 10 * a ms(assume this to be some completed work), and change res.
I want to foo2 is running over then console.log(res), this means all wait is over and all item is added to res. So in my example, res is change to 6.
Here is the code
var Q = require("q");
var res = 0;
function foo(a) {
res += a;
}
function foo2(a) {
// this is a simple simulation of my situation, this is not exactly what I am doing. In one word, change my method to sync is a little bit difficult
return Q.delay(10 * a).then(function() {
res += a;
});
}
// Q.all([1, 2, 3].map(foo)).done(); // yes, this is what I want, this log 6
// however, because of some situation, my work is async function such as foo2 instead of sync method.
Q.all([1, 2, 3].map(function(a) {
return foo2(a);
})).done();
console.log(res); // I want 6 instead of 0
You are mixing sync and async style of programming.
In this case, your console.log statement will be run before any promise had time to fulfill (before res was modified by them), as it it not inside a promise block.
See here how console.log will be run after promises have been resolved
var Q = require("q"),
res = 0;
function foo(a) { res += a; }
function foo2(a) {
return Q
.delay(10 * a)
.then(function() { res += a; });
}
Q.all( [1, 2, 3].map(function(a) { return foo2(a); }) )
.then(function(){ console.log(res) })
.done();
I have the following code with http module:
function PostCode(stream) {
// Build the post string from an object
console.log("Start upload: " + stream);
console.log("A");
var post_data = JSON.stringify({
'stream' : stream,
});
console.log("B");
// An object of options to indicate where to post to
var post_options = {
'host': 'myserver',
'port': '5000',
'path': '/upload',
'method': 'POST',
'headers': {
'Content-Type': 'application/json'
}
};
// Set up the request
console.log("C");
var post_req = http.request(post_options, function(res) {
console.log("D");
res.setEncoding('utf8');
console.log(post_options);
res.on('data', function (chunk) {
console.log('Response: ' + chunk);
});
});
// post the data
post_req.write(post_data,'utf8');
post_req.end();
}
I execute the postCode function 1000~ times in order to backup my filesystem.
The probelm is that the callback isn't executed, I see a sequence output of:
A
B
C
A
B
C
and so on.. without D.
Just when all the postCode were executed, the callback is starting to run.
How I exceute the callback parallel? So that also D will be printed?
EDIT:
this is the new question, hope it clear. I still don't understand how to fix this issue:
The problem is that I have a loop which call to function A.
In this function, there is a codeblock with execute a function with callback, let say that the callback is B.
Now, B call to other function C.
Meaning:
`function main(){
for (int i = 0; i<5; i++){
console.log("main" + i);
A();
}
}
function A(){
// do some stuff
var u = http.request(o, function (res){
console.log("A" + i);
B();
})
}
function B(){
//do some stuff
var u = http.request(o, function (res){
console.log("B" + i);
C();
})
}
function C(){
console.log("C" + i);
}
I see that C() callback is waiting until all A loop is finish, and just then execute.
In this case, that what I would see:
main 0
A 0
B 0
main 1
A 1
B 1
main 2
A 2
B 2
main 3
A 3
B 3
main 4
A 4
B 4
C 0
C 1
C 2
C 3
C 4
How can I fix it, so C would be printed after main, a, and b?
Your code misses the part where PostCode() is called, but it's probably something like a for-loop. As you already found out, that only fills the event queue which only gets executed when no other code is running.
You might try something like this:
function postNextStream() {
if (streams.length > 0)
PostCode(streams.shift());
}
var streams = [];
for (*however you find the stuff you'd like to post*)
streams.push(stream);
postNextStream();
Then you set an listener for the "end" event on the ClientRequset returned from http.request() and call postNextStream() from there.
This makes all the requests sequentially. If you want some kind of parallelism, you have to implement a more sophisticated queue management in postNextStream().