Inconsistency of `this` when debugging a typescript program under VS code - node.js

edit: I think I have seen this problem when using chrome to debug my full application. So, not sure if this is a typescript issue or a VScode issue.
I was about to submit when I thought I should try it using chrome as the debugger instead of VS code. Chrome works as expected, but VS code shows the problem illustrated below.
I distilled the following down from a larger program that was giving me some strange behavior when examining things in the debugger. The program appears to work correctly in terms of what it prints out, but if I run it in VS code or attach to a running process, when inspecting the value of this inside somePrivateArrowFunc, I see different results in the debugger than are printed out to the console:
class MyClass {
someField: number = 123;
private somePrivateArrowFunc = () => {
console.log("somePrivateArrowFunc", this);
};
funcRefs = [this.somePrivateArrowFunc];
funcRef = this.somePrivateArrowFunc;
public somePublicRegularFunc() {
console.log("somePublicRegularFunc", this); // debugger sees "this" as instance of MyClass
for (let f of this.funcRefs) {
f(); // debugger sees "this" as the global object inside somePrivateArrowFunc
}
this.funcRefs[0](); // debugger sees "this" as an array containing f inside somePrivateArrowFunc
this.funcRef(); // debugger sees "this" as an instance of MyClass inside somePrivateArrowFunc
}
}
var c:MyClass = new MyClass();
c.somePublicRegularFunc();
The output printed to the console indicates that the value of this is always an instance of MyClass, but a breakpoint on that same console.log line sees 3 different behaviors:
somePublicRegularFunc MyClass {
someField: 123,
somePrivateArrowFunc: [Function (anonymous)],
funcRefs: [ [Function (anonymous)] ],
funcRef: [Function (anonymous)]
}
somePrivateArrowFunc MyClass {
someField: 123,
somePrivateArrowFunc: [Function (anonymous)],
funcRefs: [ [Function (anonymous)] ],
funcRef: [Function (anonymous)]
}
somePrivateArrowFunc MyClass {
someField: 123,
somePrivateArrowFunc: [Function (anonymous)],
funcRefs: [ [Function (anonymous)] ],
funcRef: [Function (anonymous)]
}
somePrivateArrowFunc MyClass {
someField: 123,
somePrivateArrowFunc: [Function (anonymous)],
funcRefs: [ [Function (anonymous)] ],
funcRef: [Function (anonymous)]
}
I got these results using Version 4.6.3 of tsc and node v16.13.0 (I also saw the same with earlier versions of both).

It seems to me that what you see as this with the debugger is simply and really NOT the this that is actually used by your somePrivateArrowFunc arrow function.
As expected by design, and as confirmed by your console output, this in the arrow function is easily predictable and always the same: it is the value where the arrow function is expressed. In your case, the arrow function is expressed during the instance initialization, hence it is your instance.
But the this that you inspect in your debugger is rather the context, i.e. what a normal function would get as this. Which, therefore, depends on how you called the function:
f(): no context, this is global
funcRefs[0](): context is funcRefs, i.e. the array
this.funcRef(): context is the same as the call c.somePublicRegularFunc(), hence is c, i.e. an instance
If you turn your arrow function into a normal function, then your console output will reflect the this that you get in the debugger.

I found that using a newer target language version (es2022 instead of es5 or es6) fixed the problem. The debugger sees the correct value of this in all 3 scenarios.

Related

What are the succeed, fail, and done Context Methods For?

In Node.js, Amazon Lambda functions have a signature that looks like this
exports.handler = async function (event, context) {
// TODO implement
const response = {
statusCode: 200,
body: JSON.stringify('Hello from Lambda July 20!'),
};
return response
};
The event parameter contains information about the AWS Service Event that triggered the Lambda. The context parameter contains information about the Lambda environment itself.
The context object is documented here. Per those docs, it has a single method named getRemainingTimeInMillis and a number of properties.
However, if I log this object, I see the following
INFO {
callbackWaitsForEmptyEventLoop: [Getter/Setter],
succeed: [Function (anonymous)],
fail: [Function (anonymous)],
done: [Function (anonymous)],
functionVersion: '$LATEST',
functionName: 'july-2021-delete-after-july-31',
memoryLimitInMB: '128',
logGroupName: '/aws/lambda/july-2021-delete-after-july-31',
logStreamName: '2021/07/15/[$LATEST]e05ac24e44d6489b9f8124791b3d5513',
clientContext: undefined,
identity: undefined,
invokedFunctionArn: '...',
awsRequestId: '7852dc1a-8283-46a1-b445-e6d6187553b6',
getRemainingTimeInMillis: [Function: getRemainingTimeInMillis]
}
That is, there's three additional methods named succeed, fail, and done.
succeed: [Function (anonymous)],
fail: [Function (anonymous)],
done: [Function (anonymous)],
What are these methods for, exactly? I can take some guesses and Googling around leads to some circumstantial evidence that they're deprecated methods, but I can't seem to find any documentation on what they're meant to do or how they work.

Connecting to Cayley serving over localhost

I've followed the 'Getting Started' guide in Cayley's documentation and installed Cayley on my remote server:
Getting Started: https://github.com/google/cayley
Server OS: CentOS 7.2.1511
I've added cayley to my $PATH:
echo $PATH :
/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/csse/cayley/src/github.com/google/cayley
Here is my config file at /etc/cayley.cfg
{
"database": "leveldb",
"db_options": {
"cache_size_mb": 2,
"write_buffer_mb": 20
},
"db_path": "~/cayley/src/github.com/google/cayley/data/testdata.nq",
"listen_host": "127.0.0.1",
"listen_port": "64210",
"read_only": false,
"replication_options": {
"ignore_missing": false,
"ignore_duplicate": false
},
"timeout": 30
}
I serve cayley over http by simply doing:
cayley http
and the terminal outputs:
Cayley now listening on 127.0.0.1:64210
On my main machine (Mac OSX 10.10.5 Yosemite), I've used npm to install the cayley package and written a test:
##testconnection.js
var cayley = require('cayley');
var client = cayley("137.112.104.107");
var g = client.graph;
g.V().All(function(err, result) {
if(err) {
console.log('error');
} else {
console.log('result');
}
});
However, it fails when I run it: node testconnection.js
error: Error: Invalid URI "137.112.104.107/api/v1/query/gremlin"
I'd like to connect to Cayley and modify the database from my test. I've found a great powerpoint full of Cayley information:
https://docs.google.com/presentation/d/1tCbsYym1kXWWDcnRU9ymj6xP0Nvgq-Qhy9WDmqWcM-o/edit#slide=id.g3776708f1_0319
As well as pertinent Cayley docs:
Overview Doc
Configuration Doc
HTTP API Doc
And a post on stackoverflow:
Cayley db user and password protection over HTTP connections
But I'm struggling to come up with a way to connect Cayley (on my remote machine) with my local machine. I'd like to connect with npm if possible, but am open to other options. Where am I going wrong?
Edit #1
I've appended the "http://" to my ip, so now it reads http://137.112.104.107. At that point, I solved another issue by performing
cayley init --config=/etc/cayley.cfg
as mentioned by the author here
I've also removed the listen_post and listen_port from my config file (each individually first, then both), yet have still have the same socket hang up error. Here's a printout of client from the test script:
Client {
host: 'http://137.112.104.107',
request:
{ [Function]
get: [Function],
head: [Function],
post: [Function],
put: [Function],
patch: [Function],
del: [Function],
cookie: [Function],
jar: [Function],
defaults: [Function] },
graph: Gremlin { client: [Circular], query: [Function] },
g: Gremlin { client: [Circular], query: [Function] },
write: [Function: bound ],
delete: [Function: bound ],
writeFile: [Function: bound ]
}
Your Cayley server is listening on 127.0.0.1 / localhost and therefor not reachable from another machine. To be able to reach it from a virtual machine or another computer on your network it needs to bind to an interface that is reachable.
If you configure host: 0.0.0.0 and check what is your network IP (I assume: 137.112.104.107) and connect it, it should work or you need to open it or forward the port on your firewall (depending on your network).

meteor: How to list files and directories from the project root path?

I'm using Meteor 1.0.2.1 and I noticed that working with the filesystem is not as easy as I tought :p
I ended up installing the peerlibrary:fs package (https://atmospherejs.com/peerlibrary/fs) so that I now have access to the node.js "fs" module and now I'm trying to list the content of the public folder but as mentioned here:
Reading files from a directory inside a meteor app
the path now (with version 1) seems to be '../../../../../public'
var files = fs.readdirSync('../../../../../public');
But I assume this to be wrong.
Is there an alias to the project root folder?
Is it ok to use the peerlibrary:fs for this?
Thanks.
console.log of _meteor_bootstrap_ says(I removed personal content)
{ startupHooks:
[ [Function], [Function],
[Function], [Function],
[Function],
[Function],
[Function],
[Function],
[Function] ],
serverDir: 'my_path_to_serverDir',
configJson:
{ meteorRelease: 'METEOR#1.0.2',
clientPaths: { 'web.browser': '../web.browser/program.json' } } }
=> Started your app.
I did checked program.json in /home/user/app/.meteor/local/build/programs/web.browser/program.json
Part of it looks like that(I changed some personal data)
{
"path": "app/pathToImage/image.png",
"where": "client",
"type": "asset",
"cacheable": false,
"url": "/pathToimage/image.png",
"size": someSize,
"hash": "someHash"
},
On that i state there is no public folder in deployed state but you can get paths from program.json file and _meteor_bootstrap_.configJson.clientPaths gives object with path to it wich looks like this(paste from console.log):
{ 'web.browser': '../web.browser/program.json' }

Commander can't handle multiple command arguments

I have the following commander command with multiple arguments:
var program = require('commander');
program
.command('rename <id> [name]')
.action(function() {
console.log(arguments);
});
program.parse(process.argv);
Using the app yields the following result:
$ node app.js 1 "Hello"
{ '0': '1',
'1':
{ commands: [],
options: [],
_execs: [],
_args: [ [Object] ],
_name: 'rename',
parent:
{ commands: [Object],
options: [],
_execs: [],
_args: [],
_name: 'app',
Command: [Function: Command],
Option: [Function: Option],
_events: [Object],
rawArgs: [Object],
args: [Object] } } }
As you can see, the action receives the first argument (<id>) and program, but doesn't receives the second argument: [name].
I've tried:
Making [name] a required argument.
Passing the name unquoted to the tool from the command line.
Simplifying my real app into the tiny reproducible program above.
Using a variadic argument for name (rename <id> [name...]), but this results on both 1 and Hello to being assigned into the same array as the first parameter to action, defeating the purpose of having id.
What am I missing? Does commander only accepts one argument per command (doesn't looks so in the documentation)?
I think this was a bug in an old version of commander. This works now with commander#2.9.0.
I ran in to the same problems, and decided to use Caporal instead.
Here's an example from their docs on Creating a command:
When writing complex programs, you'll likely want to manage multiple commands. Use the .command() method to specify them:
program
// a first command
.command("my-command", "Optional command description used in help")
.argument(/* ... */)
.action(/* ... */)
// a second command
.command("sec-command", "...")
.option(/* ... */)
.action(/* ... */)

write in stdin for a spawned child_process doesn't work

I am spawnning a java App (REPL for querying a local DB) using:
repl = = require('child_process').spawn('java', ['-cp', '...list of libs...', ,{ cwd: '...path to env...', env: process.env, customFds: [-1, -1, -1] });
The REPL loads fine because I can seen its outputs in stdout, but stdin.write commands don't go throught. I can however write them directly the console window of the node process itself (which is weird since I didn't .resume() it).
I have printed out the stdin of the spawned process, it looks like this:
{ _handle:
{ writeQueueSize: 0,
socket: [Circular],
onread: [Function: onread] },
_pendingWriteReqs: 0,
_flags: 0,
_connectQueueSize: 0,
destroyed: false,
bytesRead: 0,
bytesWritten: 0,
allowHalfOpen: undefined,
writable: true,
readable: false }
It seems there is no 'fd' defined, and also .readable returns false. How can this be resolved?
(this is all on a windows machine, node v0.6.6)
Thanks
The documentation states that the customFds option was deprecated specifically because they couldn't get it to work on Windows.
While an array of -1's implies that it shouldn't be used, since the entire option is deprecated, try removing it entirely and see if that solves your problem.

Resources