why does running `which` in a node child process result in different results? - node.js

when running the which command in terminal (for example, which yarn), I get a different result from when I'm running a node script (from the same location) which calls execSync('which yarn')
can someone explain why?
tldr;
// in terminal
which yarn
// results in
/Users/xxx/.nvm/versions/node/v17.1.0/bin/yarn
// in node
execSync('which yarn')
// results in
/var/folders/0j/xxx/T/yarn--xxx/yarn

It looks like the Node.js process is running as a different user (not as you), and that user has a different path from your account (or at least, it isn't running any .bashrc or similar specific to your user account that might add to the path). That makes sense given that your result refers to a folder specific to you (/Users/xxx/), but the one from Node.js refers to a central location shared by all users.

Related

How to automigrate when needed in loopback 3?

I created an automigrate script under /bin in my loopback app and added its path in the package.json file so that I can run this script to automigrate whenever I want from the terminal.
I also have a boot script "createUsers.js" which creates some default users in a model. The problem is, whenever I run this script it calls the boot script and it tries to create the users while automigration is still not finished, resulting in a failed automigration. I don't understand why the boot scripts are called when I only run automigrate script specifically. I could call automigrate in the boot scripts and wrap the createUsers.js code in its callback (as shown here), but that would automigrate every time the app is started which is undesirable since the data is lost on automigration. Where should I call automigrate() so that it can be called whenever required? Any help is greatly appreciated.
What I normally do, is create a script called util.js inside boot
util.js
class util{
static _automigrate(){
//Whatever you need to do
}
}
module.exports = function (server) {
global.util = util;
};
This way your script is available across the entire application. And you can call it whenever you need to.
You could call it with
util._automigrate();
I normally use this pattern to store all my input validations etc since I might need those across different models.

How to update ENV variables in a Process without restarting it (NodeJS)?

I have a server running on NodeJS. Is there a way to update the environment variables in the process without restarting the server?
What I'm looking to do is:
Start my server npm start
type something into the console to update ENV variable
Server restarts with new environment variable
From a programmatic standpoint, you should be able to update the process.env variable that was passed into the running process.
For example, running:
cmd_line$: MY_VALUE=some_val node ./index.js
with code:
console.log(process.env.MY_VALUE)
process.env.MY_VALUE = 'some other value'
console.log(process.env.MY_VALUE)
process.env.MY_VALUE = 4
console.log(process.env.MY_VALUE)
output in terminal:
some_val
some other value
4
From a server admin standpoint for an already running application, I don't know the answer to that.
It's possible to debug Node.js process and change global variables:
On *nix, it's possible to enable debugging even if a process wasn't started in debug mode. On Windows, it's possible to debug only processes that were started with node --inspect. This article explains both possibilities in detail.
Obviously, this will work only if environment variables are used directly all the time as process.env.FOO.
If their values are initially used, changing process.env.FOO later may not affect anything:
const FOO = process.env.FOO;
...
console.log(FOO); // it doesn't matter whether process.env.FOO was changed at this point
If you make a change to env variables they will take place immediately only if you make the change via the main Properties dialog for the system which is going to my computer -> Advanced properties -> Environment Variables.
Any program which is already running will not see the changes unless we handle it in the code explicitly.
Logic behind it is that there is an agent which sends a broadcasting a WM_SETTINGCHANGE message and make changes to all applications inorder to notify for that change.
Here's a good explanation on how to setup environment variables in mac OS. Additionally, as others have already answered, you can set them with process.env.ANY_VARIABLE = 'some value'.
I am trying to find a cross-platform answer to this (mac, linux, windows)
For mac, a workaround that I found is using
const shell = require("shelljs");
console.log(shell.exec('launchctl getenv TEST'));
It only works if you set environment variable (e.g. in other terminal tab) using launchctl setenv TEST blahblah
You can try to use this javascript tiny library to update env variables without restarting the node server:
https://www.npmjs.com/package/runtime-node-refresh
Hope this help.

What is the difference between `process.env.USER` and `process.env.USERNAME` in Node?

This is the most robust documentation I can find for the process.env property: https://nodejs.org/api/process.html#process_process_env.
It mentions USER, but not USERNAME. On my machine (Windows/Bash), when I print the contents of process.env, I see USERNAME (my windows username) but not USER. Similarly, echo $USERNAME shows my name but echo $USER returns nothing.
What is the difference between USER and USERNAME? Is it an operating system thing? Are they interchangeable?
The documentation about process.env that you linked to shows an example environment; it is not meant to be normative. process.env can be basically anything -- its values generally have OS defaults provided by the shell, but ultimately they are controlled by the user and/or the process that launched your process.
ie, a user could run
$ USER=lies node script.js
...and process.env would not contain the real username.
If you're interested in getting information about the user your process is running as, call os.userInfo(), which is (mostly1) consistent across platforms.
> os.userInfo()
{ uid: -1,
gid: -1,
username: 'josh',
homedir: 'C:\\Users\\josh',
shell: null }
1 - on Windows, uid, gid, and shell are useless, as seen above
os.userInfo() calls uv_os_get_passwd, which returns the actual current effective user, regardless of what's in environment variables.
uv_os_get_passwd Gets a subset of the password file entry for the current effective uid (not the real uid). The populated data includes the username, euid, gid, shell, and home directory. On non-Windows systems, all data comes from getpwuid_r(3). On Windows, uid and gid are set to -1 and have no meaning, and shell is NULL.
process.env is the process's environment variables, which are supplied by the OS to the process.
This object can really contain just about anything, as specified the OS and the process that launches it, but by default Windows stores the username in USERNAME and Unix-like systems (Linux, macOS, etc.) store it in USER.
I was having a similar issue when trying to connect node.js to mysql via dotenv.
None of the many answers in the web did not resolve my issue.
This worked perfectly well, without the .env file, but only with the information required for authentication inserted into the app.js file. I have tried unsuccessfully any of the posted answers, which include (but not only):
changing the information inside the .env file to be with and without ""
changing the name of the .env file
changing the path of the .env file
describing the path to .env file
writing different variations of the dotenv commands inside app.js
At last, I have tried to find if I had installed the dotenv using the npm install dotenv command. Also I have tried to show the version of the dotenv from the console.log(dotenv.MY_ENV_VAR); which again, showed undefined.
The issue was related to the fact, that dotenv confused USER (of the system, again like you I was using Linux) with USERNAME (of the mysql database). Actually USER returns the current system user instead of the mysql database user, which I have set to USERNAME in the .env file for convenience. Now it was able to connect to the database!
To check this, you could use:
console.log(process.env.USER);
and:
console.log(process.env.USERNAME);
1st gives you the system user, whereas the 2nd gives the database user.
Actually, any name for the variable, that holds the username of the mysql database could be used, as far as it does not match the reserved name for the system username in Linux, which is USER.

Where should I store cache of a custom CLI npm module?

I am developing an npm module, where user can interact with it through a terminal by executing commands:
> mymodule init
> mymodule do stuff
When executing certain commands user is being asked for some data, which will be used by the module. Since this data won't really change while using the module and since these commands can be executed pretty frequently, it is not the best option to ask user for the data any time he runs a command. So I've decided to cache this data, and as soon as it should live through multiple module calls, the easiest way to store it that I see is a file (the data structure allows to store it in a simple JSON). But I am to quite sure where should this file go on a user's machine.
Where in the file system should I store a cache in a form of a file or multiple files for a custom npm module, considering that the module itself can be installed globally, on multiple operation systems and can be used in multiple projects at the same time?
I was thinking about storing it in a module's folder, but it might be tricky in case of global installation + multi-project use. The second idea was to store it in OS specific tmp storage, but I am not quite sure about it too. I am also wondering if there are some alternatives to file storage in this case.
I would create a hidden folder in the user's home directory (starting with a dot). For instance, /home/user/.mymodule/config.cfg. The user's home directory isn't going anywhere, and the dot will make sure it's out of the user's way unless they go looking for it.
This is the standard way that most software stores user configs, including SSH, Bash, Nano, Wine, Ruby, Gimp, and even NPM itself.
On some systems you can cache to ~/.cache by create a sub-directory to store your cache data, though its much more common for applications to create a hidden directory in the users home directory. On modern windows machines you can use create a directory in C:/Users/someUser/AppData. In Windows using a . suffix will not hide a file. I'd recommend you do something platform agnostic like so:
var path = require('path');
function getAppDir(appName, cb) {
var plat = process.platform;
var homeDir = process.env[(plat == 'win32') ? 'USERPROFILE' : 'HOME'];
var appDir;
if(plat == 'win32') {
appDir = path.join(homeDir, 'AppData', appName);
}
else {
appDir = path.join(homeDir, '.' + appName);
}
fs.mkdir(appDir, function(err) {
if(err) return cb(err);
cb(null, appDir);
})
}
Just declare a function to get the app directory. This should handle most systems, but if you run into a case where it does not it should be easy to fix because you can just create some kind of alternate logic here. Lets say you want to allow a user to specify a custom location for app data in a config file later on, you could easily add that logic. For now this should suite most of your cases for most all Unix/Linux systems and Windows Vista and up.
Storing in system temp folder, depending on the system, your cache could be lost on an interval(cron) or on reboot. Using the global install path would lead to some issues. If you need this data to be unique per project, then you can extend that functionality to allow you to store this data in the project root, but not the module root. It would be best not to store it in the module root, even if its just installed as a local/project module, because then the user doesn't have the ability to include this folder in their repositories without including the entire module.
So in the event that you need to store this cached data relevant to a project, then you should do so in the project root not the node_modules. Otherwise store it in the users home directory in a system agnostic way.
First you need to know in what kind of SO you are running:
Your original idea is not bad, because global modules are not really global in all SO and in all virtual enviroments.
Using /home/user may not work in Windows. In windows you have to check process.ENV.HOMEPAHT
I recommend you a chain of checks to determine the best place.
Let the user take the control. Chose your own env variable. Supouse MYMOD_HOME. You firts check if process.ENV.MYMOD_HOME exists, and use it
Check if windows standard process.ENV.LOCALAPPDATA
Check if windows standard process.ENV.HOMEPATH
Check if exists '/home/user' or '~'
Otherwise use __dirname
In all cases create a directory ./mymodule

Working with Node.JS

Last night I dump windows 7 and formatted my hard driver to port to a Linux based operating system, Purely for the reasons that I wanted to start working with Node.JS
So I have installed Node.JS and have done a few test stuff, the http server and sockets etc.
What I would like to do is build a HTTP Server that is tightly intergrated with an MVC Framework, but before I get started on all that I need to learn how to build efficiently in Node.
For example within PHP as my framework I would create a bootloading system to load all base classes etc, then i would fire my events system ready to start attaching callbacks.
I would continue to process the request etc until the output is generated which then gets sent of to an output handler that would process headers etc etc
But Node s a totally new environment for this and im wondering on the best practises to build an system in Node.
The information im looking for is more to do with the design structure rather then the actual coding of the application, how to load the lib where to load the libs, etc etc
Any help is appreciated.
So far my WebApplication is coming along nicely, I have built my application pretty traditionally and a little procedural.
What i have started out is creating a directory structure like so:
<root>
startup.js
/public/
favicon.ico
/images/
/stylesheets/
/javascripts/
/system/
init.js
config.js
/libs/
/exceptions/
http.js
server.js
/application/
/views/
/_override/
/errors/
generic.view
/partials/
sidebar.voew
index.view
/controllers/
index.js
/models/
users.js
This directory structure is like most MVC Based Web Applications out there so using this method I feel comfortable.
The startup file is whats executed by node as the entry point, node startup & and looks like so:
/*
* Header of t he file, Copyright etc
*/
var _Intitialize = require("./system/init.js");
//Displays the command line header, title, copyright etc
_Intitialize.DisplayCommandLineHeader();
//Check the enviroment, Permissions, Ports etc
_Intitialize.CheckEnviroment();
//Start the server and listen the port.
_Initialize.StartServer();
the init file is the main work, its what tells all other areas of the system to run, stop etc.
I have a file in libs called serverhandler.js, and this is required into init.js, I then create a server and assign the callback to the ServerHandler.Listener. Who then listens for requests, checks to see if the file exists in public directory, if so it then reads in chunks and sends back.
if no file was found in public it would then create a route with Route.Create("/path?params"); which deters 3 elements, Controller, Method, Params from the uri, and then the controller files are loaded if exists.
I've taken on the approach of throwing error pages like so:
if(!FileSystem.exists(RequiredPath))
{
throw new HTTPExceptions.FileNotFound();
}
Hope this helps some people getting started in Node.
Have a look at
http://dailyjs.com/2010/11/01/node-tutorial/ , it's pretty relevant.
I would suggest looking at the current modules too
https://github.com/joyent/node/wiki/modules
and reading the code of any of the projects in the areas you are interested in, esp. the middleware, routing and module loaders.

Resources