How to set a default/fallback scope on package names with Node require() - node.js

I am using a particular module in my Node code:
const example = require('example');
However this module is slow to be updated, so I have forked it and published it with my updates under my own scope on npmjs.com. However now to use my own module, I must change every use in my code:
const example = require('#my-username/example');
The problem with this is that I will have to commit a bunch of changes throughout many files to rename the module, then when upstream merges my changes into the official version I will have to update my code again to remove the scope operator from require() across all those files, then add it back if I have more changes that are slow to be accepted, and so on.
Is there a way that I can tell Node or NPM that if require() can't find a module with no scope in the name, to then check all the #scope folders in node_modules to see if there's a match there?
If that were possible then I would only need to update package.json with the relevant package version and the code itself could remain unchanged as I switch between my fork and the official version.

you can implement it using module-alias
This will slow down your startup, but let you write all this logic for every requires you make in your application.
const moduleAlias = require('module-alias')
// Custom handler function (starting from v2.1)
moduleAlias.addAlias('request', (fromPath, request, alias) => {
console.log({
fromPath,
request,
alias,
});
return __dirname + '/my-custom-request'
})
require('request')

Related

Make a var, "global" just in the npm Package instead of whole Project in Nodejs

I am transforming my Node MicroServices in npm packages, so that I can run several of them in a single server. They exchanges RabbitMQ messages, so they are independent from the listening server.
My problem is that each Ms used global.variables, like the DB connection for example. Now that they are Packages, and so I can have different Mss in one server, I can't share a global.db_connection because each Ms may use a different db.
Is there a way so that I can use a global.db_connection just inside a Package, and that var is global just inside that Package?
==== UPDATE1
Before, I wrongly referred to Modules. I mean npm package. Each MS is a separated npm package, that I install in the server. I've corrected my question now, using package instead of module.
=== EXAMPLE
I know there is a better way to use a DB connection among the code. I am using 'db_connection' just as example of a global object.
-- packageA.js
#module MS_A
global.db_connection = db.connect(URL_1);
class MS_A
{
...
}
module.exports = MS_A;
-- packageB.js
#module MS_B
global.db_connection = db.connect(URL_2);
class MS_B
{
...
}
module.exports = MS_B;
-- server.js
global.needed_variable = 3;
var MS_A = new require("./packageA.js")().start();
var MS_B = new require("./packageB.js")().start();
/**
"needed_variable" should be visible inside both packages.
"db_connection" should be not visible outside each package
*/
MS_B should use its own db_connection, and be not aware of the db_connection of MS_A (and vice versa).

Retrieve file contents during Gatsby build

I need to pull in the contents of a program source file for display in a page generated by Gatsby. I've got everything wired up to the point where I should be able to call
// my-fancy-template.tsx
import { readFileSync } from "fs";
// ...
const fileContents = readFileSync("./my/relative/file/path.cs");
However, on running either gatsby develop or gatsby build, I'm getting the following error
This dependency was not found:
⠀
* fs in ./src/templates/my-fancy-template.tsx
⠀
To install it, you can run: npm install --save fs
However, all the documentation would suggest that this module is native to Node unless it is being run on the browser. I'm not overly familiar with Node yet, but given that gatsby build also fails (this command does not even start a local server), I'd be a little surprised if this was the problem.
I even tried this from a new test site (gatsby new test) to the same effect.
I found this in the sidebar and gave that a shot, but it appears it just declared that fs was available; it didn't actually provide fs.
It then struck me that while Gatsby creates the pages at build-time, it may not render those pages until they're needed. This may be a faulty assessment, but it ultimately led to the solution I needed:
You'll need to add the file contents to a field on File (assuming you're using gatsby-source-filesystem) during exports.onCreateNode in gatsby-node.js. You can do this via the usual means:
if (node.internal.type === `File`) {
fs.readFile(node.absolutePath, undefined, (_err, buf) => {
createNodeField({ node, name: `contents`, value: buf.toString()});
});
}
You can then access this field in your query inside my-fancy-template.tsx:
{
allFile {
nodes {
fields { content }
}
}
}
From there, you're free to use fields.content inside each element of allFile.nodes. (This of course also applies to file query methods.)
Naturally, I'd be ecstatic if someone has a more elegant solution :-)

Webpack Aliases in Node JS Server code

I'm building an isomorphic React/React-Router/Redux/Webpack application and I'm attempting to implement server side rendering.
My directory looks like:
/client
/actions
/components
/containers
/server
/server.js
In my webpack config, I have aliases set up for all the folders inside client:
var path_base = path.resolve(__dirname, '..');
const resolve = path.resolve;
const base = function() {
var args = [path_base];
args.push.apply(args, arguments);
return resolve.apply(resolve,args);
};
const resolve_alias = base.bind(null, 'src/client');
const aliases = [
'actions',
'components',
'constants',
'containers',
'middleware',
'reducers',
'routes',
'store',
'styles',
'utils',
'validation'
];
so that inside the code that gets bundled by webpack, I can do:
import { Widget } from 'components';
and that import gets resolved by webpack.
Now in my server code, in order to do the rendering, I have to import some of my client files, like routes/index.js. The problem I'm running into when I import my routes file, it's using a webpack alias to another file, say components or containers so naturally, the node js require system can't resolve it.
How do I fix something like that? I looked at this question and it talks about essentially setting up the same aliases that exist in webpack with mock-require. But then the issue becomes that my routes file imports all my components which then all import things like stylesheets, images, etc. Should I then be using something like webpack-isomorphic-tools?
The guides I've been looking at (this for example) are all great at showing how server side rendering is accomplished but none of them really talk about how to resolve all the requires and whatnot.
After battling with this issue for 2 days I settled on babel-plugin-webpack-alias.
What you need to do to resolve paths with that is:
$ npm install --save-dev babel-plugin-webpack-alias
Add the plugin to your .babelrc
Add the aliases to your webpack.config (make sure you use path.join())
Refer to this post if you have problems loading styles
The other option I tried was universal-webpack but I found it to be a bit verbose. If you want to see roughly how the whole server-side loading works, you can check out this video.
If you really want them, run your server side code through babel and use this plugin: https://www.npmjs.com/package/babel-plugin-module-alias which will let you do the same thing as webpack.
Edit: This one works a lot better: https://github.com/jagrem/babel-resolve-relative-module it allows multiple paths
Try to use NODE_PATH. Node will always look for a module in this path during require calls. It allows to short cut your relative paths as you want.
// turn this
import {Widget} from '../../components';
// into this
import {Widget} from 'components';
See Node.js docs for more information.
P.S. this thing is very sensitive, so use it carefully. Now your code tightly depends from the environment and may break somewhere.
If you use webpack-isomorphic-tools then it'll take your webpack config into account for your server side which will make all your aliases work.
https://www.npmjs.com/package/webpack-isomorphic-tools

How to add async callbacks in node to a function call?

Question is too broad / unclear. Anyone interested in this answer would be better served by visiting: Creating Callbacks for required modules in node.js
Basically I have included a CLI package in my node application. I need the CLI to spin up a new project (this entails creating a folder for the project). After the project folder is created, I need to create some files in the folder (using fs writeFile). The problem is right now, my writeFile function executes BEFORE the folder is created by the CLI package (This is detected by my console.log. This brings me to main main question.
Can I add an async callback function to the CLI.new without modifying the package I included?
FoundationCLI.new(null, {
framework: 'sites', // 'apps' or 'emails' also
template: 'basic', // 'advanced' also
name: projectName,
directory: $scope.settings.path.join("")
});
try{
if (!fs.existsSync(path)){
console.log("DIRECTORY NOT THERE!!!!!");
}
fs.writeFileSync(correctedPath, JSON.stringify(project) , 'utf-8');
} catch(err) {
throw err;
}
It uses foundation-cli. The new command executes the following async series. I'd love to add a callback to the package - still not quite sure how.
async.series(tasks, finish);
Anyone interested in this can probably get mileage out of:
Creating Callbacks for required modules in node.js
The code for the new command seem to be available on https://github.com/zurb/foundation-cli/blob/master/lib/commands/new.js
this code was not written to allow programmatic usage of the new command (it uses console.log everywhere) and does not call any callback when the work is finished.
so no there is no way to use this package to do what you are looking for. Either patch the package or find another way to do what you want to achieve.

Add global function to every execution of NodeJS

In my project folder project I have a file utils.js like this
global.root_path = process.cwd();
global.custom_require = function() {
console.log('custom require called');
}
I would like to include this file in every execution of node so that in every js file in project i can call custom_require, or access global.root_path as I'm doing already with built-in function require.
Do you know if there is a solution (command line options, environment variable or whatever) to achieve this?
UPDATE
I don't want to include utils.js in every file, in that case I wouldn't need to modify global object.
Thanks!
Simply require it. Example app.js:
require('./utils.js');
var express = custom_require('express');
var app = express();
var port = process.env.PORT || 3000;
app.listen(port);
Then, you could just do (in terminal)
node app
As long as you require the code containing the global declarations on it, then you can use those globals from any file that's required after the fact.
You most probably don't want to have a custom "plugin" altering the behavior of all your applications when run locally, thus it'd be best to stay away from that kind of pattern. Instead, you might want to create a module, use npm link on it, and then npm link something in your project's directory, where something is the module name.
Then each project could just add one line like require('something'); at the very beginning. Changing something would immediately impact all the projects which included it, thanks to the behavior in npm link.

Resources