How do I add functions to LESS in node.js - node.js

I used to use less-middleware, but switched to less as I needed to add custom functions/constants. I assume less-middleware uses less behind the scenes.
To define constants I used:
function lessDefine(name, value) {
less.tree.functions[name] = function () {
return new (less.tree.Anonymous)(value);
};
}
and to create a constant:
lessDefine('backgroundImage', config.backgroundImage);
Then to link into Express:
router.use('/stylesheets',
less.middleware(
lessOpts.source, lessOpts.main,
lessOpts.parser, lessOpts.compiler));
This was after crawling through bug reports, feature requests and google results, as the ability to extend LESS is not well documented. This worked last week, but now I see:
TypeError: Object #<Object> has no method 'middleware'
when I try to launch my node app.
What is the "correct" way to add functions/constants to LESS in Node.js?
Note that I use:
var less = require('less');
var lessMiddleware = require('less-middleware');
And my package.json uses the following version limits:
less-middleware: 1.0.3
less: 1.7.x

less.middleware is added by the less-middleware module.
Monkeypatching is frowned upon for a reason; less-middleware had its own less module installed within its folder (by npm), while my app was using a separate instance of less.
Hence, less.middleware was not being set in the instance of less that my app sees. After calling npm dedupe, the duplicate instances were consolidated and my app now works.

Related

Require behaves differently in Angular project

If to write something like const theoretically = require('jasmine-theories');, require returns the content of the file.
But if to set declare var require: any;, then next **require** executions steps inside webpack bootstrap function __webpack_require__(moduleId) and return real file path with hash, not content (for example 'file.65465436547.js').
I've found out that file-loader has such behavior https://www.npmjs.com/package/file-loader .
I may assume that depending on declare var require: any; require is taken from NodeJS or from File-loader.
Is that correct? And is there more obvious way how and when to use each of them?
And how can I config File-loader to behave another way in case of Angular application? Angular CLI doesn't provide webpack.config, therefore loaders just get installed without any configuration possible.
Overall the question can be shortened to:
Why require returns content in one case and file name in another case?
Don't use require, use
import { theoretically } from 'jasmine-theories';
This is webpack tree shakable.
It looks that before declare var require: any; default method is used from Node JS. But after explicit declaring FileLoader's method becomes visible. And as it has default exporting - require usage is switched to FileLoader's one.

Setting debug strings dynamically in Loopback 2.0

I use debug strings for debugging Loopback 2.0 application. Loopback documentation says:
The LoopBack framework has a number of built-in debug strings to help
with debugging. Specify a string on the command-line via an
environment variable as follows:
MacOS and Linux
$ DEBUG=<pattern>[,<pattern>...] node .
Is it possible to change patterns dynamically in runtime? Or is it possible to use environment-specific configuration?
Before I get deeper note that this debug logging facility uses visionmedia's debug module which handles almost all of the logic.
Is it possible to change patterns dynamically in runtime?
Well before any module is loaded, safest and best way I believe is to just manipulate the environmental variables:
if (process.env.NODE_ENV === 'development') {
process.env.DEBUG = process.env.DEBUG + ',loopback:*';
}
Another way would be to load debug and use it's .enable method:
require('debug').enable('loopabck:*');
But note that it only works if you do this before Loopback is required since it only allows changes before it's instances are created, which is in this case before loopback is loaded. Another thing is that, there might be multiple debug modules installed depending on the dependencies and your package manager(npm#3, npm#2 and yarn behave differently). debug might be in your node_modules directory, or it might be in loopback each module, node_modules directory. So make sure you require all instances of it and enable, if you want to do it this way.
Now if don't want to do it on the startup, well API doesn't allow changes in the runtime. You can view the discussion regarding this here. Though there are some dirty ways to go around it, but these might possibly break in the future so be careful.
Firstly, there's a module called hot-debug which supposedly makes require('debug').enable work on previously created instances also, but when I tried it, it didn't work perfectly and it was buggy, but it's possible it might work fine for you.
If that doesn't work for you another way is to override require('debug').log method. If this is defined, debug will call this method instead of console.log with the formatted the arguments. You can set DEBUG=* and then filter it yourself:
require('debug').log = function (string) {
if (string.contains('loopback:security')) {
console.log(string);
}
};
This way will be slow in production though as all the debug output will be formatted before being filtered even though nothing might be outputted to console.
Another thing to override the require('debug').init method. This is called everytime a new debug instance is created. Since every debug instance uses an enabled property to check if it's enabled we can toggle that.
const debug = require('debug');
const { init } = debug;
const instances = [];
debug.init = function(debugInstance) {
init(debugInstance);
instances.push(debugInstance);
};
// You can call this function later to enable a given namespace like loopback.security.acl
function enableNamespace(namespace) {
instances.forEach(instance => {
instance.enabled = instance.namespace === namespace;
});
}
Though there's a lot of improvement can be done on this, but you get the idea.
I can change debug namespace dynamically with the hot-debug module.
In my app, I've just created a function for this :
require('hot-debug')
const debug = require('debug')
function forceDebugNamespaces (namespaces) {
debug.enable(namespaces)
}
// Usage
forceDebugNamespaces('*,-express:*,-nodemon:*,-nodemon')
In my case, I have a config file which allow me to set process.env.DEBUG but I needed to find a way to update debug namespaces without restarting my app (the config file is watched for changed by my app).

Using node module in angularjs?

What's the best practice for using external code, e.g. code found in node modules, in angular?
I'd like to use this https://www.npmjs.com/package/positionsizingcalculator node module in my angular app. I've created an angular service intended to wrap the node module, and now I want to make the service use the node module.
'use strict';
angular.module('angularcalculator')
.service('MyService', function () {
this.calculate = function () {
return {
//I want to call the node module here, whats the best practice?
};
}
});
To do this, I would crack open the package and grab the .js out of it. This package is MIT license, so we can do whatever we want. If you navigate to /node_modules/positionsizingcalculator/ you'll find an index.js. Open that up and you'll see the moudle export, which takes a function that returns an object.
You'll notice this is an extremely similar pattern to .factory, which also takes a function that returns an object (or constuctor, depending on your pattern). So I'd do the following
.factory('positionsizingcalculator', function(){
basicValidate = function (argument) {
... //Insert whole declaration in here
return position;
})
and the inject it where you need it:
.controller('AppController', function(positionsizingcalculator){
//use it here as you would in node after you inject it via require.
})
--
Edit: This is good for one off grabs of the JS, but if you want a more extensible solution, http://browserify.org/ is a better bet. It allows you to transform your requirements into a single package. Note that this could result in pulling down a lot more code that you might otherwise need, if you make one require bundle for your whole site, as this is not true AMD, and you need to to load everything you might want one the client, unless you make page specific bundles.
You'd still want to do the require in a factory and return it, to keep it in angular's dependency injection framework.

how to pass a shared variable to downstream modules?

I have a node toplevel myapp variable that contains some key application state - loggers, db handlers and some other data. The modules downstream in directory hierarchy need access to these data. How can I set up a key/value system in node to do that?
A highly upticked and accepted answer in Express: How to pass app-instance to routes from a different file? suggests using, in a lower level module
//in routes/index.js
var app = require("../app");
But this injects a hard-coded knowledge of the directory structure and file names which should be a bigger no-no jimho. Is there some other method, like something native in JavaScript? Nor do I relish the idea of declaring variables without var.
What is the node way of making a value available to objects created in lower scopes? (I am very much new to node and all-things-node aren't yet obvious to me)
Thanks a lot.
Since using node global (docs here) seems to be the solution that OP used, thought I'd add it as an official answer to collect my valuable points.
I strongly suggest that you namespace your variables, so something like
global.myApp.logger = { info here }
global.myApp.db = {
url: 'mongodb://localhost:27017/test',
connectOptions : {}
}
If you are in app.js and just want to allow access to it
global.myApp = this;
As always, use globals with care...
This is not really related to node but rather general software architecture decisions.
When you have a client and a server module/packages/classes (call them whichever way you like) one way is to define routines on the server module that takes as arguments whichever state data your client keeps on the 'global' scope, completes its tasks and reports back to the client with results.
This way, it is perfectly decoupled and you have a strict control of what data goes where.
Hope this helps :)
One way to do this is in an anonymous function - i.e. instead of returning an object with module.exports, return a function that returns an appropriate value.
So, let's say we want to pass var1 down to our two modules, ./module1.js and ./module2.js. This is how the module code would look:
module.exports = function(var1) {
return {
doSomething: function() { return var1; }
};
}
Then, we can call it like so:
var downstream = require('./module1')('This is var1');
Giving you exactly what you want.
I just created an empty module and installed it under node_modules as appglobals.js
// index.js
module.exports = {};
// package.json too is barebones
{ "name": "appGlobals" }
And then strut it around as without fearing refactoring in future:
var g = require("appglobals");
g.foo = "bar";
I wish it came built in as setter/getter, but the flexibility has to be admired.
(Now I only need to figure out how to package it for production)

Using a global variable in Node.js for dependency injection

I'm starting out a long term project, based on Node.js, and so I'm looking to build upon a solid dependency injection (DI) system.
Although Node.js at its core implies using simple module require()s for wiring components, I find this approach not best suited for a large project (e.g. requiring modules in each file is not that maintainable, testable or dynamic).
Now, I'd done my bits of research before posting this question and I've found out some interesting DI libraries for Node.js (see wire.js and dependable.js).
However, for maximal simplicity and minimal repetition I've come up with my own proposition of implementing DI:
You have a module, di.js, which acts as the container and is initialized by pointing to a JSON file storing a map of dependency names and their respective .js files.
This already provides a dynamic nature to the DI, as you may easily swap test/development dependencies.
The container can return dependencies by using an inject() function, which finds the dependency mapping and calls require() with it.
For simplicity, the module is assigned to a global variable, i.e. global.$di, so that any file in the project may use the container/injector by calling $di.inject().
Here's the gist of the implementation:
File di.js
module.exports = function(path) {
this.deps = require(path);
return {
inject: function(name) {
if (!deps[name])
throw new Error('dependency "' + name + '" isn\'t registered');
return require(deps[name]);
}
};
};
Dependency map JSON file
{
"vehicle": "lib/jetpack",
"fuel": "lib/benzine",
"octane": "lib/octane98"
}
Initialize the $di in the main JavaScript file, according to development/test mode:
var path = 'dep-map-' + process.env.NODE_ENV + '.json;
$di = require('di')(path);
Use it in some file:
var vehicle = $di.inject('vehicle');
vehicle.go();
So far, the only problem I could think of using this approach is the global variable $di.
Supposedly, global variables are a bad practice, but it seems to me like I'm saving a lot of repetition for the cost of a single global variable.
What can be suggested against my proposal?
Overall this approach sounds fine to me.
The way global variables work in Node.js is that when you declare a variable without the var keyword, and it gets added to the global object which is shared between all modules. You can also explicitly use global.varname. Example:
vehicle = "jetpack"
fuel = "benzine"
console.log(vehicle) // "jetpack"
console.log(global.fuel) // "benzine"
Variables declared with var will only be local to the module.
var vehicle = "car"
console.log(vehicle) // "car"
console.log(global.vehicle) // "jetpack"
So in your code if you are doing $di = require('di')(path) (without var), then you should be able to use it in other modules without any issues. Using global.$di might make the code more readable.
Your approach is a clear and simple one which is good. Whether you have a global variable or require your module every time is not important.
Regarding testability it allows you to replace your modules with mocks. For unit testing you should add a function that makes it easy for you to apply different mocks for each test. Something that extends your dependency map temporarily.
For further reading I can recommend a great blog article on dependency injection in Node.js as well as a talk on the future dependency injector of angular.js which is designed by some serious masterminds.
BTW, you might be interested in Fire Up! which is a dependency injection container I implemented.

Resources