Make required json available in other modules - node.js

I am writing an application that has configuration parameters in a json file. Something like this:
// config.json
{
"httpServer": {
"port": 3000
},
"module1": {
"setting1": "value1",
"setting2": "value2"
},
"module2": {
"setting1": "value1",
"setting2": "value2"
}
}
// index.js
const config = require("./config")
const func1 = require("./module1")
const func2 = require("./module2")
// code here
// module1.js
const config = require("./config")
// use config and define functions
module.exports = {
function: function
}
// module2.js
const config = require("./config")
// use config and define functions
module.exports = {
function: function
}
The problem is that I am requiring this file in every module which makes my code unmaintainable since I need to update every require statement if the filename changes. I am pretty sure that this is not the "correct" way of doing this. Can I require the configuration file once when the program starts and then reference to it in other modules? Or should I pass the configuration file as a command line argument and then use process.argv array when requiring the file? What is the best way of handling situations like these?

use dotenv package npm install dotenv --save,
create a config file
//config.env
NODE_ENV=development
IP=127.0.0.1
PORT=3000
load the config file
//index.js
const dotenv = require('dotenv');
dotenv.config({ path: './config.env' })
use it where ever you want
//module1
console.log('IP: ',process.env.IP)

To be honest I don't really see anything wrong in requiring the config in multiple files. Since you need it, you are requiring it.
If you really don't want to require multiple times, you could consider this
Convert the function style to class style and then inject the config as a dependency to that class
Main File
const config = require("./config");
const file1 = new File1(config);
const file2 = new File2(config);
File 1
class File1 {
constructor(config) {
this.config_ = config;
}
someFunction() {
// use this.config_ here
}
}
File 2
class File2 {
constructor(config) {
this.config_ = config;
}
someFunction() {
// use this.config_ here
}
}
Few advantages of using this approach are:
better testable as you can mock the config if you want.
you could also change at one place and you wouldn't need to change other places as you are injecting.

Related

Module not found: Can't resolve 'fs' in Next.js application

Unable to identify what's happening in my next.js app. As fs is a default file system module of nodejs. It is giving the error of module not found.
If you use fs, be sure it's only within getInitialProps or getServerSideProps. (anything includes server-side rendering).
You may also need to create a next.config.js file with the following content to get the client bundle to build:
For webpack4
module.exports = {
webpack: (config, { isServer }) => {
// Fixes npm packages that depend on `fs` module
if (!isServer) {
config.node = {
fs: 'empty'
}
}
return config
}
}
For webpack5
module.exports = {
webpack5: true,
webpack: (config) => {
config.resolve.fallback = { fs: false };
return config;
},
};
Note: for other modules such as path, you can add multiple arguments such as
{
fs: false,
path: false
}
I spent hours on this and the solution is also here on Stackoverflow but on different issue -> https://stackoverflow.com/a/67478653/17562602
Hereby I asked for MOD permission to reshare this, since this issue is the first one to show up on Google and probably more and more people stumble would upon the same problem as I am, so I'll try to saved them some sweats
Soo, You need to add this in your next.config.js
module.exports = {
future: {
webpack5: true, // by default, if you customize webpack config, they switch back to version 4.
// Looks like backward compatibility approach.
},
webpack(config) {
config.resolve.fallback = {
...config.resolve.fallback, // if you miss it, all the other options in fallback, specified
// by next.js will be dropped. Doesn't make much sense, but how it is
fs: false, // the solution
};
return config;
},
};
It works for like a charm for me
Minimal reproducible example
A clean minimal example will be beneficial to Webpack beginners since auto splitting based on usage is so mind-blowingly magic.
Working hello world baseline:
pages/index.js
// Client + server code.
export default function IndexPage(props) {
return <div>{props.msg}</div>
}
// Server-only code.
export function getStaticProps() {
return { props: { msg: 'hello world' } }
}
package.json
{
"name": "test",
"version": "1.0.0",
"scripts": {
"dev": "next",
"build": "next build",
"start": "next start"
},
"dependencies": {
"next": "12.0.7",
"react": "17.0.2",
"react-dom": "17.0.2"
}
}
Run with:
npm install
npm run dev
Now let's add a dummy require('fs') to blow things up:
// Client + server code.
export default function IndexPage(props) {
return <div>{props.msg}</div>
}
// Server-only code.
const fs = require('fs')
export function getStaticProps() {
return { props: { msg: 'hello world' } }
}
fails with:
Module not found: Can't resolve 'fs'
which is not too surprising, since there was no way for Next.js to know that that fs was server only, and we wouldn't want it to just ignore random require errors, right? Next.js only knows that for getStaticProps because that's a hardcoded Next.js function name.
OK, so let's inform Next.js by using fs inside getStaticProps, the following works again:
// Client + server code.
export default function IndexPage(props) {
return <div>{props.msg}</div>
}
// Server-only code.
const fs = require('fs')
export function getStaticProps() {
fs
return { props: { msg: 'hello world' } }
}
Mind equals blown. So we understand that any mention of fs inside of the body of getStaticProps, even an useless one like the above, makes Next.js/Webpack understand that it is going to be server-only.
Things would work the same for getServerSideProps and getStaticPaths.
Higher order components (HOCs) have to be in their own files
Now, the way that we factor out IndexPage and getStaticProps across different but similar pages is to use HOCs, which are just functions that return other functions.
HOCs will normally be put outside of pages/ and then required from multiple locations, but when you are about to factor things out to generalize, you might be tempted to put them directly in the pages/ file temporarily, something like:
// Client + server code.
import Link from 'next/link'
export function makeIndexPage(isIndex) {
return (props) => {
return <>
<Link href={isIndex ? '/index' : '/notindex'}>
<a>{isIndex ? 'index' : 'notindex'}</a>
</Link>
<div>{props.fs}</div>
<div>{props.isBlue}</div>
</>
}
}
export default makeIndexPage(true)
// Server-only code.
const fs = require('fs')
export function makeGetStaticProps(isBlue) {
return () => {
return { props: {
fs: Object.keys(fs).join(' '),
isBlue,
} }
}
}
export const getStaticProps = makeGetStaticProps(true)
but if you do this you will be saddened to see:
Module not found: Can't resolve 'fs'
So we understand another thing: the fs usage has to be directly inside the getStaticProps function body, Webpack can't catch it in subfunctions.
The only way to solve this is to have a separate file for the backend-only stuff as in:
pages/index.js
// Client + server code.
import { makeIndexPage } from "../front"
export default makeIndexPage(true)
// Server-only code.
import { makeGetStaticProps } from "../back"
export const getStaticProps = makeGetStaticProps(true)
pages/notindex.js
// Client + server code.
import { makeIndexPage } from "../front"
export default makeIndexPage(false)
// Server-only code.
import { makeGetStaticProps } from "../back"
export const getStaticProps = makeGetStaticProps(false)
front.js
// Client + server code.
import Link from 'next/link'
export function makeIndexPage(isIndex) {
return (props) => {
console.error('page');
return <>
<Link href={isIndex ? '/notindex' : '/'}>
<a>{isIndex ? 'notindex' : 'index'}</a>
</Link>
<div>{props.fs}</div>
<div>{props.isBlue}</div>
</>
}
}
back.js
// Server-only code.
const fs = require('fs')
export function makeGetStaticProps(isBlue) {
return () => {
return { props: {
fs: Object.keys(fs).join(' '),
isBlue,
} }
}
}
Webpack must see that name makeGetStaticProps getting assigned to getStaticProps, so it decides that the entire back file is server-only.
Note that it does not work if you try to merge back.js and front.js into a single file, probably because when you do export default makeIndexPage(true) webpack necessarily tries to pull the entire front.js file into the frontend, which includes the fs, so it fails.
This leads to a natural (and basically almost mandatory) split of library files between:
front.js and front/*: front-end + backend files. These are safe for the frontend. And the backend can do whatever the frontend can do (we are doing SSR right?) so those are also usable from the backend.
Perhaps this is the idea behind the conventional "components" folder in many official examples. But that is a bad name, because that folder should not only contain components, but also any library non-component helpers/constants that will be used from the frontend.
back.js and back/* (or alternatively anything outside of front/*): backend only files. These can only be used by the backend, importing them on frontend will lead to the error
fs,path or other node native modules can be used only inside server-side code, like "getServerSide" functions. If you try to use it in client you get error even you just console.log it.. That console.log should run inside server-side functions as well.
When you import "fs" and use it in server-side, next.js is clever enough to see that you use it in server-side so it wont add that import into the client bundle
One of the packages that I used was giving me this error, I fixed this with
module.exports = {
webpack: (config, { isServer }) => {
if (!isServer) {
config.resolve.fallback.fs = false
}
return config
},
}
but this was throwing warning on terminal:
"Critical dependency: require function is used in a way in which
dependencies cannot be statically extracted"
Then I tried to load the node module on the browser. I copied the "min.js" of the node module from the node_modules and placed in "public/js/myPackage.js" and load it with Script
export default function BaseLayout({children}) {
return (
<>
<Script
// this in public folder
src="/js/myPackage.js"
// this means this script will be loaded first
strategy="beforeInteractive"
/>
</>
)
}
This package was attached to window object and in node_modules source code's index.js:
if (typeof window !== "undefined") {
window.TruffleContract = contract;
}
So I could access to this script as window.TruffleContract. BUt this was not an efficient way.
While this error requires a bit more reasoning than most errors you'll encounter, it happens for a straightforward reason.
Why this happens
Next.js, unlike many frameworks allows you to import server-only (Node.js APIs that don't work in a browser) code into your page files. When Next.js builds your project, it removes server only code from your client-side bundle by checking which code exists inside one any of the following built-in methods (code splitting):
getServerSideProps
getStaticProps
getStaticPaths
Side note: there is a demo app that visualizes how this works.
The Module not found: can't resolve 'xyz' error happens when you try to use server only code outside of these methods.
Error example 1 - basic
To reproduce this error, let's start with a working simple Next.js page file.
WORKING file
/** THIS FILE WORKS FINE! */
import type { GetServerSideProps } from "next";
import fs from "fs"; // our server-only import
type Props = {
doesFileExist: boolean;
};
export const getServerSideProps: GetServerSideProps = async () => {
const fileExists = fs.existsSync("/some-file");
return {
props: {
doesFileExist: fileExists,
},
};
};
const ExamplePage = ({ doesFileExist }: Props) => {
return <div>File exists?: {doesFileExist ? "Yes" : "No"}</div>;
};
export default ExamplePage;
Now, let's reproduce the error by moving our fs.existsSync method outside of getServerSideProps. The difference is subtle, but the code below will throw our dreaded Module not found error.
ERROR file
import type { GetServerSideProps } from "next";
import fs from "fs";
type Props = {
doesFileExist: boolean;
};
/** ERROR!! - Module not found: can't resolve 'fs' */
const fileExists = fs.existsSync("/some-file");
export const getServerSideProps: GetServerSideProps = async () => {
return {
props: {
doesFileExist: fileExists,
},
};
};
const ExamplePage = ({ doesFileExist }: Props) => {
return <div>File exists?: {doesFileExist ? "Yes" : "No"}</div>;
};
export default ExamplePage;
Error example 2 - realistic
The most common (and confusing) occurrence of this error happens when you are using modules that contain multiple types of code (client-side + server-side).
Let's say I have the following module called file-utils.ts:
import fs from 'fs'
// This code only works server-side
export function getFileExistence(filepath: string) {
return fs.existsSync(filepath)
}
// This code works fine on both the server AND the client
export function formatResult(fileExistsResult: boolean) {
return fileExistsResult ? 'Yes, file exists' : 'No, file does not exist'
}
In this module, we have one server-only method and one "shared" method that in theory should work client-side (but as we'll see, theory isn't perfect).
Now, let's try incorporating this into our Next.js page file.
/** ERROR!! */
import type { GetServerSideProps } from "next";
import { getFileExistence, formatResult } from './file-utils.ts'
type Props = {
doesFileExist: boolean;
};
export const getServerSideProps: GetServerSideProps = async () => {
return {
props: {
doesFileExist: getFileExistence('/some-file')
},
};
};
const ExamplePage = ({ doesFileExist }: Props) => {
// ERROR!!!
return <div>File exists?: {formatResult(doesFileExist)}</div>;
};
export default ExamplePage;
As you can see, we get an error here because when we attempt to use formatResult client-side, our module still has to import the server-side code.
To fix this, we need to split our modules up into two categories:
Server only
Shared code (client or server)
// file-utils.ts
import fs from 'fs'
// This code (and entire file) only works server-side
export function getFileExistence(filepath: string) {
return fs.existsSync(filepath)
}
// file-format-utils.ts
// This code works fine on both the server AND the client
export function formatResult(fileExistsResult: boolean) {
return fileExistsResult ? 'Yes, file exists' : 'No, file does not exist'
}
Now, we can create a WORKING page file:
/** WORKING! */
import type { GetServerSideProps } from "next";
import { getFileExistence } from './file-utils.ts' // server only
import { formatResult } from './file-format-utils.ts' // shared
type Props = {
doesFileExist: boolean;
};
export const getServerSideProps: GetServerSideProps = async () => {
return {
props: {
doesFileExist: getFileExistence('/some-file')
},
};
};
const ExamplePage = ({ doesFileExist }: Props) => {
return <div>File exists?: {formatResult(doesFileExist)}</div>;
};
export default ExamplePage;
Solutions
There are 2 ways to solve this:
The "correct" way
The "just get it working" way
The "Correct" way
The best way to solve this error is to make sure that you understand why it is happening (above) and make sure you are only using server-side code inside getStaticPaths, getStaticProps, or getServerSideProps and NOWHERE else.
And remember, if you import a module that contains both server-side and client-side code, you cannot use any of the imports from that module client-side (revisit example #2 above).
The "Just get it working" way
As others have suggested, you can alter your next.config.js to ignore certain modules at build-time. This means that when Next.js attempts to split your page file between server only and shared code, it will not try to polyfill Node.js APIs that fail to build client-side.
In this case, you just need:
/** next.config.js - with Webpack v5.x */
module.exports = {
... other settings ...
webpack: (config, { isServer }) => {
// If client-side, don't polyfill `fs`
if (!isServer) {
config.resolve.fallback = {
fs: false,
};
}
return config;
},
};
Drawbacks of this approach
As shown in the resolve.fallback section of the Webpack documentation, the primary reason for this config option is because as-of Webpack v5.x, core Node.js modules are no longer polyfilled by default. Therefore, the main purpose for this option is to provide a way for you to define which polyfill you want to use.
When you pass false as an option, this means, "do not include a polyfill".
While this works, it can be fragile and require ongoing maintenance to include any new modules that you introduce to your project. Unless you are converting an existing project / supporting legacy code, it is best to go for option #1 above as it promotes better module organization according to how Next.js actually splits the code under the hood.
If trying to use fs-extra in Next.js, this worked for me
module.exports = {
webpack: (config) => {
config.resolve.fallback = { fs: false, path: false, stream: false, constants: false };
return config;
}
}
I got this error in my NextJS app because I was missing export in
export function getStaticProps()
/** #type {import('next').NextConfig} */
module.exports = {
reactStrictMode: false,
webpack5: true,
webpack: (config) => {
config.resolve.fallback = {
fs: false,
net: false,
dns: false,
child_process: false,
tls: false,
};
return config;
},
};
This code fixed my problem and I want to share.Add this code to your next.config file.i'm using
webpack5
For me clearing the cache
npm cache clean -f
and then updating the node version to the latest stable release(14.17.0) worked
It might be that the module you are trying to implement is not supposed to run in a browser. I.e. it's server-side only.
For me, the problem was the old version of the node.js installed. It requires node.js version 14 and higher. The solution was to go to the node.js web page, download the latest version and just install it. And then re-run the project. All worked!
I had the same issue when I was trying to use babel.
For me this worked:
#add a .babelrc file to the root of the project and define presets and plugins
(in my case, I had some issues with the macros of babel, so I defined them)
{
"presets": ["next/babel"],
"plugins": ["macros"]
}
after that shut down your server and run it again
I had this exact issue. My problem was that I was importing types that I had declared in a types.d.ts file.
I was importing it like this, thanks to the autofill provided by VSCode.
import {CUSTOM_TYPE} from './types'
It should have been like this:
import {CUSTOM_TYPE} from './types.d'
In my case, I think the .d was unnecessary so I ended up removing it entirely and renamed my file to types.ts.
Weird enough, it was being imported directly into index.tsx without issues, but any helper files/functions inside the src directory would give me errors.
I ran into this in a NextJS application because I had defined a new helper function directly below getServerSideProps(), but had not yet called that function inside getServerSideProps().
I'm not sure why this created a problem, but it did. I could only get it to work by either calling that function, removing it, or commenting it out.
Don't use fs in the pages directory, since next.js suppose that files in pages directory are running in browser environment.
You could put the util file which uses fs to other directory such as /core
Then require the util in getStaticProps which runs in node.js environment.
// /pages/myPage/index.tsx
import View from './view';
export default View;
export async function getStaticProps() {
const util = require('core/some-util-uses-fs').default; // getStaticProps runs in nodes
const data = await util.getDataFromDisk();
return {
props: {
data,
},
};
}
In my case, this error appeared while refactoring the auth flow of a Next.js page. The cause was some an unused imports that I had not yet removed.
Previously I made the page a protected route like so:
export async function getServerSideProps ({ query, req, res }) {
const session = await unstable_getServerSession(req, res, authOptions)
if (!session) {
return {
redirect: {
destination: '/signin',
permanent: false,
},
}
}
//... rest of server-side logic
}
Whilst refactoring, I read up on NextAuth useSession. Based on what I read there, I was able to change the implementation such that I simply needed to add
MyComponent.auth = true to make a page protected. I then deleted the aforementioned code block inside of getServerSideProps. However, I had not yet deleted the two imports used by said code block:
import { unstable_getServerSession } from 'next-auth/next'
import { authOptions } from 'pages/api/auth/[...nextauth]'
I believe the second of those two imports was causing the problem. So the summary is that in addition to all of the great answers above, it could also be an unused import.
Sometimes this error can be because you have imported something but not mastered it anywhere. This worked for me. I reviewed my code and removed the unused dependencies.

Test process.env with Jest

I have an application that depends on environmental variables like:
const APP_PORT = process.env.APP_PORT || 8080;
And I would like to test that for example:
APP_PORT can be set by a Node.js environment variable.
or that an Express.js application is running on the port set with process.env.APP_PORT
How can I achieve this with Jest? Can I set these process.env variables before each test or should I mock it somehow maybe?
The way I did it can be found in this Stack Overflow question.
It is important to use resetModules before each test and then dynamically import the module inside the test:
describe('environmental variables', () => {
const OLD_ENV = process.env;
beforeEach(() => {
jest.resetModules() // Most important - it clears the cache
process.env = { ...OLD_ENV }; // Make a copy
});
afterAll(() => {
process.env = OLD_ENV; // Restore old environment
});
test('will receive process.env variables', () => {
// Set the variables
process.env.NODE_ENV = 'dev';
process.env.PROXY_PREFIX = '/new-prefix/';
process.env.API_URL = 'https://new-api.com/';
process.env.APP_PORT = '7080';
process.env.USE_PROXY = 'false';
const testedModule = require('../../config/env').default
// ... actual testing
});
});
If you look for a way to load environment values before running the Jest look for the answer below. You should use setupFiles for that.
Jest's setupFiles is the proper way to handle this, and you need not install dotenv, nor use an .env file at all, to make it work.
jest.config.js:
module.exports = {
setupFiles: ["<rootDir>/.jest/setEnvVars.js"]
};
.jest/setEnvVars.js:
process.env.MY_CUSTOM_TEST_ENV_VAR = 'foo'
That's it.
Another option is to add it to the jest.config.js file after the module.exports definition:
process.env = Object.assign(process.env, {
VAR_NAME: 'varValue',
VAR_NAME_2: 'varValue2'
});
This way it's not necessary to define the environment variables in each .spec file and they can be adjusted globally.
In ./package.json:
"jest": {
"setupFiles": [
"<rootDir>/jest/setEnvVars.js"
]
}
In ./jest/setEnvVars.js:
process.env.SOME_VAR = 'value';
You can use the setupFiles feature of the Jest configuration. As the documentation said that,
A list of paths to modules that run some code to configure or set up
the testing environment. Each setupFile will be run once per test
file. Since every test runs in its own environment, these scripts will
be executed in the testing environment immediately before executing
the test code itself.
npm install dotenv dotenv that uses to access environment variable.
Create your .env file to the root directory of your application and add this line into it:
#.env
APP_PORT=8080
Create your custom module file as its name being someModuleForTest.js and add this line into it:
// someModuleForTest.js
require("dotenv").config()
Update your jest.config.js file like this:
module.exports = {
setupFiles: ["./someModuleForTest"]
}
You can access an environment variable within all test blocks.
test("Some test name", () => {
expect(process.env.APP_PORT).toBe("8080")
})
Expanding a bit on Serhan C.'s answer...
According to the blog post How to Setup dotenv with Jest Testing - In-depth Explanation, you can include "dotenv/config" directly in setupFiles, without having to create and reference an external script that calls require("dotenv").config().
I.e., simply do
module.exports = {
setupFiles: ["dotenv/config"]
}
In test file:
const APP_PORT = process.env.APP_PORT || 8080;
In the test script of ./package.json:
"scripts": {
"test": "jest --setupFiles dotenv/config",
}
In ./env:
APP_PORT=8080
In my opinion, it's much cleaner and easier to understand if you extract the retrieval of environment variables into a utility (you probably want to include a check to fail fast if an environment variable is not set anyway), and then you can just mock the utility.
// util.js
exports.getEnv = (key) => {
const value = process.env[key];
if (value === undefined) {
throw new Error(`Missing required environment variable ${key}`);
}
return value;
};
// app.test.js
const util = require('./util');
jest.mock('./util');
util.getEnv.mockImplementation(key => `fake-${key}`);
test('test', () => {...});
Depending on how you can organize your code, another option can be to put the environment variable within a function that's executed at runtime.
In this file, the environment variable is set at import time and requires dynamic requires in order to test different environment variables (as described in this answer):
const env = process.env.MY_ENV_VAR;
const envMessage = () => `MY_ENV_VAR is set to ${env}!`;
export default myModule;
In this file, the environment variable is set at envMessage execution time, and you should be able to mutate process.env directly in your tests:
const envMessage = () => {
const env = process.env.MY_VAR;
return `MY_ENV_VAR is set to ${env}!`;
}
export default myModule;
Jest test:
const vals = [
'ONE',
'TWO',
'THREE',
];
vals.forEach((val) => {
it(`Returns the correct string for each ${val} value`, () => {
process.env.MY_VAR = val;
expect(envMessage()).toEqual(...
you can import this in your jest.config.js
require('dotenv').config()
this work for me
All the above methods work if you're using require("dotenv").config within the jest.config.js file, a NodeJS application without TypeScript such as what Jialx or Henry Tipantuna has suggested.
But if you're using ts-jest and within the jest.config.ts file.
import dotenv from "dotenv"
dotenv.config()
/* config options below */
When using Typescript the following works for me:
in root:
jest.config.js
/* eslint-disable #typescript-eslint/no-var-requires */
const { pathsToModuleNameMapper } = require('ts-jest');
const { compilerOptions } = require('./tsconfig.paths.json');
module.exports = {
// [...]
moduleNameMapper: pathsToModuleNameMapper(compilerOptions.paths, { prefix: '<rootDir>/' }),
};
process.env = Object.assign(process.env, {
env_name: 'dev',
another_var: 'abc123',
});
To build upon #HenryTipantuña's suggestion is to import dotenv in your jest.config.js and use a .env.test file in the config path
require('dotenv').config({
path: '.env.test'
})
Building on top of #jahller's answer.
I made it responsive so you don't need to keep the files in sync as things change.
Put this at the bottom of your jest.config.js file.
const arr = require('fs')
.readFileSync('.env', 'utf8')
.split('\n')
.reduce((vars, i) => {
const [variable, value] = i.split('=')
vars[variable] = value
return vars
}, {})
process.env = Object.assign(process.env, arr)
It reads the contents of your .env file, splits every new line and reduces it all back down to an object where you then assign it to process.env
OR
just use dotenv in jest.setup.js 🤷‍♂️
i have most simple for implementation env (specialy test.env)
require("dotenv").config({ path: './test.env' });
const { sum } = require('./sum.js');
describe('sum', () => {
beforeEach(() => {
jest.resetModules(); // remove cache
})
test('should success', () => {
expect(sum(1, 3)).toEqual(4);
})
})
I think you could try this too:
const currentEnv = process.env;
process.env = { ENV_NODE: 'whatever' };
// test code...
process.env = currentEnv;
This works for me and you don't need module things

How do I override config values at runtime with node-config?

I'd like to override some values at test-time, specifically setting my retries for an http service to 1 (immediate failure, no retries). Our project uses node-config. According to the docs I can override with NODE_CONFIG env variable:
node myapp.js --NODE_CONFIG='{"Customer":{"dbConfig":{"host":"customerdb.prod"}}}'
Well I would prefer to do this in my test, but not for all tests. The code says that you can allow config mutations by setting ALLOW_CONFIG_MUTATIONS.
process.env.ALLOW_CONFIG_MUTATIONS = "true";
const importFresh = require('import-fresh');
importFresh("config");
process.env.NODE_CONFIG = JSON.stringify({httpServices:{integration:{enrich: {retryInterval: 1, retries: 1}}}});
expect(process.env.NODE_CONFIG, 'NODE_CONFIG not set').to.exist();
expect(process.env.NODE_CONFIG, 'NODE_CONFIG not set').to.match(/retryInterval/);
expect(process.env.ALLOW_CONFIG_MUTATIONS, 'ALLOW_CONFIG_MUTATIONS not set').to.equal("true");
const testConfig = require("config");
console.dir(testConfig.get("httpServices.integration.enrich"));
expect(testConfig.get("httpServices.integration.enrich.retryInterval"), 'config value not set to 1').to.equal(1);
Result:
{ url: 'https://internal-**********',
retryInterval: 5000,
retries: 5 }
`Error: config value not set to 1: Expected 5000 to equal specified value: 1`
How do I get this override to work?
(expect is from Hapi.js Code library)
I'm one of the maintainers of node-config. Your bug is that you used require the second time when you should have used importFresh again.
Your first use of "importFresh()" does nothing different than require() would, because it is the first use of require().
After setting some variables, you call require(), which will return the copy of config already generated and cached, ignoring the effects of the environment variables set.
You only needed to use importFresh() once, where you currently use require(). This will cause a "fresh" copy of the config object to be returned, as you expected.
Simply changing config's property worked for me.
For example:
const config = require( 'config' );
config.httpServices.integration.enrich.retryInterval = 1;
// Do your tests...
UPD: Make sure that overrides are done before anyone calls the first config.get(), because the config object is made immutable as soon as any client uses the values via get().
Joining late, but other answers did not fit with the testing standard in my project, so here is what I came up with
TL;DR
Use mocks..
Detailed Answer
node-config uses a function get to get the configuration values.
By mocking the function get you can easily modify any configuration you see fit..
My personal favorite library is sinon
Here is an implementation of a mock with sinon
const config = require('config');
const sinon = require('sinon');
class MockConfig {
constructor () {
this.params = {};
this.sandbox = sinon.sandbox.create();
}
withConfValue (confKey, confValue) {
this.params.confValues[confKey] = confValue;
return this;
}
reset () {
this.params.confValues: {};
return this;
}
restore() {
this.sandbox.restore();
}
apply () {
this.restore(); // avoid duplicate wrapping
this.sandbox.stub(config, 'get').callsFake((configKey) => {
if (this.params.confValues.hasOwnProperty(configKey)) {
return this.params.confValues[configKey];
}
// not ideal.. however `wrappedMethod` approach did not work for me
// https://stackoverflow.com/a/57017971/1068746
return configKey
.split('.')
.reduce((result, item) => result[item], config)
});
}
}
const instance = new MockConfig();
MockConfig.instance = () => instance;
module.exports = MockConfig;
Usage would be
const mockConfig = require('./mock_config').instance();
...
beforeEach(function () {
mockConfig.reset().apply();
})
afterEach(function () {
mockConfig.reset().clear();
})
it('should do something') {
mockConfig.withConfValue('some_topic.some_field.property', someValue);
... rest of the test ...
}
Assumptions
The only assumption this approach makes is that you adhere to node-config way of reading the configuration (using the get function) and not bypass it by accessing fields directly.
It's better to create a development.json, production.json et test.json in your config folder node-config will use it your app configuration.
you just net to set your NODE_ENV to use the specific file.
Hope it helps :)

module.exports vs. export default in Node.js and ES6

What is the difference between Node's module.exports and ES6's export default? I'm trying to figure out why I get the "__ is not a constructor" error when I try to export default in Node.js 6.2.2.
What works
'use strict'
class SlimShady {
constructor(options) {
this._options = options
}
sayName() {
return 'My name is Slim Shady.'
}
}
// This works
module.exports = SlimShady
What doesn't work
'use strict'
class SlimShady {
constructor(options) {
this._options = options
}
sayName() {
return 'My name is Slim Shady.'
}
}
// This will cause the "SlimShady is not a constructor" error
// if in another file I try `let marshall = new SlimShady()`
export default SlimShady
The issue is with
how ES6 modules are emulated in CommonJS
how you import the module
ES6 to CommonJS
At the time of writing this, no environment supports ES6 modules natively. When using them in Node.js you need to use something like Babel to convert the modules to CommonJS. But how exactly does that happen?
Many people consider module.exports = ... to be equivalent to export default ... and exports.foo ... to be equivalent to export const foo = .... That's not quite true though, or at least not how Babel does it.
ES6 default exports are actually also named exports, except that default is a "reserved" name and there is special syntax support for it. Lets have a look how Babel compiles named and default exports:
// input
export const foo = 42;
export default 21;
// output
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var foo = exports.foo = 42;
exports.default = 21;
Here we can see that the default export becomes a property on the exports object, just like foo.
Import the module
We can import the module in two ways: Either using CommonJS or using ES6 import syntax.
Your issue: I believe you are doing something like:
var bar = require('./input');
new bar();
expecting that bar is assigned the value of the default export. But as we can see in the example above, the default export is assigned to the default property!
So in order to access the default export we actually have to do
var bar = require('./input').default;
If we use ES6 module syntax, namely
import bar from './input';
console.log(bar);
Babel will transform it to
'use strict';
var _input = require('./input');
var _input2 = _interopRequireDefault(_input);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
console.log(_input2.default);
You can see that every access to bar is converted to access .default.
Felix Kling did a great comparison on those two, for anyone wondering how to do an export default alongside named exports with module.exports in nodejs
module.exports = new DAO()
module.exports.initDAO = initDAO // append other functions as named export
// now you have
let DAO = require('_/helpers/DAO');
// DAO by default is exported class or function
DAO.initDAO()
You need to configure babel correctly in your project to use export default and export const foo
npm install --save-dev #babel/plugin-proposal-export-default-from
then add below configration in .babelrc
"plugins": [
"#babel/plugin-proposal-export-default-from"
]

node.js require all files in a folder?

How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');

Resources