npm run build returns Module not found: Error: Can't resolve in my docker container in my NuxtJS app - node.js

I've been on this issue for hours now. I kept receiving the error below when I run npm run build
ERROR in ./store/chatroom.js
Module not found: Error: Can't resolve '#/services/ChatRoomService.js' in '/usr/src/app/store'
# ./store/chatroom.js 1:0-60 9:11-26
# ./.nuxt/store.js
# ./.nuxt/index.js
# ./.nuxt/client.js
# multi ./.nuxt/client.js
What's weird is, it is working perfectly on my local alone. The error aboves occur in my docker build and when I run my container with my codebase in it.. However, it is that weirder when I run my container with bind mount on my local and try npm run build it is working properly..
At first I thought that maybe some files from my local are missing but I tried copying every file in my local to my container via docker cp . but it still does not work..
Dockerfile
FROM node:8.12.0
WORKDIR /usr/src/app
EXPOSE 3000
COPY package.json package.json
RUN npm install
# To include everything
COPY . .
RUN npm run build
ENTRYPOINT ["/usr/src/app/entrypoint.sh"]
chatroom.js
import ChatRoomService from "#/services/ChatRoomService.js";
export const state = () => ({});
export const mutations = {};
export const actions = {
getText({ commit }, data) {
return ChatRoomService.queryText(data).then(response => {
if (response.code === 1) {
commit("bbs/SET_TOP_ARR", JSON.parse(response.data.content), {
root: true
});
}
});
}
};
chatRoomService.js
import { mainApiClient, requestSetup } from "#/assets/js/axios.js";
const apiModule = "chatroom";
const resources = {
chatroomGetChatRoomText: "text/queryText"
};
export default {
queryText(body) {
const resource = resources.chatroomGetChatRoomText;
const [api, req] = requestSetup(resource, body, apiModule);
return mainApiClient.post(api, req);
}
};

UPDATE
I have solved it. It was how I imported. I imported chatRoomService.jswhere it should have been ChatRoomService.js. It works just fine in my local because it is Mac but when I put it in a node linux server inside container it is more sensitive

Related

Use node ssh into Runcloud gives a different version to what is actually installed/used by NVM on the server

I have a server on Runcloud on which I've used NVM and installed the version 16.14.2.
If I ssh via any SSH client and run node -v, I effectively get 16.14.2
Though when I wrote a script to ssh into the server and run the same command, I get 10.0
I was previously advised to create an alias, I tried to follow some steps I came across for that but it did not fix my issue. Further more, referencing the path to the desired version of npm inside nvm gives me an error that this version npm cannot run with node 10.0
npm does not support Node.js v10.0.0\n
Below is my code
import { NodeSSH } from 'node-ssh'
const ssh = new NodeSSH()
const { environment } = await inquirer.prompt([
{
name: 'environment',
message: `Environment?`,
type: 'input',
default: 'development'
}
])
if (build) {
const sshConfig = sshConfigs[environment]
console.log(chalk.yellow(`Connecting to ${sshConfig.host}...`))
await ssh.connect(sshConfig)
console.log(chalk.green('Connected'))
console.log(chalk.yellow('Executing release...'))
const nodePath = '~/.nvm/versions/node/v16.14.2/bin/node'
const npmPath = '~/.nvm/versions/node/v16.14.2/bin/npm'
console.log(await ssh.execCommand(`${npmPath} -v`))
ssh.dispose()
console.log(chalk.green('Release completed'))
}

NodeJS + WebPack setting client static data

I have a NodeJS/React/WebPack application that I'm trying to take environment variables that are present at build time and export them as variables that are available to the client without having to request them with AJAX.
What is currently setup is a /browser/index.js file with a method that is exported however the variables are not getting expanded when webpack runs.
function applicationSetup()
{
const config = JSON.parse(process.env.CONFIG);
const APPLICATION_ID = process.env.APPLICATION_ID;
.........
}
During the build process we run node node_modules/webpack/bin/webpack.js --mode production with npm.
What do I need to do in order to expand the environment variable to be their actual values when webpack creates the .js file?
Edit 8/23
I've tried adding it in the webpack.DefinePlugin section of the webpack.config.js file however it's still doesn't seem to be available in the client side code. What am I missing?
Edit #2 (webpack.config.js)
const getClientConfig = (env, mode) => {
return {
plugins: [
new webpack.DefinePlugin({
__isBrowser__: 'false',
__Config__: process.env.CONFIG,
__ApplicationID__:process.env.APPLICATION_ID
})]
}
module.exports = (env, options) => {
const configs = [
getClientConfig(options.env, options.mode)
];
return configs;
};

Get consuming service's package.name inside an npm package

I have the following
consuming-service
-package.json (has utilities-package as a dependency)
utilities-package (published to npm)
-package.json
-version.js
I'm trying to make version.js return the consuming service's name and version, but I'm not sure how to access consuming-service's package.json from within utilities-package
version.js
const pkg = require('../../package.json') // this doesn't work
function getVersion () {
return {
name: pkg.name,
version: pkg.version,
}
}
I'm struggling with finding the specific terms to google to find my answer.
The two best answers I have found are:
Use process.env.npm_package_name & process.env.npm_package_version. These values are automatically set if the consuming service is run with npm start or yarn start
Pass the package.json into the getVersion function.
// utilities-package/version.js
function getVersion (pkg) {
return {
name: pkg.name,
version: pkg.version,
}
}
// consuming-service
const utils = require('utilities-package')
const pkg = require('../../package.json')
const versionResult = utils.getVersion(pkg)

How to access container id in NodeJs application in host network mode

I am trying to get a docker container's Id when network=host settings is enabled but instead of getting the containerId I am getting the host instance name. But, in case my network=host is not passed in the command it gives me the containerId as expected.
In short:
Case 1- I run my container with command – docker run --network="host" -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **docker-desktop**
Case 2- I run my container with command – docker run -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **67db4w32k112** as expected
Is there a way I can get the same output i.e 67db4w32k112 in case 1 as well?
From looking at this thread you can probably do something like below which will read the /proc/1/cpuset file inside the container. This file has the current container ID, the contents look like:
/docker/7be92808767a667f35c8505cbf40d14e931ef6db5b0210329cf193b15ba9d605
This will be more reliable in your case than using os.hostname() since it works both with and without the --newtwork="host"flag on the docker run command.
fs = require('fs')
fs.readFile('/proc/1/cpuset', 'utf8', function(err, data) {
if (err) {
return console.log(err);
}
let containerID = data.replace("/docker/", "");
console.log(containerID);
});
Try to use a helper package such as docker-container-id
Add the dependency in your package.json
npm install --save docker-container-id
Here's an example:
const getId = require('docker-container-id');
async function() {
console.log("I'm in container:", await getId());
}
npmjs reference

The best way to run npm install for nested folders?

What is the most correct way to install npm packages in nested sub folders?
my-app
/my-sub-module
package.json
package.json
What is the best way to have packages in /my-sub-module be installed automatically when npm install run in my-app?
I prefer using post-install, if you know the names of the nested subdir. In package.json:
"scripts": {
"postinstall": "cd nested_dir && npm install",
...
}
Per #Scott's answer, the install|postinstall script is the simplest way as long as sub-directory names are known. This is how I run it for multiple sub dirs. For example, pretend we have api/, web/ and shared/ sub-projects in a monorepo root:
// In monorepo root package.json
{
...
"scripts": {
"postinstall": "(cd api && npm install); (cd web && npm install); (cd shared && npm install)"
},
}
On Windows, replace the ; between the parentesis with &&.
// In monorepo root package.json
{
...
"scripts": {
"postinstall": "(cd api && npm install) && (cd web && npm install) && (cd shared && npm install)"
},
}
Use Case 1: If you want be able to run npm commands from within each subdirectory (where each package.json is), you will need to use postinstall.
As I often use npm-run-all anyway, I use it to keep it nice and short (the part in the postinstall):
{
"install:demo": "cd projects/demo && npm install",
"install:design": "cd projects/design && npm install",
"install:utils": "cd projects/utils && npm install",
"postinstall": "run-p install:*"
}
This has the added benefit that I can install all at once, or individually. If you don't need this or don't want npm-run-all as a dependency, check out demisx's answer (using subshells in postinstall).
Use Case 2: If you will be running all npm commands from the root directory (and, for example, won't be using npm scripts in subdirectories), you could simply install each subdirectory like you would any dependecy:
npm install path/to/any/directory/with/a/package-json
In the latter case, don't be surprised that you don't find any node_modules or package-lock.json file in the sub-directories - all packages will be installed in the root node_modules, which is why you won't be able to run your npm commands (that require dependencies) from any of your subdirectories.
If you're not sure, use case 1 always works.
If you want to run a single command to install npm packages in nested subfolders, you can run a script via npm and main package.json in your root directory. The script will visit every subdirectory and run npm install.
Below is a .js script that will achieve the desired result:
var fs = require('fs');
var resolve = require('path').resolve;
var join = require('path').join;
var cp = require('child_process');
var os = require('os');
// get library path
var lib = resolve(__dirname, '../lib/');
fs.readdirSync(lib).forEach(function(mod) {
var modPath = join(lib, mod);
// ensure path has package.json
if (!fs.existsSync(join(modPath, 'package.json'))) {
return;
}
// npm binary based on OS
var npmCmd = os.platform().startsWith('win') ? 'npm.cmd' : 'npm';
// install folder
cp.spawn(npmCmd, ['i'], {
env: process.env,
cwd: modPath,
stdio: 'inherit'
});
})
Note that this is an example taken from a StrongLoop article that specifically addresses a modular node.js project structure (including nested components and package.json files).
As suggested, you could also achieve the same thing with a bash script.
EDIT: Made the code work in Windows
Just for reference in case people come across this question. You can now:
Add a package.json to a subfolder
Install this subfolder as reference-link in the main package.json:
npm install --save path/to/my/subfolder
The accepted answer works, but you can use --prefix to run npm commands in a selected location.
"postinstall": "npm --prefix ./nested_dir install"
And --prefix works for any npm command, not just install.
You can also view the current prefix with
npm prefix
And set your global install (-g) folder with
npm config set prefix "folder_path"
Maybe TMI, but you get the idea...
My solution is very similar.
Pure Node.js
The following script examines all subfolders (recursively) as long as they have package.json and runs npm install in each of them.
One can add exceptions to it: folders allowed not having package.json. In the example below one such folder is "packages".
One can run it as a "preinstall" script.
const path = require('path')
const fs = require('fs')
const child_process = require('child_process')
const root = process.cwd()
npm_install_recursive(root)
// Since this script is intended to be run as a "preinstall" command,
// it will do `npm install` automatically inside the root folder in the end.
console.log('===================================================================')
console.log(`Performing "npm install" inside root folder`)
console.log('===================================================================')
// Recurses into a folder
function npm_install_recursive(folder)
{
const has_package_json = fs.existsSync(path.join(folder, 'package.json'))
// Abort if there's no `package.json` in this folder and it's not a "packages" folder
if (!has_package_json && path.basename(folder) !== 'packages')
{
return
}
// If there is `package.json` in this folder then perform `npm install`.
//
// Since this script is intended to be run as a "preinstall" command,
// skip the root folder, because it will be `npm install`ed in the end.
// Hence the `folder !== root` condition.
//
if (has_package_json && folder !== root)
{
console.log('===================================================================')
console.log(`Performing "npm install" inside ${folder === root ? 'root folder' : './' + path.relative(root, folder)}`)
console.log('===================================================================')
npm_install(folder)
}
// Recurse into subfolders
for (let subfolder of subfolders(folder))
{
npm_install_recursive(subfolder)
}
}
// Performs `npm install`
function npm_install(where)
{
child_process.execSync('npm install', { cwd: where, env: process.env, stdio: 'inherit' })
}
// Lists subfolders in a folder
function subfolders(folder)
{
return fs.readdirSync(folder)
.filter(subfolder => fs.statSync(path.join(folder, subfolder)).isDirectory())
.filter(subfolder => subfolder !== 'node_modules' && subfolder[0] !== '.')
.map(subfolder => path.join(folder, subfolder))
}
If you have find utility on your system, you could try running the following command in your application root directory:
find . ! -path "*/node_modules/*" -name "package.json" -execdir npm install \;
Basically, find all package.json files and run npm install in that directory, skipping all node_modules directories.
EDIT As mentioned by fgblomqvist in comments, npm now supports workspaces too.
Some of the answers are quite old. I think nowadays we have some new options available to setup monorepos.
I would suggest using yarn workspaces:
Workspaces are a new way to set up your package architecture that’s available by default starting from Yarn 1.0. It allows you to setup multiple packages in such a way that you only need to run yarn install once to install all of them in a single pass.
If you prefer or have to stay with npm, I suggest taking a look at lerna:
Lerna is a tool that optimizes the workflow around managing multi-package repositories with git and npm.
lerna works perfect with yarn workspaces too - article. I've just finished setting up a monorepo project - example.
And here is an example of a multi-package project configured to use npm + lerna - MDC Web: they run lerna bootstrap using package.json's postinstall.
Adding Windows support to snozza's answer, as well as skipping of node_modules folder if present.
var fs = require('fs')
var resolve = require('path').resolve
var join = require('path').join
var cp = require('child_process')
// get library path
var lib = resolve(__dirname, '../lib/')
fs.readdirSync(lib)
.forEach(function (mod) {
var modPath = join(lib, mod)
// ensure path has package.json
if (!mod === 'node_modules' && !fs.existsSync(join(modPath, 'package.json'))) return
// Determine OS and set command accordingly
const cmd = /^win/.test(process.platform) ? 'npm.cmd' : 'npm';
// install folder
cp.spawn(cmd, ['i'], { env: process.env, cwd: modPath, stdio: 'inherit' })
})
Inspired by the scripts provided here, I built a configurable example which:
can be setup to use yarn or npm
can be setup to determine the command to use based on lock files so that if you set it to use yarn but a directory only has a package-lock.json it will use npm for that directory (defaults to true).
configure logging
runs installations in parallel using cp.spawn
can do dry runs to let you see what it would do first
can be run as a function or auto run using env vars
when run as a function, optionally provide array of directories to check
returns a promise that resolves when completed
allows setting max depth to look if needed
knows to stop recursing if it finds a folder with yarn workspaces (configurable)
allows skipping directories using a comma separated env var or by passing the config an array of strings to match against or a function which receives the file name, file path, and the fs.Dirent obj and expects a boolean result.
const path = require('path');
const { promises: fs } = require('fs');
const cp = require('child_process');
// if you want to have it automatically run based upon
// process.cwd()
const AUTO_RUN = Boolean(process.env.RI_AUTO_RUN);
/**
* Creates a config object from environment variables which can then be
* overriden if executing via its exported function (config as second arg)
*/
const getConfig = (config = {}) => ({
// we want to use yarn by default but RI_USE_YARN=false will
// use npm instead
useYarn: process.env.RI_USE_YARN !== 'false',
// should we handle yarn workspaces? if this is true (default)
// then we will stop recursing if a package.json has the "workspaces"
// property and we will allow `yarn` to do its thing.
yarnWorkspaces: process.env.RI_YARN_WORKSPACES !== 'false',
// if truthy, will run extra checks to see if there is a package-lock.json
// or yarn.lock file in a given directory and use that installer if so.
detectLockFiles: process.env.RI_DETECT_LOCK_FILES !== 'false',
// what kind of logging should be done on the spawned processes?
// if this exists and it is not errors it will log everything
// otherwise it will only log stderr and spawn errors
log: process.env.RI_LOG || 'errors',
// max depth to recurse?
maxDepth: process.env.RI_MAX_DEPTH || Infinity,
// do not install at the root directory?
ignoreRoot: Boolean(process.env.RI_IGNORE_ROOT),
// an array (or comma separated string for env var) of directories
// to skip while recursing. if array, can pass functions which
// return a boolean after receiving the dir path and fs.Dirent args
// #see https://nodejs.org/api/fs.html#fs_class_fs_dirent
skipDirectories: process.env.RI_SKIP_DIRS
? process.env.RI_SKIP_DIRS.split(',').map(str => str.trim())
: undefined,
// just run through and log the actions that would be taken?
dry: Boolean(process.env.RI_DRY_RUN),
...config
});
function handleSpawnedProcess(dir, log, proc) {
return new Promise((resolve, reject) => {
proc.on('error', error => {
console.log(`
----------------
[RI] | [ERROR] | Failed to Spawn Process
- Path: ${dir}
- Reason: ${error.message}
----------------
`);
reject(error);
});
if (log) {
proc.stderr.on('data', data => {
console.error(`[RI] | [${dir}] | ${data}`);
});
}
if (log && log !== 'errors') {
proc.stdout.on('data', data => {
console.log(`[RI] | [${dir}] | ${data}`);
});
}
proc.on('close', code => {
if (log && log !== 'errors') {
console.log(`
----------------
[RI] | [COMPLETE] | Spawned Process Closed
- Path: ${dir}
- Code: ${code}
----------------
`);
}
if (code === 0) {
resolve();
} else {
reject(
new Error(
`[RI] | [ERROR] | [${dir}] | failed to install with exit code ${code}`
)
);
}
});
});
}
async function recurseDirectory(rootDir, config) {
const {
useYarn,
yarnWorkspaces,
detectLockFiles,
log,
maxDepth,
ignoreRoot,
skipDirectories,
dry
} = config;
const installPromises = [];
function install(cmd, folder, relativeDir) {
const proc = cp.spawn(cmd, ['install'], {
cwd: folder,
env: process.env
});
installPromises.push(handleSpawnedProcess(relativeDir, log, proc));
}
function shouldSkipFile(filePath, file) {
if (!file.isDirectory() || file.name === 'node_modules') {
return true;
}
if (!skipDirectories) {
return false;
}
return skipDirectories.some(check =>
typeof check === 'function' ? check(filePath, file) : check === file.name
);
}
async function getInstallCommand(folder) {
let cmd = useYarn ? 'yarn' : 'npm';
if (detectLockFiles) {
const [hasYarnLock, hasPackageLock] = await Promise.all([
fs
.readFile(path.join(folder, 'yarn.lock'))
.then(() => true)
.catch(() => false),
fs
.readFile(path.join(folder, 'package-lock.json'))
.then(() => true)
.catch(() => false)
]);
if (cmd === 'yarn' && !hasYarnLock && hasPackageLock) {
cmd = 'npm';
} else if (cmd === 'npm' && !hasPackageLock && hasYarnLock) {
cmd = 'yarn';
}
}
return cmd;
}
async function installRecursively(folder, depth = 0) {
if (dry || (log && log !== 'errors')) {
console.log('[RI] | Check Directory --> ', folder);
}
let pkg;
if (folder !== rootDir || !ignoreRoot) {
try {
// Check if package.json exists, if it doesnt this will error and move on
pkg = JSON.parse(await fs.readFile(path.join(folder, 'package.json')));
// get the command that we should use. if lock checking is enabled it will
// also determine what installer to use based on the available lock files
const cmd = await getInstallCommand(folder);
const relativeDir = `${path.basename(rootDir)} -> ./${path.relative(
rootDir,
folder
)}`;
if (dry || (log && log !== 'errors')) {
console.log(
`[RI] | Performing (${cmd} install) at path "${relativeDir}"`
);
}
if (!dry) {
install(cmd, folder, relativeDir);
}
} catch {
// do nothing when error caught as it simply indicates package.json likely doesnt
// exist.
}
}
if (
depth >= maxDepth ||
(pkg && useYarn && yarnWorkspaces && pkg.workspaces)
) {
// if we have reached maxDepth or if our package.json in the current directory
// contains yarn workspaces then we use yarn for installing then this is the last
// directory we will attempt to install.
return;
}
const files = await fs.readdir(folder, { withFileTypes: true });
return Promise.all(
files.map(file => {
const filePath = path.join(folder, file.name);
return shouldSkipFile(filePath, file)
? undefined
: installRecursively(filePath, depth + 1);
})
);
}
await installRecursively(rootDir);
await Promise.all(installPromises);
}
async function startRecursiveInstall(directories, _config) {
const config = getConfig(_config);
const promise = Array.isArray(directories)
? Promise.all(directories.map(rootDir => recurseDirectory(rootDir, config)))
: recurseDirectory(directories, config);
await promise;
}
if (AUTO_RUN) {
startRecursiveInstall(process.cwd());
}
module.exports = startRecursiveInstall;
And with it being used:
const installRecursively = require('./recursive-install');
installRecursively(process.cwd(), { dry: true })
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && npm install" \;
[For macOS, Linux users]:
I created a bash file to install all dependencies in the project and nested folder.
find . -name node_modules -prune -o -name package.json -execdir npm install \;
Explain: In the root directory, exclude the node_modules folder (even inside nested folders), find the directory that has the package.json file then run the npm install command.
In case you just want to find on specified folders (eg: abc123, def456 folder), run as below:
find ./abc123/* ./def456/* -name node_modules -prune -o -name package.json -execdir npm install \;
To run npm install on every subdirectory you can do something like:
"scripts": {
...
"install:all": "for D in */; do npm install --cwd \"${D}\"; done"
}
where
install:all is just the name of the script, you can name it whatever you please
D Is the name of the directory at the current iteration
*/ Specifies where you want to look for subdirectories. directory/*/ will list all directories inside directory/ and directory/*/*/ will list all directories two levels in.
npm install -cwd install all dependencies in the given folder
You could also run several commands, for example:
for D in */; do echo \"Installing stuff on ${D}\" && npm install --cwd \"${D}\"; done
will print "Installing stuff on your_subfolder/" on every iteration.
This works for yarn too
Any language that can get a list of directories and run shell commands can do this for you.
I know it isn't the answer OP was going for exactly, but it's one that will always work. You need to create an array of subdirectory names, then loop over them and run npm i, or whatever command you're needing to run.
For reference, I tried npm i **/, which just installed the modules from all the subdirectories in the parent. It's unintuitive as hell, but needless to say it's not the solution you need.

Resources