I'm writing an Express v4.18.2 app on Node.js 18.12.1 on Windows. I'm testing a controller with Jasmine 4.5.0. When I run jasmine spec, it fails with an error message about resolving ES modules:
Error [ERR_UNSUPPORTED_DIR_IMPORT]: Directory import
'...\SimpleServiceJasmine\spec' is not supported resolving ES modules
imported from
...\SimpleServiceJasmine\node_modules\jasmine\lib\loader.js
...
code: 'ERR_UNSUPPORTED_DIR_IMPORT',
url: 'file:///D:/.../SimpleServiceJasmine/spec'
I use require, not import, everywhere in the code being tested or the spec.
Note that jasmine runs fine if I specify the spec file explicitly, even with wildcards, as long as the wildcard path resolves to a single file:
jasmine spec/service/contact-api.spec.js # ok
jasmine spec/*/c* # ok
I tried downgrading jasmine to 3.0.0 and 2.0.1 but got the same error. The behavior is the same on Windows 11 and Windows Server 2019.
Any suggestions for how can I run all the specs in this project?
Here's my package.json:
{
"name": "simpleservice",
"version": "1.0.0",
"description": "A simple CRUD API for contacts",
"main": "service/contact-api.js",
"scripts": {
"test": "jasmine spec/service/contact-api.spec.js",
"start": "node src/service/contact-api.js"
},
"author": "Puzzled Dev",
"license": "ISC",
"dependencies": {
"express": "^4.18.2"
},
"devDependencies": {
"jasmine": "^4.5.0"
}
}
Here's the spec/support/jasmine.json:
{
"spec_dir": "spec",
"spec_files": [
"**/*[sS]pec.?(m)js"
],
"helpers": [
"helpers/**/*.?(m)js"
],
"env": {
"stopSpecOnExpectationFailure": false,
"random": true
}
}
Jasmine uses glob for pattern matching, which means whenever you pass spec as a pattern, glob finds this particular directory and then returns it to jasmine. Unfortunately, path validation in jasmine does not seem in a very smart way, it continues executing with the given path, and then it throws an error by Node.js.
It is equivalent to this:
import spec from './spec'
You can take a look at the jasmine-runner code, and the line in which contributor put a comment:
The ES module spec requires import paths to be valid URLs. As of v14,
Node enforces this on Windows but not on other OSes. On OS X, import
paths that are URLs must not contain parent directory references.
So, once you have defined spec_files in jasmine.json, there is no need to give an extra pattern to execute all tests.
jasmine will execute all tests with the help of spec_dir, and spec_files fields in jasmine.json. All you need to do is just simply run jasmine or npx jasmine.
Related
I am developing fullstack NPM packages with multiple entrypoints (namely client, server and sometimes tests), using the exports property of package.json.
I also use a small hack to make TS work with exports until it's officially supported (see this Stack Overflow post, this hackish solution and this ticket).
The package.json ends up looking like this:
{
"name": "#vulcanjs/graphql",
"version": "0.4.7",
"main": "./dist/index.js",
"files": [
"dist/"
],
"exports": {
".": "./dist/index.js",
"./server": "./dist/server/index.js",
"./testing": "./dist/testing.js"
},
"types": "./dist/index.d.ts",
"typesVersions": {
"*": {
"server": [
"./dist/server/index.d.ts"
],
"testing": [
"./dist/testing.d.ts"
]
}
},
"description": "Vulcan graphQL schema generator",
...
You can then import using either #vulcanjs/graphql for shared code, and #vulcanjs/graphql/server for Node.js-only code.
It works perfect in my Next.js app. However, it seems to break Jest moduleNameMapper.
First, I had this:
moduleNameMapper: {
"#vulcanjs/(.*)": [
"<rootDir>/node_modules/#vulcanjs/$1",
],
},
The error is:
Configuration error:
Could not locate module #vulcanjs/graphql/server mapped as:
[
"/code/vulcan-next/node_modules/#vulcanjs/graphql/server",
].
The problem is that it tries to find a package named #vulcanjs/graphql/server: yet "server" is not a different package, it's an entrypoint of #vulcanjs/graphql.
I've also tried this:
moduleNameMapper: {
"#vulcanjs/(.*)/(.*)": [
"<rootDir>/node_modules/#vulcanjs/$1",
],
},
With this config, #vulcanjs/graphql/server is mapped to #vulcanjs/graphql. The problem is that now, server code is not found. I've checked and the problem is that this solution totally removes the /server: so #vulcanjs/graphql/server points to the main entrypoints instead of the server entrypoint.
Finally I did try to remove the moduleNameMapper, but then #vulcanjs/graphql/server package is not found by Jest. Note that I need the mapper for a use case I did not demonstrate here, so getting rid of it is possible but not the best solution.
The bug can be reproduced by installing the framework: https://github.com/VulcanJS/vulcan-next. You can clone, yarn install, unskip the test in src/models/tests/sampleModel.server.test.ts and run yarn run test:unit sampleModel. It will show the error.
Any idea how I could fix this?
Early days in the development of my first npm script, and struggling somewhat. I'm on Ubuntu LTS with the latest nvm, node, npm and pnpm releases.
Node + npm have been installed using nvm, pnpm installed using npm, and several modules installed locally (i.e. without the -g flag) using pnpm. No sudo was necessary. The resulting package.json:
{
"name": "javascript-development-environment",
"version": "1.0.0",
"description": "JavaScript development environment cobbled together using various online sources",
"scripts": {
"prestart": "./node_modules/.bin/babel buildScripts/startMessage.js",
"start": "./node_modules/.bin/babel buildScripts/srcServer.js"
},
"author": "Laird o' the Windy Waas",
"license": "MIT",
"dependencies": {
"#babel/polyfill": "^7.0.0"
},
"devDependencies": {
"#babel/cli": "^7.1.5",
"#babel/core": "^7.1.6",
"#babel/preset-env": "^7.1.6",
"chalk": "^2.4.1",
"express": "^4.16.4",
"open": "^0.0.5",
"path": "^0.12.7"
}
}
With only Firefox 60.0.1 installed, on doing a 'pnpm start' using node, a browser window is opened "Hello World!" displayed, and terminal control has to be regained using a CTRL-C. -> All ok.
If I substitute in babel using the path as shown above (which results from the same issues described in this post), the buildScripts code is echoed to the terminal, but no browser window opens, and terminal control is released immediately on completion. The npm debugger provides no useful feedback. -> Something not working..
As the "Hello World!" code is traversed correctly using node (and remains unchanged for the babel traversal), it is not the source of the problem.
Here my babel config files:
.babelrc
{
"presets": [
"#babel/preset-env"
]
}
babel.config.js
const presets = [
[
"#babel/env",
{
targets: {
edge: "17",
firefox: "61",
chrome: "67",
safari: "11.1",
opera: "56"
},
useBuiltIns: "usage"
},
],
];
module.exports = { presets };
The problem looks to be that babel is not passing the transpiled code on to nodejs / express. Bound to be something simple, but I'm just going round in circles..
One thing I found myself asking is whether there might be a conflict between the various env presets across .babelrc, babel.config.js and package.json. Successive parking of the .babelrc and babel.config.js files, however, brought no change/advance.
I have also noticed that both (nvms) node and (ubuntus) nodejs are currently installed:
$ which node
/home/<myusername>/.nvm/versions/node/v10.13.0/bin/node
$ which nodejs
/usr/bin/nodejs
However, as everything to do with node and npm was installed using nvm, this shouldn't be a problem.
I could, I suppose, try installing babel globally, but with this widely frowned apon. I'd prefer a solution reflecting 'best practice'.
Thanks for any suggestions.
In earlier years, tutor material suggested babel-node would start npm / node (and hence express) on the user's behalf.
babel-node now no longer seems to be recognised. Attempts at using the babel-node command failed, and simply using node in it's place resulted in the transpiler output being dumped to the terminal.
babel, (in our case) pnpm, and node now have to be explicitly called, the latter referencing the transpiled code. node appears to handle interfacing with express.
After some experiment, therefore, the following changes (in package.json) appear to work fine:
"scripts": {
"prestart": "./node_modules/.bin/babel buildScripts/startMessage.js -d dist",
"build": "./node_modules/.bin/babel buildScripts/srcServer.js -d dist",
"start": "pnpm run build && node dist/startMessage.js && node dist/srcServer.js"
},
These result both in a tidy console output and result in "Hallo World!" being displayed in a freshly opened browser window.
Just hope this is of use to someone else.. ;-)
Note: this is a Q&A on migrating AWS Lambda Webpack builds from v6.10 to v8.10 — no help is needed, but even better answers are of course always encouraged!
For a while now I have been a disciple of using Webpack to build my back-end Lambdas after reading James Long's excellent series entitled "Backend Apps with Webpack" (part 1, part2, and part3).
Up until recently, the only version of Node.js that Amazon Web Services offered was 6.10; you had to write your Lambda fn in the "callback" style. But on April 2, 2018 AWS announced that 8.10 was now supported, and with it the suggested pattern is async / await which is great! Until it immediately broke my Webpack build. After a bit of debugging, I can break my build just by adding one async fn to the Lambda handler (I don't even need to call it):
async function firstAsync() {
return true;
}
exports.handler = async (event) => {
// TODO implement
return 'Hello from Lambda!'
};
To be clear, doing this in the AWS Lambda console is perfectly fine, runs great. Webpack even builds it successfully, but upon uploading to AWS Lambda, I get the following error message: regeneratorRuntime is not defined. My code is below. What am I needing to do?
webpack.config.js
const nodeExternals = require('webpack-node-externals');
const path = require('path');
const UglifyJsPlugin = require('uglifyjs-webpack-plugin')
const webpack = require('webpack');
const config = {
entry: './src/index.js',
output: {
path: path.resolve(__dirname, 'dist'),
library: 'index',
libraryTarget: 'commonjs2',
filename: 'index.js'
},
target: 'node', // building for a Node environment
externals: [nodeExternals()], // in order to ignore all modules in node_modules folder
module: {
rules: [{
test: /\.(js|jsx)$/,
use: {
loader: 'babel-loader',
options: {
presets: ['env']
}
}
}]
},
plugins: [
new UglifyJsPlugin()
]
};
module.exports = config;
package.json
{
"name": "lambda-webpack",
"version": "1.0.0",
"description": "An empty project scaffold to enable webpack builds in AWS Lambda",
"main": "index.js",
"scripts": {
"build": "webpack",
"upload": "upload.bat"
},
"author": "Geek Stocks®",
"license": "MIT",
"devDependencies": {
"aws-sdk": "^2.179.0",
"babel-core": "^6.26.0",
"babel-loader": "^7.1.2",
"babel-preset-env": "^1.6.1",
"uglifyjs-webpack-plugin": "^1.1.6",
"webpack": "^3.10.0",
"webpack-node-externals": "^1.6.0"
}
}
Geek Stock's answer above is only necessary when using the Node v6.10 runtime. It allows you to use async/await syntax and babel-runtime/regenerator converts your async functions to ES6 generators which is supported on Node v4.3.2 and newer runtimes.
In Node v8.10 though, it is different. async/await is already supported so you don't need Babel to convert your async functions to generators. You just use it correctly and it works.
As for your specific situation, I assume you simply changed your Javascript code to use async/await code but you didn't tell Babel NOT to convert your async/await code to generators. That is why babel-runtime is complaining about a missing plugin.
If this is the case, you simply need to configure Babel to simply target Node v8.10. You can do it like this in your .babelrc...
{
"presets": [
[
"env",
{
"targets": {
"node": "8.10"
}
}
]
]
}
babel will then stop converting those async functions. And since it's not doing a conversion, it will not need the regenerator-runtime anymore.
In my case I'm using it with nodejs12.x. I'm receiving the error regeneratorRuntime is not defined.
I was using preset es2015 and stage-0 with a node12.x code. Also I'm using an older version of babel.
What I did, installed:
"#babel/cli": "^7.10.1",
"#babel/core": "^7.10.2",
"#babel/preset-env": "^7.10.2"
And then set this babel preset:
"presets": [
["#babel/preset-env", {"targets": { "node": "current" }}]
]
Thanks,
To begin using async / await in your Webpack-built code, you need a little help from Babel, specfically the babel-runtime and the babel-plugin-transform-runtime. Fortunately there is a really good writeup on the installation and use of both here on the Babel website. You need Babel to do the following for you:
Automatically requires babel-runtime/regenerator when you use
generators/async functions.
I won't repeat what has been written there, however, of particular note is HOW you need to install these, because while their writeup certainly works for most, there is a needed tweak for some AWS Lambda devs who have not needed runtime dependencies before now.
The first part is pretty normal, you need new devDependencies:
npm install --save-dev babel-plugin-transform-runtime
And also you need to tweak your .babelrc file like they describe:
{
"plugins": [
["transform-runtime", {
"helpers": false,
"polyfill": false,
"regenerator": true,
"moduleName": "babel-runtime"
}]
]
}
But here is the kicker, and this is new to the code above: the babel-runtime is a regular dependencies, not a devDependencies:
npm install --save babel-runtime
These 2 installs and the .babelrc tweaks do SOLVE the async / await problem described above, but it introduces a new problem to the build above: Cannot find module 'babel-runtime/regenerator'.
If you are at all concerned with keeping your Webpack built code small, you are likely also using webpack-node-externals like the above code does. For example, the aws-sdk for JavaScript in Node is very large and is already available in the Lambda environment, so its superfluous to bundle it again. The webpack-node-externals configuration above configures Webpack to ignore ALL the modules in node-modules by default, and there is the root of the new problem. The newly installed babel-runtime is a module that needs to be bundled in the build in order for Lambda to run correctly.
Understanding the problem then, the answer becomes simple: don't use the default configuration. Instead of passing in nothing to webpack-node-externals, configure it like this:
externals: [nodeExternals({
whitelist: [
'babel-runtime/regenerator',
'regenerator-runtime'
]
})], // ignore all modules in node_modules folder EXCEPT the whitelist
And that solves both the original async / await problem, and the (possible) new problem you may be encountering if you had no previous dependencies in your build. Hope that helps — happy Lambda awaiting!
I hope you can help me . i'm fairly new to node and am building my first node app.
I have built out an app in node.js. It works fine and as expected when running locally.
When trying to run it on our internal production server I get an error message.
It seems to take issue with the opening template literal tag. The file in question employs a simple module export function. I've stripped down the code a bit to make it easier to read. See below:
exports.templateModule = function(markup, edmData) {
var template = `<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head></head>
<body></body>
</html>
`;
return template;
I'm using node version v6.11.0.
The production server is running on linux.
package.json file below incase it helps.
{
"name": "template",
"version": "0.0.0",
"description": "A simple tempate test.",
"main": "app.js",
"scripts": {
"start": "node app.js"
},
"dependencies": {
"ejs": "^2.5.6",
"ejs-lint": "^0.3.0",
"express": "4.15.2",
"glob": "^7.1.2",
"node-dev": "^3.1.3"
},
"repository": {
"type": "git",
"url": "https://github.com/heroku/node-js-getting-started"
},
"license": "MIT",
"devDependencies": {
"node-dev": "^3.1.3"
}
}
I'll provide any other information you need if I can.
Any help or insight would be much appreciated.
Thank you
Moe
Template strings were added in NodeJS v4.0.0.
Node.js v4.0.0 contains V8 v4.5, the same version of V8 shipping with the Chrome web browser today.
This brings with it many bonuses for Node.js users, most notably a raft of new ES6 features that are enabled by default including block scoping, classes, typed arrays (Node's Buffer is now backed by Uint8Array), generators, Promises, Symbols, template strings, collections (Map, Set, etc.) and, new to V8 v4.5, arrow functions.
If you are using an older version of Node.js on your production server, please consider upgrading it to a more recent version.
I've begun using Node.js to make web applications. It's really awesome. I've come across a few modules that I want to incorporate into my build. I can work with the modules in Terminal after a global npm install. When it comes time to add them to my application, I have no idea how to go about placing them in my directory structure and I haven't found any good documentation on this. My typical node.js directory is:
ROOT
Server
server.js
node-modules
Client
index.html
css
-main.css
javascript
-main.js
-jquery.js
My process for installing the modules has been:
I cd into my Server file and run npm install
Then I go to my package.json file and include the module in the dependencies
{
"name": "application-name",
"version": "0.0.1",
"private": true,
"scripts": {
"start": "node app"
},
"dependencies": {
"express": "3.1.0",
"jade": "*",
"stylus": "*",
"<node-module-here>": "1.0.x",
},
"engines": {
"node": "0.10.0",
"npm": "1.2.14"
},
}
After that, I head over to the server.js file I add:
module.exports = require('<path_to_node-module_lib>');
When I run functions that are dependent on the modules on the Client side (functions that work in Terminal), I don't receive an error but the function won't run. Because I'm not receiving errors I have no idea about how to debug. If anyone can recognize some fatal flaw in my structure or implementation and can offer some recommendations, I offer my first born.