I am currently developing a project, which I want to test in different environments - including node.js and different browsers with karma/selenium - to avoid compatibility issues. (I think I will use browserify in browsers, but I am not familiar with it yet.)
I have a nested testing directory, something like this:
repo/
- project.js
- project.my.module.js
- spec/
-- helpers/
--- a.jasmine.helper.js
-- support/
--- jasmine.json
-- project.my.module/
--- ModuleClass.spec.js
-- project.MyClass.spec.js
-- project.OtherClass.spec.js
Currently I tested the project only with jasmine-npm (which is jasmine 2.2 for node.js). By testing the working directory is the repo/, where I run node.exe with jasmine.js. The jasmine.js loads the jasmine.json:
{
"spec_dir": "spec",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
]
}
Now I have 2 problems here.
How can I avoid the long relative paths by require, for example require("../../project.my.module.js") in the ModuleClass.spec.js file? (I would rather use a short constant name, like I can do by symlink.)
How can I do this in a way that is compatible with running the same test files in different browsers? (I want to keep the commonjs module definitions by the tests.)
I checked some tutorials about node.js, and it seems like I have two options. I can use the package.json (with some magic config options unknown to me), or I can move the files I want to load, to the node_modules/ (which I am sure I won't do). I am open for suggestions, because I can't see how it is possible to solve this issue...
edit:
The karma-browserify appears to solve the testing problem, probably I have to add a jasmine for browser, but that's okay. I don't have to change the commonjs module definitions by the tests. So it is possible to test both in node.js and the browser with the long paths.
edit2:
I ended up adding the parent dir of my repo to NODE_PATH. This way I can require every project I am currently developing.
What about symlinking your project directory under node_modules (e.g. as node_modules/project) and requiring like require("project/project.my.module.js")?
Related
I'm determined to find a way to do this. Here's why:
I'm writing tests for .js files which are bundled with webpack for a browser client.
I would like to run these tests in nodejs. There is a "build.js" script which is NOT using es6 modules, it's a commonjs module - so updating package.json would break it.
Is there a way to do this or is everything fundamentally broken?
Edit: after my workaround it still doesn't really work 'cause now webpack is broken.
I just made a subdirectory with its own package.json that has "type": "module" set, so that it doesn't break any commonjs scripts outside of it but everything inside can import other es6 modules.
So yes, everything is fundamentally broken but there is a workaround.
I am looking for information on how to bundle dependencies with Webpack. Haven't been doing front-end development much recently and behind the latest trends.
(a) I would like to bundle x number of dependencies with Webpack, but
I do not wish to specify an entry point. Such that if the bundle was
require'd, nothing would execute.
(b) This has probably nothing to do with (a) - ideally I could bundle
them as AMD modules. Basically, would like to take NPM modules and my
code and convert things to AMD.
I am guessing that the above can be accomplished through some webpack.config.js configuration, but I haven't see anything online demonstrating how you can bundle deps with Webpack without specifying an entry point.
You have to specify an entrypoint, otherwise Webpack won't be able to parse your modules and statically analyze the dependencies.
That said, you don't have to directly specify an entrypoint in your configuration. You can use the webpack --entry path/to/entry.js $OTHER_ARGS and then require all the dependencies therein, or you could use the configuration can and specify all the required modules:
{
entry: ['react', 'foo', 'bar', './ours/a', './ours/b']
}
In any case, the way Webpack evaluates your modules during runtime does not make these modules readily available. I suspect what you actually may be interested in is creating library targets, which are compiled independently and then reused in other Webpack builds.
Here is a nice article that explains the approach in detail and then you can refer to the official documentation.
I'm very new to using npm and TypeScript so I'm hoping that I'm missing something obvious. We have authored a node package written in TypeScript for our own internal use. So, the source is written in Typescript, and the file structure looks like this:
src/myModule.ts
myModule.ts
Where myModule.ts looks like this:
export * from "./src/myModule";
I then run tsc to generate .js and .d.ts files. So, the files then look like this:
src/myModule.js
src/myModule.ts
src/myModule.d.ts
myModule.js
myModule.ts
myModule.d.ts
All of this gets pushed to git and then our main app includes this package as a dependency via a git URL. When I first attempted this, I got an error like this:
export * from "./src/myModule";
^
ParseError: 'import' and 'export' may appear only with 'sourceType: module'
After some searching around, I found that the issue was with the .ts files that were getting loaded in. So I added the following to my .npmignore:
*.ts
!*.d.ts
After doing this, it brings in the node package without any problems.
The problem I am running into is that I want to be able to run off of local changes to the node package during active development. I want to be able to make changes in the node package, and then have the main app pick up these changes so that I can make sure everything works together before pushing to git. Everything I find says to use npm link. This makes it point to my local directory as expected, but the problem is that it now has all the .ts files, and the same errors show up.
I'm sure npm link works great when everything is written in JavaScript, but we are using TypeScript to write everything. Is there a way to make this work with TypeScript?
This makes it point to my local directory as expected, but the problem is that it now has all the .ts files, and the same errors show up.
The errors will only show up if you have those .ts files and a seperate declaration for those files.
Recommended fix
Just use the seperate .ts/.js/.d.ts files and steer clear of bundling a single .d.ts.
I've got a plugin I wrote in es6, and I'm currently testing the plugin on a site that I'm building.
When there's an issue, I would like to quickly modify the plugin directly in the node_modules folder, however everytime I need to make a change, I need to rebuild the dist folder for that plugin using babel-cli.
Is there anyway to get around this? Is there a webpack solution for this?
Not sure if understand you correctly where do you execute this code, but any way if it is executed in node - node supports es, just use latest version. If it is browser - then again you have two options execute file without transcompiling it at all https://kangax.github.io/compat-table/es6/, or use babel directly in the browser: http://babeljs.io/docs/usage/browser/
Your problem derives from the use of a transpiler to transform your source code before loading it into the browser. You can avoid this by using an isomorphic module pattern like this example, with introductory article.
Another alternative that is webpack compatible is to use the webpack hot loader.
I'm new to Node but am enjoying myself so far. I was trying to move my node_modules (libraries) directory outside of the public 'webroot' and need advice and guidance.
I've setup my simple expressJS based Node project as follows:
/my_project
/config
/public
/node_modules
server.js
I was wondering if there was any way I could have the /node_modules dir outside of my webroot and not break my application. I'm just so used to keeping the bare minimum in my publicly exposed webroot and don't feel right with the libs being in there. Call me old fashioned but that's how I'm used to doing stuff in the PHP and C# world.
If I setup the project as follows:
/my_project
/config
/node_modules
/public
server.js
then it all goes wobbly and Node's require() magic breaks.
I've tried the following:
var express=require('../express'); which doesn't work either giving me the 'Cannot Find module' type error.
Is what I'm asking even possible, if so then how?
Are there any major risks with me having my libs in a webroot or have I missed something fundamental here with the way Node works.
What do you guys do, what is best practice for production apps? May I have some examples of your production practices and why.
1. Is it possible to have modules in a folder outside of the project
Yes.
2. Are there any major risks with having modules in a webroot?
Assuming that you by "webroot" mean in the root of the server or any folder outside of your project: yes. It is possible to install modules globally with npm using the g-flag: npm install -g express. This generally considered bad practice as different projects may depend on different versions of the same module. Installing locally allows different projects to have different versions.
If you're using version control and don't want to check in the external modules, a common (and standard in npm) pattern is to ignore ./node_modules and specify dependencies in a package.json file.
3. "What is best practice for production apps?"
Not a good fit for SO, but since I'm at it I'll give it a shot anyway. If you use grunt (a very popular task automation tool) you'll usually end up with a structure like this:
/my_project
/node_modules
/lib
# actual project files
/public
# files that are meant to be sent to clients
/test
package.json # specifies dependencies, main script, version etc
README.md # optional
This structure has the advantages of clearly separating core files, dependencies and any tests while keeping it all in the same folder.