Using NPM module in Backbone with RequireJS - node.js

Hejsa,
I'm writing a webapp, which consists of a Node backend (Express server), which serves a Backbone app to the clients.
The Backbone app uses RequireJS to load the modules used.
I would like to use Ag-grid clientside, which can be included as an NPM module.
https://www.ag-grid.com/javascript-grid-getting-started/index.php
How can I reference this NPM module from Backbone?
Project structure
./node_modules
./src/package.json
./src/app (Node backend + Express server)
./src/public
./src/public/main.coffee (contains requireJs config)
./src/public/scripts (Backbone views, models, etc)
main.coffee
require.config
baseUrl: '../scripts/'
paths:
jquery: '//cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min'
jqueryui: '//cdnjs.cloudflare.com/ajax/libs/jqueryui/1.11.4/jquery-ui.min'
underscore: '//cdnjs.cloudflare.com/ajax/libs/underscore.js/1.8.3/underscore-min'
...
I would like to include the ag-grid NPM module here, but without having to reference the very top ./node_modules folder as ../../../node_modules/ag-grid/dist/ag-grid (didn't count the levels..).
Also, I'd like if possible to avoid a second package.js, and a secondary npm install
Any help related specifically to this project structure?
Secondarily, is there any better way to structure such a project? (Node backend serving a Backbone webapp)
Thanks

You guessed it and using a relative path all the way to the node_modules directory is the way to go.
requirejs.config({
paths: {
"ag-grid": "../../../node_modules/ag-grid/dist/ag-grid",
"backbone": "../../../node_modules/backbone/backbone"
}
});
define(["backbone", "ag-grid"], function(Backbone, agGrid) {
// whatever
});
You could also use npm for all the dependencies and bundle an optimized version of your app using the RequireJS optimizer (r.js).
Personally, I use npm for the development of the project and for server-side (node) dependencies. For my Backbone app, I use Bower as it's specialized in front-end dependencies management.
I have a .bowerrc file that tells bower where to install the dependencies:
{
"directory": "src/lib",
}
And a Gulp task which calls bower install:
var bower = require("bower"),
$ = require('gulp-load-plugins')({ lazy: true }),
gulp = require("gulp");
gulp.task('bower', function() {
return bower.commands.install()
.on('log', function(data) {
$.util.log('bower', $.util.colors.cyan(data.id), data.message);
});
});
this task is called automatically after npm install with a npm hook:
"scripts": {
// ...
"postinstall": "gulp install"
}
Take a look at simplified-js-project, a sample project which shows my development tools around a Backbone and RequireJs project.

Related

Vue Error - Can't resolve 'https' when importing package

I'm trying to make a Vue project and use an npm package for connecting to the retroachievements.org api to fetch some data, but I'm getting an error. Here's my process from start to finish to create the project and implement the package.
Navigate to my projects folder and use the vue cli to create the project: vue create test. For options, I usually chose not to include the linter, vue version 2, and put everything in package.json.
cd into the /test folder: cd test and install the retroachievements npm package: npm install --save raapijs
Modify App.vue to the following (apologies for code formatting, not sure why the post isn't formatting/coloring it all properly...):
const RaApi = require('raapijs');
export default {
name: 'App',
data: () => ({
api:null,
user: '<USER_NAME>',
apiKey: '<API_KEY>',
}),
created() {
this.api = new RaApi(this.user, this.apiKey);
},
}
run `npm run serve' and get the error:
ERROR in ./node_modules/raapijs/index.js 2:14-30
Module not found: Error: Can't resolve 'https' in 'C:\Projects\Web\test\node_modules\raapijs'
I'm on Windows 10, Node 16.17.0, npm 8.15.0, vue 2.6.14, vue CLI 5.0.8, raapijs 0.1.2.
The first solution below says he can run it without error but it looks like the exact same code as I'm trying. Can anyone see a difference and a reason for this error?
EDIT: I reworded this post to be more clear about my process and provide more info, like the versions.
This solution works for me. I installed raapijs with npm install --save raapijs command. Then in my Vue version 2 component I used your code as follow:
const RaApi = require('raapijs');
export default {
data: () => ({
api: null,
user: '<USER_NAME>',
apiKey: '<API_KEY>',
}),
created() {
this.api = new RaApi(this.user, this.apiKey);
},
};
It seems the raapijs package was designed to be used in a Node environment, rather than in Vue's browser based environment, so that's the reason I was getting an error. The package itself was looking for the built in https package in Node, but since it wasn't running in Node, it wasn't finding it.
So I solved my problem by looking at the package's github repo and extractingt he actual php API endpoints that were being used and using those in my app directly, rather than using the package wrapper. Not quite as clean and tidy as I was hoping but still a decent solution.

Express, Pug and Webpack

I have a Node js server app which uses Express and Pug. I would like to bundle it to single script which can be deployed by pm2. There seem to be several problems with this.
In runtime I get Cannot find module "." and during compilation few messages like
WARNING in ./node_modules/express/lib/view.js 80:29-41 Critical
dependency: the request of a dependency is an expression
appear which come from dynamic imports like require(mod).__express. I assume Webpack can't statically resolve those and does not know which dependency to include.
How can this be solved ?
How do I make Pug compile and be part of the output js ?
It is because webpack rebundle node_modules (already bundled) dependencies and in the case of pug, it doesn't work.
You need to use webpack-node-externals within the webpack externals option in order to specifically ask not to re-bundle depedencies.
Install webpack-node-externals: npm i -D webpack-node-externals
Integrate it your webpack config file:
Example
// ...
const nodeExternals = require('webpack-node-externals')
module.exports = {
target: 'node',
entry: {
// ...
},
module: {
// ...
},
externals: [nodeExternals()],
output: {
// ...
},
}

Can't find node_modules after deployment

This title might be a bit misleading but please bear with me for a while.
I have made a simple Angular2 app on visual studio 2015 and now I have published it on Azure.
Having node_modules in the development environment was perfect but after deploying it shows error saying can't find node_modules.
Here is how I am referring in my development env in index.html-
<!-- Polyfill(s) for older browsers -->
<script src="/node_modules/core-js/client/shim.min.js"></script>
<script src="/node_modules/zone.js/dist/zone.js"></script>
<script src="/node_modules/reflect-metadata/Reflect.js"></script>
<script src="/node_modules/systemjs/dist/system.src.js"></script>
<script src="/systemjs.config.js"></script>
Its also referred in system.config.js-
/**
* System configuration for Angular 2 samples
* Adjust as necessary for your application needs.
*/
(function(global) {
// map tells the System loader where to look for things
var map = {
'app': '/app', // 'dist',
'#angular': '/node_modules/#angular',
'angular2-in-memory-web-api': '/node_modules/angular2-in-memory-web-api',
'rxjs': '/node_modules/rxjs'
};
// packages tells the System loader how to load when no filename and/or no extension
var packages = {
'app': { main: 'main.js', defaultExtension: 'js' },
'rxjs': { defaultExtension: 'js' },
'angular2-in-memory-web-api': { main: 'index.js', defaultExtension: 'js' },
};
var ngPackageNames = [
'common',
'compiler',
'core',
'forms',
'http',
'platform-browser',
'platform-browser-dynamic',
'router',
'router-deprecated',
'upgrade',
];
// Individual files (~300 requests):
function packIndex(pkgName) {
packages['#angular/'+pkgName] = { main: 'index.js', defaultExtension: 'js' };
}
// Bundled (~40 requests):
function packUmd(pkgName) {
packages['#angular/'+pkgName] = { main: '/bundles/' + pkgName + '.umd.js', defaultExtension: 'js' };
}
// Most environments should use UMD; some (Karma) need the individual index files
var setPackageConfig = System.packageWithIndex ? packIndex : packUmd;
// Add package entries for angular packages
ngPackageNames.forEach(setPackageConfig);
// No umd for router yet
packages['#angular/router'] = { main: 'index.js', defaultExtension: 'js' };
var config = {
map: map,
packages: packages
};
System.config(config);
})(this);
The error makes sense as I have a .gitignore file which doesn't let the node_modules to deploy to server.
Can someone please assist as to how I can run it after deploying and what change could be done with the above references in order to make it work.
I have not used SystemJS, but your bounty has enticed me to try answering anyway, since it looks like you still need an answer. :)
After glancing through some SystemJS docs, it looks like your index.html needs to be different for development vs production use. This is what the docs show for development:
<script src="systemjs/dist/system.js"></script>
<script>
SystemJS.import('/js/main.js');
</script>
And this is what they show for production (notice the first line has a different src path):
<script src="systemjs/dist/system-production.js"></script>
<script>
SystemJS.import('/js/main.js');
</script>
More importantly, take note that node_modules is not referenced in either case, nor should it be. If you have your code and configuration set up correctly, SystemJS (like other build tools) will package everything you need without any additional <script> tags. Instead, you should import your shims (and similar) from within your code somewhere. For example, in their Webpack guide (Webpack is a another build tool filling a similar role to SystemJS) the Angular team shows a polyfills.ts file that imports their shims, then they include the polyfills file into the build within their webpack configuration.
I'm sorry I can't offer more specific advice about SystemJS in particular, but hopefully this answer is enough to point you in the right direction.
You either have to deploy node_modules as a part of your package or have a script run npm install for you to get the packages from your package.json
To get the packages in your package.json file do npm install --save package-you-want-to-install
Then you can have your startup script install from the package json by trying the script on this link https://github.com/woloski/nodeonazure-blog/blob/master/articles/startup-task-to-run-npm-in-azure.markdown
One thing you could do is install the packages needed on Azure server via Kudu dashboard.
Go to https://yoursitename.scm.azurewebsites.net
Then Debug console -> CMD
Go to home\site\wwwroot directory
Type npm install
This will install the needed packages for the Angular 2 app to run on Azure server.
Don't use system.config.js
You need to bundle it first. Don't upload node_modules in Azure. To bundle refer below link.
How to bundle an Angular app for production
Once you bundle dist folder will create. You can upload the dist folder in Azure.
npm install your deps on prod env ..
npm i --production

Angular 2 how to load 3rd party vendor node modules with sub dependencies angular-cli

Loading a single node module in Angular 2 an angular-cli bootstraped project is described within the wiki pretty well. Just being curious, how do I nicely load a more complex node module within a project bootstrapped with angular-cli?
E.g. angular2-apollo relies on several sub-dependencies like apollo-client, graphql, lodash, ...
I added the node module to angular-cli-build.js
var Angular2App = require('angular-cli/lib/broccoli/angular2-app');
module.exports = function(defaults) {
return new Angular2App(defaults, {
vendorNpmFiles: [
'...',
'angular2-apollo/**'
]
});
};
And registered the node module ins system-config.js with
const barrels: string[] = [
// ...
// Thirdparty barrels.
'rxjs',
'angular2-apollo',
// App specific barrels.
// ...
];
// ...
// Apply the CLI SystemJS configuration.
System.config({
map: {
'#angular': 'vendor/#angular',
'rxjs': 'vendor/rxjs',
'angular2-apollo':'vendor/angular2-apollo/build/src',
'main': 'main.js',
},
packages: cliSystemConfigPackages
});
However this is only loading angular2-apollo. The sub-dependencies of angular2-apollo are not getting loaded. How do I load subdependencies with system.js within angular-cli bootstraped project?
So, you are facing a really annoying problem with System.js and there is an open issue about that on the Angular CLI here: https://github.com/angular/angular-cli/issues/882
It basically means you have to specify all the dependencies in the system.config.ts file and load them all in the angular-cli-build.js file.... horrible I know...
Maybe in the future that will happen: https://github.com/angular/angular-cli/issues/909
But, until the Angular CLI will become better, here is a starter app that includes Angular 2.0 and angular2-apollo with all it's dependencies (and even with a mock GraphQL server..) - https://github.com/Urigo/apollo-ship
You can check out the system.config.ts and the angular-cli-build.js in there to see how to include dependencies on angular2-apollo, apollo-client, lodash (and all the wanted dependencies of it), redux and many many more (too many....)
I think you are doing wrong in system.config.ts. User package configuration should be in the upper section of this file.
const map: any = {
'angular2-apollo': 'vendor/angular2-apollo/build'
};
/** User packages configuration. */
const packages: any = {
'angular2-apollo': { main: 'main.js', defaultExtension: 'js' },
};
See if it helps you ?

How to share code between node.js apps?

I have several apps in node that all share a few modules that I've written. These modules are not available via npm. I would like to be able to share freely between apps, but I don't want to copy directories around, nor rely on Git to do so. And I'm not really big on using symlinks to do this either.
I would like to arrange directories something like this:
app1
server.js
node_modules
(public modules from npm needed for app1)
lib
(my own modules specific to app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
lib
(my own modules specific to app2)
shared_lib
(my own modules that are used in both app1 and app2)
The problem I'm seeing is that the modules in shared_lib seem to get confused as to where to find the modules that will be in the node_modules directory of whichever app they are running in. At least I think that is the problem.
So....what is a good way to do this that avoids having duplicates of files? (note that I don't care about duplicates of things in node_modules, since those aren't my code, I don't check them into Git, etc)
The npm documentation recommends using npm-link to create your own Node.js packages locally, and then making them available to other Node.js applications. It's a simple four-step process.
A typical procedure would be to first create a package with the following structure:
hello
| index.js
| package.json
A typical implementation of these files would be:
index.js
exports.world = function() {
return('Hello World');
}
package.json
{
"name": "hello",
"version": "0.0.1",
"private": true,
"main": "index.js",
"dependencies": {
},
"engines": {
"node": "v0.6.x"
}
}
"private:true" ensures that npm will refuse to publish the package. This is a way to prevent accidental publication of private packages.
Next, navigate to the root of your Node.js package folder and run npm link to link the package globally so it can be used in other applications.
To use this package in another application, e.g., "hello-world", with the following directory structure:
hello-world
| app.js
Navigate to the hello-world folder and run:
npm link hello
Now you can use it like any other npm package like so:
app.js
var http = require('http');
var hello = require('hello');
var server = http.createServer(function(req, res) {
res.writeHead(200);
res.end(hello.world());
});
server.listen(8080);
I've got this working by having node_modules folders at different levels - node then automatically traverses upwards until it finds the module.
Note you don't have to publish to npm to have a module inside of node_modules - just use:
"private": true
Inside each of your private package.json files - for your project I would have the following:
app1
server.js
node_modules
(public modules from npm needed for app1)
(private modules locally needed for app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
(private modules locally needed for app2)
node_modules
(public modules from npm needed for app1 & app2)
(private modules locally for app1 & app2)
The point is node.js has a mechanism for dealing with this already and it's awesome. Just combine it with the 'private not on NPM' trick and you are good to go.
In short a:
require('somemodule')
From app A or B would cascade upwards until it found the module - regardless if it lived lower down or higher up. Indeed - this lets you hot-swap the location without changing any of the require(...) statements.
node.js module documentation
Just use the correct path in your require call
For example in server.js that would be:
var moduleName = require('../shared_lib/moduleName/module.js');
Its Important to know that as soon as your path is prefixed with '/', '../', or './' the path is relative to the calling file.
For further information about nodes module loading visit:
http://nodejs.org/docs/latest/api/modules.html
Yes, you can reference shared_lib from app1, but then you run into a problem if you want to package and deploy app1 to some other environment, such as a web server on AWS.
In this case, you're better off installing your modules in shared_lib to app1 and app2 using "npm install shared_lib/module". It will also install all the dependencies of the shared_lib modules in app1 and app2 and deal with conflicts/duplicates.
See this:
How to install a private NPM module without my own registry?
If you check out the node.js docs, you'll see that Node.js understands the package.json file format as well, at least cursorily.
Basically, if you have a directory named foo, and in that directory is a package.json file with the key-value pair: "main": "myCode.js", then if you try to require("foo") and it finds this directory with a package.json file inside, it will then use foo/myCode.js for the foo module.
So, with your directory structure, if each shared lib has it's own directory with such a simple package.json file inside, then your apps can get the shared libs by:
var lib1 = require('../shared_lib/lib1');
var lib2 = require('../shared_lib/lib2');
And that should work for both of these apps.
Another solution can be cloning files from the other places into this repo:
clone.js:
const path = require('path')
const fs = require('fs')
const shared = [
{
type: 'file',
source: '../app1',
files: [
'src/file1',
'src/file2',
'...'
],
},
]
function cloneFiles(source, files) {
const Reset = '\x1b[0m'
const FgGreen = '\x1b[32m'
console.log(`---------- Cloning ${files.length} files from ${source} ----------`)
for (const file of files) {
const sourceFile = path.join(__dirname, '..', source, file)
const targetFile = path.join(__dirname, '..', file)
process.stdout.write(`📁 ${file} ... `)
fs.copyFileSync(sourceFile, targetFile)
console.log(`${FgGreen}Done!${Reset}`)
}
console.log(`---------- All done successfully ----------\n`)
}
;(() => {
for (const item of shared) {
switch (item.type) {
case 'file':
cloneFiles(item.source, item.files)
break
}
}
})()
Then, in the package.json you can add this script and call it when you want to clone / sync files:
"clone": "node clone.js"

Resources