Sails.js + Grunt: recompile assets - node.js

I deleted my js files in the .tmp folder. Now none of the js files get recompiled on sails lift. All of the needed JS files are included in the Gruntfile.js like this:
var jsFilesToInject = [
// Below, as a demonstration, you'll see the built-in dependencies
// linked in the proper order order
// Bring in the socket.io client
'linker/js/socket.io.js',
// then beef it up with some convenience logic for talking to Sails.js
'linker/js/sails.io.js',
// A simpler boilerplate library for getting you up and running w/ an
// automatic listener for incoming messages from Socket.io.
'linker/js/app.js',
// *-> put other dependencies here <-*
'linker/js/jquery.min.js',
'linker/js/index.js',
// All of the rest of your app scripts imported here
'linker/**/*.js'
];
And also put script includes in the end of the layout.ejs.
I'm new to node.js, sails.js, Grunt and to this whole asset management thing. I've some similar questions on Stackoverflow, but with no answers.

Related

Webpack for back-end?

I was just wondering, I started using Webpack for a new project and so far it's working fine. I almost would say I like it better than Grunt, which I used before. But now I'm quite confused how and or I should use it with my Express back-end?
See, I'm creating one app with a front-end (ReactJS) and a back-end (ExpressJS). The app will be published on Heroku. Now it seems like I should use Webpack with ExpressJS as well to get the app up and running with one single command (front-end and back-end).
But the guy who wrote this blogpost http://jlongster.com/Backend-Apps-with-Webpack--Part-I seems to use Webpack for bundling all back-end js files together, which is in my opinion really not necessary. Why should I bundle my back-end files? I think I just want to run the back-end, watch my back-end files for changes and use the rest of Webpack's power just for the front-end.
How do you guys bundle the front-end but at the same time run the back-end nodejs part? Or is there any good reason to bundle back-end files with Webpack?
Why to use webpack on node backend
If we are talking about react and node app you can build isomorphic react app. And if you are using import ES6 Modules in react app on client side it's ok - they are bundled by webpack on the client side.
But the problem is on server when you are using the same react modules since node doesn't support ES6 Modules. You can use require('babel/register'); in node server side but it transipile code in runtime - it's not effective. The most common way to solve this problem is pack backend by webpack (you don't need all code to be transpile by webpack - only problematic, like react stuff in this example).
The same goes with JSX.
Bundling frontend and backend at the same time
Your webpack config can have to configs in array: one for frontend and second for backend:
webpack.config.js
const common = {
module: {
loaders: [ /* common loaders */ ]
},
plugins: [ /* common plugins */ ],
resolve: {
extensions: ['', '.js', '.jsx'] // common extensions
}
// other plugins, postcss config etc. common for frontend and backend
};
const frontend = {
entry: [
'frontend.js'
],
output: {
filename: 'frontend-output.js'
}
// other loaders, plugins etc. specific for frontend
};
const backend = {
entry: [
'backend.js'
],
output: {
filename: 'backend-output.js'
},
target: 'node',
externals: // specify for example node_modules to be not bundled
// other loaders, plugins etc. specific for backend
};
module.exports = [
Object.assign({} , common, frontend),
Object.assign({} , common, backend)
];
If you start this config with webpack --watch it will in parallel build your two files. When you edit frontend specific code only frontend-output.js will be generated, the same is for backend-output.js. The best part is when you edit isomorphic react part - webpack will build both files at once.
You can find in this tutorial explanation when to use webpack for node (in chapter 4).
This is my second update to this answer, which is beyond outdated by now.
If you need full a stack web framework in 2023, I'd recommend nextjs (which is built on top of react). No need to go around setting up anything, it just works out of the box, and is full stack.
On the other hand, if you need to compile your nodejs project written in typescript (which you should use as much as you can for js projects), I would use tsup-node.
You don't need to be a genius to imagine that in 3-5 years I'll come back to this and say this is really outdated, welcome to javascript.
This answer is outdated by now since node now has better support for ES modules
There's only a couple of aspects I can redeem the need to use webpack for backend code.
ES modules (import)
import has only experimental support in node (at least since node 8 up to 15). But you don't need to use webpack for them work in node.
Just use esm which is very lightweight and has no build step.
Linting
Just use eslint, no need to use webpack.
Hot reloading/restart
You can use nodemon for this. It's not hot reloading but I think it's way easier to deal with.
I wished I could refer to a more lightweight project than nodemon, but it does do the trick.
The blog post you shared (which is dated by now) uses webpack for having hot reloading. I think that's an overkill, opens a can of worms because now you have to deal with webpack config issues and hot reloading can also lead to unexpected behaviour.
The benefits we get from using tools like webpack on the frontend don't really translate to backend.
The other reasons why we bundle files in frontend is so browsers can download them in an optimal way, in optimal chunks, with cache busting, minified. There's no need for any of these in the backend.
Old (and terrible, but maybe useful) answer
You can use webpack-node-externals, from the readme:
npm install webpack-node-externals --save-dev
In your webpack.config.js:
var nodeExternals = require('webpack-node-externals');
module.exports = {
...
target: 'node', // in order to ignore built-in modules like path, fs, etc.
externals: [nodeExternals()], // in order to ignore all modules in node_modules folder
...
};
to use Webpack for bundling all back-end js files together, which is in my opinion really not necessary.
I think you are absolutely right. It's not necessary at all. I've been researching on this topic for a while. I've asked lots of questions on this topic, and up to this day, I haven't found yet a single "real" reason for one to use webpack on a Node.js back-end.
I'm not saying you can't or shouldn't set up a webpack-dev-server to develop your back-end code locally. But you definitely don't need to bundle your backend code on your build process.
webpack bundles are meant for the browser. Take a look at its official docs: Why webpack?. Historically, browsers never had a built-in module system, that's the reason why you need webpack. It basically implements a module system on the browser. On the other hand, Node.js has a built-it module system out of the box.
And I do re-use all of my front-end code for SSR on my server. The very same front-end files are run on my server as-is without any bundling (they are just transpiled, but the folder structured and number of files is the same). Of course I bundle it to run on the browser, but that's all.
To run on your Node.js server, simply transpile it with babel, without webpack.
Just use ["#babel/preset-env", { targets: { node: "12" }}], on your babel config. Choose the Node version of your backend environment.
backend
dist_app // BABEL TRANSPILED CODE FROM frontend/src
dist_service // BABEL TRANSPILED CODE FROM backend/src
src
index.tsx
frontend
src
App.tsx
public // WEBPACK BUNDLED CODE FROM frontend/src
You can perfectly render your frontend code on the server, by doing:
backend/src/index.tsx
import { renderToString } from "react-dom/server";
import App from "../dist_app/App";
const html = renderToString(<App/>);
This would be considered isomorphic, I guess.
If you use path aliases on your code, you should use babel-plugin-module-resolver.

sails.config.XXXX in app.js for SailsJS Framework

I am using Sails Framework with the newrelic plugin. I am currently trying to register the app based on a flag in production.js/development.js. This is the flag in development.js:
ENABLE_NEWRELIC_NODE_SERVICE: false,
And this is the piece of code in App.js:
if (sails.config.ENABLE_NEWRELIC_NODE_SERVICE){
require('sails-hook-newrelic/register');
}
But it seems that sails.config is not accessible in app.js (I might be wrong). Is there any other way to include conditionality in App.js based on config files?
Thanks!
Short of a Sails specific way of doing this, simply require in the config file:
var config = require('./sails.config') // Check the path is correct
Then use the flag:
if (config.ENABLE_NEWRELIC_NODE_SERVICE){
require('sails-hook-newrelic/register');
}
The sails.config object isn't available until Sails is actually loaded. Looking at the code of sails-hook-newrelic, it's not clear to me why register() needs to be called so early; it seems like it could just happen as part of the hook initialization. But, the code isn't written that way, so you're stuck on that point. Looks like there's a pull request open to fix it. In the meantime, assuming that you want to activate New Relic based on the Node environment, you can just check process.NODE_ENV in your app.js:
if (process.NODE_ENV == 'development') {
require('sails-hook-newrelic/register');
}
Just make sure that you start your app with NODE_ENV=development node app.js.

Can I use webpack on the client side without nodejs server?

I am trying to build a web app where I want to store all html, js and css files on amazon s3, and communicate with a restful server through api.
I am trying to achieve lazy loading and maybe routing with react router. It seems that webpack has this feature code splitting that would work similarly as lazy loading.
However, all of the tutorial and examples I found involves webpack-dev-server, which is a small node express server. Is there anyway I could generate bundle at build time and upload everything to amazon s3 and achieve something similar to Angular's ocLazyLoading?
It's definitely possible to create a static bundle js file, which you can use in your production code that does not include webpack-dev-server.
See this example as a reference (note: I am the owner of this repo). webpack.prod.config.js does create a production ready bundle file using webpack via node.js which itself does not require node.js anymore. Because of that you can simply serve it as a simple static file (which is done in the live example).
The key difference is how the entry points are written in the dev- and production environments. For development webpack-dev-server is being used
module.exports = {
entry: [
'webpack-dev-server/client?http://localhost:3000',
'webpack/hot/only-dev-server',
'./src/index'
],
// ...
}
In the production environment you skip the webpack-dev-server and the hot reloading part
module.exports = {
entry: [
'./src/index'
],
// ...
}
If you want to split your code into more than one bundle, you might want to have a look at how to define multiple entry points and link the files accordingly.

What is index.js used for in node.js projects?

Other than a nice way to require all files in a directory (node.js require all files in a folder?), what is index.js used for mainly?
When you pass a folder to Node's require(), it will check for a package.json for an endpoint. If that isn't defined, it checks for index.js, and finally index.node (a c++ extension format). So the index.js is most likely the entry point for requiring a module.
See the official Docs here: http://nodejs.org/api/modules.html#modules_folders_as_modules.
Also, you ask how to require all the files in a directory. Usually, you require a directory with an index.js that exposes some encapsulated interface to those files; the way to do this will be different for ever module. But suppose you wanted to include a folder's contents when you include the folder (note, this is not a best practice and comes up less often than you would think). Then, you could use an index.js that loads all the files in the directory synchronously (setting exports asynchronously is usually asking for terrible bugs) and attaches them to module.exports like so:
var path = require('path'),
dir = require('fs').readdirSync(__dirname + path.sep);
dir.forEach(function(filename){
if(path.extname(filename) === '.js' && filename !== 'index.js'){
var exportAsName = path.basename(filename);
module.exports[exportAsName] = require( path.join( __dirname, filename) );
}
});
I hardly ever see people wanting to use that pattern though - most of the time you want your index.js to go something like
var part1 = require('./something-in-the-directory'),
part2 = require('./something-else');
....
module.exports = myCoolInterfaceThatUsesPart1AndPart2UnderTheHood;
Typically in other languages the web server looks for certain files to load first when visiting a directory like / in priority, traditionally this is either: index or default. In php it would be index.php or just plain HTML it would be index.html
In Node.js, Node itself is the web server so you don't need to name anything index.js but it's easier for people to understand which file to run first.
index.js typically handles your app startup, routing and other functions of your application and does require other modules to add functionality. If you're running a website or web app it would also handle become a basic HTTP web server replacing the role of something more traditional like Apache.
Here is a good article explaining how Node.js looks for required module https://medium.freecodecamp.org/requiring-modules-in-node-js-everything-you-need-to-know-e7fbd119be8, with folder and index.js file
Modules don’t have to be files. We can also create a find-me folder
under node_modules and place an index.js file in there. The same
require('find-me') line will use that folder’s index.js file:
~/learn-node $ mkdir -p node_modules/find-me
~/learn-node $ echo "console.log('Found again.');" > node_modules/find-me/index.js
~/learn-node $ node
> require('find-me');
Found again.
{}
>
Late to the party but the answer is simply to allow a developer to specify the public api of the folder!
When you have a bunch of JavaScript files in a folder, only a small subset of the functions and values exported from these files should exportable outside of the folder. These carefully selected functions are the public apis of the folder and should be explicitly exported (or re-exported) from the index.js file. Thus, it serves an architectural purpose.

Prevent RequireJS from Caching Required Scripts on Nodejs

I am using something like airbnb's rendr, i.e. my own implementation of sharing backbone code between client and server, to build full HTML on the server using the same Backbone models, views, collections and templates I am using on the client. My files/modules are defined as requirejs modules, that way I can share them easily on the client.
In development mode, I want requirejs to refetch/reload any modules from disc when I refresh my browser (without restarting the server), so I get my server rendering uses the newest templates and javascript to finally serve me the newest HTML.
when using requirejs on the server with nodejs, the trick of appending bust parameters to the urlArgs like the following doesn't work, i.e. the server doesn't reload/refetch any modules from disc
urlArgs: "bust=v2"
I wonder, if reloading/refetching requirejs modules from disc-space without restarting the server is possible in node? specifically, that would be very useful for the require-text plugin for template. additionally, it would be nice to apply reloading only to a restricted set of modules.
I've never had to do this, but a quick search shows a few options, including automating the server restart:
nodemon
node-supervisor
You may also be able to do something similarly creative to this:
delete require.cache['/home/shimin/test2.js']
You could almost definitely clear the version RequireJS has in cache, forcing it to reload it, although I suspect the node would just serve up the old file again.
Finally, maybe take a look at hot-reloading, which seems to work around the caching without needing to restart the server (example).
Like the OP, I was looking for a way to make development in node with RequireJS easier and faster. In my case, I am using RequireJS with a grunt watch task in node, and wanted to be able to force RequireJS to always get the latest version of a module from the file system. And happily, I found a very simple solution!
I had to slightly modify Henning Kvinnesland’s suggestion by adding the timeout, but now it is working like a charm:
var requirejs = require('requirejs');
// Set up requirejs.config here...
// Disable caching of modules
requirejs.onResourceLoad = function(context, map) {
setTimeout(function() {
requirejs.undef(map.name);
}, 1);
};

Resources