Backbone with Multipage node app strategy - node.js

I have a question regarding the node app that I want to build.
Before starting on the development, I've written a clear document that splits up my app into different components:
Home
Search
User profile
Dashboard
etc...
Each of these modules may in turn consist of different submodules.
As every module in my app works quite independently (although there are common, reusable components), I decided to render the main page for each of the modules from the server using Express.
Each of the pages that I want to render is highly interactive in the field of DOM interaction and event driven view updates, so I want to go with Backbone for this (using push state for loading submodules dynamically for the nested url's), in combination with requirejs for asynchronous module loading.
The thing I wonder about is if it is okay to include a minified file for each of the pages that I render from the server with express. It seems that this causes quite some overhead, because for each module loaded all the libraries need to be included again (backbone, underscore, jquery, and others).
Is there a common solution to this problem, and will this (in your experience) cause unacceptable performance issues?

What we end up doing with a similar multi page app structure is we break the build to separate "common.js" file that contains all the shared modules, and "main-[module-name].js" files for page specific code, and load it with 2 separate script tags per page
I don't know that it has actual significant perf impact. I am guessing that not really, unless you have some large libraries in your project
Take a look at the multi page config example for requirejs
My $0.02

Related

Run-time bundling of ES6 modules in ASP.NET MVC

Are there any existing solutions for run-time bundling of ES6 modules?
I'm looking to simplify JavaScript code development in a MVC5 web app. We're having issues with large, unwieldy JS files, so I'm hoping to get a module loader system in place. So far, I'm not finding any existing bundle transformers for ES6 or another module loader format. I'd be fine with using TypeScript or nodejs require style. I prefer not to use require.js style, though.
Perhaps there's a good reason this solution doesn't exist already. Maybe the dependency resolution processing is too much for a run-time bundling solution. But, I figure it's worth a shot to ask.
Solutions Considered
Prebuilt Web Client
Ultimately, this is where I want to be, but I need a stop-gap solution for now. I know how to put together a build system for an HTML client using grunt/gulp/webpack. But I don't want to have to tell developers to run webpack -w or something similar during development. Nor do I want to tell them to rebuild a solution for every JS change. They should be able to modify the file, refresh the browser, and see the change.
Directory Structure
This is the route I'll probably end up going with. Basically, this JS codebase consists of jQuery widgets and plain JS (helpers/common functions). So, if I structure the code in this directory structure and include the js dir, it should get me most of the way there:
js (DIR)
app-start.js
helpers (DIR)
widgets (DIR)
Widgets should be fine. Helpers, I can see issues where one function/class depends on another. Though, since a function call should never start with a helper (only a widget), this should work fine, assuming no globals are used (or maybe one global like 'App').

Preprocessing via express middleware or through build system

Is preprocessing static resources through middleware (using express) a good idea for production environments? From my understanding the middleware stack is run, in series, for every request. Wouldn't that mean then that preprocessing middle would regenerate a static resource (ie some_styles.less -> some_styles.css) after every request? If so, would it be better to simply preprocess through a build system such as grunt.js in advance and serve those files? I'd like the final rendering of the css and js to be concatenated to one file and minified.
Also, is it worthwhile to pre-render html from templates (such as jade) on pages with only static content? Or is that more trouble than it's worth?
The easiest way to handle CSS and JS preprocessing and minifying is through some sort of build system, be it grunt, cake, etc. It also might offer some performance benefits; at the very least, it reduces the workload for your server.
For my projects, I have tasks in my Cakefile to handle both CSS and JS. These are invoked by running the build task, and just output to a directory of static files that is set up through app.use("/res", express.static("RESDIR")).
As for pre-rendering HTML, it will offer a performance benefit. Unless doing it is very complex, I would go ahead and pre-render everything you easily can. It is far, far simpler to do it up front than to bolt it on down the road (if you're expecting any sort of growth, it could matter in the future).

Hot Towel: Why is Durandal and Require in the App folder rather than the Script folder?

This is coming from the idea of 3rd party libraries being in Script to discourage developers from customizing them. It would encourage them to write extensions to make it easier to take in a new version of either library.
You make a good point about other developers mistaking the durandal libraries for customizable files.
But, you are not required to keep durandal anywhere. The folder structure can be whatever your heart desires. Because durandal does not impose any folder structure.. it only has a recommeneded default setup. There are benifits to following its pattern.
By keeping durandal as part of your application root folder. It keeps all your amd javascript files together in one root folder. This way when you run the durandal optimizer it can scan every subfolder to compress/minify/uglify all your html/css/js into 1 file. This is a nice benifit because its a 1 click build of your entire application.
Also, its a nice seperation because its a good idea to keep your 3rd party non-amd JavaScript libraries in a separate folder structure this way if you use a bundler to compress all your third party libraries into a separate file. The browser can cache your application separate from the third-party libraries. Because the third-party libraries don't change very often, whereas your application will probably be changing frequently.
But durandal's conventions are all completely configurable and you can put durandal in any location you like.
This is a convention that Durandal has decided to use to help keep your customer client code organized in an App folder and away from the 3rd party scripts folder, which gets pretty messy pretty quickly. It does put require.js in the App folder because of the way it relies on require.js and its AMD pattern. require.js is used to help locate all modules and load them as needed (in your App folder).
Is there something specific that you need that this is preventing?

Concatenating javascript files on the fly in Liferay

I see a barebone.jsp file created (I guess by the MinifierFilter) as well as for deploying compressed and cached js. I want to separate development and production cases, and as for development, I just don't want Liferay not only to cache produced javascript file, I don't want to have this generated instance at all.
To be more precise, I want all javascript files to be concatenated on the fly. I always want to have an opportunity to edit any statics files at development and to see results as soon as possible.
What is the easiest way to implement it?
include the settings from portal-developer.properties in your portal-ext.properties. This disables minifiers, caching etc. and you can develop without the problems mentioned. You don't want this setting in production though, as all files will be loaded individually.
(Edit: It might be advisable to include my comment from below in the answer):
You find this file in webapps/ROOT/WEB-INF/classes
All the *.fast.load parameters are for the various minifiers (css, js), but typically you want all of the parameters named in there.

Dojo load time extremely slow on iis

I am currently working on a project that is using Dojo as the js framework. Its a rather rich ui and as such is using (and thus loading) a lot of different .js files for the dojo plug-ins
When run on an apache server running on a mac, the files (all around 1k) are served very quickly (1 or 2 ms) and the page loads pretty fast (<5 seconds)
When run on IIS on Win 7, the files are served at an unbelievably slow rate (150ms - 1s), thus causing the page to take up to 3 minutes to load.
I have searched the internet to try to find a solution and have come up empty.
Anyone have any ideas?
Why not let Google serve the Dojo files for you?
The AJAX Libraries API is a content
distribution network and loading
architecture for the most popular,
open source JavaScript libraries. By
using the google.load() method, your
application has high speed, globally
available access to a growing list of
the most popular, open source
JavaScript libraries.
What you need to do is build an optimized version of your code. That way you will have far fewer hits to your server (but I guess they'll still be slow, until you discover the iis problem) Dojo runs out of the box as individual files which is great for development, but without running the build scripts to concatenate all these files together, the experience is poor. The CDN does build profiles for dojo base and certain profiles, like dijit.dijit. Doing a dojo.require on these profiles in addition to the individual requires would enable this after running a build. You would need to do create layers for your code as well. The build scripts can also concatenate css and inline template files, remove comments and whitespace, etc.
Have you actually tried measuring the load times on the intended target production server?
If you're just testing this on local development environments (or in development/test VM's) then I think you're comparing apples with oranges (pardon the pun :) ).

Resources