Have require.js load my files only when I actually need them - requirejs

I know this is the whole point of require.js, but it does not behave this way in my situation.
I am creating a single page Backbone.js application. The main entry point to the application is trough a router. Let's say I have 3 routes:
users: function(){
require('users');
},
products: function(){
require('products');
},
groups: function(){
require('groups');
},
Based on the function I call I want to load the file, but require.js does not do this.
Instead it downloads all the files for my complete website everywhere there is a require. I haven't even called the function but it loads the file.
Is there a way to have require.js behave as it should, and download the file only when I am actually inside the function.

You can basically do what you have in your code snippet. It'd look something like this:
users: function(){
require(['users'], function(Users){
//Code
});
},
products: function(){
require(['products'], function(Products){
//Code
});
},
groups: function(){
require(['groups'], function(Groups){
//Code
});
},

Related

Display dynamically an image using express and EJS

I have a collection containing different URLs of images. I retrieve the URL I want and want to pass it to the jade template like:
app.get('/',function(req,res){
mongoDB.getUsedHomePageOne(function(err, result){
if(!err){
console.log("getUsedHomePageOne : ");
console.log(result);
app.locals['homePageImg'] = result.url;
}
});
app.render('userPageEjs.html',function(err,renderedData){
console.log(renderedData);
res.send(renderedData);
});
});
and the getUsedHomePageOne looks like:
DBMongo.prototype.getUsedHomePageOne = function(callback){
this.homePageColl.findOne({used:1}, callback);
};
and in the jade template:
<img src="<%= homePageImg %>"/>
So this won't work except if I load twice the page, I assume because it gets cached and is computed quickly enough or something.
What is the proper way of doing it?
PS: the 2nd time I load the page, everything will load correctly.
PS2: I don't want to delay the rendering for the image, I would like to load the image once it is ready, but render the HTML page before anyway.
From what I've gathered in your code:
app.get('/',function(req,res){
mongoDB.getUsedHomePageOne(function(err, result){
if(!err){
console.log("getUsedHomePageOne : ");
console.log(result);
app.locals['homePageImg'] = result.url;
app.render('userPageEjs.html',function(err,renderedData){
console.log(renderedData);
res.send(renderedData);
});
}
});
});
Basically, you have an async function to the DB and you quickly render the template before waiting for the DB function to complete. The normal pattern when using async functions whose results should be used down the line, you have to call the next function inside the async function. However, this might lead to callback hell (similar to how I've written the fix above), so an alternative like Promises or async.js is usually preferred.

Injection code is there in spite of ngmin

Why does jhipster have these duplicate dependency lists in the source files even though it uses ngmin to automatically add the injected dependencies as part of the build tasks so it's safe for minification?
For example, there is this code in services.js,
jhipsterApp.factory('Register', ['$resource',
function ($resource) {
return $resource('app/rest/register', {}, {
});
}]);
but since it uses ngmin, I would expect something like this, without the array and extra "$resource"
jhipsterApp.factory('Register', function ($resource) {
return $resource('app/rest/register', {}, {
});
});
Is it just that the code was like this before ngmin was introduced and hasn't been simplified yet? Maybe I don't understand it properly.

Block function whilst waiting for response

I've got a NodeJS app i'm building (using Sails, but i guess that's irrelevant).
In my action, i have a number of requests to other services, datasources etc that i need to load up. However, because of the huge dependency on callbacks, my code is still executing long after the action has returned the HTML.
I must be missing something silly (or not quite getting the whole async thing) but how on earth do i stop my action from finishing until i have all my data ready to render the view?!
Cheers
I'd recommend getting very intimate with the async library
The docs are pretty good with that link above, but it basically boils down to a bunch of very handy calls like:
async.parallel([
function(){ ... },
function(){ ... }
], callback);
async.series([
function(){ ... },
function(){ ... }
]);
Node is inherently async, you need to learn to love it.
It's hard to tell exactly what the problem is but here is a guess. Assuming you have only one external call your code should look like this:
exports.myController = function(req, res) {
longExternalCallOne(someparams, function(result) {
// you must render your view inside the callback
res.render('someview', {data: result});
});
// do not render here as you don't have the result yet.
}
If you have more than two external calls your code will looks like this:
exports.myController = function(req, res) {
longExternalCallOne(someparams, function(result1) {
longExternalCallTwo(someparams, function(result2) {
// you must render your view inside the most inner callback
data = {some combination of result1 and result2};
res.render('someview', {data: data });
});
// do not render here since you don't have result2 yet
});
// do not render here either as you don't have neither result1 nor result2 yet.
}
As you can see, once you have more than one long running async call things start to get tricky. The code above is just for illustration purposes. If your second callback depends on the first one then you need something like it, but if longExternalCallOne and longExternalTwo are independent of each other you should be using a library like async to help parallelize the requests https://github.com/caolan/async
You cannot stop your code. All you can do is check in all callbacks if everything is completed. If yes, go on with your code. If no, wait for the next callback and check again.
You should not stop your code, but rather render your view in your other resources callback, so you wait for your resource to be reached before rendering. That's the common pattern in node.js.
If you have to wait for several callbacks to be called, you can check manually each time one is called if the others have been called too (with simple bool for example), and call your render function if yes. Or you can use async or other cool libraries which will make the task easier. Promises (with the bluebird library) could be an option too.
I am guessing here, since there is no code example, but you might be running into something like this:
// let's say you have a function, you pass it an argument and callback
function myFunction(arg, callback) {
// now you do something asynchronous with the argument
doSomethingAsyncWithArg(arg, function() {
// now you've got your arg formatted or whatever, render result
res.render('someView', {arg: arg});
// now do the callback
callback();
// but you also have stuff here!
doSomethingElse();
});
});
So, after you render, your code keeps running. How to prevent it? return from there.
return callback();
Now your inner function will stop processing after it calls callback.

Express.js - ASP.NET-like MVC Routing

I'm trying to setup an MVC architecture for Express. What I am trying to accomplish is a routing mechanism close to ASP.NET's. For example for the following route:
/users/detail/1
express should call a module under controllers directory named users.js. Within the users.js module is a function named detail. And within the function, I can simply get the request parameter to get the id of the user.
My idea is to extract the users and map it to a users.js file using a simple require statement. But how can I tell express to call details() function by simply extracting the action part of the route which is 'detail' in the above example. I can use eval() but I am hearing that it's not a safe thing to do? Thanks in advance.
In browser-side javascript, you can typically do the following
function a () { console.log('called a');
window['a'](); // called a
You can do similar in node by replacing window with global such as
function a () { console.log('called a');
global['a'](); // called a
However, if you are pulling this function in from another file, it will be little different. Let's assume that you have the following file a_module.js:
exports.a = function () { console.log('a called'); }
And then in you're main file, you can do the following:
var a_mod = require('./a_module.js');
a_mod['a'](); // a called

Design pattern for many asynchronous tasks in node

I'm learning node and writing an API. One of my API calls takes a parameter called Tags, which will contain comma-delimited tags, each of which I want to save to disk (I'm using MongoDB + Mongoose). Typically when I save to DB in my API I pass a callback and carry on after the save inside of that callback, but here I have a variable number of objects to save to disk, and I don't know the cleanest way to save all of these tags to disk, then save the object which references them afterward. Can anyone suggest a clean async pattern to use? Thanks!
async is a good node library for these tasks..
run multiple async calls in parallel or in series and trigger one single callback after that:
async.parallel([
function(){ ... },
function(){ ... }
], callback);
async.series([
function(){ ... },
function(){ ... }
]);
This is common code pattern I often use when I don't want additional dependencies:
var tags = ['tag1', 'tag2', 'tag3'];
var wait = tags.length;
tags.forEach(function (tag) {
doAsyncJob(tag, done);
});
function done() {
if (--wait === 0) allDone();
}
This code will run doAsyncJob(tag, callback) in parallel for each item of array, and call allDone when each job completed. If you need to process data continuously (each after another), here's another pattern:
(function oneIteration() {
var item = tags.shift();
if (item) {
doAsyncJob(item, oneIteration);
} else {
allDone();
}
})();

Resources