Publish a collection multiple times Meteor js - pagination

I have a collection that I have to publish as a whole as well in part. The challenge now is that once I publish as a whole it overrides the one that is suppose to return only 5 at a time. The publishing with a set limit is to achieve pagination while publishing all goes into a dropdown box. How do I publish a collection so that none will override another?
This is publishing in part. Set with a limit of 5.
Meteor.publish('userSchools', function (skipCount) {
check(skipCount, Number);
user = Meteor.users.findOne({_id:this.userId})
if(user) {
if(user.emails[0].verified) {
return SchoolDb.find({userId: Meteor.userId()}, {limit: 5, skip: skipCount});
} else {
throw new Meteor.Error('Not authorized');
return false;
}
}
});
Published as a whole
Meteor.publish('allvalues', function () {
user = Meteor.users.findOne({_id:this.userId})
if(user) {
if(user.emails[0].verified) {
return SchoolDb.find({userId: Meteor.userId()});
} else {
throw new Meteor.Error('Not authorized');
return false;
}
}
});

This how Meteor pub-sub behaves. What you can do is put limit and skipcount in Subscribed collection as well inside template where you subscribe in parts.

Related

Angular 5 will not run more than one method on ngOnInit?

I have attached a screenshot of what I am trying to do. This is so basic yet so frustrating. I have to run a data parse after retrieving the array of objects from the first method being called but I can't add my method to the one inside ngOnInit or directly after it inside ngOnInit. Either way the method just simply doesn't run. Any ideas?
Image
ngOnInit() {
this.getSiteContent(this.route.snapshot.params['id']);
//Doesnt work
this.addUpdatedPages();
}
//in use
getSiteContent(id) {
this.http.get('/site-content/'+id).subscribe(data => {
this.siteContent = data;
});
//Doesn't show..
console.log('End of getSiteContent');
}
addUpdatedPages(){
//Doesn't show
console.log('Adding pages...');
for (var i = 0; i < this.siteContent.length; i++) {
this.checkNull(this.siteContent[i].SiteID, this.siteContent[i].SitePageID);
console.log(this.nullCheck[0].SiteID);
if (this.nullCheck.length > 0) {
this.siteContent[i].SitePageContent = this.nullCheck[0].SitePageContent;
}
}
}
Everything points to an unhandled exception when you call this.http.get. You should check your browsers console, that would show it if there was one. One likely reason is that http was not injected or is undefined.
ngOnInit() {
this.getSiteContent(this.route.snapshot.params['id']);
// if the above throws an exception anything below would not be called
this.addUpdatedPages();
}
getSiteContent(id) {
this.http.get('/site-content/'+id).subscribe(data => {
this.siteContent = data;
});
// If the call above to this.http.get throws an exception the code below would not be called
console.log('End of getSiteContent');
}
That being said the method addUpdatedPages should be called in the subscribe of the http.get because you want it to occur after the data base been retrieved. Modify the getSiteContent so that the line is moved into the callback for the observable's subscribe call.
this.http.get('/site-content/'+id).subscribe(data => {
this.siteContent = data;
this.addUpdatedPages();
});

Nodejs step through array and finish each step before moving on

I'm having troubles processing a queue that I've got stored in Redis.
Basically the queue in Redis is a simple array of IDs that I want to step through one by one.
My current code:
async.forEach(data, function(n, done) {
redisClient.hgetall("visitor:" + n, function(err, visitor) {
if (visitor != null) {
agentOnlineCheck(visitor['userID'], function(online) {
if (online == true) {
console.log("We are done with this item, move on to the next");
} else {
console.log("We are done with this item, move on to the next");
}
});
} else {
console.log("We are done with this item, move on to the next");
}
});
}, function() {
console.log("I want this to fire when all items in data are finished");
});
I use the async library and above the var data represents an array such as:
['232323', '232423', '23232']
I want to loop through the array but one ID at a time. And not move on to the next ID until the previous one has run through all the callbacks.
Is this somehow possible?
You can use async.eachSeries instead of async.forEach.
c.f.: https://github.com/caolan/async#eachSeries

how to make this function async in node.js

Here is the situation:
I am new to node.js, I have a 40MB file containing multilevel json file like:
[{},{},{}] This is an array of objects (~7000 objects). Each object has properties and a one of those properties is also an array of objects
I wrote a function to read the content of the file and iterate it. I succeeded to get what I wanted in terms of content but not usability. I thought that I wrote an async function that would allow node to serve other web requests while iterating the array but that is not the case. I would be very thankful if anyone can point me to what I've done wrong and how to rewrite it so I can have a non-blocking iteration. Here's the function that handles the situation:
function getContents(callback) {
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
js = JSON.parse(data);
callback();
return;
});
}
getContents(iterateGlobalArr);
var count = 0;
function iterateGlobalArr() {
if (count < js.length) {
innerArr = js.nestedProp;
//iterate nutrients
innerArr.forEach(function(e, index) {
//some simple if condition here
});
var schema = {
//.....get props from forEach iteration
}
Model.create(schema, function(err, post) {
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
count++;
process.nextTick(iterateGlobalArr);
}
else {
console.log("\nIteration finished");
next();
}
Just so it is clear how I've tested the above situation. I open two tabs one loading this iteration which takes some time and second with another node route which does not load until the iteration is over. So essentially I've written a blocking code but not sure how to re-factor it! I suspect that just because everything is happening in the callback I am unable to release the event loop to handle another request...
Your code is almost correct. What you are doing is inadvertently adding ALL the items to the very next tick... which still blocks.
The important piece of code is here:
Model.create(schema, function(err, post) {
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
// add EVERYTHING to the very same next tick!
count++;
process.nextTick(iterateGlobalArr);
Let's say you are in tick A of the event loop when getContents() runs and count is 0. You enter iterateGlobalArr and you call Model.create. Because Model.create is async, it is returning immediately, causing process.nextTick() to add processing of item 1 to the next tick, let's say B. Then it calls iterateGlobalArr, which does the same thing, adding item 2 to the next tick, which is still B. Then item 3, and so on.
What you need to do is move the count increment and process.nextTick() into the callback of Model.create(). This will make sure the current item is processed before nextTick is invoked... which means next item is actually added to the next tick AFTER the model item has been created... which will give your app time to handle other things in between. The fixed version of iterateGlobalArr is here:
function iterateGlobalArr() {
if (count < js.length) {
innerArr = js.nestedProp;
//iterate nutrients
innerArr.forEach(function(e, index) {
//some simple if condition here
});
var schema = {
//.....get props from forEach iteration
}
Model.create(schema, function(err, post) {
// schedule our next item to be processed immediately.
count++;
process.nextTick(iterateGlobalArr);
// then move on to handling this result.
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
}
else {
console.log("\nIteration finished");
next();
}
}
Note also that I would strongly suggest that you pass in your js and counter with each call to iterageGlobalArr, as it will make your iterateGlobalArr alot easier to debug, among other things, but that's another story.
Cheers!
Node is single-threaded so async will only help you if you are relying on another system/subsystem to do the work (a shell script, external database, web service etc). If you have to do the work in Node you are going to block while you do it.
It is possible to create one node process per core. This solution would result in only blocking one of the node processes and leave the rest to service your requests, but this feature is still listed as experimental http://nodejs.org/api/cluster.html.
A single instance of Node runs in a single thread. To take advantage
of multi-core systems the user will sometimes want to launch a cluster
of Node processes to handle the load.
The cluster module allows you to easily create child processes that
all share server ports.

Node JS best practices for static string mappers

I am running a Node JS app (Sails JS, but shouldn't matter) and MongoDB (also shouldn't matter).
I have a database model that the WebApp saves many instances of it. For simplicity, let us take a UserActivityHistory model example (the example is not my real case, so nevermind the logic).
One of the members of the model is "ActivityType", which can be a medium-long text but can be chosen from a static set. For example:
ActivityType may be one of the following:
"User added a new account".
"User changed account password".
So, assume that I am going to save thousands of instances, I prefer to save activity type identifier/code for memory performance. For example:
Instead of saving "User added a new account", the WebApp should save "UANC".
When the user wants to view the "UserActivityHistory", I want to have a static mapper (maybe global variable?) that is some kind of dictionary as follows:
["UNAC"] = "User added a new account"
["UCAP"] = "User changed account password"
Which means, I receive the code of the activity and returns the text from a static dictionary that is saved in the server memory running the application.
What is the most efficient way and best practices to achieve this mapping?
The ways I thought of:
1) Global variables: saving a global dictionary and add/remove values from it by API calls. I am not sure if this is a wise idea, I still do not see why not, but still deep in my heart something is telling me do not trust global variables.
2) Saving mappers in another database: this requires another call to the database for each set of activities. (doesn't seem so heavy to me, if I load many activities at once, but still there could be a better solution).
Note that I want to add dynamic activities, so having a function with a switch case is not a proper solution.
If you want to avoid having to redeploy your app everytime a string changes then I would recommend you look at pulling these from some persistant storage (e.g. external file / DB).
Regardless of which you choose, given they are application settings, you would pull them once either as the app starts or on first access. This would give you the flexibility of changing them out-with the app (or perhaps even from the app).
To answer your comments
If I pull them as the app starts on first access, where should I save them? global variable?
Well the natural thing to do here would be to have it as it's own module, this would give you a good place to have other methods for manipulating the list & persisting to storage. If you go for the singleton approach then you could load the resource on first access e.g.
ActivityType.js
var fs = require('fs');
var cache = null;
function loadCache(callback) {
if (!cache) {
fs.readFile('/path-to-strings', function (err, data) {
if (!err) {
// for the purpose of the example, lets assume you are storing the
// strings as a JSON object e.g. { 'key1': 'value1', 'key2': 'value2' }
cache = JSON.parse(data);
callback(null, cache);
} else {
callback(err, null);
}
});
} else {
callback(null, cache);
}
};
module.exports = {
get: function(key, callback) {
loadCache(function(err, cache) {
callback(err, (cache || {})[key]);
});
},
add: function(key, value, callback) {
loadCache(function(err, cache) {
if (err) {
if (callback) callback(err);
} else {
cache[key] = value;
}
});
},
remove: function(key, callback) {
loadCache(function(err, cache) {
if (err) {
if (callback) callback(err);
} else {
delete cache[key];
}
});
},
save: function(callback) {
loadCache(function(err, cache) {
if (err) {
if (callback) callback(err);
} else {
var data = JSON.stringify(cache, null, 4); // save formatted for easier editing
fs.writeFile('/path-to-strings', data, callback);
}
});
}
};
App.js
var ActivityType = require('./ActivityType');
...
ActivityType.get('UNAC', function(err, text) {
if (!err) {
console.log(text);
} else {
console.log('Unable to load string');
}
});
ActivityType.add('UCAP', 'User changed account password');
ActivityType.remove('UNAC');
ActivityType.save(function(err) {
if (!err) {
console.log('Saved successfully!');
} else {
console.log('Unable to save strings');
}
});
Alternatively, you could have an explicit load or init method you could call during app startup.
How expensive is the redeployment of the app?
That's not something I could answer as there is not enough information here, only you know how expensive a redeployment of your app would be. Regardless, surely updating a DB record or changing a resource file is far simpler than a redeployment?

different redirects during different promise fails causes Error: cannot call http.ServerResponse.end()

I have a chain of promises that could fail at different points. Depending on where it fails, I might want it to do different things. In some places I might want to render a page, other places redirect. The problem I'm finding is when it runs through all the fail functions and then has errors with http.ServerResponse.end() being called numerous times.
Example:
Parse.Promise.as(1).then(function() {
if (apples) {
return apples.fetch().fail(function() { res.redirect('/somewhere') } );
} else {
return {};
}
}).then(function() {
// doing other stuff
},function() {
res.redirect('/elsewhere');
}).fail(function() {
res.render('error.ejs');
});
Should I be doing this a different way?
( I start with as(1) just to get into the promise chain since there are two different starting cases of apples and !apples that both need to continue to the next part of the chain, but only one of them could start the chain since {} can't. Not sure if that's the best way to do that either. )
Just like exceptions, you can throw different errors and get to different places.
Parse.Promise.as(1).then(function() {
if (apples) {
return apples.fetch();
} else {
throw new Error("No Apples");
}
}).then(function() {
// doing other stuff
}).then(null,function(e) {
if(e.message !== "No Apples"){ // can also subclass Error
res.render("error.ejs");
} else {
res.redirect("/elsewhere");
}
});
Note, like I said in your different question - what you really want is .then(null,function(){ rather than .fail usually. It's a poor name choice for .fail on their side imo.
Because I like them, here is the synchronous analogy:
try {
if(apples){
var a = apples.fetch(); // might throw too
} else{
throw new Error("No Apples");
}
}catch(e){
if(e.message !== "No Apples"){ // can also subclass Error
res.render("error.ejs");
} else {
res.redirect("/elsewhere");
}
}

Resources