Variable query parameter for MongoDB + NodeJS + ExpressJS - node.js

What I'm trying to do is pass a parameter into a get() method (I am currently using Express as my middleware). Right now, I'm able to pick up which color is passed, and see if it's undefined or not. However, it seems like the query doesn't seem to work.
The color is defined by an option that the user selects (in this case, a link). It seems to pass the parameter fine - it's just the query that I can't seem to figure out.
Here's the code that I'm using right now:
router.get('/cards/cardlist', function(req, res) {
console.log('get /cardlist');
var reqColor = req.query.color;
var query = {};
query[reqColor] = 1;
var db = req.db;
if(reqColor !== undefined) {
db.collection('creaturecards').find(query).toArray(function (err, items) {
res.json(items);
});
} else {
db.collection('creaturecards').find().toArray(function (err, items) {
res.json(items);
});
}
});
basically, if I manually add in (for example) red : 1, then it's able to pick up red just fine.
For reference, not every record will have a color saved to it. If the color isn't set, then I just don't have it in the record - that seemed to be the best setup for this particular project.
Thanks in advance!

Related

node-postgres why is req.query.record empty?

When I start my nodejs app (with dust) on localhost:3000 the following code is used to grab all rows from my products tables (postgres db) and display them on my index page. It seemed to work fine last week.
app.get('/', async (req, res) => {
var parms = req.query.record;
const results = await pool.query('SELECT * FROM products WHERE id = $1',[parms], function (err, result) {
if (err) {
return console.error('error running query', err);
}
res.render('index', { products: result.rows });
});
});
For some reason now "req.query.record" is empty "[ ]" so no results are displayed on my index page.
If I navigate to localhost:3000/?record=3 it works fine and I can see the one record on my index page.
Also, if I edit the "results" in the original code and change [parms] to [3] then it works fine as well and displays that one record on my index page.
what would cause var parms = req.query.record; to return an empty object? Is that the expected behavior?
Okay, I think I have it. Bergi (it is the request) is correct and it was the expected behavior. I started the code from a tutorial posted before July 2017. the original pg.connect was hard deprecated. pg.connect not a function?
As a result I started hacking together something with the new pg.Pool(...).connect(...). The method described in my original question must have gotten mushed up with something else. As far as I know there is no need to process any query string from the index page when I am trying to return all rows from a table.
There is nothing after http://localhost:3000 which is why the result was an empty object. that is the expected behavior. http://localhost:3000/?record=3 did return the one row because req.query.record picked out [3] from the query string which is the expected behavior. The method in the original question would be used for returning one row from the table when the id is passed in the query string.
for my purposes, returning all rows, I replaced my app.get with...
app.get('/', async (req, res) => {
await pool.query('SELECT * FROM library', function (err, result) {
if (err) {
return console.error('error running query', err);
}
res.render('index', { recipes: result.rows });
});
});
If I wanted to use both methods I guess I would first run a check to see if there is a query string then proceed with the original app.get if true. else run the second app.get method to get all rows.
I just started with NodeJS and Postgres but am not sure why I didn't notice that from the get-go. Not sure if this will be a useful question/answer for anyone but I'll leave it up.

Google Cloud Datastore, how to query for more results

Straight and simple, I have the following function, using Google Cloud Datastore Node.js API:
fetchAll(query, result=[], queryCursor=null) {
this.debug(`datastoreService.fetchAll, queryCursor=${queryCursor}`);
if (queryCursor !== null) {
query.start(queryCursor);
}
return this.datastore.runQuery(query)
.then( (results) => {
result=result.concat(results[0]);
if (results[1].moreResults === _datastore.NO_MORE_RESULTS) {
return result;
} else {
this.debug(`results[1] = `, results[1]);
this.debug(`fetch next with queryCursor=${results[1].endCursor}`);
return this.fetchAll(query, result, results[1].endCursor);
}
});
}
The Datastore API object is in the variable this.datastore;
The goal of this function is to fetch all results for a given query, notwithstanding any limits on the number of items returned per single runQuery call.
I have not yet found out about any definite hard limits imposed by the Datastore API on this, and the documentation seems somewhat opaque on this point, but I only noticed that I always get
results[1] = { moreResults: 'MORE_RESULTS_AFTER_LIMIT' },
indicating that there are still more results to be fetched, and the results[1].endCursor remains stuck on constant value that is passed on again on each iteration.
So, given some simple query that I plug into this function, I just go on running the query iteratively, setting the query start cursor (by doing query.start(queryCursor);) to the endCursor obtained in the result of the previous query. And my hope is, obviously, to obtain the next bunch of results on each successive query in this iteration. But I always get the same value for results[1].endCursor. My question is: Why?
Conceptually, I cannot see a difference to this example given in the Google Documentation:
// By default, google-cloud-node will automatically paginate through all of
// the results that match a query. However, this sample implements manual
// pagination using limits and cursor tokens.
function runPageQuery (pageCursor) {
let query = datastore.createQuery('Task')
.limit(pageSize);
if (pageCursor) {
query = query.start(pageCursor);
}
return datastore.runQuery(query)
.then((results) => {
const entities = results[0];
const info = results[1];
if (info.moreResults !== Datastore.NO_MORE_RESULTS) {
// If there are more results to retrieve, the end cursor is
// automatically set on `info`. To get this value directly, access
// the `endCursor` property.
return runPageQuery(info.endCursor)
.then((results) => {
// Concatenate entities
results[0] = entities.concat(results[0]);
return results;
});
}
return [entities, info];
});
}
(except for the fact, that I don't specify a limit on the size of the query result by myself, which I have also tried, by setting it to 1000, which does not change anything.)
Why does my code run into this infinite loop, stuck on each step at the same "endCursor"? And how do I correct this?
Also, what is the hard limit on the number of results obtained per call of datastore.runQuery()? I have not found this information in the Google Datastore documentation thus far.
Thanks.
Looking at the API documentation for the Node.js client library for Datastore there is a section on that page titled "Paginating Records" that may help you. Here's a direct copy of the code snippet from the section:
var express = require('express');
var app = express();
var NUM_RESULTS_PER_PAGE = 15;
app.get('/contacts', function(req, res) {
var query = datastore.createQuery('Contacts')
.limit(NUM_RESULTS_PER_PAGE);
if (req.query.nextPageCursor) {
query.start(req.query.nextPageCursor);
}
datastore.runQuery(query, function(err, entities, info) {
if (err) {
// Error handling omitted.
return;
}
// Respond to the front end with the contacts and the cursoring token
// from the query we just ran.
var frontEndResponse = {
contacts: entities
};
// Check if more results may exist.
if (info.moreResults !== datastore.NO_MORE_RESULTS) {
frontEndResponse.nextPageCursor = info.endCursor;
}
res.render('contacts', frontEndResponse);
});
});
Maybe you can try using one of the other syntax options (instead of Promises). The runQuery method can take a callback function as an argument, and that callback's parameters include explicit references to the entities array and the info object (which has the endCursor as a property).
And there are limits and quotas imposed on calls to the Datastore API as well. Here are links to official documentation that address them in detail:
Limits
Quotas

code explanation nodejs expressjs mongoose

i feel a bit embarrassed, can you please kindly explain parts of the code?
For example, I have no idea, what is this part? where can I read more about it?
function parsePostStory(data) {
return {
name : data.name
}
}
What is req.body? Is it json req body?
Why do we declare empty array and why do we return it? Just for the clarity?
Is Story.create just a mongoose method?
The rest of the code is here:
router.post('/stories', function(req, res) {
var validation = validatePostStory(req.body);
if(validation.length > 0) {
return res.badRequestError(validation);
}
var story = parsePostStory(req.body);
Story.create(story, function(err, story) {
if(err) {
console.log(err.message);
return res.internalServerError();
} res.send(story);
});
});
function validatePostStory(data) {
var array = [];
if (!data.name || typeof data.name !== 'String') {
return array.push('name');
}
return array;
}
function parsePostStory(data) {
return {
name : data.name
}
}
Sorry once more for that kind of a question and thanks a ton.
I'm assuming you know how the request-response cycle works with HTTP requests and the client-server interactions with it. If not, Wikipedia Request-Response and Client-Server (Two link limit, otherwise I would have posted them as links)
A request sends a lot of information to the server. If you console.log the request in NodeJS, you will see that it contains a lot of information that isn't entirely relevant to what you need.
You're using Express as your web framework. In this case, req.body is the information that you are sending to the server from the client. Using req.body will make sure that you're not using the extra information passed in to the server from the client. Req.body is your code that you want. (Note: Req.body isn't natively supported by Express v4, you'll have to use something like body-parser) See Express docs for more details
Now, let's break up this code a bit. You essentially have 3 separate functions. Let's take a look at validatePostStory.
function validatePostStory(data) {
var array = [];
if (!data.name || typeof data.name !== 'String') {
return array.push('name');
}
return array;
}
This function is a validation function. It takes one argument - an object and returns an array. Effectively, what this is doing is checking if the name is a string or not - if not, return an array that has a length of 1. The following conditional checks length and returns a 400 if greater than 0
if(validation.length > 0) {
return res.badRequestError(validation);
}
I'm not entirely sure why this needs to be a separate function. Looks like you can probably just do this instead.
if (!req.body.name || typeof req.body.name !== 'String') {
return res.badRequestError(validation);
}
The following function function essentially converts the data so that mongodb/mongoose can store it in the proper format
function parsePostStory(data) {
return {
name : data.name
}
}
It's the same as saying:
var story = {name: req.body.name}
I would assume Story.create is a custom mongoose method yes.

NodeJS readdir() function always being run twice

I've been trying to pick up NodeJS and learning more for backend development purposes. I can't seem to wrap my mind around Async tasks though and I have an example here that I've spent hours over trying to search for the solution.
app.get('/initialize_all_pictures', function(req, res){
var path = './images/';
fs.readdir(path, function(err, items){
if (err){
console.log("there was an error");
return;
}
console.log(items.length);
for(var i = 0; i<items.length; i++){
var photo = new Photo(path + items[i], 0, 0,Math.floor(Math.random()*1000))
photoArray.push(photo);
}
});
res.json({"Success" : "Done"});
});
Currently, I have this endpoint that is supposed to look through a directory called images and create "Photo" objects and push it into a global array called PhotoArray. It works, except the function for readdir is always being called twice.
console.log would always give output of
2
2
(I have two items in the directory).
Why is this?
Just figured out the problem.
I had a chrome extension that would help me format JSON values from HTTP requests. Unfortunately, the extension actually made an additional call to the endpoint therefore whenever I would point my browser to the endpoint, the function would end up getting called twice!

Limiting Number of Collection Records in Loopback Model

What I need to do is limit a particular collection (in my case logs) to 100 records. That is, when there are 100 logs and a new one is added, the oldest one is destroyed.
I know I can do this in mongo by setting the capped/size/max values, but I also want to have some code for more advanced query filters down the road.
I'm looking to introduce this as an Operation Hook (http://docs.strongloop.com/display/public/LB/Operation+hooks#Operationhooks-access), but I can't figure out how to query the Model in question and remove the last record if a threshold has been exceeded. Right now I just setup an Access hook that's just checks that the threshold has not been met, if it has it will delete the last record. This would ultimately be done on the "before create" hook, but doing it this way is easier for testing.
Here's some Pseudocode (common/models/log.js):
module.exports = function (Log) {
Log.observe('access', function logQuery (ctx, next) {
var threshold = 10;
var logs = Log.find({});
if (logs.length > threshold) {
logs[logs.length].delete // delete last record
}
next();
});
};
This obviously doesn't work, just hoping it gives a clue to what I'm trying to do.
Thanks.
I think what you're looking for is actually a remote hook and then the PersistedModel's count() method:
module.exports = function(Log) {
Log.afterRemote('create', function accessCount(ctx, instance, next) {
console.log('AFTER CREATE of a new Log');
Log.count(function(err, count) {
if (err) {
// let this one go maybe? if not, call: next(err);
}
console.log('There are now ' + count + ' Log records.');
// do whatever you need to here
next();
});
});
};

Resources