Azure Redis Cache cache.set() Not operable within Promise.then? - node.js

This one has had me spinning my tires for about 2 days now, I'm ready to reach out for help :)
I have a google firebase functions app, running as a middle-ware to an angular SPA. Hoping to avoid some of the pay-by-use cost of Azure SQL, I wanted to implement a caching option for the most common queries.
I thought I knew redis, I've worked with it before. There's a simple enough example on the repo: https://www.npmjs.com/package//redis
Everything works fine, if it is top level.
But the way my application is built, i need the ability to set a cache value from within the .then of a Promise, and when I try to do that, all operation just stops, with no identifiable error logging, or even response from redis. Even in Azure insights, i'm not getting much feed-back, only that the 'set' operation isn't being counted in metrics.
So, just to clarify, this works:
// "cache", the object, set globally
export const testCache = functions.https.onRequest(
async (req: any, res) => {
await cache.connect();
var redisKey = 'testing_global';
var result = await cache.get(redisKey);
await cache.set(redisKey, 'testing new class')
console.log("\nDone");
cache.disconnect();
res.send('done');
})
But, this does not:
import { createClient } from 'redis';
const cache = createClient({
url: "rediss://" + process.env.REDIS_HOST_NAME + ":6380",
password: process.env.REDIS_KEY,
});
export const getValues = functions.https.onRequest(
(req: any, response) => {
cors(req, response, async () => {
response.set('Access-Control-Allow-Origin', origin);
var searchText = req.body['search'];
var offset = req.body['offset'];
var fetch = req.body['fetch'];
var x = req.body['x'];
var y = req.body['y'];
var districts = req.body['districtFilter'];
var sort = req.body['sort'];
if (offset) {
if (!/^\d+$/.test(offset))
throw new Error('bad number');
} else {
offset = 0;
}
if (fetch) {
if (!/^\d+$/.test(fetch))
throw new Error('bad number');
} else {
fetch = 200;
}
if (x) {
if (!/^-?\d+$/.test(x))
throw new Error('bad number');
} else {
x = null;
}
if (y) {
if (!/^-?\d+$/.test(y))
throw new Error('bad number');
} else {
y = null;
}
if (districts && districts.length > 0) {
districts = sanitizeStringArray(districts);
} else {
districts = null;
}
if (sort) {
switch (sort) {
case "scoreAsc":
case "scoreDesc":
case "priceAsc":
case "priceDesc":
break;
default:
sort = '';
}
}
var redisKey = `get_values_${searchText}_${offset}_${fetch}_${x}_${y}_${districts}_${sort}`;
var cacheResult: any = null;
await cache.connect();
// because I want to end up in the 'else', for testing
cacheResult = await cache.getFromCache('someOtherKey');
if (null !== cacheResult) {
response.send({
"status": "success",
"totalCount": cacheResult.totalCount,
"data": cacheResult.result
});
cache.disconnect();
} else {
var connection = new Connection(sqlConfig);
var totalCount: number = 0;
connection.on('connect', function (err: any) {
// If no error, then good to proceed.
console.log("Connected");
var sql = `EXEC SomeSPC;`;
const sqlRequest = new Request(sql, function (err: any) {
if (err) {
console.log(err);
}
});
const countRequest = new Request(
`EXEC SomeOtherSPC;`
, function (err: any) {
if (err) {
console.log(err);
}
}
)
sqlRequest.connection = connection;
countRequest.connection = connection;
var result: any[] = [];
sqlRequest.on('row', function (columns: any[]) {
var rowResult: any = {};
columns.forEach(function (column: any) {
rowResult[column['metadata']['colName']] = column['value'];
});
result.push(rowResult);
});
sqlRequest.on("requestCompleted", function (rowCount: any, more: any) {
console.log(rowCount + ' rows returned');
connection.execSql(countRequest);
countRequest.on('row', function (columns: any[]) {
totalCount = columns[0]['value'];
});
countRequest.on('requestCompleted', async function (rowCount: any, more: any) {
connection.close();
cacheResult = {
totalCount: totalCount
, result: result
};
// ******************************************************************
// Does Not Work, Just Fails, Without Much to Go On
await cache.set(redisKey, 'in Promise')
response.send({
"status": "success",
"totalCount": totalCount,
"data": result
});
})
});
connection.execSql(sqlRequest);
});
connection.on('infoMessage', infoError);
connection.on('errorMessage', infoError);
connection.on('end', end);
connection.on('debug', debug);
connection.connect();
console.log("Reading rows from the Table...");
}
})
}
)
There's **update- no longer ** a fair amount of psuedo-code here, so please **update- No Need to ** disregard any inconsistent lines. I went ahead and put in the full function, including all the fluff, since trimming the fat seems to make things difficult for others to understand what is being asked.
The sql stuff all works, if i take out the cache.set() everything is fine, but that line, in the result of the Promise, just fails, and I can't figure out why.
I've tried using cache locally and globally, extracting the cache operations to a function, and then to a separate class, and in all cases, i'm getting the same result.
Is there a known reason this wouldn't work?

As far as I can understand, given you didn't provide a reproducible example, your code is binding to the requestCompleted event for the second request only after it runs execSql(): I would suggest moving the binding block before that, otherwise the event may be skipped.

Related

How to connect to Mongodb reliably in a serverless setup?

8 out of ten times everything connects well. That said, I sometimes get a MongoClient must be connected before calling MongoClient.prototype.db error. How should I change my code so it works reliably (100%)?
I tried a code snippet from one of the creators of the Now Zeit platform.
My handler
const { send } = require('micro');
const { handleErrors } = require('../../../lib/errors');
const cors = require('../../../lib/cors')();
const qs = require('micro-query');
const mongo = require('../../../lib/mongo');
const { ObjectId } = require('mongodb');
const handler = async (req, res) => {
let { limit = 5 } = qs(req);
limit = parseInt(limit);
limit = limit > 10 ? 10 : limit;
const db = await mongo();
const games = await db
.collection('games_v3')
.aggregate([
{
$match: {
removed: { $ne: true }
}
},
{ $sample: { size: limit } }
])
.toArray();
send(res, 200, games);
};
module.exports = handleErrors(cors(handler));
My mongo script that reuses the connection in case the lambda is still warm:
// Based on: https://spectrum.chat/zeit/now/now-2-0-connect-to-database-on-every-function-invocation~e25b9e64-6271-4e15-822a-ddde047fa43d?m=MTU0NDkxODA3NDExMg==
const MongoClient = require('mongodb').MongoClient;
if (!process.env.MONGODB_URI) {
throw new Error('Missing env MONGODB_URI');
}
let client = null;
module.exports = function getDb(fn) {
if (client && !client.isConnected) {
client = null;
console.log('[mongo] client discard');
}
if (client === null) {
client = new MongoClient(process.env.MONGODB_URI, {
useNewUrlParser: true
});
console.log('[mongo] client init');
} else if (client.isConnected) {
console.log('[mongo] client connected, quick return');
return client.db(process.env.MONGO_DB_NAME);
}
return new Promise((resolve, reject) => {
client.connect(err => {
if (err) {
client = null;
console.error('[mongo] client err', err);
return reject(err);
}
console.log('[mongo] connected');
resolve(client.db(process.env.MONGO_DB_NAME));
});
});
};
I need my handler to be 100% reliable.
if (client && !client.isConnected) {
client = null;
console.log('[mongo] client discard');
}
This code can cause problems! Even though you're setting client to null, that client still exists, will continue connecting to mongo, will not be garbage collected, and its callback connection code will still run, but in its callback client will refer to the next client that's created that is not necessarily connected.
A common pattern for this kind of code is to only ever return a single promise from the getDB call:
let clientP = null;
function getDb(fn) {
if (clientP) return clientP;
clientP = new Promise((resolve, reject) => {
client = new MongoClient(process.env.MONGODB_URI, {
useNewUrlParser: true
});
client.connect(err => {
if (err) {
console.error('[mongo] client err', err);
return reject(err);
}
console.log('[mongo] connected');
resolve(client.db(process.env.MONGO_DB_NAME));
});
});
return clientP;
};
I had the same issue. In my case it was caused by calling getDb() before a previous getDb() call had returned. In this case, I believe that 'client.isConnected' returns true, even though it is still connecting.
This was caused by forgetting to put an 'await' before the getDb() call in one location. I tracked down which by outputting a callstack from getDb using:
console.log(new Error().stack);
I don't see the same issue in the sample code in the question, though it could be triggered by another bit of code that isn't shown.
I have written this article talking about serverless, lambda e db connections. There are some good concepts which could help you to find the root cause of your problem. There are also example and use cases of how to mitigate connection pool issues.
Just by looking your code I can tell it is missing this:
context.callbackWaitsForEmptyEventLoop = false;
Serverless: Dynamodb x Mongodb x Aurora serverless

Why does this firebase function run recursively?

I'm guessing this is related to not understanding promises and execution order, but I'm currently stumped why this Firebase Function (repackaged Google Cloud Functions) code runs recursively.
Currently the function executes once successfully (fetches data, writes database entry, writes file in storage), and then repeats every 15-30 seconds until it reaches the '402' error state. It is intended to only execute once.
Any help would be appreciated.
exports.add = functions.https.onRequest((req, res) => {
cors(req, res, () => {
if (req.query.idToken) {
// there's a query param
var idToken = req.query.idToken;
admin.auth().verifyIdToken(idToken)
.then(function(decodedToken) {
var uid = decodedToken.uid;
var userRef = database.ref('users/' + uid);
var feedCountRef = database.ref('users/' + uid).child('feeds');
var plansRef = database.ref('plans')
userRef.once('value', function(snapshot){
var feedsCount = snapshot.val().feeds;
var currentPlan = snapshot.val().membership;
var planRef = database.ref('plans/' + currentPlan);
planRef.once('value', function(snapshot) {
console.log(snapshot.val());
var allowedFeeds = snapshot.val().feeds;
if(feedsCount < allowedFeeds) {
fetchFeed(req.body.feedSource, function(feedData) {
var defaultFeedName = 'Untitled';
var defaultUpdateFrequency = 'Weekly';
var feedsdatabaseRef = database.ref('feeds/' + uid);
var newFeedDatabaseRef = feedsdatabaseRef.push();
var feedKey = newFeedDatabaseRef.key;
writeFeedStorage(feedKey, feedData, function(response) {
console.log(response);
newFeedDatabaseRef.set({
// write data
})
});
feedCountRef.transaction(function(feeds){
return (feeds || 0) + 1;
});
return;
});
} else {
console.log('over quota');
res.status(402).send({error: 'You are at the maximum number of feeds your plan allows.'});
}
});
})
}).catch(function(error) {
res.status(401);
});
} else {
res.status(401);
}
})
})
From your code snippet, a potential reason that it would be running repeatedly is that you are not returning an ok status if things worked out correctly, e.g.
res.status(200).send('ok');
According to the Firebase documentation, this is something you should be doing for HTTP Functions.

how to promises in a loop?

I am trying write a cron function in nodejs which fetches user_ids of all the users from the db and then I want to parse through each user_id.
Here is my code :
cron.schedule('43 11 * * *', function(){
var now = moment()
var formatted = now.format('YYYY-MM-DD HH:mm:ss')
console.log('Starting the cron boss!');
var dbSelectPromise = function(db, sql1) {
return new Promise(function(resolve, reject) {
db.select(sql1, function(err, data) {
if (err) {
reject(err)
} else {
resolve(data)
}
})
})
}
var users =[]
var sql = "select distinct(user_id) from user_level_task"
dbSelectPromise(db,sql).then(function(secondResult){
for(i=0;i<secondResult.length;i++){
var sql1 = "select max(level_id) as level from user_level_task where user_id ="+secondResult[i].user_id
dbSelectPromise(db,sql1).then(function(thirdResult){
console.log(thirdResult)
console.log(current)
var sql2 = "select task_id form user_level_task where user_id = '"+secondResult[i].user_id+"' and level_id = '"+thirdResult[0].level+"' "
dbSelectPromise(db,sql2).then(function(fourthResult){
var leng = fourthResult.length
for(i=0;i<leng;i++){
console.log(fourthResult[i])
}
})
})
}
})
});
The problem i am facing is i cannot access value of i in third and fourth promises. Please help!
I think what's happening is that i is no longer the same when you create those new promises because the for loop is still running. It appears that what you really need is the user_id and level_id. I suggest you restructure your code a bit to reduce nesting and pass on the values you need for future promises.
Perhaps something similar to this:
dbSelectPromise(db, sql)
.then(secondResult => {
const levelPromises = [];
secondResult.forEach(res => {
levelPromises.push(getLevelByUserId(res.user_id, db));
});
return Promise.all(levelPromises); // Promise.all only if you want to handle all success cases
})
.then(result => {
result.forEach( level => {
{ userId, queryResult } = level;
// ...
})
//...
})
.catch(err => {
console.log(err);
});
function getLevelByUserId(userId, db) {
const query = `select max(level_id) as level from user_level_task where user_id = ${userId}`;
return dbselectPromise(db, query).then(result => { userId, result });
}
It creates an array of all the get level queries as promises and then passes it along to the next step using Promise.all() which will only resolve if all queries were successful. At that point, you will have access to the userId again of each result because we returned it in our new function for your next set of queries.
I think you should abstract your queries a bit further instead of using a generic dbSelectPromise and don't forget to catch() at the end otherwise you won't know what's happening.
Note: It assumes your db variable instantiated properly and your original db.select doesn't need to be returned based on whatever library you're using. There's also some new syntax there.
The problem i am facing is i cannot access value of i in third and fourth promises. Please help!
This is because you're using reinitializing i without using let. When the loop is in process, the value will be different than what you expect.
each promise is dependant on the other and need to run synchronously
For this to work, You need to chain promises. Also, you can make use of Promise.all() to execute a bunches of promises at once. Remember, Promise.all() is all or nothing.
Making those changes to your code, I get the following structure.
'use strict';
let _ = require('lodash');
function dbSelectPromise(db, sql1) {
return new Promise((resolve, reject) => {
return db.select(sql1, (err, data) => {
if (err) {
return reject(err);
}
return resolve(data);
});
});
}
function process(secondResult) {
let sql1 = "select max(level_id) as level from user_level_task where user_id =" + secondResult[i].user_id;
return dbSelectPromise(db, sql1).then(function (thirdResult) {
console.log(thirdResult);
let sql2 = "select task_id form user_level_task where user_id = '" + secondResult[i].user_id + "' and level_id = '" + thirdResult[0].level + "' ";
return dbSelectPromise(db, sql2);
});
}
function getUsers() {
let sql = "select distinct(user_id) from user_level_task";
return dbSelectPromise(db, sql).then((users) => {
return users;
}).catch(() => {
return [];
});
}
cron.schedule('43 11 * * *', function () {
var now = moment();
var formatted = now.format('YYYY-MM-DD HH:mm:ss');
getUsers().then((users) => {
let batches = _.map(users, function (user) {
return process(user);
});
return Promise.all(batches);
}).then((fourthResult) => {
console.log('Your fourthResult [[],..]', fourthResult);
}).catch(() => {
console.log('err while processing', err);
});
});

Nodejs crashes when database connection fails

While my database server is not available and any function of my node-express rest service like hiExpress is called, Nodejs crashes the node server and console of node reports
sql server connection closed
I do not want this to happen because either it should go to err function or at least it must be cautht by catch block. What could i do to avoid the crash of nodejs when database server is not available I am using following code which is absolutely fine as long as database server is available.
var sqlServer = require('seriate');
app.get('/hiExpress',function(req, res)
{
var sr = {error:'',message:''};
var sql= 'select * from table1 where id=? and name=?';
var params = {id: 5, name:'sami'};
exeDB(res,sr,sql, params);//sent only 4 parameters (not 6)
});
function exeDB(res, sr, sql, params, callback, multiple) {
try {
var obj = {};
for (p in params) {
if (params.hasOwnProperty(p)) {
obj[p] = {
type: sqlServer.VARCHAR,
val: params[p]
};
}
};
var exeOptions = {
query: sql,
params: obj
};
if (multiple) {
exeOptions.multiple = true;
}
sqlServer.execute(sqlServerConfigObject, exeOptions).then(function (results) {
sr.data = results;
if (callback)
callback(sr);
else
res.json(sr); //produces result when success
}, function (err) {
//sr.message = sql;
console.log(11);
sr.error = err.message;
res.json(sr);
});
}
catch (ex) {
console.log(21);
sr.error = ex.message;
res.json(sr);
}
}
Why I preferred to use seriate
I had not been much comfortable with node-SQL, especially when when it came to
multiple queries option even not using a transaction. It facilitates easy go to parameterized queries.
You can use transaction without seriate but with async like below
async.series([
function(callback) {db.run('begin transaction', callback)},
function(callback) {db.run( ..., callback)},
function(callback) {db.run( ..., callback)},
function(callback) {db.run( ..., callback)},
function(callback) {db.run('commit transaction', callback)},
], function(err, results){
if (err) {
db.run('rollback transaction');
return console.log(err);
}
// if some queries return rows then results[query-no] contains them
})
The code is very dirty. Pass req and res params to db-layer is not a good idea.
Try change exeDB. I'm not sure, but probably you don't set error catcher to promise
function exeDB(res, sr, sql, params, callback, multiple) {
// It will execute with no error, no doubt
var obj = {};
for (p in params) {
if (params.hasOwnProperty(p)) {
obj[p] = {
type: sqlServer.VARCHAR,
val: params[p]
};
}
};
var exeOptions = {
query: sql,
params: obj
};
if (multiple) {
exeOptions.multiple = true;
}
// Potential problem is here.
// Catch is useless because code below is asynchronous.
sqlServer.execute(sqlServerConfigObject, exeOptions).then(function (results) {
sr.data = results;
if (callback)
callback(sr);
else
res.json(sr); //produces result when success
}).error(function(err){ // !!! You must provide on-error
console.log(err);
};
}

Trying to make my own RxJs observable

I'm trying to convert an existing API to work with RxJS... fairly new to node, and very new to RxJs, so please bear with me.
I have an existing API (getNextMessage), that either blocks (asynchronously), or returns a new item or error via a node-style (err, val) callback, when the something becomes available.
so it looks something like:
getNextMessage(nodeStyleCompletionCallback);
You could think of getNextMessage like an http request, that completes in the future, when the server responds, but you do need to call getNextMessage again, once a message is received, to keep getting new items from the server.
So, in order to make it into an observable collection, I have to get RxJs to keep calling my getNextMessage function until the subscriber is disposed();
Basically, I'm trying to create my own RxJs observable collection.
The problems are:
I don't know how to make subscriber.dispose() kill the async.forever
I probably shouldn't be using async.forever in the first place
I'm not sure I should be even getting 'completed' for each message - shouldn't that be at the end of a sequence
I'd like to eventually remove the need for using fromNodeCallback, to have a first class RxJS observable
Clearly I'm a little confused.
Would love a bit of help, thanks!
Here is my existing code:
var Rx = require('rx');
var port = require('../lib/port');
var async = require('async');
function observableReceive(portName)
{
var observerCallback;
var listenPort = new port(portName);
var disposed = false;
var asyncReceive = function(asyncCallback)
{
listenPort.getNextMessage(
function(error, json)
{
observerCallback(error, json);
if (!disposed)
setImmediate(asyncCallback);
}
);
}
return function(outerCallback)
{
observerCallback = outerCallback;
async.forever(asyncReceive);
}
}
var receive = Rx.Observable.fromNodeCallback(observableReceive('rxtest'));
var source = receive();
var subscription = source.forEach(
function (json)
{
console.log('receive completed: ' + JSON.stringify(json));
},
function (error) {
console.log("receive failed: " + error.toString());
},
function () {
console.log('Completed');
subscription.dispose();
}
);
So here's probably what I would do.
var Rx = require('Rx');
// This is just for kicks. You have your own getNextMessage to use. ;)
var getNextMessage = (function(){
var i = 1;
return function (callback) {
setTimeout(function () {
if (i > 10) {
callback("lawdy lawd it's ova' ten, ya'll.");
} else {
callback(undefined, i++);
}
}, 5);
};
}());
// This just makes an observable version of getNextMessage.
var nextMessageAsObservable = Rx.Observable.create(function (o) {
getNextMessage(function (err, val) {
if (err) {
o.onError(err);
} else {
o.onNext(val);
o.onCompleted();
}
});
});
// This repeats the call to getNextMessage as many times (11) as you want.
// "take" will cancel the subscription after receiving 11 items.
nextMessageAsObservable
.repeat()
.take(11)
.subscribe(
function (x) { console.log('next', x); },
function (err) { console.log('error', err); },
function () { console.log('done'); }
);
I realize this is over a year old, but I think a better solution for this would be to make use of recursive scheduling instead:
Rx.Observable.forever = function(next, scheduler) {
scheduler = scheduler || Rx.Scheduler.default,
//Internally wrap the the callback into an observable
next = Rx.Observable.fromNodeCallback(next);
return Rx.Observable.create(function(observer) {
var disposable = new Rx.SingleAssignmentDisposable(),
hasState = false;
disposable.setDisposable(scheduler.scheduleRecursiveWithState(null,
function(state, self) {
hasState && observer.onNext(state);
hasState = false;
next().subscribe(function(x){
hasState = true;
self(x);
}, observer.onError.bind(observer));
}));
return disposable;
});
};
The idea here is that you can schedule new items once the previous one has completed. You call next() which invokes the passed in method and when it returns a value, you schedule the next item for invocation.
You can then use it like so:
Rx.Observable.forever(getNextMessage)
.take(11)
.subscribe(function(message) {
console.log(message);
});
See a working example here

Resources