Is it bad practice to perform http requests in main module? - node.js

I want to sync data from an external resource (file, server etc) into my DB every time the server starts.
app.js
const app = express();
app.use('/add', require('./addLayer'));
...
(async() => {
const external_resource = await axios.get('...'); // get data
await axios.post('http://localhost:3000/add', external_resource.data); // add it to db
});
addLayer.js
const DBObject = require('../../models/DBObject')
const addNewLayer = async (req, res) => {
try {
const data = validateData(req.body)
await DBObject.create(data)
} catch (err) {
res.status(400).send(err)
}
}
I do not want to re-write the code in the /add route (which includes data validations), but I find performing POST requests in here not good for some reason. Is there a better way to do so?

Related

TypeError: client.getAsync is not a function - redis

I want to use redis as caching for my database. I wrote the following code and it is giving me the error:
const offerData = await client.getAsync(offerID)
^ TypeError: client.getAsync is not a function
My current code :
const redis = require('redis')
// Connect to the Redis server
const client = redis.createClient()
client.on('connect', () => {
console.log('[PASS]'.green + ' Redis Connected')
require('bluebird').promisifyAll(redis)
})
router.get('/', async (req, res) => {
const offerData = await client.getAsync(offerID)
let link
if (offerData) {
// If the offer data is in Redis, parse it and assign it to the link variable
link = JSON.parse(offerData)
} else {
// If the offer data is not in Redis, get it from the database and store it in Redis
link = await Offers.findOne({_id: offerID})
if (link == null) return res.sendStatus(404)
client.set(offerID, JSON.stringify(link))
}
//Do some
})
How can I fix this? Tried that promisifyAll but no luck
getAsync is not a function of redis module.
You can use redis-promisify module which offers this method.
Here is a sample ref for you bellow:
const redis = require('redis-promisify')
// Connect to the Redis server
const client = redis.createClient()
router.get('/', async (req, res) => {
const offerData = await client.getAsync(offerID)
// rest of the code
})

Chaining async await calls in Node/Express with an external time limit

I'm building a Slackbot that makes a call to an Express app, which then needs to 1) fetch some other data from the Slack API, and 2) insert resulting data in my database. I think I have the flow right finally using async await, but the operation is timing out because the original call from the Slackbot needs to receive a response within some fixed time I can't control. It would be fine for my purposes to ping the bot with a response immediately, and then execute the rest of the logic asynchronously. But I'm wondering the best way to set this up.
My Express route looks like:
const express = require('express');
const router = express.Router();
const knex = require('../../db/knex.js');
const slack = require('../../services/slack_helpers');
// POST api/slack/foo
router.post('/foo', async (req, res) => {
let [body, images] = await slack.grab_context(req);
knex('texts')
.insert({ body: body,
image_ids: images })
.then(text => { res.send('worked!'); }) // This sends a response back to the original Slackbot call
.catch(err => { res.send(err); })
});
module.exports = router;
And then the slack_helpers module looks like:
const { WebClient } = require('#slack/web-api');
const Slack = new WebClient(process.env.SLACKBOT_TOKEN);
async function grab_context(req) {
try {
const context = await Slack.conversations.history({ // This is the part that takes too long
channel: req.body.channel_id,
latest: req.headers['X-Slack-Request-Timestamp'],
inclusive: true,
limit: 5
});
} catch (error) {
return [error.toString(), 'error'];
}
return await parse_context(context);
};
function parse_context(context) {
var body = [];
context.messages.forEach(message => {
body.push(message.text);
});
body = body.join(' \n');
return [body, ''];
}
module.exports = {
grab_context
};
I'm still getting my head around asynchronous programming, so I may be missing something obvious. I think basically something like res.send perhaps needs to come before the grab_context call? But again, not sure the best flow here.
Update
I've also tried this pattern in the API route, but still getting a timeout:
slack.grab_context(req).then((body, images) => {
knex ...
})
Your timeout may not be coming from where you think. From what I see, it is coming from grab_context. Consider the following simplified version of grab_context
async function grab_context_simple() {
try {
const context = { hello: 'world' }
} catch (error) {
return [error.toString(), 'error']
}
return context
}
grab_context_simple() /* => Promise {
<rejected> ReferenceError: context is not defined
...
} */
You are trying to return context outside of the try block where it was defined, so grab_context will reject with a ReferenceError. It's very likely that this error is being swallowed at the moment, so it would seem like it is timing out.
The fix is to move a single line in grab_context
async function grab_context(req) {
try {
const context = await Slack.conversations.history({
channel: req.body.channel_id,
latest: req.headers['X-Slack-Request-Timestamp'],
inclusive: true,
limit: 5
});
return await parse_context(context); // <- moved this
} catch (error) {
return [error.toString(), 'error'];
}
};
I'm wondering the best way to set this up.
You could add a higher level try/catch block to handle errors that arise from the /foo route. You could also improve readability by staying consistent between async/await and promise chains. Below is how you could use async/await with knex, as well as the aforementioned try/catch block
const express = require('express');
const router = express.Router();
const knex = require('../../db/knex.js');
const slack = require('../../services/slack_helpers');
const insertInto = table => payload => knex(table).insert(payload)
const onFooRequest = async (req, res) => {
try {
let [body, images] = await slack.grab_context(req);
const text = await insertInto('texts')({
body: body,
image_ids: images,
});
res.send('worked!');
} catch (err) {
res.send(err);
}
}
router.post('/foo', onFooRequest);
module.exports = router;

How to get and pass json from model to controller?

I'm trying to get json data from database (sql) with node.js and then pass it to app.get (express.js) but without success.
I have two files urls.js which should get all urls from database and in app.js I'm trying to create api endpoint with express.js. I have managed to get json data in app.js if I write query there and run it but I do not know how to separate it into two files.
code that works in app.js
app.get('/api/urls', (request, response) => {
db.query('SELECT * FROM urls', (error, result) => {
if (error) throw error;
response.send(result);
});
});
I've tried to separate it into two files so in urls.js (model like) I could have something like
class Urls {
async getUrls() {
const sql = `select * from urls`;
return await db.query(sql);
}
}
module.exports = Urls;
and then call it in app.js (controller like):
const data = new Urls();
app.get(/api/urls, (req, res) => {
res.send(data.getUrls());
}
In both cases result should be json
Your getUrls function is async it will return promise,so do something like this
const data = new Urls();
app.get(/api/urls, (req, res) => {
data.getUrls().then(response=>{
res.send(response);
})
}

[Koa]404 while passing throught the routeur

I'm having some trouble with the Koa framework. I'm trying to build a pretty basic server by I'm having a problem with my router. The ctx always return 404 despite passing in my functions.
Some code :
//www.js
const Koa = require('koa');
const app = new Koa();
const version = require('./routes/version');
app.listen(config.port, () => {
console.log('Server is listenning on port ' + config.port);
});
app.use(version.routes());
app.use(ctx => {
console.log ('test')
});
//version.js
const Router = require('koa-router');
const router = new Router();
router.prefix('/version');
router.use((ctx, next) => {
ctx.vFactory = new VersionFactory(ctx.app.db);
next();
});
router.get('/', getAllVersions);
async function getAllVersions(ctx, next) {
const ret = await ctx.vFactory.getAllVersions();
ctx.body = JSON.stringify(ret.recordset);
console.log(ctx.body)
await next();
}
I've checked a few threads. Most of the time, the problem seems to come from a non Promise based function in the await part of the router function. Here it is a simple DAO using mssql which is pretty promise based.
class DaoVersion {
constructor(db) {
this.pool = db;
}
async getAllVersions() {
const me = this;
return new Promise((resolve) => {
const ret= me.pool
.query(getVersion);
resolve(ret);
});
}
}
The console output seems good. I have my ctx.body set with my db data but if I try to check the whole context, I still have a 404. More interesting, if I try to ctx.res.write (using default node response) I got the "already end" message. So it seems Koa have sent the message before passing threw my function.
Any idea why and how I could correct that ?
Koa default response.status code is 404, unlike node's res.statusCode which defaults to 200.
Koa changes the default status code to 200 - when your route set's a non empty value to ctx.body or in some cases you can manually change (like if you need to set it to 202) it by using ctx.status = xxx.
You can use this documentation for reference: https://github.com/koajs/koa/blob/master/docs/api/response.md
Also, your route should be an async function:
router.get('/', async(ctx, next) => {
ctx.body = await getAllVersions
await next()
}

Bluebird promise resolve in express route

I have an simple REST application and I want to read files in my directory and send them back to frontend. There's code I'm using:
const fs = Promise.promisifyAll(require('fs'))
const router = require('express').Router()
router.get('/list', async (req, res, next) => {
const files = await fs.readdirAsync('presentations')
res.json(files)
})
The problem is: my frontend receive 'Promise', but if I try to debug it shows me that files is an array.
I've tried not to use async/await syntax like that:
router.get('/list', (req, res, next) => {
fs.readdirAsync('presentations')
.then(files => {
res.json(files)
})
})
But result was the same: frontend still get Promise.
UPD: Problem was with frontend axios instance. It didn't resolve promise, so await for results solved a problem.
So, there are three parts. Reading, storing and sending.
Here's the reading part:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
onFileContent(filename);
});
});
});
}
Here's the storing part:
var data = {};
readFiles('dirname/', function(filename) {
data[filename] = filname;
}, function(err) {
throw err;
});
The sending part is up to you. You may want to send them one by one or after reading completion.
If you want to send files after reading completion you should either use sync versions of fs functions or use promises. Async callbacks is not a good style.

Resources