Adding object to supertest - node.js

I'm trying to test one of my routes, which would usually expect an object on the request object (e.g. req.exampleData = { }).
I've tried looking for examples, but I've only found .set which attaches to the req header.
Ideally, I would want something like:
await request(app)
.get('/api/testRoute)
.attach('exampleData', { })
Is such a thing possible for supertest?

Perhaps you can use field as in
await request(app)
.get('/api/testRoute)
.field('exampleData', {})
on your server
req.body.exampleData // {}

Related

How to Retrieve Data from Out of Axios Function to Add to Array (NEWBIE QUESTION)

I am working on building a blog API for a practice project, but am using the data from an external API. (There is no authorization required, I am using the JSON data at permission of the developer)
The idea is that the user can enter multiple topic parameters into my API. Then, I make individual requests to the external API for the requested info.
For each topic query, I would like to:
Get the appropriate data from the external API based on the params entered (using a GET request to the URL)
Add the response data to my own array that will be displayed at the end.
Check if each object already exists in the array (to avoid duplicates).
res.send the array.
My main problem I think has to do with understanding the scope and also promises in Axios. I have tried to read up on the concept of promise based requests but I can't seem to understand how to apply this to my code.
I know my code is an overall mess, but if anybody could explain how I can extract the data from the Axios function, I think it could help me get the ball rolling again.
Sorry if this is a super low-level or obvious question - I am self-taught and am still very much a newbie!~ (my code is a pretty big mess right now haha)
Here is a screenshot of the bit of code I need to fix:
router.get('/:tagQuery', function(req, res){
const tagString = req.params.tagQuery;
const tagArray = tagString.split(',');
router.get('/:tag', function(req, res){
const tagString = req.params.tag;
const tagArray = queryString.split(',');
const displayPosts = tagArray.map(function(topic){
const baseUrl = "https://info.io/api/blog/posts";
return axios
.get(baseUrl, {
params: {
tag: tag
}
})
.then(function(response) {
const responseData = response.data.posts;
if (tag === (tagArray[0])){
const responseData = response.data.posts;
displayPosts.push(responseData);
} else {
responseData.forEach(function(post){
// I will write function to check if post already exists in responseData array. Else, add to array
}); // End if/then
})
.catch(function(err) {
console.log(err.message);
}); // End Axios
}); // End Map Function
res.send(displayPosts);
});
Node.js is a single thread non-blocking, and according to your code you will respond with the result before you fetching the data.
you are using .map which will fetch n queries.
use Promise.all to fetch all the requests || Promise.allsettled.
after that inside the .then of Promise.all || promise.allsettled, map your result.
after that respond with the mapped data to the user
router.get('/:tag', function (req, res) {
const tagString = req.params.tag;
const tagArray = queryString.split(',');
const baseUrl = "https://info.io/api/blog/posts";
const topicsPromises=tagArray.map((tobic)=>{
return axios
.get(baseUrl, {
params: {
tag: tag
}
})
});
Promise.all(topicsPromises).then(topicsArr=>{
//all the data have been fetched successfully
// loop through the array and handle your business logic for each topic
//send the required data to the user using res.send()
}).catch(err=>{
// error while fetching the data
});
});
your code will be something like this.
note: read first in promise.all and how it is working.

Nodejs proxy request coalescing

I'm running into an issue with my http-proxy-middleware stuff. I'm using it to proxy requests to another service which i.e. might resize images et al.
The problem is that multiple clients might call the method multiple times and thus create a stampede on the original service. I'm now looking into (what some services call request coalescing i.e. varnish) a solution that would call the service once, wait for the response and 'queue' the incoming requests with the same signature until the first is done, and return them all in a single go... This is different from 'caching' results due to the fact that I want to prevent calling the backend multiple times simultaneously and not necessarily cache the results.
I'm trying to find if something like that might be called differently or am i missing something that others have already solved someway... but i can't find anything...
As the use case seems pretty 'basic' for a reverse-proxy type setup, I would have expected alot of hits on my searches but since the problemspace is pretty generic i'm not getting anything...
Thanks!
A colleague of mine has helped my hack my own answer. It's currently used as a (express) middleware for specific GET-endpoints and basically hashes the request into a map, starts a new separate request. Concurrent incoming requests are hashed and checked and walked on the separate request callback and thus reused. This also means that if the first response is particularly slow, all coalesced requests are too
This seemed easier than to hack it into the http-proxy-middleware, but oh well, this got the job done :)
const axios = require('axios');
const responses = {};
module.exports = (req, res) => {
const queryHash = `${req.path}/${JSON.stringify(req.query)}`;
if (responses[queryHash]) {
console.log('re-using request', queryHash);
responses[queryHash].push(res);
return;
}
console.log('new request', queryHash);
const axiosConfig = {
method: req.method,
url: `[the original backend url]${req.path}`,
params: req.query,
headers: {}
};
if (req.headers.cookie) {
axiosConfig.headers.Cookie = req.headers.cookie;
}
responses[queryHash] = [res];
axios.request(axiosConfig).then((axiosRes) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.json(axiosRes.data);
});
responses[queryHash] = undefined;
}).catch((err) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.status(500).json(false);
});
responses[queryHash] = undefined;
});
};

Sharing DB queries in Node.js methods

What's the best practice for sharing DB query code between multiple Node.js Express controller methods? I’ve searched but the samples I’ve found don’t really get into this.
For example, I have this getUser method (using Knex for MySQL) that makes a call to get user info. I want to use it in other methods but I don't need all the surrounding stuff like the response object.
export let getUser = (req: Request, res: Response, next: NextFunction) => {
try {
knex.select().where('email', req.params.email)
.table('users')
.then( (dbResults) => {
const results: IUser = dbResults[0];
res
.status(200)
.set({ 'Content-Type': 'application/json', 'Connection': 'close' })
.send(results);
});
} catch (err) {
res.send({ error: "Error getting person" + req.params.email });
return next(err);
}
};
It seems wrong to repeat the query code somewhere else where I need to get the user. Should I turn my DB query code into async functions like this example and then call them from within the controller methods that use the query? Is there a simpler way?
/**
* #param {string} email
*/
async function getUserId(email: string) {
try {
return await knex.select('id')
.where('email', email)
.table('users');
} catch (err) {
return err;
}
}
You can for example create "service" modules, which contain helpers for certain type of queries. Or you could use ORM and implement special queries in each model that is called "fat model" design. Pretty much anything goes as long as you remember to not create new knex instance in every helper module, but you just pass knex (containing its connection pool) for the helper methods so that all queries will share the same connection pool.
ORM's like objection.js also provides a way to extend query builder API so you can inherit custom query builder with any special query helper that you need.

Sinon fake server not intercepting requests

Trying to use Sinon for the first time because of its fake server functionality that lets me stub an API response. Test itself is written for Mocha
However, the fake server doesn't seem to be intercepting the requests.
Code:
describe('when integrated', function() {
var server;
beforeEach(function() {
server = sinon.createFakeServer();
});
afterEach(function() {
server.restore();
});
it('can send a message to the notification service', function() {
server.respondWith("POST", new RegExp('.*/api/notificationmanager/messages.*'),
[200,
{ "Content-Type": "application/json" },
'{ "messageId":23561}'
]);
var messageOnly = new PushMessage(initMessageObj);
var originalUrl = PushMessage.serverUrl;
messageOnly.setServerAPI("http://a.fake.server/api/notificationmanager/messages");
console.log("fake server is: ", server);
messageOnly.notify()
.then(function(response) {
messageOnly.setServerAPI(originalUrl);
return response;
})
.then(function(response) {
response.should.be.above(0);
})
console.log(server.requests);
server.respond();
})
});
For reference, PushMessage is an object that has a static property serverUrl. I'm just setting the value to a fake URL & then resetting it.
The notify() function makes a post message using request-promise-native to the serverUrl set in the PushMessage's static property.
What seems to be happening, is that the POST request ends up being properly attempted against the URL of http://a.fake.server/api/notificationmanager/messages, resulting in an error that the address doesn't exist...
Any idea what I'm doing wrong...? Thanks!
There have been several issues on the Sinon GitHub repository about this. Sinon's fake server:
Provides a fake implementation of XMLHttpRequest and provides several interfaces for manipulating objects created by it.
Also fakes native XMLHttpRequest and ActiveXObject (when available, and only for XMLHTTP progids). Helps with testing requests made with XHR.
Node doesn't use XHR requests, so Sinon doesn't work for this use case. I wish it did too.
Here's an issue that breaks it down: https://github.com/sinonjs/sinon/issues/1049
Nock is a good alternative that works with Node: https://www.npmjs.com/package/nock

How to mock external service when testing a NodeJS API

I have JSON API built with koa which I am trying to cover with integration tests.
A simple test would look like this:
describe("GET: /users", function() {
it ("should respond", function (done) {
request(server)
.get('/api/users')
.expect(200, done);
});
});
Now the issue comes when the actions behind a controller - lets say saveUser at POST /users - use external resources. For instance I need to validate the users phone number.
My controller looks like this:
save: async function(ctx, next) {
const userFromRequest = await parse(ctx);
try {
// validate data
await ctx.repo.validate(userFromRequest);
// validate mobile code
await ctx.repo.validateSMSCode(
userFromRequest.mobile_number_verification_token,
userFromRequest.mobile_number.prefix + userFromRequest.mobile_number.number
);
const user = await ctx.repo.create(userFromRequest);
return ctx.data(201, { user });
} catch (e) {
return ctx.error(422, e.message, e.meta);
}
}
I was hoping to be able to mock the ctx.repo on the request object but I can't seem to able to get a hold on it from test, which means that my tests are actually hitting the phone number verification service.
Are there any ways I could go around hitting that verification service ?
Have you considered using a mockup library like https://github.com/mfncooper/mockery?
Typically, when writing tests requiring external services, I mock the service client library module. For example, using mocha:
mockery = require('mockery');
repo = require('your-repo-module');
before(function() {
mockery.enable();
repo.validateSMSCode = function() {...};
mockery.registerMock('your-repo-module', repo);
}
This way, every time you require your-repo-module, the mocked module will be loaded rather than the original one. Until you disable the mock, obviously...
app.context is the prototype from which ctx is created from. You may
add additional properties to ctx by editing app.context. This is
useful for adding properties or methods to ctx to be used across your
entire app, which may be more performant (no middleware) and/or easier
(fewer require()s) at the expense of relying more on ctx, which could
be considered an anti-pattern.
app.context.someProp = "Some Value";
app.use(async (ctx) => {
console.log(ctx.someProp);
});
For your sample your re-define app.context.repo.validateSMSCode like this, assuming that you have following setup lines in your test:
import app from '../app'
import supertest from 'supertest'
app.context.repo.validateSMSCode = async function(ctx, next) {
// Your logic here.
};
const request = supertest.agent(app.listen())
After re-defining app.context.repo.validateSMSCode method that your will define in your test, will work, instead of original method.
https://github.com/koajs/koa/blob/v2.x/docs/api/index.md#appcontext
https://github.com/koajs/koa/issues/652

Resources