How to write api request tests without launching server? - node.js

It is possible to launch server on port and test it with library "supertest".
I'm wondering if it's possible to do the same without running a server?
Express app, the same as fastify app or others, takes request and response arguments from native node under the hood, like this:
const server = http.createServer((req, res) => {
res.end('hello\n');
});
So it should be possible to call that callback directly from tests.
How to get that callback? Is there any toolkit to help with constructing request and response objects for the callback?
The reason - just to make tests faster. As I don't want to test HTTP protocol and internals of node, I'd like to save time by skipping it.

Fastify has this feature out of the box:
'use strict'
const { test } = require('tap')
const Fastify = require('fastify')
test('requests the "/" route', async t => {
const fastify = Fastify()
fastify.get('/', function (request, reply) {
reply.send({ hello: 'world' })
})
const response = await app.inject({
method: 'GET',
url: '/'
})
t.strictEqual(response.statusCode, 200, 'returns a status code of 200')
})
To moch the req/resp object it is used light-my-request under the hood

Related

Node JS post API endpoint not recognized in front end

I'm trying to make a post request using appwrite SDK in Node JS express and Vue JS. The SDK requires me to create an api post request to create new storage bucket in appwrite. The DOCs for this particular request isn't explaining really how to create the api in node JS express. I'm really new to Node JS and I already succeeded at creating get request but whenever I create the post request I get 404 not found error.
Node JS express file (server.js):
In this file there is get users request API which works perfectly fine.
And there is create bucket post request which when being called in frontend it comes back with a 404
const express = require("express");
const path = require("path");
const app = express(),
bodyParser = require("body-parser");
port = 3080;
// Init SDK
const sdk = require("node-appwrite");
let client = new sdk.Client();
let users = new sdk.Users(client);
let storage = new sdk.Storage(client);
client
.setEndpoint("http://localhost/v1") // Your API Endpoint
.setProject("tailwinder") // Your project ID
.setKey(
"Secrer Key!"
); // Your secret API key
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
app.use(express.static(path.join(__dirname, "../appwrite-app/build")));
//This get request works fine
//get user by ID
app.get("/v1/users/:id", (req, res) => {
let promise = users.get(req.params.id);
promise.then(
function (response) {
res.json(response);
},
function (error) {
console.log(error);
}
);
});
//This one isn't recognised in frontend
app.post("/v1/storage/buckets", function (req, res) {
let promise = storage.createBucket("bucket_id", "bucket_name", "file");
promise.then(
function (response) {
res.json(response);
},
function (error) {
console.log(error);
}
);
});
app.listen(port, () => {
console.log(`Server listening on the port::${port}`);
});
bucketsServices.js:
Here I'm using fetch post request to the api endpoint but it's not working.
export async function createBucket() {
const response = await fetch("/v1/storage/buckets", {
method: "POST",
});
return await response.json();
}
Addcomponent.vue:
Here I'm calling out the createBucket function from vue js file
bucketTesting() {
createBucket().then((response) => {
console.log(response);
});
},
The error which I assume it means that it's not reading my node js express post API:
bucketsService.js?993b:2 POST http://localhost:8080/v1/storage/buckets 404 (Not Found)
Uncaught (in promise) SyntaxError: Unexpected token < in JSON at position 0
A screenshot of the same error:
Something is missing here and I can't really figure it out.
You are making request to localhost:8080 meanwhile your server is running at localhost:3080
I believe your vue is running at port 8080 that's why /v1/storage/buckets gets prefixed by localhost:8080
Try to provide full URL while making request
export async function createBucket() {
const response = await fetch("localhost:3080/v1/storage/buckets", {
method: "POST",
});
return await response.json();
}
Better way might be to add proxy to automatically redirect request to correct URL, but this should work for now. This article might help with how to setup proxy in vue

How do I make server-side fetch calls?

I have a React web application which currently does fetch calls client-side to update a dashboard with live information (let's say current weather, as an example), meaning that with an increase in users it will cause unnecessary traffic calls and could potentially crash this weather website.
What I am trying to understand is how can I make those fetch calls be server-side? I have looked into creating a Node.js Express server, but I am unsure if it has the functionality to make fetch calls to a remote host.
Here is my code with request-weather which does not really work, unfortunately.
const { response } = require('express');
const express = require('express');
const app = express();
var fetch = require('node-fetch');
const port = process.env.PORT || 5000;
app.use(express.json());
// This displays message that the server running and listening to specified port
app.listen(port, () => console.log(`Listening on port ${port}`));
// create a GET route
app.get('/request-info', (req, res) => {
res.send({ information: 'information call successful' });
});
app.get('/request-weather', (req, res) => {
fetch('http://thisotherwebsite.com/weather-query-that-returns-json',
{method: 'GET',
headers: {' Accept': 'application/json'}})
.then(res => {
return res;
})
});
Couple things:
Your /request-weather handler makes the request to thisotherwebsite but doesn't do anything with the response.
Your .then(res => { return res; }) doesn't actually do anything. You're just taking what fetch already returns and returning it.
If you want to send the response back to the browser you might do something like this:
fetch(...) // make the request
.then(result => result.json()) // extract the data
.then(data => {
res.json(data); // send it to the browser
})
If you want to do additional processing you could await the fetch call and then do whatever else you need to do with it:
app.get('/request-weather', async (req, res) => { // make handler async
// get data from the other site
const data = await fetch(...)
.then(response => response.json());
// package it up with some other stuff
responseData = {
fromOtherSite: data,
myExpressStuff: {
foo: 1,
bar: 2,
}
}
// return it to the browser
res.json(responseData);
Reference:
fetch: response.json() - Extracting data from a fetch response
express response.json() - Sending json to the response (usually to the browser)

What is the correct order of requiring and mocking files using Jest?

I'm trying to create an integration test using Jest for my Express app. I think I have a conceptual misunderstanding as my tests are behaving strangely. My goal is to test the following scenario. I'm hitting a specific endpoint using Supertest, and I want to check whether an error handler middleware is called if there is a mocked error. I want to check whether the error handler is not called, if there is no error present. I have the following test file:
test.js
const request = require('supertest')
describe('Error handler', () => {
let server
let router
beforeEach(() => {
jest.resetModules()
jest.resetAllMocks()
})
afterEach(async () => {
await server.close()
})
it('should be triggered if there is a router error', async () => {
jest.mock('../../routes/')
router = require('../../routes/')
router.mockImplementation(() => {
throw new Error()
})
server = require('../../server')
const res = await request(server)
.get('')
.expect(500)
.expect('Content-Type', /json/)
expect(res.body.error).toBe('Error')
expect(res.body.message).toBe('Something went wrong!')
expect(res.body.status).toBe(500 )
})
it('should not be triggered if there is no router error', async () => {
server = require('../../server')
const res = await request(server)
.get('')
.expect(201)
.expect('Content-Type', /text/)
})
})
What I think is happening is the following. Before each test I reset all modules, because I don't want to have the cached version of my server from the first require, I want to overwrite it. I also reset all mocks, so when the second test runs, no mock is used, no fake error is forced, so the middleware is not called and I'm getting back a vanilla 200 result.
After this is done, I start testing the scenario when there is an error. I mock the routes file that exports my routes so I can force a fake error. Then I require the server, this way, I suppose, it's loading the server up with the fake, error throwing route. Then I wait for the response with Supertest, and assert that I indeed got an error back - hence the error handler middleware has been triggered and worked.
The afterEach hook is called, the server is closed, then the beforeEach hook initializes everything, again. Now I have my vanilla implementation without the mock. I require my server, hit the homepage with a get request, and I get back the correct response.
The strange thing is that for some reason the second test seems to not exit gracefully. If I change my implementation from async - await in the second test, to specify the done callback, and then if I call it at the end of the test, it seems to be working.
I tried a lot of possible permutations, including putting the mocking part to the beforeEach hook, starting the server before / after mocking, and I got weird results. I feel like I have conceptual misunderstandings, but I don't know where, because there are so many moving parts.
Any help to make me understand what is wrong would be greatly appreciated
EDIT:
I thought that most parts can be considered a black box, but now I realize that the fact that I'm trying to create an app using Socket.IO makes the setup process a bit more convoluted.
I don't want Express to automatically create a server for me, because I want to use socketIO. So for now I only create a function with the appropiate signature, and that is 'app'. This can be given as an argument to http.Server(). I configure it with options and the middlewares that I want to use. I do not want to call app.listen, because that way Socket.IO could not do its own thing.
config.js
const path = require('path')
const express = require('express')
const indexRouter = require('./routes/')
const errorHandler = require('./middlewares/express/errorHandler')
const app = express()
app.set('views', path.join(__dirname + '/views'))
app.set('view engine', 'ejs')
app.use(express.static('public'))
app.use('', indexRouter)
app.use(errorHandler)
module.exports = app
In server.js I require this app, and then I create a HTTP server using it. After that, I feed it to 'socket.io', so it is connected to the proper instance. In server.js I do not call server.listen, I want to export it to a file that actually starts up the server (index.js) and I want to export it to my tests, so Supertest can spin it up.
server.js
// App is an Express server set up to use specific middlewares
const app = require('./config')
// Create a server instance so it can be used by to SocketIO
const server = require('http').Server(app)
const io = require('socket.io')(server)
const logger = require('./utils/logger')
const Game = require('./service/game')
const game = new Game()
io.on('connection', (socket) => {
logger.info(`There is a new connection! Socket ID: ${socket.id}`)
// If this is the first connection in the game, start the clock
if (!game.clockStarted) {
game.startClock(io)
game.clockStarted = true
}
game.addPlayer(socket)
socket.on('increaseTime', game.increaseTime.bind(game))
})
module.exports = server
If I understand everything correctly, basically the same thing happens, expect for a few additional steps in the example that you provided. There is no need to start the server, and then use Supertest on it, Supertest handles the process of starting up the server when I use request(server).get, etc.
EDIT 2
Right now I'm not sure whether mocking like that is enough. Some mysterious things leaves the Supertest requests hanging, and it might be that somewhere along the way it can not be ended, although I do not see why would that be the case. Anyway, here is the router:
routes/index.js
const express = require('express')
const router = express.Router()
router.get('', (req, res, next) => {
try {
res.status(200).render('../views/')
} catch (error) {
next(error)
}
})
router.get('*', (req, res, next) => {
try {
res.status(404).render('../views/not-found')
} catch (error) {
next(error)
}
})
module.exports = router
The order of requiring and mocking is correct but the order of setting up and shutting down a server probably isn't.
A safe way is to make sure the server is available before doing requests. Since Node http is asynchronous and callback-based, errors cannot be expected to be handled in async functions without promisification. Considering that server.listen(...) was called in server.js, it can be:
...
server = require('../../server')
expect(server.listening).toBe(true);
await new Promise((resolve, reject) => {
server.once('listening', resolve).once('error', reject);
});
const res = await request(server)
...
close is asynchronous and doesn't return a promise so there's nothing to await. Since it's in a dedicated block, a short way is to use done callback:
afterEach(done => {
server.close(done)
})
In case errors are suppressed in error listener, server.on('error', console.error) can make troubleshooting easier.
Supertest can handle server creation itself:
You may pass an http.Server, or a Function to request() - if the server is not already listening for connections then it is bound to an ephemeral port for you so there is no need to keep track of ports.
And can be provided with Express instance instead of Node server, this eliminates the need to handle server instances manually:
await request(app)
...

Next.js server side api call returns 500 internal server error

I'm finally dipping my toe into the world of server side react using Next.js, however I'm pretty stumped with this issue.
I'm making a call to an API from pages/customer-preferences.tsx using isomorphic-unfetch
CustomerPreferencesPage.getInitialProps = async () => {
const res = await fetch(API_URL + '/preference-customer');
const initialData = await res.json();
return { initialData };
};
All works fine locally in dev mode or once built and ran build > start. To host it I'm running it from a docker container node:10, and when I run this locally all is fine also. The issue only happens once it's deployed.
When I navigate to / and then click a link to /customer-preferences all works as expected. But if I refresh the page or load the page directly at /customer-preferences I see this error from Next.js
So the issue only seems to happen when trying to make the API calls from the server and not the client.
I've also setup a simple express server to use instead, but not sure if this is necessary?!
const express = require('express');
const next = require('next');
const port = parseInt(process.env.PORT, 10) || 3000;
const dev = process.env.NODE_ENV !== 'production';
const app = next({ dev });
const handle = app.getRequestHandler();
app.prepare().then(() => {
const server = express();
server.all('*', (req, res) => {
return handle(req, res);
});
server.listen(port, err => {
if (err) throw err;
console.log(`> Ready on http://localhost:${port}`);
});
});
When checking the server logs I get this:
FetchError: request to http://xxx failed, reason: getaddrinfo EAI_AGAIN xxx xxx:80
Any help would be greatly appreciated.
No, the server setup is not necessary.
This is happening because the browser/client is not capable of resolving your docker container's hostname. As it stands, the only solution I know of is to check for the req object in getInitialProps (so as to determine which environment the fetch will run in) and call the Docker hostname when on server, localhost when on client. E.g.
async getInitialProps (ctx) {
if (ctx.req) // ctx.req exists server-side only
{
// call docker-host:port
}
else {
// call localhost:port
}
}
My suspicion has to do with the fact that fetch is not a native node module but a client in browsers. So, if you navigate from one page to this page; per the documentation; getInitialProps will be called from the client side, making the fetch method accessible. A refresh ensures that the getInitialProps called from the server side.
You can test this theory by running typeof fetch from a browser's inspector and from a node REPL.
You are better of calling the method from component or using a third-party HTTP client like axios...
If you want to skip calling the AJAX method from the backend and only call it from the frontend, you can test if the method is calling from the frontend or the backend, like so:
CustomerPreferencesPage.getInitialProps = async () => {
if (typeof window === 'undefined') {
// this is being called from the backend, no need to show anything
return { initialData: null };
}
const res = await fetch(API_URL + '/preference-customer');
const initialData = await res.json();
return { initialData };
};

Local Dev Server for AWS Lambdas

Is there a dev server that runs AWS Lambdas locally? My requirements would be
nodejs server, no ruby or go or anything need to install other than node and npm packages
Creates a server that I can query via wget / curl or an API testing tool to send various events to
I should be able to specify a js file that the server uses as lambda and the server should restart / update when I change that file
Here's a solution that does not require serverless or claudiajs.
I usually just write my own little express script for this purpose. I always just use Lambda Proxy integration so it's simpler.
Something like this...
const bodyParser = require('body-parser')
const express = require('express')
// Two different Lambda handlers
const { api } = require('../src/api')
const { login } = ('../src/login')
const app = express()
app.use(bodyParser.json())
// route and their handlers
app.post('/login', lambdaProxyWrapper(login))
app.all('/*', lambdaProxyWrapper(api))
app.listen(8200, () => console.info('Server running on port 8200...'))
function lambdaProxyWrapper(handler) {
return (req, res) => {
// Here we convert the request into a Lambda event
const event = {
httpMethod: req.method,
queryStringParameters: req.query,
pathParameters: {
proxy: req.params[0],
},
body: JSON.stringify(req.body),
}
return handler(event, null, (err, response) => {
res.status(response.statusCode)
res.set(response.headers)
return res.json(JSON.parse(response.body))
})
}
}
Then, run it with nodemon so it watches the files and reloads as necessary.
nodemon --watch '{src,scripts}/**/*.js' scripts/server.js
Have you checked out SAM Local? https://github.com/awslabs/aws-sam-local

Resources