KoaJs cant handle POST requests on CloudFunctions - node.js

I have a NodeJS Application written in KoaJS,
app.ts
const app = new Koa();
app.use(healthCheck());
app.use(bodyParser());
app.use(errorHandler());
app.use(endpoints);
export default app;
main.ts
const port = process.env.PORT || 3000;
if (!isCloudFunctions()) {
app
.listen(port, () => {
console.info(`Listening at http://localhost:${port}`);
})
.on('error', console.error);
}
export const api = (req, res) => {
app.callback()(req, res);
}
The app works well on Cloud Runs,
I can deploy the app on Cloud Functions, but on Functions the app can only handle GET requests.
If I try a POST request, I get this error
InternalServerError: stream is not readable
at getRawBody (/workspace/node_modules/raw-body/index.js:112:10)
at readStream (/workspace/node_modules/raw-body/index.js:178:17)
at AsyncFunction.module.exports [as json] (/workspace/node_modules/co-body/lib/json.js:39:21)
at executor (/workspace/node_modules/raw-body/index.js:113:5)
at parseBody (/workspace/node_modules/koa-bodyparser/index.js:100:26)
at new Promise (<anonymous>)
at bodyParser (/workspace/node_modules/koa-bodyparser/index.js:85:25)
at next (/workspace/node_modules/koa-compose/index.js:42:32)
at /workspace/webpack:/sample-explore/apps/sample-api/src/middlewares/health-check.ts:10:12
at Generator.next (<anonymous>)
I re-created the application in ExpressJS, and it works fine with both Runs and Functions
However I am really like the native async/await , compose routing of KoaJS
Does anyone know the reason why KoaJS can not handle POST requests on Cloud Functions?

The json body is automatically parsed in google cloud functions (documentation) and the koa-bodyparser middleware can't handle already parsed bodies.
More info on this issue: https://github.com/koajs/bodyparser/issues/127
Suggested fixes, from the issue thread, are to either use ctx.req.body instead of ctx.request.body, you'll need to parse it when running locally of course.
Alternatively add a middleware that will support already parsed bodies.
function hybridBodyParser (opts) {
const bp = bodyParser(opts)
return async (ctx, next) => {
ctx.request.body = ctx.request.body || ctx.req.body
return bp(ctx, next)
}
}
app.use(hybridBodyParser())

Related

Node JS post API endpoint not recognized in front end

I'm trying to make a post request using appwrite SDK in Node JS express and Vue JS. The SDK requires me to create an api post request to create new storage bucket in appwrite. The DOCs for this particular request isn't explaining really how to create the api in node JS express. I'm really new to Node JS and I already succeeded at creating get request but whenever I create the post request I get 404 not found error.
Node JS express file (server.js):
In this file there is get users request API which works perfectly fine.
And there is create bucket post request which when being called in frontend it comes back with a 404
const express = require("express");
const path = require("path");
const app = express(),
bodyParser = require("body-parser");
port = 3080;
// Init SDK
const sdk = require("node-appwrite");
let client = new sdk.Client();
let users = new sdk.Users(client);
let storage = new sdk.Storage(client);
client
.setEndpoint("http://localhost/v1") // Your API Endpoint
.setProject("tailwinder") // Your project ID
.setKey(
"Secrer Key!"
); // Your secret API key
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
app.use(express.static(path.join(__dirname, "../appwrite-app/build")));
//This get request works fine
//get user by ID
app.get("/v1/users/:id", (req, res) => {
let promise = users.get(req.params.id);
promise.then(
function (response) {
res.json(response);
},
function (error) {
console.log(error);
}
);
});
//This one isn't recognised in frontend
app.post("/v1/storage/buckets", function (req, res) {
let promise = storage.createBucket("bucket_id", "bucket_name", "file");
promise.then(
function (response) {
res.json(response);
},
function (error) {
console.log(error);
}
);
});
app.listen(port, () => {
console.log(`Server listening on the port::${port}`);
});
bucketsServices.js:
Here I'm using fetch post request to the api endpoint but it's not working.
export async function createBucket() {
const response = await fetch("/v1/storage/buckets", {
method: "POST",
});
return await response.json();
}
Addcomponent.vue:
Here I'm calling out the createBucket function from vue js file
bucketTesting() {
createBucket().then((response) => {
console.log(response);
});
},
The error which I assume it means that it's not reading my node js express post API:
bucketsService.js?993b:2 POST http://localhost:8080/v1/storage/buckets 404 (Not Found)
Uncaught (in promise) SyntaxError: Unexpected token < in JSON at position 0
A screenshot of the same error:
Something is missing here and I can't really figure it out.
You are making request to localhost:8080 meanwhile your server is running at localhost:3080
I believe your vue is running at port 8080 that's why /v1/storage/buckets gets prefixed by localhost:8080
Try to provide full URL while making request
export async function createBucket() {
const response = await fetch("localhost:3080/v1/storage/buckets", {
method: "POST",
});
return await response.json();
}
Better way might be to add proxy to automatically redirect request to correct URL, but this should work for now. This article might help with how to setup proxy in vue

Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client - Node.Js, Express, Postgres

I'm having trouble with the error message in the title when trying to retrieve all users in my express .get('/users') method. I am using Node.js, Express, and node-postgres. I have my
getUsers(); function defined in my queries.js file, and I call the function in my app.get() function in my index.js file.
queries.js
const client = require('./object models/db_client_pool')
const Pool = require('pg').Pool
const pool = new Pool(client.client)
async function getUsers(request, response) {
await pool.connect()
pool.query('select * from discord_users', (error, results) => {
if (error) {
throw error
}
response.sendStatus(200).json(results.rows)
pool.release();
})
}
module.exports = {
getUsers
}
index.js
const express = require('express');
require('dotenv').config();
//const bodyParser = require('body-parser'); deprecated
const app = express();
const port = 3000;
const db = require('./queries');
app.use(express.json())
app.use(express.urlencoded({
extended: true
}))
app.get('/', (request, response) => {
response.json({ info: 'Node.js, Express, and Postgres API' })
})
app.get('/users', (req, res) => {
db.getUsers(req, res)
})
app.listen(port, () => {
console.log(`App is listening on port ${port}`);
});
As I said, I keep getting the "cannot set headers after they are sent to the client" error and I'm at a loss of what to do. Thanks in advance for your help!
Change from this:
response.sendStatus(200).json(results.rows)
to this:
response.status(200).json(results.rows);
or even just to this:
response.json(result.rows); // 200 is the default status already
The last one is fine because 200 is already the default status so you don't need to set that yourself.
The problem is that response.sendStatus(200) sends a complete response with an empty body and then you try to call response.json(result.rows) which tries to send ANOTHER response to the same request. Trying to send that second response to the same request is what triggers the error message you are getting.
response.status(200) just sets the status to 200 as a property on the waiting response object and waits for some other method to actually send the response itself which you can then do with .json(...).
So my guess is, you're running express 4.x and that doesn't support response.sendStatus(200) anymore. You have to use response.status(200) instead.
Now, another issue I see in your code is, I don't recognize pool.release() method from pg library. You can release a client back to a pool but you can't release a pool of clients. Maybe you meant pool.end()?

What is the correct order of requiring and mocking files using Jest?

I'm trying to create an integration test using Jest for my Express app. I think I have a conceptual misunderstanding as my tests are behaving strangely. My goal is to test the following scenario. I'm hitting a specific endpoint using Supertest, and I want to check whether an error handler middleware is called if there is a mocked error. I want to check whether the error handler is not called, if there is no error present. I have the following test file:
test.js
const request = require('supertest')
describe('Error handler', () => {
let server
let router
beforeEach(() => {
jest.resetModules()
jest.resetAllMocks()
})
afterEach(async () => {
await server.close()
})
it('should be triggered if there is a router error', async () => {
jest.mock('../../routes/')
router = require('../../routes/')
router.mockImplementation(() => {
throw new Error()
})
server = require('../../server')
const res = await request(server)
.get('')
.expect(500)
.expect('Content-Type', /json/)
expect(res.body.error).toBe('Error')
expect(res.body.message).toBe('Something went wrong!')
expect(res.body.status).toBe(500 )
})
it('should not be triggered if there is no router error', async () => {
server = require('../../server')
const res = await request(server)
.get('')
.expect(201)
.expect('Content-Type', /text/)
})
})
What I think is happening is the following. Before each test I reset all modules, because I don't want to have the cached version of my server from the first require, I want to overwrite it. I also reset all mocks, so when the second test runs, no mock is used, no fake error is forced, so the middleware is not called and I'm getting back a vanilla 200 result.
After this is done, I start testing the scenario when there is an error. I mock the routes file that exports my routes so I can force a fake error. Then I require the server, this way, I suppose, it's loading the server up with the fake, error throwing route. Then I wait for the response with Supertest, and assert that I indeed got an error back - hence the error handler middleware has been triggered and worked.
The afterEach hook is called, the server is closed, then the beforeEach hook initializes everything, again. Now I have my vanilla implementation without the mock. I require my server, hit the homepage with a get request, and I get back the correct response.
The strange thing is that for some reason the second test seems to not exit gracefully. If I change my implementation from async - await in the second test, to specify the done callback, and then if I call it at the end of the test, it seems to be working.
I tried a lot of possible permutations, including putting the mocking part to the beforeEach hook, starting the server before / after mocking, and I got weird results. I feel like I have conceptual misunderstandings, but I don't know where, because there are so many moving parts.
Any help to make me understand what is wrong would be greatly appreciated
EDIT:
I thought that most parts can be considered a black box, but now I realize that the fact that I'm trying to create an app using Socket.IO makes the setup process a bit more convoluted.
I don't want Express to automatically create a server for me, because I want to use socketIO. So for now I only create a function with the appropiate signature, and that is 'app'. This can be given as an argument to http.Server(). I configure it with options and the middlewares that I want to use. I do not want to call app.listen, because that way Socket.IO could not do its own thing.
config.js
const path = require('path')
const express = require('express')
const indexRouter = require('./routes/')
const errorHandler = require('./middlewares/express/errorHandler')
const app = express()
app.set('views', path.join(__dirname + '/views'))
app.set('view engine', 'ejs')
app.use(express.static('public'))
app.use('', indexRouter)
app.use(errorHandler)
module.exports = app
In server.js I require this app, and then I create a HTTP server using it. After that, I feed it to 'socket.io', so it is connected to the proper instance. In server.js I do not call server.listen, I want to export it to a file that actually starts up the server (index.js) and I want to export it to my tests, so Supertest can spin it up.
server.js
// App is an Express server set up to use specific middlewares
const app = require('./config')
// Create a server instance so it can be used by to SocketIO
const server = require('http').Server(app)
const io = require('socket.io')(server)
const logger = require('./utils/logger')
const Game = require('./service/game')
const game = new Game()
io.on('connection', (socket) => {
logger.info(`There is a new connection! Socket ID: ${socket.id}`)
// If this is the first connection in the game, start the clock
if (!game.clockStarted) {
game.startClock(io)
game.clockStarted = true
}
game.addPlayer(socket)
socket.on('increaseTime', game.increaseTime.bind(game))
})
module.exports = server
If I understand everything correctly, basically the same thing happens, expect for a few additional steps in the example that you provided. There is no need to start the server, and then use Supertest on it, Supertest handles the process of starting up the server when I use request(server).get, etc.
EDIT 2
Right now I'm not sure whether mocking like that is enough. Some mysterious things leaves the Supertest requests hanging, and it might be that somewhere along the way it can not be ended, although I do not see why would that be the case. Anyway, here is the router:
routes/index.js
const express = require('express')
const router = express.Router()
router.get('', (req, res, next) => {
try {
res.status(200).render('../views/')
} catch (error) {
next(error)
}
})
router.get('*', (req, res, next) => {
try {
res.status(404).render('../views/not-found')
} catch (error) {
next(error)
}
})
module.exports = router
The order of requiring and mocking is correct but the order of setting up and shutting down a server probably isn't.
A safe way is to make sure the server is available before doing requests. Since Node http is asynchronous and callback-based, errors cannot be expected to be handled in async functions without promisification. Considering that server.listen(...) was called in server.js, it can be:
...
server = require('../../server')
expect(server.listening).toBe(true);
await new Promise((resolve, reject) => {
server.once('listening', resolve).once('error', reject);
});
const res = await request(server)
...
close is asynchronous and doesn't return a promise so there's nothing to await. Since it's in a dedicated block, a short way is to use done callback:
afterEach(done => {
server.close(done)
})
In case errors are suppressed in error listener, server.on('error', console.error) can make troubleshooting easier.
Supertest can handle server creation itself:
You may pass an http.Server, or a Function to request() - if the server is not already listening for connections then it is bound to an ephemeral port for you so there is no need to keep track of ports.
And can be provided with Express instance instead of Node server, this eliminates the need to handle server instances manually:
await request(app)
...

Next.js server side api call returns 500 internal server error

I'm finally dipping my toe into the world of server side react using Next.js, however I'm pretty stumped with this issue.
I'm making a call to an API from pages/customer-preferences.tsx using isomorphic-unfetch
CustomerPreferencesPage.getInitialProps = async () => {
const res = await fetch(API_URL + '/preference-customer');
const initialData = await res.json();
return { initialData };
};
All works fine locally in dev mode or once built and ran build > start. To host it I'm running it from a docker container node:10, and when I run this locally all is fine also. The issue only happens once it's deployed.
When I navigate to / and then click a link to /customer-preferences all works as expected. But if I refresh the page or load the page directly at /customer-preferences I see this error from Next.js
So the issue only seems to happen when trying to make the API calls from the server and not the client.
I've also setup a simple express server to use instead, but not sure if this is necessary?!
const express = require('express');
const next = require('next');
const port = parseInt(process.env.PORT, 10) || 3000;
const dev = process.env.NODE_ENV !== 'production';
const app = next({ dev });
const handle = app.getRequestHandler();
app.prepare().then(() => {
const server = express();
server.all('*', (req, res) => {
return handle(req, res);
});
server.listen(port, err => {
if (err) throw err;
console.log(`> Ready on http://localhost:${port}`);
});
});
When checking the server logs I get this:
FetchError: request to http://xxx failed, reason: getaddrinfo EAI_AGAIN xxx xxx:80
Any help would be greatly appreciated.
No, the server setup is not necessary.
This is happening because the browser/client is not capable of resolving your docker container's hostname. As it stands, the only solution I know of is to check for the req object in getInitialProps (so as to determine which environment the fetch will run in) and call the Docker hostname when on server, localhost when on client. E.g.
async getInitialProps (ctx) {
if (ctx.req) // ctx.req exists server-side only
{
// call docker-host:port
}
else {
// call localhost:port
}
}
My suspicion has to do with the fact that fetch is not a native node module but a client in browsers. So, if you navigate from one page to this page; per the documentation; getInitialProps will be called from the client side, making the fetch method accessible. A refresh ensures that the getInitialProps called from the server side.
You can test this theory by running typeof fetch from a browser's inspector and from a node REPL.
You are better of calling the method from component or using a third-party HTTP client like axios...
If you want to skip calling the AJAX method from the backend and only call it from the frontend, you can test if the method is calling from the frontend or the backend, like so:
CustomerPreferencesPage.getInitialProps = async () => {
if (typeof window === 'undefined') {
// this is being called from the backend, no need to show anything
return { initialData: null };
}
const res = await fetch(API_URL + '/preference-customer');
const initialData = await res.json();
return { initialData };
};

How to use Dialogflow communicating with Heroku

I have currently build a DialogFlow chatbot using Google Cloud Functions, Firebase. It all works good, but I would like to not use firebase at all, I would like to use Heroku, how can I make so that DialogFlow makes request to Heroku service.
I already know that I should add the URL of heroku and that has to be HTTPS and proper response format.
What I dont know is, how can I make the connection between DailogFlow and Heroku which is using node.js so I can send the responses such as :
sendResponse('Hello, this is responsing from Heroku')
Using firebase I have to had a function like this:
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
if (request.body.result) {
processV1Request(request, response);
} else if (request.body.queryResult) {
// processV2Request(request, response);
} else {
console.log('Invalid Request');
return response.status(400).end('Invalid Webhook Request (expecting v1 or v2 webhook request)');
}
});
I dont know if I need this when using outside Firebase, at this case Heroku! also since I am not using firebase what will happen with this code:
functions.https.onRequest((request, response) => {
I dont have "functions" variable if I am not using firebase.
Most of the code can be used unchanged - Firebase Cloud Functions use node.js/express.js under the covers, and the Dialogflow libraries assume that the request and response objects are those from express.js with the JSON body parser middleware.
The line in question is syntactic sugar to have the Firebase Cloud Functions processor discover it and handle it. So you would replace that line
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
with something more like this
const express = require('express');
const app = express();
app.use( express.json() );
app.get('/', (req, res) => processWebhook( req, res ));
app.listen(3000, () => console.log('App listening on port 3000!'));
var processWebhook = function( request, response ){
// ... the console logging and all the processing goes here

Resources