So I am trying to make a simple proxy (I think that's the right word) and I've come up with some code that works fine locally. I can call 'firebase serve --only functions' and the function works fine and I get expected results. Now when I deploy this same code, and try calling it it just times out. I have no idea why, so I was hoping I could get some help.
Here's the code:
//Variables
const functions = require('firebase-functions');
const express = require('express');
const cors = require('cors');
const request = require('request');
//App
const app = express();
app.use(cors({ origin: true }));
//Endpoints
app.get('/**', function(req, res) {
request('https://example.com' + req.url, function(err, proxyRes, body) {
//Err
if (err) {
res.send('Error: ' + err.code);
return
}
//Res
res.status(200).send(body);
});
});
//Functions
exports.proxy = functions.https.onRequest(app);
HTTP functions will time out if they don’t send a response. This means your request() is probably failing, and it’s probably failing because, on the free Spark payment plan, you can’t make outgoing requests to services that Google doesn’t fully control.
Your function should send a response in all conditions in order to avoid a timeout. This means you should be checking for errors all the time.
Related
This is my function
export const testFunction = functions.https.onRequest((req, res) => {
const text = req.body.text;
res.set("Access-Control-Allow-Origin", "*");
res.send({text: text});
});
i've tried using const cors = require('cors')({origin: true});
I keep getting this as a response. Does anyone know why?
Consider importing like this,
const cors = require('cors')({origin: true});
And try running the below function using firebase deploy –only functions :
const functions= require("firebase-functions");
const cors = require('cors')({origin: true});
exports.testFunction = functions.https.onRequest((req, res) => {
cors(req, res, () => {
const text = req.body.name;
res.send({text:text});
});
});
And then send request using :
curl -X POST "https:/region-projectID.cloudfunctions.net/function-name" -H "Content-Type:application/json" --data '{"name":"Keyboard Cat"}'
The output I am getting in the console is :
And when I click on the Cloud Function URL endpoint, my output is an empty {}.
If you try res.send(“Hello World”) in place of res.send({text:text}), you will get the output in the browser as Hello World but since our Cloud Function performs some contrived operation using data passed in the request body.This could result in an error at run time if the property name is null or undefined.
And indeed, if we deploy this new function and then attempt to call it from our web app without updating our request we do get an error. However, it might not be the error you’d expect.
It’d be easy to assume that we somehow misconfigured our CORS policy. Infact swapping cors() to cors({ origin: '*'}) to cors({ origin: true}) all to no avail. Only when we view the logs for our Cloud Function do we get a useful error message.
So try sending a POST request with the –data flag, or if you are using Postman, send the data and the parameters. Then only you would be able to have an output, if you still see the CORS error, check if your function is handled well or your nested request body attribute is not undefined/null. Sometimes CORS errors are not always CORS!
I have a system in place where I have a nodejs app:
app.post('/action', (req, res) => {
...
const option = req.body.option
...
switch (option) {
case 'getinfo':
objectToSend = {"foo": "bar"}
// i tried using
res.json(objectToSend)
// and
res.send(JSON.stringify(objectToSend))
// but neither got me anywhere
break
}
And a website that sends a post request using fetch like this (infoModal is the function I use to display data) (I got the action function sent on discord and have been using it since then, but ive never had to do anything with the response)
let action = async (i) => {
res = await fetch("/action", {
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
method: "POST",
body: JSON.stringify(i)
})
return await res.json
}
action({
option: 'getinfo'
}).then(j => {
infoModal(j.foo,'ok')
})
I can't really fix either the backend or frontend since both have to work for me to confirm it works...
EDIT:
These are my requires, uses and sets:
require('dotenv').config()
...
const express = require('express')
const path = require('path')
var bodyParser = require('body-parser')
let ejs = require('ejs')
const fs = require('fs')
var cookieParser = require('cookie-parser')
var colors = require('colors')
const app = express()
app.use(bodyParser.json())
app.use(cookieParser())
app.set('views', path.join(__dirname, 'frontend'))
app.set('view engine', 'ejs')
One obvious mistake is not executing the jeson() method of the Fetch response. And, although harmless, the second await statement is not really necessary - the async functions anyway wrap what is returned in a promise.
return res.json();
If that doesn't work -
See what your developer console says. It should give you lot of information about the request. If there is an error, follow the info (response code, any error message etc) and try to determine the problem.
Use a rest client such as POSTMAN to verify your backend first. When you know that it can respond well to a proper request, you can try your front-end with confidence and get more understanding on how the response should be handled.
So I'm trying to learn nodeJS.. But something wierd is happening. When I try to make a GET or a POST request it keep requesting infinitly on the localhost. I tested with a simple piece of code just requesting a simple Hello Word but it still doesnt works. It was working perfectly yesterday.
I tested insomnia, postman and the browser. If someone can help me would be very nice, cause I'm really stucked here...printscream of the insomnia infinity request
const {json} = require('express');
const express = require('express');
const {uuid} = require('uuidv4');
const app = express();
app.use(express,json);
const projects = [];
app.get('/projects', (request, response) => {
return response.json(projects);
});
app.post('/projects', (request, response) => {
const {title, owner} = request.body;
const project = {id: uuid(), title, owner };
projects.push(project);
return response.json(project);
});
app.listen(3333, () => {
console.log('Working 👏👏')
});
It's just two little mistakes. Take in mind that express.json() is a method, so you need to put it like this:
app.use(express.json())
You are using a comma instead of a point. However, you have done a destructuring of the .json () method; therefore, you do not have to prepend express; it would look like this:
app.use(json())
On the other hand, you probably have an unwanted result in the post request since you send the project variable instead of projects. Be sure that is what you want.
Hello guyz i need an answer to an simple question.I am using Aws lambda with serverless framework and i am using mongoDb connection in aws lambda.I have used connection code inside my handler function and i have used connection pooling.
Now when i deploy my app on AWS using sls deploy and after deploying when i call my lambda for first time then connection is established only once and after that on other lambda API calls it is reusing my connection instead of creating new connection.so this thing is fine.
Now after this process i am running a script which is not related with my AWS app to test my concurrent lambda requests.I have called my same lambda API using request npm module in for loop in script and in that case all time my new connnection is created till loop terminates instead of using my existing one generated from first call.Can someone tell me why it is happening and what is the reason behind this? Why my connection is creating again when this script runs when already i have created my connection on first lambda call.
And same api when i call from postman then it is resuing my connection after first lambda call but when i run this script and from inside script i call this API(using request NPM module) using command "node app.js" then all time till loop terminates it creates new connection.
Please help me out in this.
'use strict'
const bodyParser = require('body-parser')
const express = require('express')
const serverless = require('serverless-http')
const cors = require('cors');
const mongoConnection = require('./connection/mongoDb');
const app = express()
app.use(cors())
app.use(bodyParser.json())
const handler = serverless(app);
let cachedDb = null;
module.exports.handler = async (event, context) => {
context.callbackWaitsForEmptyEventLoop = false;
if (cachedDb == null) {
let Database = await mongoConnection();
console.log("DB", Database);
cachedDb = Database
}
const baseRouter = require('./routes/index');
app.use('/api', baseRouter);
const result = await handler(event, context);
return result;
};
Here is a node.js example that shows the connection parameters. Perhaps this will help?
const express = require("express");
const bodyParser= require("body-parser")
const app = express();
const MongoClient = require("mongodb").MongoClient
MongoClient.connect("mongodb://myusername:mypassword#localhost:27017", (err, client) => {
if (err) return console.log(err)
var db = client.db("mydatabase")
db.collection("mycollection").countDocuments(getCountCallback);
app.listen(3000, () => {
console.log("listening on 3000")
})
})
function getCountCallback(err, data) {
console.log(data);
}
app.use(bodyParser.urlencoded({extended: true}))
app.get("/", (req, res) => {
res.sendFile(__dirname + "/index.html")
})
app.post("/quotes", (req, res) => {
console.log(req.body)
})
Your example code does not show any hostname for your database server, nor does it specify which port to use. Please compare your code and contrast to my example.
I see you defined the cachedDb variable outside the handler scope, so that makes it available when the container is reused. However, there is no guarantee that the container will be reused (see my previous link on that) because that's not how Lambda works. If you invoke the same functions many times very quickly after eachother, Lambda needs to scale out horizontally to be able to handle the requests quickly. They each get their own container and connection.
When the invocation is finished, AWS will keep the container for a bit (how long depends on many factors like function size & RAM limit). If you invoke it again the containers can reuse their connection. You can try to invoke the function 20 times with 1 second interval and counting the number of connections that have been openend. It will be lower than 20, but higher than 1.
I am currently authoring a component for a users sites which adds a number of resources to their server. For my part, I am wanting to use express, but the user could be using express themselves or some other web framework.
Because I want this to just work out of the box, I was attempting to setup my express pipeline via proxying http.createServer. Injecting the pipeline this way seems to work reasonably well in cases where I handle the request, but it fails in cases where I let it fall through and execute the users callback (specifically, it says that the response headers have been sent already).
Here is the sample I am working on. Any ideas?
var http = require('http');
var express = require('express');
var setupProxy = function setupProxy() {
var app = buildApp();
var oldCreateServer = http.createServer;
http.createServer = function(callback) {
return oldCreateServer(function(req, res) {n
app.apply(this, arguments);
if (!res.finished) {
callback.apply(this, arguments);
}
});
};
};
var buildApp = function buildApp() {
var app = express();
app.use('/test', function(req, res) {
res.send('Hello World');
});
return app;
};
I suspect your express handling creates a default 404 response when it doesn't match any of your routes. So, that would be the cause of the headers already sent issue when you are not handling it, but trying to pass it on.
So, I think you need your own 404 handler that writes nothing to the request (e.g. does nothing), but keeps Express from handling it.
Or, another possibility would be to call the user's server callback from your express 404 handler and not elsewhere.