Firebase Function Error: could not handle the request + Next App - node.js

This might seem like a repeated question but i promise I've made sure
to check every related ones but non solve my problem
I am deploying a nextjs app on Firebase hosting using cloud functions,
the app does deploy and gives me back the hosting URL of app
link. visiting the URL i got this 403 forbidden error;
Your client does not have permission to get URL /nextServer/ from this server.
Which I fixed by adding allUsers and allAuthenticatedUsers as function invoker at function permission.
But then, after fixing that I got another error saying:
Error: could not handle the request
this terminated the function with a log of
Function execution took 804 ms, finished with status: 'crash'
This doesn't give any reason why the function crashed which has made fixing it all the more harder. Below is my nextServer function code:
const { https } = require("firebase-functions");
const { default: next } = require("next");
const isDev = process.env.NODE_ENV !== "production";
const server = next({
dev: isDev,
conf: { distDir: ".next" },
});
const nextjsHandle = server.getRequestHandler();
exports.nextServer = https.onRequest((req, res) => {
return server.prepare().then(() => {
return nextjsHandle(req, res);
});
});
and this is the firebase.json:
{
"hosting": {
"public": "public",
"site": "webavocat",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
"rewrites": [
{
"source": "**",
"function": "nextServer"
}
]
},
"functions": {
"source": ".",
"runtime": "nodejs16",
"ignore": [
"**/.vscode/**",
".firebase/**",
".firebaserc",
"firebase.json",
"**/node_modules/**",
"**/public/**",
"**/.next/cache/**"
]
}
}
The project folder structure is as shown below
Edit 1:
Don't really know if this is correct,
const nextjsHandle = server.getRequestHandler();
exports.nextServer = https.onRequest((req, res) => {
try {
return server.prepare().then(() => {
return nextjsHandle(req, res);
});
} catch (error) {
console.error(error);
}
});
but i used a try catch on the on the server function as #mdobrucki suggested. still no log on the why the function crashed though

Related

Nuxt3 and Firebase Cloud Function Hosting: How to access private key in .env file?

I have a Nuxt3 app that is using "server routes" to create backend APIs to use for the front-end.
I have the following server route:
server/api/imagekit/deleteFile.js:
import ImageKit from 'imagekit'
const imagekit = new ImageKit({
publicKey: useRuntimeConfig().public.imagekitPublicKey,
privateKey: useRuntimeConfig().imagekitPrivateKey,
urlEndpoint: useRuntimeConfig().public.imagekitBaseURL
})
export default defineEventHandler(async (event) => {
// Purge cache of file from Imagekit
// See detailed email from Rahul # imagekit dated aug 31, 2022
const body = await useBody(event)
const response = await imagekit.purgeCache(body.url)
return response
})
The above code works fine locally, but once I deploy to Firebase Hosting, I get the following server error when trying to access the API deleteFile:
description: ""
message: "Missing privateKey during ImageKit initialization"
statusCode: 500
statusMessage: "Internal Server Error"
url: "/api/imagekit/deleteFile"
In case it's relevant to this question, here is my code for Nuxt's nuxt.config.ts file where the runtimeConfig property is listed:
runtimeConfig: {
imagekitPrivateKey: '',
public: {
baseURL: process.env.NODE_ENV === 'production' ? 'https://example.com' : 'http://localhost:3000',
imagekitBaseURL: 'https://ik.imagekit.io/example/',
imagekitPublicKey: 'public_AdZM6u2+FvznG/LngYp7Ab3TJy4='
}
}
Also, my firebase.json uses 2 codebases for the functions: one for server and one for cloud functions:
{
"functions": [
{
"source": ".output/server",
"codebase": "nuxt"
},
{
"source": "functions",
"codebase": "functions"
}
],
"hosting": [
{
"site": "XXX",
"public": ".output/public",
"ignore": ["firebase.json", "**/.*", "**/node_modules/**"],
"cleanUrls": true,
"rewrites": [
{
"source": "**",
"function": "server"
}
]
}
]
}
I do have an .env file in project root that holds the imagekitPrivateKey value.
How would I provide this information to Firebase hosting deployment so ImageKit properly initializes with the private key?
You can read the variables from .env in nuxt.config.ts as shown below:
export default defineNuxtConfig({
runtimeConfig: {
// Uppercase preferred in .env file
imagekitPrivateKey: process.env.IMAGEKIT_PRIVATE_KEY,
},
});
Then you can access it in your API routes:
export default defineEventHandler((event) => {
const { imagekitPrivateKey } = useRuntimeConfig();
return { message: "success" };
});

AZ Custom Handler No connection could be made because the target machine actively refused it

Trying to add a Custom Handler to a simple AZ Function project. AZF works ok locally at VSC before adding. After adding, F5 starts ok like before:
[2021-12-30T19:33:14.402Z] Startup operation 'a9550626-ee3e-1234-b254-9facc08a3890' completed.
Functions:
select: [GET,POST] http://localhost:7071/api/select
then:
For detailed output, run func with --verbose flag.
[2021-12-30T19:33:16.324Z] Waiting for HttpWorker to be initialized.
Request to: http://127.0.0.1:49774/ failing with exception message: No
connection could be made because the target machine actively refused
it. (127.0.0.1:49774)
The port is random, next time can be 62974, then F5 stops itself.
Here is what's been added:
customHandler into root host.json.
a new folder middleware contains
app.js, myroute.js, host.json, mdw.js
root host.json:
{
...
"customHandler": {
"description": {
"defaultExecutablePath": "node",
"defaultWorkerPath": "middleware/app.js"
},
"enableForwardingHttpRequest": true
}
}
middleware/app.js:
const express = require('express')
const app = express()
const port = 3005;
app.listen(port, ()=>{console.log('================ My mdw is on 3005 ================');});
require("./myroute.js")(app);
middleware/myroute.js:
const express = require('express')
const mdw = require("./mdw");
module.exports = app =>
{
app.post("/api/testmdw", mdw.mytest);
};
middleware/mdw.js:
async function mytest(req, res, next)
{
const q = req;
return res.json({mdw: "ok"});
}
module.exports = { mytest }
middleware/host.json:
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
I am following this Blog to use middleware on azure function via custom handler
Host.json
"customHandler": {
"description": {
"defaultExecutablePath": "node",
"defaultWorkerPath": "azexpresstest/app.js"
},
"enableForwardingHttpRequest": true,
},
"extensions": {"http": {"routePrefix": ""}}
The customHandler section points to a target as defined by the defaultExecutablePath. The execution target may either be a command, executable, or file where the web server is implemented.
"customHandler": {
"description": {
"defaultExecutablePath": "app/handler.exe",
"workingDirectory": "app"
}
…
Refer here for more information

Express or Axios Error: socket hang up code: ECONNRESET

This is the first time i post a question here, sorry if some data is missing.
I'm trying to do some web scraping to get some info of a table.
The page only responds with an index.php and when i use the search form, it makes a POST to index.php?go=le with some formData.
To avoid the CORS problem, im making the post with my own API running in localhost. I'm pointing my frontend to my API and i get the response from localhost.
No problem there.
My problem appears when i try to make a second request to my API. The first GET works fine but after that response it keeps failing.
When i restart the server, it works again but only one time.
Here is my API code. I use nodemon server.js to start my server.
server.js
const express = require("express");
const axios = require("axios");
const scrape = require("scrape-it");
const FormData = require("form-data")
const cors = require("cors")
const app = express();
const PORT = process.env.PORT || 5000;
app.use(cors())
const config = {
headers: {
'Content-type': 'multipart/form-data'
},
}
app.get("/get-projects", async (req,res) => {
const testJSON = await axios.post(baseURL +"/index.php?go=le",formData,config)
.then(res => {
console.log("Post successfull...");
return res
}
)
.catch(err => {
console.log("Server error");
return err
}
);
if(testJSON && testJSON.data){
res.send({status: 200, data: testJSON.data});
}else{
res.status(508).send({status: 508, msg: "Unhandled Server Error", failedResponse: testJSON || "empty"})
}
})
app.listen(PORT,()=>console.log(`App running in port: ${PORT}`))
And in my front-end i only have a button with an event that makes a get to my API (http://localhost:5000)
This is my fetch.js that is included by a script tag. Nothing fancy there.
fetch.js
const btn = document.getElementById("btn-fetch-proyects")
const axios = window.axios
const fetchProjects = async () => {
console.log("Fetching...")
axios.get("http://localhost:5000/get-projects")
.then(res=>
console.log("The server responded with the following data: ",res.data)
)
.catch(err => console.log("Failed with error: ",err)
)
return null
}
btn.addEventListener("click",fetchProjects);
In the console where im running the server, i get Server error with this err object:
{
"message": "socket hang up",
"name": "Error",
"stack": "Error: socket hang up\n at connResetException (internal/errors.js:607:14)\n at Socket.socketOnEnd (_http_client.js:493:23)\n at Socket.emit (events.js:327:22)\n at endReadableNT (internal/streams/readable.js:1327:12)\n at processTicksAndRejections (internal/process/task_queues.js:80:21)",
"config": {
"url": "http://186.153.176.242:8095/index.php?go=le",
"method": "post",
"data": {
"_overheadLength": 1216,
"_valueLength": 3,
"_valuesToMeasure": [],
"writable": false,
"readable": true,
"dataSize": 0,
"maxDataSize": 2097152,
"pauseStreams": true,
"_released": true,
"_streams": [],
"_currentStream": null,
"_insideLoop": false,
"_pendingNext": false,
"_boundary": "--------------------------935763531826714388665103",
"_events": {
"error": [
null,
null
]
},
"_eventsCount": 1
},
"headers": {
"Accept": "application/json, text/plain, */*",
"Content-Type": "multipart/form-data",
"User-Agent": "axios/0.21.1"
},
"transformRequest": [
null
],
"transformResponse": [
null
],
"timeout": 0,
"xsrfCookieName": "XSRF-TOKEN",
"xsrfHeaderName": "X-XSRF-TOKEN",
"maxContentLength": -1,
"maxBodyLength": -1
},
"code": "ECONNRESET"
}
I hope someone has a clue about what's happening. I tried all day and i couldn't solve it.
I tried posting to other sites, and it works fine. I thing the problem is with the form POST.
Thanks for reading!!!
At a first glance I see an error in your front-end code. You are using async on the function but then you do not await but you use .then, try not mixing up styles, either you use async/await or .then .catch.
Check if that helps! :)
Obviously the socket is hanging!
Use node unirest and it closes the data stream.
var unirest = require('unirest');
var req = unirest('POST', 'localhost:3200/store/artifact/metamodel')
.attach('file', '/home/arsene/DB.ecore')
.field('description', 'We are trying to save the metamodel')
.field('project', '6256d72a81c4b80ccfc1768b')
.end(function (res) {
if (res.error) throw new Error(res.error);
console.log(res.raw_body);
});
Hope this helps!

Problems connecting to socket.io using firebase functions and express

I'm trying to utilize a socket.io connection with firebase hosting/functions but I'm running into a few problems. It started off as a CORS issue (which im sure is still the problem) but now I'm just totally lost on whats wrong, below is my firebase.json, index.js (firebase function file), and even the angular client file which initializes the connection to the server.
firebase.json
{
"hosting": {
"public": "public",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
"headers": [
{
"source" : "**",
"headers" : [ {
"key" : "Access-Control-Allow-Headers",
"value" : "Origin"
} ]
},
{
"source" : "**",
"headers" : [ {
"key" : "Access-Control-Allow-Origin",
"value" : "http://localhost:4200"
} ]
}
],
"rewrites": [
{
"source": "**",
"function": "app"
}
]
},
"functions": {
"predeploy": [
"npm --prefix \"$RESOURCE_DIR\" run lint"
],
"source": "functions"
}
}
index.js
const functions = require('firebase-functions');
const express = require('express');
var app = express();
const http = require('http').Server(express);
const socketio = require('socket.io')(http);
const cors = require('cors');
app.use(cors({credentials: true, origin: '*:*'}));
app.get('/tester', (request, response) => {
//response.header('Access-Control-Allow-Origin', '*');
response.send('Hello!');
socketio.on('connection', socket => {
connections.push(socket);
console.log('New client connected ('+connections.length+' connections).');
//console.log(socket);
socket.emit('port', 'LIVE SHIT');
});
})
exports.app = functions.https.onRequest(app)
app.component.ts (client component for connection)
import { Component } from '#angular/core';
import io from 'socket.io-client';
#Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.scss']
})
export class AppComponent {
title = 'multi-client';
port: number = 0;
socket: any;
width: number = 100;
updateVal: number = 5;
constructor()
{
this.socket = io('http://localhost:5000/tester');
this.socket.on('port', port => {
this.port = port;
})
}
updateWidth(val)
{
this.width += val;
}
}
This is the error I'm getting.
I viewed a few posts on stackoverflow and various other sites that had similar problems but nothing seemed to work. I'm sure im doing it wrong but I'm lost on what I'm missing to accomplish this. Please help!
Cloud Functions are for relatively short-lived operations with a clear end. You cannot use Cloud Functions to keep a connection to the client open.
The reason is that Cloud Functions closes the resources of your container by the time it thinks you're done. In the case of a HTTP function like yours, that means that these resources are gone when you've called response.send('Hello!');.
So what is possible is to send the response to the client once you've established a socket connection like this:
app.get('/tester', (request, response) => {
socketio.on('connection', socket => {
connections.push(socket);
console.log('New client connected ('+connections.length+' connections).');
//console.log(socket);
socket.emit('port', 'LIVE SHIT');
response.send('Hello!');
});
})
But in that case too, the connection will be closed after the call to response.send('Hello!');. And while you could not send a response, even in that case the connection/resources will be closed after 9 minutes, since that is the maximum time a Cloud Function can run.
It all goes back to my initial statement that Cloud Functions are only for short-lived operations with a clear end moment. For long-running processes, use another platform, such as App Engine, Compute Engine, or one of the many other offerings where you can manage your own processes.
Also see:
Run a web socket on Cloud Functions for Firebase?
Google Cloud Functions with socket.io

ng serve --proxy-config proxyconfig.json not working

I'm building an app using MEAN stack. I'm using Proxy config file to make requests to the backend which is written in Node JS.
proxyconfig.json
{
"/api/*": {
"target": "https://localhost.com:3333",
"secure": false,
"changeOrigin": true,
"pathRewrite": {
"^/api": "https://localhost.com:3333/api"
}
}
}
Code in Component file
this.http.get("/api/posts",{responseType: 'text'})
.subscribe(
data =>
{
console.log('successs');
},
error =>
{
console.log(error);
}
);
Code in Node JS server
app.get('/api/posts', function(req, res) {
console.log('Posts Api Called');
res.status(200).send({ data: 'somedata' });
});
I'm getting 500 error when I inspect the request from Chrome. The GET method is not getting called at all. What could be the cause?
Finally, I made a silly mistake and this worked for me.
{
"/api": {
"target": "https://localhost:3333/api",
"secure": false,
"changeOrigin": true,
"pathRewrite": {"^/api" : ""}
}
}

Resources