I have a Node server and use Firebase hosting. The server uses the Node module rss-to-json for parsing a RSS feed to JSON.
When I test the server on my machine, the request works, but when deploying it doesn't work. It just loads forever and gives:
"Error 503 first byte timeout"
My code:
const functions = require("firebase-functions");
const express = require("express");
const Feed = require("rss-to-json");
app.get("/feed", (req, res) => {
Feed.load("https://www.reddit.com/.rss", function(err, rss) {
res.send(rss);
});
});
exports.app = functions.https.onRequest(app);
Have tried:
Removed "node_modules" from "ignore" in firebase.json, but no luck.
Any ideas?
On the Spark payment plan, you can't make outgoing requests to services that aren't fully controlled by Google. If you want to do this, you'll have to upgrade your project to the Blaze plan. See the pricing page for more information.
Related
I have created a basic app with React SPA and Node.js & Express web API using this sample
https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/5-AccessControl/1-call-api-roles
The app is running fine locally- both front-end authentication and backend API calls are working as expected.
When I deploy the app on Azure, the front end is working fine but the backend API calls are not working at all and they timeout- I do not see any logs in the backend either. While locally, I can see all the console.logs on invoking the API calls.
I have tried many changes to troubleshoot Node.js app mentioned below and I am seeking guidance on what other settings/ changes might help to get the Node.js app running on Azure.
I tried to run the backend manually in Azure as well in /wwwroot/server - node app.js and I see the log saying API is listening, but API calls timeout.
The web.config.js is pointing to server/app.js
On Azure, the React app successfully runs only on port 443 as below and doesn't run at all when set to some other PORT:
client/package.json
...
"scripts": {
"start": "set PORT=443&& react-scripts start",
...
Locally it runs on all the ports when specified as below:
client/package.json
...
"scripts": {
"start": "set HTTPS=true&&set SSL_CRT_FILE=certificate.crt&&set SSL_KEY_FILE=privateKey.key&&set PORT=3000&& react-scripts start",
...
fetch calls timeout on Azure
client/fetch.js
export const getTest = async (env) => {
// const accessToken = await getToken();
const headers = new Headers();
// const bearer = `Bearer ${accessToken}`;
// headers.append("Authorization", bearer);
const options = {
method: "GET",
headers: headers,
};
return fetch(protectedResources.apiList.test, options)
.then(response => response.json())
.catch(error => console.log(error));
}
In server/app.js, locally, both commenting out and using the authentication logic works for API calls, but none works on Azure.
Also tried to run Node.js on both http and https. PS. locally only HTTPS works for both client and Node as redirectUri is https
server/app.js
const path = require('path');
const express = require("express");
const cors = require("cors");
const axios = require("axios");
const PORT = process.env.PORT || 8080;
require("dotenv").config();
const app = express();
app.use(cors());
app.use(express.json());
//app.use(express.static(path.resolve(__dirname, '../client/build')));
app.get("/api/test", (req, res) => {
res.json("hi");
});
//...other bearer token code
// https.createServer(httpsOptions, app).listen(PORT, () => console.log(`Server is running on port ${PORT}`));
app.listen(PORT, () => console.log(`Server is running on port ${PORT}`));
If you are using azure VPS, then enable the ports from the azure user panel. As azure restricts ports that are even enabled from the windows firewalls. So yea make sure, ports are enabled from the user panel/control panel.
After giving a couple of days of trying to figure out the solution, I decided to reverse engineer the code and start with a version that worked locally and on Azure.
Then incrementally copied my final code (with Azure issue) and made small builds and releases so I could track what could have gone wrong.
Eventually, I got the final app working with the same code (that had the deployment problem on azure). Not sure what the problem was with Azure deployment earlier.
I am new to coding and I have a question about a small app that I created with angular.js that makes a post request to another express.js app.
It's a simple parentheses balance checker, you can find the whole code here:
https://github.com/OGsoundFX/parentheseschecker
The FrontEnd-Refactored folder contains the Angular.js app and the APItesting contains the express.js API.
The app is working, but I have only been using it locally. The API runs on localhost:3000 and the app on localhost:8080
But what if I want to make it public? How would I go about it? I don't really know where to start.
Where to host a node.js of express.js app. I read about AWS, would that be good, or are there better services?
I have a Wordpress website hosted on https://www.mddhosting.com/ but that wouldn't work, right?
My Angular app is calling the api locally at the moment, so I will probably have to change the API link that it is fetching:
ApiService.js
function ApiService($http) {
API = '//localhost:3000/parentheses';
this.getUser = (entry) => {
return $http
.post(API, { string: entry} )
.then(function (response) {
return response.data;
}, function (reason) {
// error
})
};
};
angular
.module('app')
.service('ApiService', ApiService);
In my API server.js
const http = require('http');
const app = require('./app')
const port = process.env.PORT || 3000;
const server = http.createServer(app);
server.listen(port);
I will definitely have to change API = '//localhost:3000/parentheses'; in my AngularJS app, but should I change const port = process.env.PORT || 3000; ?
I just need a little push start to help me clear some confusion.
Thanks!
Ensure you use application-level environment variables For example: To define the BASE_URL of your site for development and production separately. Doing so you don't have to make any configuration changes when you go live, it is a one-time process.
And if you are looking for free hosting services for pet projects Heroku is good and if you really want to make site go live for the end-users you may go for AWS EC2 instance or Heroku paid service both are good.
Hey I'm trying to trace outgoing requests from an express app, but I can't get it to work.
When I dont use the AWSXRAY.captureHttpsGlobal function everything works fine with incoming requests and I can see my application in "Service Map" and my incoming request traces coming in on AWS, but I want to trace outgoing requests and as soon as I add AWSXRAY.captureHttpsGlobal then nothing works and I get no exception or anything, and my Daemon doesnt print the usual "Successfully sent batch of 1 segments (0.058 seconds)"
This is my code.
var AWSXRay = require('aws-xray-sdk');
const express = require("express");
var app = express();
app.use(AWSXRay.express.openSegment('MyApp'));
AWSXRay.captureHTTPsGlobal(require('https')); // works when i comment this out
var http = require('https');
app.get('/', function (req, res) {
http.get("https://google.com", (resp) => {
res.send("googlefetched")
});
//res.send("hello world")
});
app.use(AWSXRay.express.closeSegment());
app.listen(3000, () => console.log('Example app listening on port 3000!'))
could you share which node runtime version your code is running at and which X-Ray SDK version you are using so we can try to reproduce this issue on our side?
At the meantime I would like to share a previous issue that has been fixed since v1.2.0 https://github.com/aws/aws-xray-sdk-node/issues/18 where if the response body is not consumed then the entire segment will never be flushed to the daemon.
Please let me know.
I already have an app written in MERN stack with koa server prepared build version. My main node file to run by node server.js command to start the whole app looks like this.
In every tutorial, I see that I need to add functions.https.request etc. in the beginning of coding (or at least to suppose doing it).
How could I host my app on firebase the same as I could on heroku - with whole server side?
It is possible to host Koa app using firebase functions, I figure it out after some Googling and analyzing.
This is a piece of code from my project, it is now hosted with firebase functions:
const Koa = require('koa');
const app = new Koa();
// ... routes code here ...
// This is just for running Koa and testing on the local machine
const server = app.listen(config.port, () => {
console.log(`HITMers-server is running on port ${config.port}`);
});
module.exports = server;
// This export is for Firebase functions
exports.api = functions.https.onRequest(app.callback());
You can see the docs and tutorial video for more information.
By the way, here is another example to deploy Koa to now.sh version 2.
You can actually skip the listen call entirely, and use app.callback().
This seems to make more sense than listening on a random port that never actually gets hit.
const functions = require('firebase-functions');
const app = new Koa();
... // set up your koa app however you normally would
app.use(router.routes());
module.exports.api = functions.https.onRequest(app.callback());
You can run an express application using firebase hosting to serve dynamic content via firebase functions. You cannot, however, use Koa.js currently. The functions.https.onRequest requires you to pass an HTTP request handler or an express app returned from express().
Here is the relevant article from Firebase about serving dynamic content from functions.
https://firebase.google.com/docs/hosting/functions
Here is a video tutorial from Firebase on using express.
https://www.youtube.com/watch?v=LOeioOKUKI8
To anyone looking for koa Google Cloud Functions, Here is my working version in typescript
import Koa from 'koa';
import Router from 'koa-router';
import type { HttpFunction } from '#google-cloud/functions-framework/build/src/functions';
const app = new Koa();
const port = process.env.PORT || 3001;
const router = new Router();
router.get('/', async (ctx) => {
ctx.body = 'Hello World!';
});
app.use(router.routes());
// For development on local
if (!isCloudFunctions()) {
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
}
export const helloWorldApi: HttpFunction = app.callback();
function isCloudFunctions(){
return !!process.env.FUNCTION_SIGNATURE_TYPE;
}
For deployment:
gcloud functions deploy test-koa-function --entry-point=helloWorldApi --runtime nodejs16 --trigger-http --allow-unauthenticated
You can't deploy and run an arbitrary node app on Cloud Functions. You have to make use of the different types of triggers that are defined by the product.
See the Cloud Functions for Firebase main page to see the list.
Cloud Firestore Triggers
Realtime Database Triggers
Firebase Authentication Triggers
Google Analytics for Firebase Triggers
Crashlytics Triggers
Cloud Storage Triggers
Cloud Pub/Sub Triggers
HTTP Triggers
I have webapp with firebase database. I would like hosting the app on firebase. My app has own server nodejs and using websockets. How can I host my app on Firebase? And how can I run my own server on Firebase?
I think your question is quite simple. And the answer is also simple: no, you can't.
Firebase only serves static files. You need to try heroku, codeship, etc for that.
I'm not sure what exactly you are looking for. I'll assume it's one of these two:
you want to run the node.js scripts on Firebase's server
There is no way to run your own code on Firebase's servers.
you want to run the node.js scripts on your own server and have them interact with your Firebase data
Firebase has a node.js package that allows you to talk to its BaaS service from your own node scripts. See the node.js section in Firebase's quickstart and the npm package for Firebase.
You can use Google Cloud Functions to do most task processing in a serverless style: https://firebase.google.com/docs/hosting/functions
I'm using it to dynamically load javascript based on req.url.
With Firebase functions, yes you can. You can watch this tutorial from Google, it's very clear and easy to catch up.
Node.js apps on Firebase Hosting Crash Course - Firecasts
Firebase Hosting allows you to use Cloud Functions to perform server-side processing. This means that you can support dynamic generation of content for your Firebase Hosting site.
Documentation
Firebase Functions is the way to go.
Example:
Setup /index.js in your project with expressjs listening on port what ever you want. F.e. 3000
const app = express()
const port = 3000
...
app.get('/', (req, res, next) => {
... hier your code to handle request
})
...
app.listen(port, () => {
console.log(`app listening at port = ${port}`)
functions.logger.info("Application started", {structuredData: true});
})
Export your function with reference to express-app:
exports.api = functions.https.onRequest(app)