Firebase Hosting with own server node.js - node.js

I have webapp with firebase database. I would like hosting the app on firebase. My app has own server nodejs and using websockets. How can I host my app on Firebase? And how can I run my own server on Firebase?

I think your question is quite simple. And the answer is also simple: no, you can't.
Firebase only serves static files. You need to try heroku, codeship, etc for that.

I'm not sure what exactly you are looking for. I'll assume it's one of these two:
you want to run the node.js scripts on Firebase's server
There is no way to run your own code on Firebase's servers.
you want to run the node.js scripts on your own server and have them interact with your Firebase data
Firebase has a node.js package that allows you to talk to its BaaS service from your own node scripts. See the node.js section in Firebase's quickstart and the npm package for Firebase.

You can use Google Cloud Functions to do most task processing in a serverless style: https://firebase.google.com/docs/hosting/functions
I'm using it to dynamically load javascript based on req.url.

With Firebase functions, yes you can. You can watch this tutorial from Google, it's very clear and easy to catch up.
Node.js apps on Firebase Hosting Crash Course - Firecasts

Firebase Hosting allows you to use Cloud Functions to perform server-side processing. This means that you can support dynamic generation of content for your Firebase Hosting site.
Documentation

Firebase Functions is the way to go.
Example:
Setup /index.js in your project with expressjs listening on port what ever you want. F.e. 3000
const app = express()
const port = 3000
...
app.get('/', (req, res, next) => {
... hier your code to handle request
})
...
app.listen(port, () => {
console.log(`app listening at port = ${port}`)
functions.logger.info("Application started", {structuredData: true});
})
Export your function with reference to express-app:
exports.api = functions.https.onRequest(app)

Related

How to run a gRPC Server and an Express Server on the Same PORT in cloud run

Are there any known techniques or hacks for running both an Express Server and a GRPC Server on the same port? I am aware Cloud Run exposes just a single Port for a service instance.
So, I'm literally just looking for hacks as it stands now.
Like below
const app = express();
const port=process.env.PORT;
app.listen(port,()=>{});
gRPCServer.bindAsync(`0.0.0.0:${port}`, grpc.ServerCredentials.createInsecure(), () => {
gRPCServer.start();
});
I wish to expose some routes to my users via express and only use GRPC for internal micro services communication.
I saw https://github.com/grpc-ecosystem/grpc-gateway but there are very few documentations on how to use it with NodeJS plus I DON'T want to generate client libraries. I prefer dynamic code generation.

Node.js, Express.js, Angular.js: Hosting my own API

I am new to coding and I have a question about a small app that I created with angular.js that makes a post request to another express.js app.
It's a simple parentheses balance checker, you can find the whole code here:
https://github.com/OGsoundFX/parentheseschecker
The FrontEnd-Refactored folder contains the Angular.js app and the APItesting contains the express.js API.
The app is working, but I have only been using it locally. The API runs on localhost:3000 and the app on localhost:8080
But what if I want to make it public? How would I go about it? I don't really know where to start.
Where to host a node.js of express.js app. I read about AWS, would that be good, or are there better services?
I have a Wordpress website hosted on https://www.mddhosting.com/ but that wouldn't work, right?
My Angular app is calling the api locally at the moment, so I will probably have to change the API link that it is fetching:
ApiService.js
function ApiService($http) {
API = '//localhost:3000/parentheses';
this.getUser = (entry) => {
return $http
.post(API, { string: entry} )
.then(function (response) {
return response.data;
}, function (reason) {
// error
})
};
};
angular
.module('app')
.service('ApiService', ApiService);
In my API server.js
const http = require('http');
const app = require('./app')
const port = process.env.PORT || 3000;
const server = http.createServer(app);
server.listen(port);
I will definitely have to change API = '//localhost:3000/parentheses'; in my AngularJS app, but should I change const port = process.env.PORT || 3000; ?
I just need a little push start to help me clear some confusion.
Thanks!
Ensure you use application-level environment variables For example: To define the BASE_URL of your site for development and production separately. Doing so you don't have to make any configuration changes when you go live, it is a one-time process.
And if you are looking for free hosting services for pet projects Heroku is good and if you really want to make site go live for the end-users you may go for AWS EC2 instance or Heroku paid service both are good.

Accessing user uploaded images in a deployed MERN app

I am a relatively new developer and I have made a personal blog app, where a user can create a post and upload an image to use as the thumbnail for that post.
So far the backend and frontend work brilliantly and I am able to get the image, store it locally in a folder on my machine, store the file path in MongoDB and then access it and display it in the UI accordingly.
Now that I'm looking to finally deploy my application I have to figure out a way to upload images to an online cloud storage or something, where I can access them from my frontend as well.
Any suggestions on a good service of this kind and if possible, something free, suitable for my small project? Or any suggestions on an alternative way of dealing with this situation?
Any advice will be greatly appreciated!
Thanks in advance!
NOTE: I plan on deploying my app with Heroku, so if you've ever dealt with this issue directly using Heroku, please share your experience.
Yes, I have several apps that do just this! Sign up for a free MongoDB Atlas account and then you can store the data on their servers and point your Express app to the connection URL. Basically, they will give you a URL like this:
mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
Which you can then store in a .env file like this:
MONGODB_URL=mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
And access from your app like so:
mongoose.connect(process.env.MONGODB_URL, connectionOptions)
.catch((err) => {
console.log('Error on initial DB connection: ', err);
});
You'll need to load the .env files in your app using an npm packages such as dotenv on the development side. For heroku, you can use the heroku-cli
and set any environment variables for your app like so:
heroku config:set MONGODB_URL=mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
Note, the development MongoDB URL could be a connection string to a local instance, such as:
MONGODB_URL=mongodb://localhost:27017/my-blog-app
And the one for heroku can be the MongoDB Atlas cluster.
There are a few other config things to do for Node apps, like having a 'start' script in package.json, which for Express apps created with express-generator looks like:
"scripts": {
"start": "node ./bin/www"
But for your case may be different. It should point to whatever file is the entry point for your server.
You'll also need to set the PORT from process.env.PORT, which heroku will set on their end, and on your end you can just default to 3000:
const PORT = process.env.PORT || 3000;
app.listen(PORT);
If you have any questions, feel free to PM me or just ask here and I can elaborate further. It can be intimidating the first few times, but it's actually really easy to deploy stuff to heroku this way once you get the hang of it! You can also check out the heroku docs for deploying Node apps.

Google Cloud Compute Engine - Nodejs not working

I created a free server from Google Cloud. I want to run Node.js on this server. I installed Node.js. I installed Express and created a project. Then I run the project (working).
But I can not get the output by typing ip-address: 3000. "This site can not be reached." I get as a result.
What could be the reason for that.
I just setup one to see how this works, how i setup the server, hope this helps:
var express = require('express');
var app = express();
app.get('/', (req, res) => {
res.json('Home Page');
})
app.listen(8080);
The web applications must listen for HTTP requests on ports within the permitted range 8080 to 8084.....To connect to a web application running on an instance, click the Web Preview button Web Preview Button above the Cloud Shell terminal window in the GCP Console. src: https://cloud.google.com/shell/docs/features#web_preview

Firebase functions: koa.js server how to deploy

I already have an app written in MERN stack with koa server prepared build version. My main node file to run by node server.js command to start the whole app looks like this.
In every tutorial, I see that I need to add functions.https.request etc. in the beginning of coding (or at least to suppose doing it).
How could I host my app on firebase the same as I could on heroku - with whole server side?
It is possible to host Koa app using firebase functions, I figure it out after some Googling and analyzing.
This is a piece of code from my project, it is now hosted with firebase functions:
const Koa = require('koa');
const app = new Koa();
// ... routes code here ...
// This is just for running Koa and testing on the local machine
const server = app.listen(config.port, () => {
console.log(`HITMers-server is running on port ${config.port}`);
});
module.exports = server;
// This export is for Firebase functions
exports.api = functions.https.onRequest(app.callback());
You can see the docs and tutorial video for more information.
By the way, here is another example to deploy Koa to now.sh version 2.
You can actually skip the listen call entirely, and use app.callback().
This seems to make more sense than listening on a random port that never actually gets hit.
const functions = require('firebase-functions');
const app = new Koa();
... // set up your koa app however you normally would
app.use(router.routes());
module.exports.api = functions.https.onRequest(app.callback());
You can run an express application using firebase hosting to serve dynamic content via firebase functions. You cannot, however, use Koa.js currently. The functions.https.onRequest requires you to pass an HTTP request handler or an express app returned from express().
Here is the relevant article from Firebase about serving dynamic content from functions.
https://firebase.google.com/docs/hosting/functions
Here is a video tutorial from Firebase on using express.
https://www.youtube.com/watch?v=LOeioOKUKI8
To anyone looking for koa Google Cloud Functions, Here is my working version in typescript
import Koa from 'koa';
import Router from 'koa-router';
import type { HttpFunction } from '#google-cloud/functions-framework/build/src/functions';
const app = new Koa();
const port = process.env.PORT || 3001;
const router = new Router();
router.get('/', async (ctx) => {
ctx.body = 'Hello World!';
});
app.use(router.routes());
// For development on local
if (!isCloudFunctions()) {
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
}
export const helloWorldApi: HttpFunction = app.callback();
function isCloudFunctions(){
return !!process.env.FUNCTION_SIGNATURE_TYPE;
}
For deployment:
gcloud functions deploy test-koa-function --entry-point=helloWorldApi --runtime nodejs16 --trigger-http --allow-unauthenticated
You can't deploy and run an arbitrary node app on Cloud Functions. You have to make use of the different types of triggers that are defined by the product.
See the Cloud Functions for Firebase main page to see the list.
Cloud Firestore Triggers
Realtime Database Triggers
Firebase Authentication Triggers
Google Analytics for Firebase Triggers
Crashlytics Triggers
Cloud Storage Triggers
Cloud Pub/Sub Triggers
HTTP Triggers

Resources