I am trying to deploy my little express static app to google cloud. It is failing however I cannot figure out why. There is nothing in the logs. I have deployed by adding the verbosity to debug flag. I have found this json being sent to google app engine
"betaSettings": {
"module_yaml_path": "app.yaml",
"vm_runtime": "nodejs"
},
"env": "flex",
"handlers": [
{
"script": {
"scriptPath": "PLACEHOLDER"
},
"urlRegex": ".*"
}
],
"runtime": "vm"
}"
The thing that bothers me is that the scriptpath is resolved as placeholder. Not sure if it makes sense. Here's my app.yaml
runtime: nodejs
env: flex
My app.js is as follows
var express = require('express')
var app = express()
app.set('port', (process.env.PORT || 8080))
app.use(express.static('public'))
app.listen(app.get('port'))
I have a public folder which I assume is being deployed but cannot verify. Any suggestions?
This simply never worked for me. My options was to create a docker. Later on I decided to google for answers. The answer was to use a constant google cloud storage and deploy the output using gsutil. It works, I am not too happy that the express static does not work for me. The link that explains how to deploy for express static Best practice works for me not express static I tried deploying nodejs on heroku. Deployed perfectly in 5 minutes. Wish google was as simple.
Related
I created following express API
const express = require("express");
const app = express();
const bodyParser = require("body-parser");
const cookieParser = require("cookie-parser");
require("dotenv/config");
//routes
const authRoute = require("./routes/auth.js");
const adminRoute = require("./routes/admin.js");
//middleweres
//converting body into json using body parser
app.use(cookieParser());
app.use(bodyParser.json());
app.use("/", express.static("public"));
app.use("/api/auth", authRoute);
app.use("/api/admin", adminRoute);
// starting express server
// app.listen(5000, () => {
// console.log("listning on port 5000");
// });
module.exports = {
app,
};
in public folder I have html file and css and js inside public/static folders
html, css and js in public folder are generated with react build.
I am trying to deploy this API in google cloud function with following command
gcloud functions deploy outreach-dashboard --entry-point app --runtime nodejs10 --trigger-http --allow-unauthenticated
function is getting deployed but problem is when I see the function on gcp dashboard it source does not contain public folder and if I download source as zip then I can see public folder there but its empty.
I need public folder to get deployed so I can serve it using
express.static("public")
You are trying to serve several endpoints in the same Cloud Functions. I saw some hack on Stack overflow where folks bend the framework to achieve this. It's not my recommendation.
Cloud Run is a very similar platform. The same underlying infrastructure, and feature very close (I wrote an article on this). But you serve a containerize webserver, more suitable for your use case.
You can easily have a try on it.
Uncomment your "starting app express" part
Test locally if it works.
Then run this command
gcloud beta run deploy outreach-dashboard --source=. --platform=managed --region=us-central1 --allow-unauthenticated
change the region if needed
The command :
upload the sources (take care of your .gitignore and .gcloudignore file to be sure to upload all the files)
Create a container with your source. To achieve this, Buildpacks.io is used. Exactly the same process that with Cloud Functions and App Engine.
Deploy the container on Cloud Run.
If the problem persist, there is may an issue with the automatic container build process (maybe some file are automatically discarded). In this case, you can write a very simple Dockerfile similar to this one that you have in the getting started documentation.
This time, you can create and deploy in 2 steps (and 2 commands)
# create the container with Cloud Build based on the docker file
gcloud builds submit --tag gcr.io/<PROJECT_ID>/<containerName>
# Deploy on Cloud Run
gcloud beta run deploy outreach-dashboard --image=gcr.io/<PROJECT_ID>/<containerName> --platform=managed --region=us-central1 --allow-unauthenticated
As it turns out I have remove public folder from .gitignore
and after that also need to tell GCF to treat public folder as static folder by creating app.yaml file in root folder
content of app.yaml
runtime: nodejs12
handlers:
- url: /static
static_dir: public
- url: /.*
script: auto
I have created a Node.js application and trying to deploy in the Firebase for hosting. Before deploying, I am making sure that it will work properly using the command firebase serve --only hosting,function. This command creates the server and I am able to access the Home page of the application using the URL (localhost:5000) provided by Firebase but for some reason, my Angularjs HTTP request is unable to find the Node.js controller due to which I am getting the 404 URL not found error in the brwoser.
As soon as the index.html is accessed using the localhost:5000 I am trying to populate various dropdown options in my frontend. These options are present within the Node.js controller file populator.js. Hence I am making an HTTP request from angularjs to my node.js controller but I get the 404 URL not found error. I am guessing there is some issue with folder structure or the firebase.json file due to which the code is unable to find my Node.js controller.
My firebase.json file:
{
"hosting": {
"public": "public",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
"rewrites": [
{
"source": "**",
"function": "app"
}
]
}
}
My index.js file:
const functions = require("firebase-functions");
const express = require('express');
const app = express();
//call function to popultae the fields
app.get('/populateFields', function(req,res){
populateFields.BusinessStep(function(data){
res.send(data);
});
});
exports.app = functions.https.onRequest(app);
Here is my folder structure
Firebase Hosting (root)
|--functions
|---index.js
|---package.json
|
--public
|---index.html
|
--firebase.json
--.firebaserc
I tried running the command firebase serve --only hosting,function in both the root directory and within the functions folder but none worked and still getting the error The requested URL was not found on this server.
I found many post related to this issue but could not find there solution after trying the methods mentioned there hence posting this question.
I was running the command wrongly so I was getting that issue the actual command is
firebase serve --only functions,hosting
If anyone is facing the issue then run the command correctly.
"I was running the command wrongly so I was getting that issue the actual command is
firebase serve --only functions,hosting"
I guess that the error was for specifying function instead of functions
I have successfully in the past launched full stack applications to Heroku by using within the client package.json file.
"proxy": "http://localhost:3001"
Now I am getting an "Invalid Host header" I did fix that error by removing the proxy as well as implementing setupProxy.js file with the following code, but afterwards the app does not call the back end at all and errors out.
const { createProxyMiddleware } = require('http-proxy-middleware');
module.exports = function(app) {
app.use(
'/api',
createProxyMiddleware({
target: 'http://localhost:3001',
changeOrigin: true,
})
);
};
I'm wondering how to fix, or if anything changed recently in Heroku to not allow proxy within the client package.json file?
It looks like it was a seemingly unrelated fix. I had to enter in some environment variables within Heroku to allow the server to run. Without the variables, I believe the server would stop with errors therefore trickling down and causing many problems. So long story short, always remember your environment variables within Heroku.
I’m trying to deploy a very basic angular app to elastic beanstalk. The project was created using the angular cli. I have not made any changes to the files in this project.
Here are the steps I took to deploy the app
Executed ’ng build’ inside the root folder of my project
Moved the #angular/cli dependency from devDependencies to dependencies in package.json
Zipped the contents of the dist folder along with package.json
Deployed zip folder to AWS EB configured with the node.js platform, node version 8.11.3, the same as my local environment.
I always end up with a ‘npm install failed’ error when I check eb-activity.log.
Am I missing something trivial here? Would really appreciate any help with deploying angular apps to EB.
While this does not specifically answer your question, I don't think Elastic Beanstalk is the right tool for the job. I strongly suggest hosting on a Static Website on S3, and if you want https and a custom domain name, put a CloudFront distribution in front of it.
Create an S3 bucket, e.g. www.mydomainname.com
Enable Static Website Hosting
Set the Bucket Policy to public read
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::www.mydomainname.com/*"
}
]
}
Build the angular app locally, into a dist folder.
Push the files to the website using the aws-cli
aws s3 sync dist s3://www.mydomainname.com/
This solution will cost pennies, much lower than an Elastic Beanstalk solution (EC2, EBS, ELBs). Elastic Beanstalk is great for Monolithic apps, but their existence is numbered, and the wrong paradigm when you are talking SPA, IMO.
I know I'm pushing my luck now, but I would also strongly recommend using the Serverless Framework to build and deploy NodeJS API endpoints for your Angular App to interact with.
Follow the steps:
-- Angular app
Create your Angular App
Build your Angular App using ng build --prod command
This will create a dist folder like 'dist/app-folder' with HTML, js, and CSS
The angular app you just built won’t work as a static website, it has to run on top of a Node.js server
-- Node.js App
Created a new folder and create a Node.js project by running: npm init and follow the instructions
Name entry point: (index.js) js to 'server.js'
Install Express and Path using npm install express path --save command
Create a file named 'server.js' into the project folder
Now check the package.json file for a configuration named “main” the value should be 'server.js'
Copy the Angular dist folder to Node.js app folder
Open 'server.js' file paste below code
var path = require('path');
const port = process.env.PORT ||3000;
const app = express();
//Set the base path to the angular-test dist folder
app.use(express.static(path.join(__dirname, 'dist/yourappfolder')));
//Any routes will be redirected to the angular app
app.get('*', function(req, res) {
res.sendFile(path.join(__dirname, 'dist/yourappfolder/index.html'));
});
//Starting server on port 8081
app.listen(port, () => {
console.log('Server started!');
console.log(port);
});
Run Node.js project locally using 'node server.js' command
The app should work on localhost:3000 port
Take the dist folder, the server.js file, and the package.json file (of the server project) and compress them as a zip. DO NOT include the “node_modules” folder.
Upload the zip to your AWS Elastic Beanstalk environment
Browse your site
Hope this is useful!
Got the deployment issue resolved! I used express to create a server and serve my angular app. I needed to add server.js to my dist/ folder. My server.js file looked like so
const express = require('express');
const http = require('http');
const app = express();
const port = process.env.PORT || 3001;
app.use(express.static(__dirname));
const server = http.createServer(app);
server.listen(port, ()=> console.log("Running..."));
I've been working with the Node.js Google App Engine for some months and have always successfully used the express.static solution to access static files in the public folder when i deployed my node.js app.
For some (to me not so obvious) reason I struggle to get this working lately in the Google Flexible production environment. On my local development environment everything is fine.
In order to narrow down the problem I created a very basic test app listed here:
'use strict'
const express = require('express')
const app = express()
const path = require('path')
const os = require('os')
const PORT = process.env.PORT || 8080
const ENV = process.env.NODE_ENV
//app.use(express.static('public'))
//app.use(express.static(path.resolve(__dirname, 'public')))
app.use(express.static(path.join(__dirname, 'public')))
app.listen(PORT, () => {
console.log(`SYSTEM: App listening on port ${PORT}`)
console.log(`SYSTEM: Press Ctrl+C to quit.`)
})
app.get('/', (req,res) => {
res.status(200).send('\
<h1>TEST app.use(express.static("public")) in Google Cloud Flexibel App Engine environment </h1>\
<hr/>\
<h4>YAML settings: runtime: nodejs env: flex</h4>\
<h4>HOST : '+`${os.hostname()}`+'</h4>\
<h4>PORT : '+`${PORT}`+'</h4>\
<h4>__dirname : '+`${__dirname}`+'</h4>\
<h4>mountpath : '+`${app.mountpath}`+'</h4>\
<h4>env : '+`${ENV}`+'</h4>\
<h4>path resolved: '+`${path.resolve(__dirname, 'public')}`+'</h4>\
<h4>path joined : '+`${path.join(__dirname, 'public')}`+'</h4>\
<hr/>\
<h2>If you see me <img src="./HB.png"> you can access "./HB.png" in the "public" directory.</h2>\
<h2>If you see me <img src="/HB.png"> you can access "/HB.png" in the "public" directory.</h2>\
<h2>If you see me <img src="HB.png"> you can access "HB.png" in the "public" directory.</h2>\
<hr/>\
')
})
I tried various settings of the express.static settings (see those commented out). However each time after deploying using
gcloud app deploy
to Google production I get 404 (also in the google logs). On local development environment everything is fine.
Does anyone have a clue ? Thanks in advance !
Strange, I solved it by reinstalling the Google Cloud SDK.