Docker not exposing my application to run locally - node.js

I've an express application which looks as follows:
const express = require("express");
const PORT = process.env.PORT || 3001;
app = express();
app.all("*", (req, res) =>
res.status(200).json({
status: 200,
message: "Hello world from Docker.",
})
);
app.listen(PORT, "127.0.0.1", () =>
console.log("The server is running on port: %s", PORT)
);
I want to build an image based on this simple express application so my Dockerfile looks as follows:
FROM node:18-alpine
WORKDIR /app
COPY package*.json .
RUN npm install
COPY . .
EXPOSE 3001
CMD ["npm", "start"]
.dockerignore looks as follows:
node_modules
This is how i built my image:
docker build -t my-app:1.0 .
Now if i start an image locally as follows:
docker run -p 3001:3001 my-app:1.0
And if i visit on my web-browser at http://localhost:3001, there's nothing showing in the browser.
OS: Windows 10
What am i missing here?

You can expose the app for external connections this way:
app.listen(PORT, "0.0.0.0", () =>
console.log("The server is running on port: %s", PORT)
);

Related

docker build creates two images instead of one with the two components

i want to build a easy web server to get in touch with docker compose but there are two images build and the webserver doesnt run.
app.js:
'use strict';
const cors = require('cors');
const { request } = require('express');
const express = require('express');
const {flaschenpost} = request('flaschenpost');
const http = require('http');
const logger = flaschenpost.getLogger();
const api = express();
api.use(cors());
api.get('/', (req,res) => {
res.json(
{
now: Date.now()
});
});
const server = http.createServer(api);
const port = 3_000;
server.listen(post, ()=> {
logger.info('Server started.', {port});
});
my Dockerfile looks like this:
FROM node:16.13.0-alpine
USER node
WORKDIR /home/node
COPY --chown=node:node ./package.json ./package.json
COPY --chown=node:node ./package-lock.json ./package-lock.json
RUN npm install --production
COPY --chown=node:node . .
CMD [ "node", "app.js"]
sudo docker build -t api
sudo docker run -d --init -p 3000:3000 --name api api
The output is after running docker ps
|REPO |TAG |IMAGE_ID |CREATED |SIZE |
|api |latest |f164f3da6ad2|4 minutes ago|127MB|
|node |16.13.0-alpine|44e24535dfbf|12 months ago|110MB|
curl http://localhost:3000
outputs this:
curl: (7) Failed to connect to localhost port 3000 after 0 ms: Connection refused
Something is wrong.
Dockerfile
FROM node:17
ENV NODE_ENV=production
WORKDIR /app
COPY ["package.json", "package-lock.json*", "./"]
RUN npm install --production
COPY . .
CMD [ "node", "index.js" ]
Now build it
docker build -t api .
Inspect image to if you get what you really want
docker inspect image api:latest
[
{
"Id": "sha256:7bbad5c790f4583e19b1dcf3b3c1aeb0a45fc46ad997d54222a51f1867e8789b",
"RepoTags": [
"api:latest"
],
"RepoDigests": [],
"Parent": "sha256:e58f7a2e8bb5c548475104eb813a6f12b8f610aa939b69de92d349116c2d9a00",
"Comment": "",
"Created": "2022-11-14T09:51:03.762665424Z",

Express routes don't return data within Docker conatiner

When the app starts, the home page immediately makes a request to my googleRoute to retrieve some review data.
When ran locally and visiting localhost:3001 the app starts up and displays the data fine.
When ran via docker and visiting localhost:3001 the app starts up and the data is 'undefined' as if the route never returned any data back.
Below is my code...
Express App Index.js:
const express = require('express')
const awsRouter = require('./routes/aws-route')
const googleRouter = require('./routes/google-route')
const dotenv = require('dotenv');
const path = require('path');
const PORT = process.env.PORT || 3001;
const app = express();
// middleware
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
//routes
app.use("/api/aws", awsRouter);
app.use("/api/google", googleRouter);
app.use(express.static(path.join(__dirname, '../react-app/build')));
app.get('/', (req,res) => {
res.sendFile(path.join(__dirname, '../react-app/build/index.html'));
});
app.listen(PORT, () => {
console.log(`Server listening on ${PORT}`);
});
My Dockerfile:
# pull official base image
FROM node:13.12.0-alpine AS ui-build
# set working directory
WORKDIR /app
COPY react-app/ ./react-app
RUN cd react-app && npm install && npm run build
FROM node:13.12.0-alpine AS server-build
WORKDIR /root/
COPY --from=ui-build /app/react-app/build ./react-app/build
COPY express-app/package*.json ./express-app/
COPY express-app/index.js ./express-app/
COPY express-app/routes ./express-app/routes
RUN cd express-app && npm install
EXPOSE 3001
CMD [ "node", "./express-app/index.js" ]
After adding log statements to my routes, I noticed that I was able to reach the routes just fine however my environment variables were undefined.
To fix this I used the --env-file flag as mentioned here to reference my environment variables from my .env file

Dockerfile not working on backend with react

I'm trying to run a react app with 2 node servers. One for the front end and one for the back-end connected with a mysql data-base.
I'm trying to use docker for the container and I managed to get the database and the front-end server up. However,When the back-end server is fired it seems like it doesn't acknowledge the Dockerfile.
node_server | npm WARN exec The following package was not found and will be installed: nodemon
node_server | Usage: nodemon [nodemon options] [script.js[args]
node_server |
node_server | See "nodemon --help" for more.
node_server |
node_server exited with code 0
Dockerfile - client:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/scr/app
EXPOSE 3000
COPY package.json .
RUN npm install express body-parser nano nodemon cors
COPY . .
Dockerfile - server
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
RUN npm init -y
RUN npm install express body-parser nano nodemon cors
EXPOSE 5000
CMD ["npx", "nodemon", "src/server.js"]
docker-compose
version: '3'
services:
backend:
build:
context: ./server
dockerfile: ./Dockerfile
depends_on:
- mysql
container_name: node_server
image:
raff/node_server
ports:
- "5000:5000"
volumes:
- "./server:/usr/src/app"
frontend:
build:
context: ./client
dockerfile: ./Dockerfile
container_name: node_client
image:
raff/node_client
ports:
- "3000:3000"
volumes:
- "./client:/usr/src/app"
mysql:
image: mysql:5.7.31
container_name: db
ports:
- "3306:3306"
environment:
MYSQL_ROOT_PASSWORD: admin
MYSQL_DATABASE: assignment
The server side is not done yet, but i don't believe it's causing this error.
Server.js
"use strict";
const path = require("path");
const express = require("express");
const app = express();
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.json());
const mysql = require("mysql");
let con = mysql.createConnection({
host: "mysql",
port: "3306",
user: "root",
password: "admin",
});
const PORT = 5000;
const HOST = "0.0.0.0";
app.post("/posting", (req, res) => {
var topic = req.body.param1;
var data = req.body.param2;
sql_insertion(topic, data);
});
// Helper
const panic = (err) => console.error(err);
// Connect to database
con.connect((err) => {
if (err) {
panic(err);
}
console.log("Connected!");
con.query("CREATE DATABASE IF NOT EXISTS assignment", (err, result) => {
if (err) {
panic(err);
} else {
console.log("Database created!");
}
});
});
//select database
con.query("use assignment", (err, result) => {
if (err) {
panic(err);
}
});
// Create Table
let table =
"CREATE TABLE IF NOT EXISTS posts (ID int NOT NULL AUTO_INCREMENT, Topic varchar(255), Data varchar(255), Timestamp varchar(255), PRIMARY KEY(ID));";
con.query(table, (err) => {
if (err) {
panic(err);
} else {
console.log("Table created!");
}
});
app.get("*", (req, res) => {
res.sendFile(path.join(__dirname, "client/build" , "index.html"));
});
app.listen(PORT, HOST);
console.log("up!");
Modify this line
CMD ["npx", "nodemon", "src/server.js"]
By
CMD ["npx", "nodemon", "--exec", "node src/server.js"]
While putting the command in package.json under scripts section is better.
Your volumes: declarations are hiding everything that's in the image, including its node_modules directory. That's not normally required, and you should be able to trim the frontend: container definition down to
backend:
build: ./server # default `dockerfile:` location
depends_on:
- mysql
image: raff/node_server # only if you plan to `docker-compose push`
ports:
- "5000:5000"
The image then contains a fixed copy of the application, so there's no particular need to use nodemon; just run the application directly.
FROM node:latest
WORKDIR /usr/src/app # also creates the directory
COPY package.json package-lock.json .
RUN npm ci # do not `npm install` unmanaged packages
COPY . . # CHECK: `.dockerignore` must include `node_modules`
EXPOSE 5000
CMD ["node", "src/server.js"]
This apparently isn't a problem for your frontend application, because there's a typo in WORKDIR -- the image installs and runs its code in /usr/scr/app but the bind mount is over /usr/src/app, so the actual application's /usr/scr/app/node_modules directory isn't hidden.

Getting this Error While running docker-compose up - TypeError: redis.createCient({}) is not a function

I wrote this script to count the users every time they visit.
But during the build process, it's successfully downloading and installing the dependencies
but when executing using command docker-compose up the line redis.createClient({}) is throwing error as to its not a function.
**#Dockerfile**
FROM node:alpine
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "start"]
**#docker-compose.yml**
version : '3'
services:
redis-server:
restart: always
image: redis
node-app:
restart: on-failure
build: .
ports:
- "4001:8081"
**# Application Code**
const express = require('express');
const redis = require('redis');
const process = require('process');
const app = express();
const client = redis.createClient({
host: 'redis-server',
port: 6379
});
client.set('visits', 0);
app.get('/', (req, res) => {
client.get('visits', (err, visits) => {
res.send('Number of visits ' + visits);
client.set('visits', parseInt(visits) + 1);
});
});
app.listen(8081, () => {
console.log('Listening on port 8081');
});

I am running 2 images with Docker Compose and I am having trouble hitting the localhost from my Mac. I am exposing ports 3000. Am I missing something?

I am building a Node/Mongo app using Docker and I am having trouble hitting my localhost from my host computer running MacOs when I run docker-compose up. Using postman or curl -i localhost:3000 returns nothing. I have also tried inspecting the container and connecting with that ip. What am I doing wrong? Thanks!
docker-compose.yml:
version: "2"
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/app
env_file:
- todoListDocker.env
links:
- mongo
mongo:
image: mongo
environment:
- MONGO_INITDB_ROOT_USERNAME=root
- MONGO_INITDB_ROOT_PASSWORD=tWwp3Fm4hZUsaLw4
volumes:
- mongo:/data/db
ports:
- "27017:27017"
env_file:
- todoListDocker.env
volumes:
mongo:
Dockerfile:
FROM node:boron
MAINTAINER Clinton Medbery <clintomed#gmail.com>
RUN ["apt-get", "update"]
RUN ["apt-get", "install", "-y", "vim"]
RUN mkdir - p /app
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
EXPOSE 3000
CMD ["npm", "start"]
Index.js:
const express = require('express');
const morgan = require('morgan');
const bodyParser = require('body-parser');
const mongoose = require('mongoose');
var app = express();
var router = require('./services/router');
//Use ENV Variables
console.log("Connecting to Mongo");
mongoose.connect('mongodb://root:tWwp3Fm4hZUsaLw4#mongo:27017');
// mongoose.connect('mongodb://localhost:todoList/todoList');
console.log("Connected to Mongo");
app.use(morgan('combined'));
app.use(bodyParser.json());
app.use('/v1', router);
var PORT = process.env.PORT || 3000;
var HOST = process.env.HOST || '127.0.0.1';
app.get('/hello', function (req, res) {
console.log("Hello World");
res.send({hello:'Hello World!'});
});
console.log('Listening on port ', HOST, PORT);
app.listen(PORT, HOST);
Your express server is listening on localhost port 3000.
var PORT = process.env.PORT || 3000;
var HOST = process.env.HOST || '127.0.0.1';
This will bind to the container's localhost. That is independent from the Mac's localhost, and from any other container's localhost. You cannot reach it from outside the container.
You need to bind to the external interface of the container, which will let the Mac, or other containers, connect to the port. You can use the special address 0.0.0.0 for this.
var PORT = process.env.PORT || 3000;
var HOST = process.env.HOST || '0.0.0.0';
Now that the express server is reachable from the Mac, the port binding 3000:3000 will work. By default, that will be bound on all of the Mac's network interfaces, but you can limit it to the Mac's localhost if you prefer.
ports:
- "127.0.0.1:3000:3000"

Resources