Docker Container Not Showing Up - node.js

I am following along with this tutorial to learn how to containerize a node.js / express application I'm working on. Everything seems to be working fine, except that when I docker ps after building the container, I don't see my running container.
Node.js / express code:
const express = require("express");
const app = express();
const port = process.env.PORT || 3000;
app.get('/', (req, res) => {
res.send("Hi there")
})
app.listen(port, () => console.log("Listening On Port 3000"))
Dockerfile
FROM node:latest
WORKDIR /untitled
COPY package.json .
RUN npm install
EXPOSE 3000
COPY . ./
CMD node server.js
My working directory is "untitled" because that's the default name that Webstorm gives my project and I haven't changed it yet.
The image build seems to have worked fine
As well as the container build
But docker ps isn't showing me any containers

Related

Node Js hello world Program is not being displayed in localhost

const express = require('express')
const app = express()
const port = 3000
app.get('/', (req, res) => {
res.send('Hello World!')
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})
enter image description here CMD
enter image description here local host
I need to run node js successfully and see the hello world output on local host
That code seems to be OK.
Please try http://127.0.0.1:3000
Or try to move your project to different folder, I see it is placed in One Drive folder ? Maybe some permission issue ?
I hope you also tried restart your machine if you just installed NodeJS before trying this out.

Node JS with Express leaking memory

I am new to Node JS. I have a Node JS + Express app that is causing a Docker container to repeatedly stop with exit code 137 - out of memory error every 5-10 minutes.
The app could not be more simple serving up static html, css, js and images
const express = require('express')
const app = express()
const port = 8080
app.use(express.static('public'))
app.get('/', (req, res) => {
res.sendFile('index.html', { root: __dirname + "/public" } );
})
app.listen(port, () => {
console.log(`web app listening at http://localhost:${port}`)
})
Directory structure
--root
server.js
--public
index.html
--css
--js
--img
The host is an ECS EC2 instance - T2 small with 2Gb memory. The container task allocation 1024 MiB of memory.
Dockerfile
# from nodes 12 image
FROM node:12
MAINTAINER coco
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package*.json ./
RUN npm install
# Bundle app source - copy all files/folders in current directory
COPY . .
# Specify ports
EXPOSE 8080
# Run the app
CMD [ "node", "server.js" ]
What about this Node app is causing a memory leak? Though, it may be express that's leaking. I'm using express 4.17.1

CRA Socket.io request returns net::ERR_CONNECTION_TIMED_OUT

Hello. I've spent some time without luck trying to understand the problem here.
I've looked through each Question on StackOverflow which seems to deal with the same problem, though nothing has worked so far.
I have a simple chat app built using Create React App and Socket.io (which runs fine on localhost), but when deployed to my Node server I'm receiving ERR_CONNECTION_TIMED_OUT errors and no response. The website itself runs fine, but when I make a call to my Socket.io server, but errors.
I'm guessing this is down to my lack of knowledge with how Node and Socket.io want to work.
Some info:
server.js
const path = require("path");
const express = require("express");
const app = express();
const http = require("http").createServer(app);
const port = 8080;
http.listen(port, () => console.log(`http: Listening on port ${port}`));
const io = require("socket.io")(http, { cookie: false });
app.use(express.static(path.join(__dirname, "build")));
app.get("/*", function (req, res) {
res.sendFile(path.join(__dirname, "build", "index.html"));
});
io.on("connection", (socket) => {
console.log("New client connected");
// Emitting a new message. Will be consumed by the client
socket.on("messages", (data) => {
socket.broadcast.emit("messages", data);
});
//A special namespace "disconnect" for when a client disconnects
socket.on("disconnect", () => console.log("Client disconnected"));
});
client.js
....
const socket =
process.env.NODE_ENV === "development"
? io("http://localhost:4001")
: io("https://my-test-site:8080");
socket.on("messages", (msgs: string[]) => {
setMessages(msgs);
});
....
docker-compose.yml
version: "X.X"
services:
app:
image: "my-docker-image"
build:
context: .
dockerfile: Dockerfile
args:
DEPENDENCY: "my-deps"
ports:
- 8080:8080
Dockerfile
...
RUN yarn build
CMD node server.js // run my server.js
...
UPDATE: I got around this problem by making sure my main port was only used to run Express (with socket.io) - in my set up that was port: 8080. When running in the same Docker container, I don't think I needed to create and use the https version of the express 'createServer'.
This looks like you forgot to map the port of your docker container. The expose statement in your dockerfile will only advertise for other docker containers, which share a docker network with your container, that they can connect to port 4001 of your container.
The port mapping can be configured with the -p flag for docker run commands. In your case the full command look somehow like this:
docker run -p 4001:4001 your_image_name
Also, do you have a signed certificate? Browser will likely block the conneciton if they do not trust your servers certificate.
I got around this problem by keeping just one port available (in my case :8080). This port is what express/socket.io is using (originally I had two different ports, one for my site, one for express). Also, in my case, when running in the same Docker container, I didn't require the require("https").createServer(app) (https) version of the server, as http was sufficient.

CRA, Node.js, nginx in Docker?

I'm starting off a new project. I currently have a strucute like this, from root folder:
/app (CRA frontend app)
/server (Node.js Express app)
Dockerfile
docker-compose.yml
My requirements is the following:
Development
Fire up Docker that creates necessary container(s)
Hot reloading for frontend React app (using CRA)
Node.js server that can serve my React app with SSR (automatically updated when editing)
Accessible via http://localhost:3000
Production
Potentially fire up Docker that creates necessary container(s)
Creates production ready version of React app
Creates production ready version of Express app
Accessible via port 80
Where I am right now is somewhere between everything. I don't know how to setup Docker the right way in order to make this whole thing work, and I don't really know how to structure my React app vs the Express app while developing. The Production part seems to be easier as soon as I know how to structure the Development part... + Nginx as a proxy for the Express app?
I'm currently building a Docker container which fires up a container where hot reloading is working etc, but I don't know how to setup the Express part so they work nicely together...?
Any help is much appreciated.
Thanks
Very broad question. Perhaps better to break it down into more direct questions. Anyway, I don't think running your dev setup in Docker is ideal. Instead build your app normally with CRA. Then deploy in Docker.
In my own projects, I have a docker container running a node server which serves the react app using SSR.
Here is the docker part. Note that your package.json should have a script named start:prod for this to work. That script then starts your app in production.
// --- Dockerfile
# Pulled from docker hub and has everything
# needed to run a node project
FROM node:alpine
ENV PORT 3000
# Navigate (cd) to the app folder in the docker container
WORKDIR /usr/src/app
# Copy all package.json / package-lock.json etc. to the root folder
# Executed on build: docker build .
COPY ./package*.json ./
RUN npm i
# copy entire project into docker container
COPY . .
# build front-end with react build scripts and store them in the build folder
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start:prod"]
Here's the express server that will start the server.
// -- server.js
import express from "express";
import router from "./controller/index";
const app = express();
const port = 4000;
// Tell the app to use the routes above
app.use(router);
// start the app
app.listen(port, () => {
console.log(`express running on port ${port}`);
});
Here is the controller/index.js file you'll need to start up
// -- controller/index.js
import express from "express";
import path from "path";
import serverRenderer from '../middleware/renderer';
const router = express.Router();
// root (/) should always serve our server rendered page
router.use('^/$', serverRenderer());
// other static resources should just be served as they are
router.use(express.static(
path.resolve(__dirname, '..', '..', 'build'),
{ maxAge: '30d' },
));
export default router;
And finally the renderer which renders the app on the server.
// -- renderer.js
import React from "react";
import { renderToString } from "react-dom/server";
import App from "../../src/App";
const path = require("path");
const fs = require("fs");
export default () => (req, res) => {
// point to html file created by CRA's build tool
const filePath = path.resolve(__dirname, "..", "..", "build", "index.html");
fs.readFile(filePath, "utf8", (error, htmlData) => {
if (error) {
console.error("error", error);
return response.status(404).end();
}
// render the app as string
const html = renderToString(<App />);
// inject rendered app into final html and send
return res.send(
htmlData
.replace('<div id="root"></div>', `<div id="root">${html}</div>`)
);
})
}
You will need bootstrap.js to inject support for certain packages.
// -- bootstrap.js
require('ignore-styles');
require('url-loader');
require('file-loader');
require('babel-register')({
ignore: [/(node_modules)/],
presets: ['es2015', 'react-app'],
plugins: [
'syntax-dynamic-import',
'dynamic-import-node'
]
});
require("./index");
You can find the details of it all here:
https://blog.mytoori.com/react-served-by-express-running-in-docker-container

Can't communicate with simple Docker Node.js web app [duplicate]

This question already has answers here:
Containerized Node server inaccessible with server.listen(port, '127.0.0.1')
(2 answers)
Closed 9 months ago.
I'm just trying to learn Node.js and Docker at the same time. I have a very simple Node.js app that listens on a port and returns a string. The Node app itself runs fine when running locally. I'm now trying to get it running in a Docker container but I can't seem to reach it.
Here's my Node app:
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
var count = 0;
var server = http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end("Here's the current value: " + count);
console.log('Got a request: ', req.url);
count++;
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
My Dockerfile:
FROM node:latest
MAINTAINER Jason
ENV PORT=3000
COPY . /var/www
WORKDIR /var/www
EXPOSE $PORT
ENTRYPOINT ["node", "app.js"]
My build command:
docker build -t jason/node .
And my run command:
docker run -p 3000:3000 jason/node
The app.js file and Dockerfile live in the same directory where I'm running the commands. Doing a docker ps shows the app running but I just get a site cannot be reached error when navigating to 127.0.0.1:3000 in the browser. I've also confirmed that app.js was properly added to the image and I get the message "Server running at http://127.0.0.1:3000/" after running.
I think I'm missing something really simple, any ideas?
Omit hostname or use '0.0.0.0' on listen function. Make it server.listen(port, '0.0.0.0', () => { console.log(Server running..); });
If You use docker on Windows 7/8 you most probably have a docker-machine running then You would need to access it on something like 192.168.99.100 or whatever ip your docker-machine has.
To see if you are running a docker-machine just issue the command
docker-machine ls

Resources