I'm dockerizing nodejs and mongoDB application but its not executing into browser.
URL is http://0.0.0.0:3030/
Error in browser:
This site can’t be reached
The connection was reset.
Try:
Checking the connection
Checking the proxy and the firewall
ERR_CONNECTION_RESET
By executing this command:
docker-compose up --build
It will generates following output into ubuntu terminal.
web_1 | npm info lifecycle test#1.0.0~prestart: test#1.0.0
web_1 | npm info lifecycle test#1.0.0~start: test#1.0.0
web_1 |
web_1 | > test#1.0.0 start /usr/src/app
web_1 | > node app.js
web_1 |
web_1 | App listning on port 3030
mongoDB_1 | 2017-05-16T07:07:27.891+0000 I NETWORK [initandlisten] connection accepted from 172.19.0.3:57655 #3 (1 connection now open)
Dockerfile
FROM node:alpine
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
RUN npm install nodemon -g
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
EXPOSE 3030
CMD [ "npm", "start" ]
docker-compose.yml
version: "2.0"
services:
web:
build: .
command: npm start
ports:
- "3030:3000"
volumes:
- .:/api
links:
- mongoDB
mongoDB:
image: mongo:3.0
ports:
- "27017:27017"
volumes:
- /srv/docker/mongodb:/var/lib/mongodb
restart: always
package.json
{
"name": "test",
"version": "1.0.0",
"description": "My first node docker application",
"main": "index.js",
"dependencies": {
"express": "^4.15.2",
"mongodb": "^2.2.26"
},
"devDependencies": {},
"scripts": {
"start": "node app.js"
},
"author": "Muzammil",
"license": "ISC"
}
Nodejs app.js
'use strict';
const express = require('express');
const app = express();
const MongoClient = require('mongodb').MongoClient;
const port = 3030;
let DB;
MongoClient.connect("mongodb://mongoDB/testDB", (err, db) => {
console.log(err);
//console.log(db);
DB = db;
});
app.get('/', (req, res)=>{
DB.collection("user").find({}, (err, result) => {
if(err) {
return res.json({message: "Error", error: err});
}
let results = [];
result.each((err, doc) => {
if(doc) {
console.log(doc);
}else {
res.end();
}
})
//res.status(200).json({message: "Successfully", code: 200, data: result});
});
});
app.listen(port, ()=>{
console.log(`App listning on port ${port}`);
});
You have - "3030:3000" in your docker-compose file, which means you're trying to bind localhost port 3030 to the container port 3000. So, either expose port 3000, or change the line to - "3030:3030"
Related
I am trying to dockerize a next.JS TypeScript app which uses express and apollo graphql.
My server/index.ts looks like this:
app.prepare().then(() => {
const server = express.default();
const apolloServer = new ApolloServer({
typeDefs,
resolvers,
});
server.get("*", (req, res) => {
return handle(req, res);
});
apolloServer.start().then((res) => {
console.log(res);
const graphqlHandler = apolloServer.createHandler({ path: "/" });
server.use("/api/graphql", graphqlHandler);
server.listen(process.env.PORT || 3000, (err: string | void) => {
if (err) throw err;
console.log(
`>>> Listening on http://localhost:${process.env.PORT || 3000}`
);
});
});
});
apollo client:
const GRAPHQL_URL = process.env.NODE_ENV == 'development' ? 'http://localhost:3000/api/graphql': 'https://app1.com/api/graphql' ;
package.json:
"scripts": {
"build:next": "next build",
"build": "npm run build:next && npm run build:server",
"start": "next start",
"start:production": "node dist/index.js"
If building with npm run build and then npm run start:production, then after the first refresh I get the error ``` ReferenceError: Cannot access 'data' before initialization ````. In this case the query request is on the CSR and not with getServerSideProps. The environment variable here is still "development" and not "production".
If building with next build and next start then my appollo server does not start and I get a 404 that the graphql API is not found.
I am starting the app in production in a docker container:
FROM node:16
ENV PORT 3000
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Installing dependencies
COPY package*.json /usr/src/app/
RUN npm install
# Copying source files
COPY . /usr/src/app
# Building app
RUN npm run build
EXPOSE 3000
# Running the app
ENTRYPOINT [ "npm", "run", "start:production" ]
What am I doing wrong here?
I'm trying to run a react app with 2 node servers. One for the front end and one for the back-end connected with a mysql data-base.
I'm trying to use docker for the container and I managed to get the database and the front-end server up. However,When the back-end server is fired it seems like it doesn't acknowledge the Dockerfile.
node_server | npm WARN exec The following package was not found and will be installed: nodemon
node_server | Usage: nodemon [nodemon options] [script.js[args]
node_server |
node_server | See "nodemon --help" for more.
node_server |
node_server exited with code 0
Dockerfile - client:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/scr/app
EXPOSE 3000
COPY package.json .
RUN npm install express body-parser nano nodemon cors
COPY . .
Dockerfile - server
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
RUN npm init -y
RUN npm install express body-parser nano nodemon cors
EXPOSE 5000
CMD ["npx", "nodemon", "src/server.js"]
docker-compose
version: '3'
services:
backend:
build:
context: ./server
dockerfile: ./Dockerfile
depends_on:
- mysql
container_name: node_server
image:
raff/node_server
ports:
- "5000:5000"
volumes:
- "./server:/usr/src/app"
frontend:
build:
context: ./client
dockerfile: ./Dockerfile
container_name: node_client
image:
raff/node_client
ports:
- "3000:3000"
volumes:
- "./client:/usr/src/app"
mysql:
image: mysql:5.7.31
container_name: db
ports:
- "3306:3306"
environment:
MYSQL_ROOT_PASSWORD: admin
MYSQL_DATABASE: assignment
The server side is not done yet, but i don't believe it's causing this error.
Server.js
"use strict";
const path = require("path");
const express = require("express");
const app = express();
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.json());
const mysql = require("mysql");
let con = mysql.createConnection({
host: "mysql",
port: "3306",
user: "root",
password: "admin",
});
const PORT = 5000;
const HOST = "0.0.0.0";
app.post("/posting", (req, res) => {
var topic = req.body.param1;
var data = req.body.param2;
sql_insertion(topic, data);
});
// Helper
const panic = (err) => console.error(err);
// Connect to database
con.connect((err) => {
if (err) {
panic(err);
}
console.log("Connected!");
con.query("CREATE DATABASE IF NOT EXISTS assignment", (err, result) => {
if (err) {
panic(err);
} else {
console.log("Database created!");
}
});
});
//select database
con.query("use assignment", (err, result) => {
if (err) {
panic(err);
}
});
// Create Table
let table =
"CREATE TABLE IF NOT EXISTS posts (ID int NOT NULL AUTO_INCREMENT, Topic varchar(255), Data varchar(255), Timestamp varchar(255), PRIMARY KEY(ID));";
con.query(table, (err) => {
if (err) {
panic(err);
} else {
console.log("Table created!");
}
});
app.get("*", (req, res) => {
res.sendFile(path.join(__dirname, "client/build" , "index.html"));
});
app.listen(PORT, HOST);
console.log("up!");
Modify this line
CMD ["npx", "nodemon", "src/server.js"]
By
CMD ["npx", "nodemon", "--exec", "node src/server.js"]
While putting the command in package.json under scripts section is better.
Your volumes: declarations are hiding everything that's in the image, including its node_modules directory. That's not normally required, and you should be able to trim the frontend: container definition down to
backend:
build: ./server # default `dockerfile:` location
depends_on:
- mysql
image: raff/node_server # only if you plan to `docker-compose push`
ports:
- "5000:5000"
The image then contains a fixed copy of the application, so there's no particular need to use nodemon; just run the application directly.
FROM node:latest
WORKDIR /usr/src/app # also creates the directory
COPY package.json package-lock.json .
RUN npm ci # do not `npm install` unmanaged packages
COPY . . # CHECK: `.dockerignore` must include `node_modules`
EXPOSE 5000
CMD ["node", "src/server.js"]
This apparently isn't a problem for your frontend application, because there's a typo in WORKDIR -- the image installs and runs its code in /usr/scr/app but the bind mount is over /usr/src/app, so the actual application's /usr/scr/app/node_modules directory isn't hidden.
Having some problems while trying to create my first node.js app , super new to JS..
Trying to dockerize the app , like so :
docker build -t echo_app .
docker run -p 3000:3000 echo_app
End goal is to echo user input , like so :
http://example/?name=Eyal -> Hello Eyal
http://example/ -> Hello World
ERROR IM GETTING
Error: Cannot find module 'express'
Require stack:
- /app/index.js
.
.
code: 'MODULE_NOT_FOUND',
requireStack: [ '/app/index.js' ]
}
Directroy containes :
index.js
const express = require('express')
const log4js = require('log4js')
const app = express()
const logger = log4js.getLogger()
const echo = (req, res) => {
logger.debug("Request: ", req)
const input = 'name' in req.query ? req.query.input : ''
if (input.length == 0) {
res.send('Echo World')
} else {
res.send(`Echo ${input}`)
}
}
app.get('/', (req, res) => echo(req, res))
Dockerfile
FROM mhart/alpine-node:12
WORKDIR /app
ADD . ./
ENTRYPOINT ["node", "/app/index.js"]
package.json
{
"name": "echo",
"version": "1.0.0",
"description": "You talk, we talk back!",
"main": "index.js",
"author": "eyal",
"license": "MIT",
"dependencies": {
"express": "^4.17.1",
"js-yaml": "^3.13.1",
"log4js": "^5.2.2",
"saslprep": "^1.0.3"
}
}
To get it up and running, you first need to install the node dependencies by adding npm install to your Dockerfile, like this
FROM mhart/alpine-node:12
WORKDIR /app
ADD . ./
RUN npm install
ENTRYPOINT ["node", "/app/index.js"]
Then you need to have your Node app listen for requests by adding
app.listen(3000, () => {
console.log(`Example app listening at http://localhost:3000`)
})
at the bottom of index.js.
Finally, a small error in your code. req.query.input needs to be req.query.name.
That should hopefully get you going.
It appears that my Hapi app is running in a Docker container, but I can't hit it in the browser. I thought that docker run -d -p 8080:3000 would have done it, but I guess not. I'm running boot to docker and neither http://localhost:8080/hello nor http://192.168.99.100:8080/hello is working.
I've tried tons of variations on this as well.
This is what I see when I run docker inspect <container id>:
Server running at: http://localhost:8080
Here's my Hapi.js server:
'use strict';
const Hapi = require('hapi');
// Create a server with a host and port
const server = Hapi.server({
host: 'localhost',
port: 3000
});
// Add the route
server.route({
method: 'GET',
path:'/hello',
handler: function (request, h) {
return 'hello world';
}
});
async function start() {
try {
await server.start();
}
catch (err) {
console.log(err);
process.exit(1);
}
console.log(`App running at: ${server.info.uri}/hello`);
}
start();
Here's my Dockerfile:
FROM node:8.9.3
MAINTAINER My Name <email#email.com>
ENV NODE_ENV=production
ENV PORT=3000
ENV user node
WORKDIR /var/www
COPY package.json yarn.lock ./
RUN cd /var/www && yarn
COPY . .
EXPOSE $PORT
ENTRYPOINT ["yarn", "start"]
Here's my package.json:
{
"name": "my-app",
"version": "1.0.0",
"repository": "https://github.com/myname/myrepo.git",
"author": "My Name",
"license": "MIT",
"private": true,
"dependencies": {
"hapi": "17.2.0"
},
"scripts": {
"start": "node ./src/server"
}
}
The issue is not with Docker but how you configure the node server.
If you bind to localhost it will only be available from within the docker container. If you want to allow connections from the docker host either don't provide a hostname or use 0.0.0.0.
const server = Hapi.server({
host: '0.0.0.0',
port: 3000
});
It appears that my Hapi app is running in a Docker container, but I can't hit it in the browser. I thought that docker run -d -p 8080:3000 would have done it, but I guess not. I'm running boot to docker and neither http://localhost:8080/hello nor http://192.168.99.100:8080/hello is working.
I've tried tons of variations on this as well.
This is what I see when I run docker inspect <container id>:
Server running at: http://localhost:8080
Here's my Hapi.js server:
'use strict';
const Hapi = require('hapi');
// Create a server with a host and port
const server = Hapi.server({
host: 'localhost',
port: 3000
});
// Add the route
server.route({
method: 'GET',
path:'/hello',
handler: function (request, h) {
return 'hello world';
}
});
async function start() {
try {
await server.start();
}
catch (err) {
console.log(err);
process.exit(1);
}
console.log(`App running at: ${server.info.uri}/hello`);
}
start();
Here's my Dockerfile:
FROM node:8.9.3
MAINTAINER My Name <email#email.com>
ENV NODE_ENV=production
ENV PORT=3000
ENV user node
WORKDIR /var/www
COPY package.json yarn.lock ./
RUN cd /var/www && yarn
COPY . .
EXPOSE $PORT
ENTRYPOINT ["yarn", "start"]
Here's my package.json:
{
"name": "my-app",
"version": "1.0.0",
"repository": "https://github.com/myname/myrepo.git",
"author": "My Name",
"license": "MIT",
"private": true,
"dependencies": {
"hapi": "17.2.0"
},
"scripts": {
"start": "node ./src/server"
}
}
The issue is not with Docker but how you configure the node server.
If you bind to localhost it will only be available from within the docker container. If you want to allow connections from the docker host either don't provide a hostname or use 0.0.0.0.
const server = Hapi.server({
host: '0.0.0.0',
port: 3000
});