Differences in code between local project and Dockerized project break the app - node.js

I'm trying to dockerize my current pet project in which I use a NodeJS (ExpressJS) as a backend, React as a frontend and PostgreSQL as a database. On both backend and frontend I use TypeScript instead of JavaScript. I'm also using a Prisma as ORM for my database. I decided to have a standard three container's architecture, one for backend, one for database and one for frontend app. My Dockerfile's are as follows:
Frontend's Dockerfile
FROM node:alpine
WORKDIR /usr/src/frontend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "run", "start"]
Backend's Dockerfile
FROM node:lts
WORKDIR /usr/src/backend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
RUN npx prisma generate
CMD ["npm", "run", "dev"]
there's also a .dockerignore file in the backend folder:
node_modules/
and my docker-compose.yml looks like this:
version: '3.9'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
environment:
POSTGRES_USER: 'postgres'
POSTGRES_PASSWORD: 'postgres'
POSTGRES_DB: 'hucuplant'
server:
build:
context: ./backend_express
ports:
- "8000:8000"
environment:
DATABASE_URL: 'postgresql://postgres:postgres#localhost:5432/hucuplant?schema=public'
client:
build:
context: ./frontend
ports:
- "3000:3000"
After doing a docker-compose up --build everything starts well but when I try to register a new user on my site then I get the following error:
Error:
hucuplant-server-1 | Invalid `prisma.user.findUnique()` invocation in
hucuplant-server-1 | /usr/src/backend/src/routes/Auth.ts:44:57
hucuplant-server-1 |
hucuplant-server-1 | 41 auth.post("/register", async (req: Request, res: Response) => {
hucuplant-server-1 | 42 const { email, username, password } = req.body;
hucuplant-server-1 | 43
hucuplant-server-1 | → 44 const usernameResult: User | null = await prisma.user.findUnique({
hucuplant-server-1 | where: {
hucuplant-server-1 | ? username?: String,
hucuplant-server-1 | ? id?: Int,
hucuplant-server-1 | ? email?: String
hucuplant-server-1 | }
hucuplant-server-1 | })
However, the existing code in my Auth.ts file on the line 44 looks like this:
auth.post("/register", async (req: Request, res: Response) => {
const { email, username, password } = req.body;
const usernameResult: User | null = await prisma.user.findUnique({
where: {
username: username,
},
});
When I run my project locally everything works just fine but when I try to run the containerized app then those things break and differ quite much. What is causing that? How do I fix that?

Related

my puppeteer not working on docker container

It was working fine on just development environment not docker container.
But not working on docker container. How do I setup puppeteer on docker container?
I have seen a lot of answer and question but not working at all. I think that my code is wrong..
My Env:
OS: MacOS, M1
Here is Dockerfile:
FROM node:18-alpine
WORKDIR /usr/src/app
RUN apk update && apk upgrade
RUN apk add --no-cache udev ttf-freefont chromium
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD true \
CHROME_PATH=/usr/bin/chromium-browser \
PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 8120
CMD ["npm", "run", "start:dev"]
And docker-compose.yml:
version: "3.7"
services:
node:
container_name: node-test
build:
context: .
dockerfile: Dockerfile
restart: always
platform: linux/amd64
extra_hosts:
- "host.docker.internal:host-gateway"
networks:
- test-net
env_file:
- .env
ports:
- "8010:8010"
depends_on:
- mongo
mongo:
image: mongo
container_name: mongo
restart: always
networks:
- test-net
ports:
- "27017:27017"
environment:
MONGO_INITDB_ROOT_USERNAME: test
MONGO_INITDB_ROOT_PASSWORD: test
MONGO_INITDB_DATABASE: test
volumes:
mongo:
networks:
test-net:
name: test-net
external: true
crawler.ts:
import puppeteer from 'puppeteer';
export const crawler = async (url: string) => {
console.log(url); // <- [v] working
const browser = await puppeteer.launch({
ignoreHTTPSErrors: false,
executablePath: '/usr/bin/chromium-browser',
headless: true,
args:
'--disable-dev-shm-usage',
'--no-sandbox',
'--disable-setuid-sandbox'
],
slowMo: 30
});
console.log(browser) // [v] print something.
const page = await browser.newPage();
console.log(page); // <- [x] not working. print anything.
await page.setViewport({ width: 1920, height: 1080 })
await page.goto("https://www.example.com");
await page.close();
await browser.close();
};
My error message is different whenever modify my code, but It is latest error message on terminal.
node-test | ProtocolError: Protocol error (Target.createTarget): Target closed.
node-test | at /usr/src/app/node_modules/puppeteer/src/common/Connection.ts:119:16
node-test | at new Promise (<anonymous>)
node-test | at Connection.send (/usr/src/app/node_modules/puppeteer/src/common/Connection.ts:115:12)
node-test | at Browser._createPageInContext (/usr/src/app/node_modules/puppeteer/src/common/Browser.ts:525:47)
node-test | at BrowserContext.newPage (/usr/src/app/node_modules/puppeteer/src/common/Browser.ts:886:26)
node-test | at Browser.newPage (/usr/src/app/node_modules/puppeteer/src/common/Browser.ts:518:33)
node-test | at crawler (/usr/src/app/src/crawler/crawler.ts:16:30)
node-test | at processTicksAndRejections (node:internal/process/task_queues:95:5)
node-test | at async CrawlerController (/usr/src/app/src/crawler/crawler.controller.ts:10:5) {
node-test | originalMessage: ''
node-test | }

Why i cant access to my docker node app via browser, within container it works?

my docker-compose.yml
version: "3"
services:
client:
ports:
- "3000:3000"
restart: always
container_name: thread_client
build:
context: .
dockerfile: ./client/client.Dockerfile
volumes:
- ./client/src:/app/client/src
- /app/client/node_modules
depends_on:
- api
api:
build:
context: .
dockerfile: ./server/server.Dockerfile
container_name: thread_api
restart: always
ports:
- "3001:3001"
- "3002:3002"
volumes:
- ./server/src:/app/server/src
- /app/server/node_modules
pg_db:
image: postgres:14-alpine
container_name: thread_db
restart: always
environment:
POSTGRES_PASSWORD: postgres
POSTGRES_DB: thread
POSTGRES_USER: postgres
volumes:
- pg_volume:/var/lib/postgresql/data
adminer:
image: adminer
restart: always
depends_on:
- pg_db
ports:
- "9090:8080"
volumes:
pg_volume:
client.Dockerfile
FROM node:16-alpine
WORKDIR /app
COPY .editorconfig .
COPY .eslintrc.yml .
COPY .lintstagedrc.yml .
COPY .ls-lint.yml .
COPY .npmrc .
COPY .nvmrc .
COPY .prettierrc.yml .
COPY .stylelintrc.yml .
COPY package.json .
COPY package-lock.json .
RUN npm install
COPY ./shared ./shared
RUN npm run install:shared
WORKDIR /app/client
COPY ./client/package.json .
COPY ./client/package-lock.json .
COPY ./client/.eslintrc.yml .
COPY ./client/.npmrc .
COPY ./client/.stylelintrc.yml .
COPY ./client/jsconfig.json .
COPY ./client/.env.example .env
RUN npm install
COPY ./client .
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start"]
server.Dockerfile
FROM node:16-alpine
WORKDIR /app
COPY .editorconfig .
COPY .eslintrc.yml .
COPY .lintstagedrc.yml .
COPY .ls-lint.yml .
COPY .npmrc .
COPY .nvmrc .
COPY .prettierrc.yml .
COPY .stylelintrc.yml .
COPY package.json .
COPY package-lock.json .
RUN npm install
COPY ./shared ./shared
RUN npm run install:shared
WORKDIR /app/client
COPY ./client/package.json .
COPY ./client/package-lock.json .
COPY ./client/.eslintrc.yml .
COPY ./client/.npmrc .
COPY ./client/.stylelintrc.yml .
COPY ./client/jsconfig.json .
COPY ./client/.env.example .env
RUN npm install
COPY ./client .
RUN npm run build
WORKDIR /app/server
COPY ./server/package.json .
COPY ./server/package-lock.json .
COPY ./server/.env.example .env
RUN npm install
COPY ./server .
EXPOSE 8654
CMD ["npm", "start"]
client app is accessed in browser easily, but API service not, and I don't understand why
server.js
import fastify from 'fastify';
import cors from '#fastify/cors';
import fastifyStatic from '#fastify/static';
import http from 'http';
import Knex from 'knex';
import { Model } from 'objection';
import qs from 'qs';
import { Server as SocketServer } from 'socket.io';
import knexConfig from '../knexfile.js';
import { initApi } from './api/api.js';
import { ENV, ExitCode } from './common/enums/enums.js';
import { socketInjector as socketInjectorPlugin } from './plugins/plugins.js';
import { auth, comment, image, post, user } from './services/services.js';
import { handlers as socketHandlers } from './socket/handlers.js';
const app = fastify({
querystringParser: str => qs.parse(str, { comma: true })
});
const socketServer = http.Server(app);
const io = new SocketServer(socketServer, {
cors: {
origin: '*',
credentials: true
}
});
const knex = Knex(knexConfig);
Model.knex(knex);
io.on('connection', socketHandlers);
app.register(cors, {
origin: "*"
});
app.register(socketInjectorPlugin, { io });
app.register(initApi, {
services: {
auth,
comment,
image,
post,
user
},
prefix: ENV.APP.API_PATH
});
const staticPath = new URL('../../client/build', import.meta.url);
app.register(fastifyStatic, {
root: staticPath.pathname,
prefix: '/'
});
app.setNotFoundHandler((req, res) => {
res.sendFile('index.html');
});
const startServer = async () => {
try {
await app.listen(ENV.APP.PORT);
console.log(`Server is listening port: ${ENV.APP.PORT}`);
} catch (err) {
app.log.error(err);
process.exit(ExitCode.ERROR);
}
};
startServer();
socketServer.listen(ENV.APP.SOCKET_PORT);
So, I have tried curl localhost:3001 in API container and it's works, but why client works good via browser and API doesn't I don't any ideas.
How to debug, to find right solution?
UPD:
docker inspect (API service container)
"Ports": {
"3001/tcp": [
{
"HostIp": "0.0.0.0",
"HostPort": "3001"
},
{
"HostIp": "::",
"HostPort": "3001"
}
],
"3002/tcp": [
{
"HostIp": "0.0.0.0",
"HostPort": "3002"
},
{
"HostIp": "::",
"HostPort": "3002"
}
]
},
Looking at your comment stating:
i am trying to access to that app via browser by localhost:3001
And the ports part of your docker-compose.yaml.
ports:
- "8654:3001"
- "3002:3002"
You are trying to access the application on the wrong port.
With - "8654:3001" you are telling docker-compose to map port 3001 of the container to port 8654 on your host. (documentation)
Try to open http://localhost:8654 in your browser or changing 8654 in the docker-compose.yaml to 3001.

ampqlib Error:"Frame size exceeds frame max" inside docker container

I am trying to do simple application with backend on node.js + ts and rabbitmq, based on docker. So there are 2 containers: rabbitmq container and backend container with 2 servers running - producer and consumer. So now I am trying to get an access to rabbitmq server, but I get this error "Frame size exceeds frame max".
The full code is:
My producer server code is:
import express from 'express';
import amqplib, { Connection, Channel, Options } from 'amqplib';
const producer = express();
const sendRabbitMq = () =>{
amqplib.connect('amqp://localhost', function(error0: any, connection: any) {
if(error0){
console.log('Some error...')
throw error0
}
})
}
producer.post('/send', (_req, res) => {
sendRabbitMq();
console.log('Done...');
res.send("Ok")
})
export { producer };
It is connected to main file index.ts and running inside this file.
Also maybe I have some bad configuration inside docker. My Dockerfile is
FROM node:16
WORKDIR /app/backend/src
COPY *.json ./
RUN npm install
COPY . .
And my docker-compose include this code:
version: '3'
services:
backend:
build: ./backend
container_name: 'backend'
command: npm run start:dev
restart: always
volumes:
- ./backend:/app/backend/src
- ./conf/myrabbit.conf:/etc/rabbitmq/rabbitmq.config
ports:
- 3000:3000
environment:
- PRODUCER_PORT=3000
- CONSUMER_PORT=5672
depends_on:
- rabbitmq
rabbitmq:
image: rabbitmq:3.9.13
container_name: 'rabbitmq'
ports:
- 5672:5672
- 15672:15672
environment:
- RABBITMQ_DEFAULT_USER=user
- RABBITMQ_DEFAULT_PASS=user
I will be very appreciated for your help

Problem about dockerizing a NestJS app with Prisma and PostgreSQL

I am trying to build a NestJS app with Prisma and PostgreSQL. I want to use docker; however, I got an error when I sent the request to the backend.
Here is my docker file
FROM node:14 AS builder
WORKDIR /app
COPY package*.json ./
COPY prisma ./prisma/
RUN npm install
RUN npx prisma generate
COPY . .
RUN npm run build
FROM node:14
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package*.json ./
COPY --from=builder /app/dist ./dist
EXPOSE 3000
CMD [ "npm", "run", "start:prod" ]
Here is my docker-compose.yml
version: '3.8'
services:
nest-api:
container_name: nest-api
build:
context: .
dockerfile: Dockerfile
ports:
- 3000:3000
depends_on:
- postgres
env_file:
- .env
postgres:
image: postgres:13
container_name: postgres
restart: always
ports:
- 5432:5432
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: task-management
env_file:
- .env
Here is my schema.prisma
// This is your Prisma schema file,
// learn more about it in the docs: https://pris.ly/d/prisma-schema
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
//url = "postgresql://postgres:postgres#localhost:5432/task-management?schema=public"
}
model Task {
id Int #id #default(autoincrement())
title String
description String
status TaskStatus #default(OPEN)
}
enum TaskStatus {
OPEN
IN_PRO
DOooNE
}
Here is the .env
# Environment variables declared in this file are automatically made available to Prisma.
# See the documentation for more detail: https://pris.ly/d/prisma-schema#using-environment-variables
# Prisma supports the native connection string format for PostgreSQL, MySQL, SQLite, SQL Server and MongoDB (Preview).
# See the documentation for all the connection string options: https://pris.ly/d/connection-strings
DATABASE_URL=postgresql://postgres:postgres#postgres:5432/task-management?schema=public
After I run the command:docker-compose up, everything is fine. However, if I send the request to the app, I get the following error:
nest-api | [Nest] 19 - 11/02/2021, 5:52:43 AM ERROR [ExceptionsHandler]
nest-api | Invalid `this.prisma.task.create()` invocation in
nest-api | /dist/tasks/tasks.service.js:29:33
nest-api |
nest-api | 26 return found;
nest-api | 27 }
nest-api | 28 async creatTask(data) {
nest-api | → 29 return this.prisma.task.create(
nest-api | The table `public.Task` does not exist in the current database.
nest-api | Error:
nest-api | Invalid `this.prisma.task.create()` invocation in
nest-api | /dist/tasks/tasks.service.js:29:33
nest-api |
nest-api | 26 return found;
nest-api | 27 }
nest-api | 28 async creatTask(data) {
nest-api | → 29 return this.prisma.task.create(
nest-api | The table `public.Task` does not exist in the current database.
nest-api | at cb (/node_modules/#prisma/client/runtime/index.js:38537:17)
nest-api | at async /node_modules/#nestjs/core/router/router-execution-context.js:46:28
nest-api | at async /node_modules/#nestjs/core/router/router-proxy.js:9:17
What changes should I make in the docker file to solve the problem?

(Docker-Compose) UnhandledPromiseRejectionWarning when connecting node and postgres

I am trying to connect the containers for postgres and node. Here is my setup:
yml file:
version: "3"
services:
postgresDB:
image: postgres:alpine
container_name: postgresDB
ports:
- "5432:5432"
environment:
- POSTGRES_DB=myDB
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=Thisisngo1995!
express-server:
build: ./
environment:
- DB_SERVER=postgresDB
links:
- postgresDB
ports:
- "3000:3000"
Dockerfile:
FROM node:12
WORKDIR /usr/src/app
COPY package.json ./
RUN npm install
COPY . .
COPY ormconfig.docker.json ./ormconfig.json
EXPOSE 3000
CMD ["npm", "start"]
connect to postgres:
let { Pool, Client } = require("pg");
let postgres = new Pool({
host: "postgresDB",
port: 5432,
user: "postgres",
password: "Thisisngo1995!",
database: "myDB",
});
module.exports = postgres;
and here is how I handled my endpoint:
exports.postgres_get_controller = (req, resp) => {
console.log("Reached Here");
postgres
.query('SELECT * FROM public."People"')
.then((results) => {
console.log(results);
resp.send({ allData: results.rows });
})
.catch((e) => console.log(e));
};
Whenever I try to touch the endpoint above, I get this error in the container:
Reasons why?
Note: I am able to have everything functioning on my local machine (without docker) simply by changing "host: localhost"
Your postgres database name and username should be the same
You can use docker-compose-wait to make sure interdependent services are launched in proper order.
See below on how to use it for your case.
update the final part of your Dockerfile as below;
# ...
# this will be used to check if DB is up
ADD https://github.com/ufoscout/docker-compose-wait/releases/download/2.7.3/wait ./wait
RUN chmod +x ./wait
CMD ./wait && npm start
Update some parts of your docker-compose.yml as below:
express-server:
build: ./
environment:
- DB_SERVER=postgresDB
- WAIT_HOSTS=postgresDB:5432
- WAIT_BEFORE_HOSTS=4
links:
- postgresDB
depends_on:
- postgresDB
ports:
- "3000:3000"

Resources