JHipster + Angular + MongoDB + Docker: beginner question - jhipster

I would like to have some guidance about what is supposed to be the best development workflow with JHipster.
What I did expect:
With one docker-compose command, I could up and run everything the project needs (in this case, MongoDB, Kafka, backend, etc.);
When modifying front-end, saving the modified files, could fire livesync (ng serve --watch?).
What I did find:
The one command option that I found (docker-compose -f src/main/docker/app.yml up -d), which I guess that depends of a ./mvnw package -Pprod verify jib:dockerBuild before, does not livesync and seems that is not compatible with the individual execution of front-end with npm run start - application started this way points to different backend's modules ports (?).
I have experience with Angular and MongoDB (and a little with Docker), but I'm super new to JHipster and am trying to understand what I am doing wrong.
Thanks in advance!

For development workflow, you should start the dependencies individually. The app.yml will start the app's Docker image with the prod profile, useful for testing locally before deploying.
Start Containers for Mongo and Kafka
docker-compose -f src/main/docker/mongodb.yml up -d
docker-compose -f src/main/docker/kafka.yml up -d
Start the backend
./mvnw
Start frontend live-reload
npm start
If Docker is not accessible on localhost, you may need to configure application-dev.yml to point to the Docker IP.

Related

How to use a docker image to generate static files to serve from nginx image?

I'm either missing something really obvious or I'm approaching this totally the wrong way, either way I could use some fresh insights.
I have the following docker images (simplified) that I link together using docker-compose:
frontend (a Vue.js app)
backend (Django app)
nginx
postgres
In development, I don't use nginx but instead the Vue.js app runs as a watcher with yarn serve and Django uses manage.py runserver.
What I would like to do for production (in CI/CD):
build and push backend image
build and push nginx image
build the frontend image with yarn build command
get the generated files in the nginx container (through a volume?)
deploy the new images
The problem is: if I put yarn build as the CMD in the Dockerfile, the compilation happens when the container is started, and I want it to be done in the build step in CI/CD.
But if I put RUN yarn build in the image, what do I put as CMD? And how do I get the generated static files to nginx?
The solutions I could find use multistage builds for the frontend that have an nginx image as the last step, combining the two. But I need the nginx image to be independent of the frontend image, so that doesn't work for me.
I feel like this is a problem that has been solved by many, yet I cannot find an example. Suggestions are much appreciated!
Create a volume in your docker-compose.yml file and mount the same volume to both your frontend container (to a path where the built files are, like dist folder) and nginx container (to your web root path). This way both containers will have the access to same volume.
And also, keep your yarn build as RUN command.
EDIT:
Containers need to run a program or command in order to be classified as a container, and so they could be started, stopped etc. That is by design.
If you are not planning to serve from frontend container using a command, then you should either remove it as a service (since its not) from docker-compose.yml and add it as a build stage in your nginx dockerfile, or you could use some kind of command that will run indefinitely in your frontend container, for example tail -f index.html. The first solution is a better practice.
In your nginx dockerfile add frontend build dockerfile as a first build stage:
From: node as frontend-build
WORKDIR /app
RUN yarn build
From:nginx
COPY --from=frontend-build /app/dist /usr/shared/nginx/html
...
CMD ["nginx"]

Nuxt SSR project use 100% CPU

I am using Nuxt for server side rendering. I finished this project but at the last stage when i deploy to production, nuxt or ssr (idk) use 100% cpu on the system side. For this reason centos machine stop running.
Have you any suggestion about this problem? What should i look?
I solved this problem. All i had to do is;
'pm2 start npm --name server -i max -- run start'
When you use;
pm2 start npm --name server -- run start
node working just one core, but when you use '-i max' node working maximum capability of your server.
maybe this info good for anyone...

How can I run docker command inside dokku node.js app?

I have node.js application hosted on digital ocean and deployed using dokku. I need to run command docker-compose up -d inside this application to start database and Prisma server as mentioned in prisma docs.
I tried to run dokku run my-node-js-app "docker-compose up -d", but I got this error:
Could not determine a reasonable value for WEB_CONCURRENCY.
This is likely due to running the Heroku NodeJS buildpack on a non-Heroku platform.
WEB_CONCURRENCY has been set to 1. Please review whether this value is appropriate for your application.
setuidgid: fatal: unable to run docker-compose: file does not exist
I checked, I have docker-compose.yml file in my project.
I forgot to install prisma by running npm i -g prisma

Docker NodeJs for ReactJs

I'm trying to create a container for my ReactJs project with nodejs. At first, it seems to be success, but am not able to access it through the port.
Following is my Dockerfile
package.json
Command I ran
docker build -t <your username>/node-reactjs .
docker run -p 9090:3333 -d <your username>/node-reactjs
As you can see, the container created successfully
But
I even tried to go inside the container and curl localhost:3333, it did return me the js file
I tried googling around, and many ways but seems like cant make it to work. I even tried the docker-compose way, but even worse. Cant even create the container.
Would really appreciate if someone can help me out on this.
Btw, is this the correct way to do for ReactJs?
Thanks.
After some hard time, I finally found the way to get it to work.
Thanks to this Connecting webpack-dev-server inside a Docker container from the host
All I need to do is to add a parameter to the start script in the package.json as following
Noticed the
--host 0.0.0.0
That's what missing.

How to run the seed.js, given in the generator-angular-fullstack, on OpenShift?

I'm developing an app, using MEAN.js and its generator (https://github.com/DaftMonk/generator-angular-fullstack), and Openshift as a hosting.
The project template of the generator includes a script (server/config/seed.js) to populate the database with two users.
In localhost, it is called automatically, but I also can call it using node server/config/seed.js (suppose you're on the root app directory).
The problem is, when I deploy it to Openshift, I run it and no error is reported, but the mongodb database is not updated. The exactly steps I do to run it on Openshift are:
Connect to ssh: ssh ....
cd app-root/runtime/repo/
`node server/config/seed.js``
What am I missing?
Thanks in advance.
You have few options:
in server/config/production.js add
seedDB: true
or change NODE_ENV to development coz during 1st deployment it is set to production
then
grunt
grunt:buildcontrol:openshift
should be working now

Resources