Configure the npm to start in the background on Mac OS X - node.js

Description
I am on a Mac OS X.
Right now, I have almost 10 Laravel/LAMP projects locally, that I ran using vhost configured with Apache. The awesome part about them is even when I restart my Mac or move between networks, or even close the terminal app/tab of my projects, the Apache is still running, all my local sites will still be accessible.
Goal
Now, I am looking to do the same things with my MEAN apps.
How would one configure something like that ?
Let's say I have 3 MEAN apps.
Example
App1
FE running on port : http://localhost:4201
BE running on port : http://localhost:3001
App2
FE running on port : http://localhost:4202
BE running on port : http://localhost:3002
App3
FE running on port : http://localhost:4203
BE running on port : http://localhost:3003
I'm opening for anything suggestions at this moment.
Can we configure the npm to start in the background ?
BE/API
FE

You can use macOS's launchd to run services in the background. There are a couple good GUI apps that make it easier to create launch services:
LaunchControl ($10)
Lingon ($10) - If you go with Lingon, get Lingon X 5 from the official website instead of Lingon 3 from the Mac App Store; Lingon X 5 is more powerful because it is not limited by Apple's sandboxing.
There's also launched.zerowidth.com, an interactive online tool for creating the .plist files that launchd uses.
launchd.info is also a good resource if you want to set them up manually. Apple's documentation is available too.
If you are having problems with commands not working, I recommend trying these troubleshooting steps:
Convert all your commands to use absolute paths (e.g. npm -> /usr/local/bin/npm). You can find the absolute path of a command by running which with the name of the command (e.g. which npm)
Run your commands from within bash using /bin/bash -c (e.g /bin/bash -c "/usr/local/bin/npm start")

One thing you can do is dockerize your applications.
With docker you can run your applications in a light weight virtual machine known as containers in your computer.
This have some advantages, for example, you can run your app with port 80 inside the virtual machine and expose another port to your machine. You can start or stop the container and so forth.
Go to https://www.docker.com/what-docker for more information.

Related

How do I automatically start a node/express app (with pm2), having node installed using software collections (scl), on CentOS7

1. Summarize the problem
I would like for a node/express app.js to listen on a port 3000, on container startup.
I created a CentOS 7 Docker container, installed the software collections (SCL) repo, and then installed node.
I can now enable node with:
scl enable rh-nodejs10 bash, and so I did, and then installed express (globally), and pm2 (globally), and can successfully run a minimal express app listening on port 3000 with commands I run at the command line.
I put scl enable rh-nodejs10 bash in my .bash_profile (of a user I created named: www - because I do not want root running the web server).
In fact, I will be building a rootless container (buildah), next after this, so there will be no 'root' user at all for security concerns.
Now on container startup I want to have the web server start automatically, and be able to get a response from: http://localhost:3000 (hello world).
The problem is that on container startup, node is not enabled for any user until a shell is invoked to enable it.
2. Provide background including what you've already tried
I have searched the web for a solution of using node, express, pm2 in conjunction with CentOS 7 software collections and have found no solution.
Please only reply if you have actually tried the solution your recommend, and have it working, otherwise it most likely will not work.
systemd needs to:
1. enable node
2. run pm2 start app
I tried putting both in a shell, but when you enable node, you are then put in a sub-shell and cannot script any additional commands.
3. show some code
scl enable rh-nodejs10 bash
4. Describe expected and actual results including any error messages
I expect the node/express server to listen on port 3000 on container startup.
I have node running on reboot on RHEL 7 by using scl-utils/scl_source technique found here
$ cat /etc/profile.d/enablenodejs.sh
#!/bin/bash
source scl_source enable rh-nodejs10

Remote debugging nodejs app in Intellij with Docker - port already allocated

I want to start debugging node.js app using Intellij and node.js interpreter running on Docker. While running the app works, when I try to debug I get the error:
Error running 'index.js'
com.github.dockerjava.api.exception.InternalServerErrorException:
{"message":"driver failed programming external connectivity on
endpoint focused_poincare
(a17137973880d1be7c6a74fc142184fdda31e0dec8ebd539b09d9dbe4cf70014):
Error starting userland proxy: Bind for 0.0.0.0:55578 failed: port is
already allocated"}
Remote interpreter was configured acccording to the documentation. I have created a new Node.js Run/Debug configuration and entered the following data:
.
What might be the cause for debugging not working?
I use:
Intellij Idea Ultimate v. 2019.1.4 Preview
Intellij NodeJS plugin v. 191.7479.1, NodeJS remote interpreter plugin v. 191.6014.8 and Docker plugin v. 191.7141.44
Docker Desktop Community v. 2.0.0.3
EDIT: Adressing the comments:
Local debugging works. The file (index.js) that I am trying to run consists only of console.log('Hello world!') so I don't spawn any child processes on my own. My host system has Windows 10 Pro as OS, so for checking the open ports on host system I used netstat -an | find "55578", which returned nothing. Moreover, if I try to run docker manually from the command line, using docker run -it -p 55578:55578 node, everything runs and no error is given.
Also, each time I try remote debugging, the port number given by Intellij in an error message seems to be random high port number. I tried looking for open ports just after getting error message, but never found one that is open with a number reported by Intellij and those indeed appear in the output:
My Run/Debug configuration:
My Docker configuration (I had to check "Expose daemon on tcp://localhost:2375 without TLS" in Docker configuration to make Intellij and Docker play together):
EDIT: When I add --inspect-brk=0.0.0.0:55432 as "Node Parameters" in "Run/Debug Configurations" Intellij windows (per this bug report) the container nad program start, but debugging seems to be no-op (e.g. the program does not stop on breakpoints).
After updating to Intellij v. 2019.2 I was able to get the container debugging to work using the workaround already mentioned in my question.
I have added a parameter --inspect-brk=0.0.0.0:55432 to Node parameters option in Run/Debug configuration (see the picture below) and everything seems to be working, including the breakpoints.

Connecting app to AWS EC2 instance

I'm pretty new to DevOps and I'm trying to set up my Node.js app on a AWS server instance. Steps I've taken:
Set up Elastic IP
Launched EC2 instance with Ubuntu server
Connected IP to instance
Allowed incoming connections on port 3000
SSH'd into the server with a .pem file
Now I'm at the point where I need to get my files uploaded to the server. I've used FileZilla (and like it) in the past to upload files but the initial part was already set up. When I set up the site on FileZilla there is no /var/www folder on the remote site.
Don't know how to connect these dots.
Also not sure what I need to run once I successfully upload the files. I imagine npm install when I'm ssh'd into the server? Most of the tutorials out there only go through the basic instance setup.
Thanks!
You don't need to have /var/www. Also, it's better that you use a version control and a remote repository like Github and then SSH to your EC2 and then clone your repository there.
Then cd into your repo and run npm install and then start your app.
And check.
Once you connect to the EC2 instance then clone your code in there. It not mandatory to be in /var/www/html but, it's best practice to keep it there. Once you clone npm install into your project home directory so all the required packages get installed. Then for running your node application in production you have to run it on service as pm2, supervisor, forever, passenger, etc. You can use any of these services and configured appropriately to run your application on desired port. As with pm2, you can follow this guide, install pm2 Then you can run with the following command w.r.t. your environment, like I want to run my application on port 5555 for production
$ PORT=5555 pm2 start app.js --name API --env production -f
Check the status using pm2 list Now, your application is running on http://server-ip:5555/ But, you won't be typing port number every-time. So, you need to configure the web server in front of your application like apache or nginx which will forward all request to your application running port. You could find the best guide to their home page. Then your application is available at http://server-ip/ You can follow this for single configuration of multiple node apps
Hope this helps.

VNC into my workspace on cloud9

I'm going to code in a group a Python programm with tkinter, and I would like to do it online with cloud9.
But in this case, I will need an X environment to run and test it.
I thought about launching a vncserver on the workspace, which is already possible with tools preinstalled like vncserver or x11vnc.
But it seems that [project]-[pseudo].c9.io:5901 is not accessible, and that only port 80 and 443 are available to serve.
Can I use port 80 or 443 for an instance of vncserver ? If yes, how can I do it ?
I would like also to be able to connect with a tool like novnc in the browser. Do I need to serve it from the workspace ?
Yeah go to https://github.com/fjakobs/cloud9-vnc and download the zip file. Extract it to your Hard Drive and upload the files to your Workspace (File/Upload Local Files). Then run install.sh and when it's done, run run.sh to actually start the VNC Server. It will give you a link to the NoVNC Page from where you can connect to your Workspace. And there you have it, a basic fluxbox desktop with terminal. (Right Click then you can access all the Programs.)

How to simultaneously deploy Node.js web app on multiple servers with Jenkins?

I'm gonna deploy a Node.js mobile web application on two remote servers.(Linux OS)
I'm using SVN server to manage my project source code.
To simply and clearly manage the app, I decided to use Jenkins.
I'm new to Jenkins so it was a quite difficult task installing and configuring Jenkins.
But I couldn't find how to set up Jenkins to build remote servers simultaneously.
Could you help me?
You should look into supervisor. It's language and application type agnostic, it just takes care of (re-) starting application.
So in your jenkins build:
You update your code from SVN
You run your unit tests (definitely a good idea)
You either launch an svn update on each host or copy the current content to them (I'd recommend this because there are many ways to make SVN fail and this allows to include SVN_REVISION in the some .JS file for instance)
You execute on each host: fuser -k -n tcp $DAEMON_PORT, this will kill the currently running application with the port $DAEMON_PORT (the one you use in your node.js's app)
And the best is obviously that it will automatically start your node.js at system's startup (provided supervisor is correctly installed (apt-get install supervisor on Debian)) and restart it in case of failure.
A node.js supervisord's subconfig looks like this:
# /etc/supervisor/conf.d/my-node-app.conf
[program:my-node-app]
user = running-user
environment = NODE_ENV=production
directory = /usr/local/share/dir_app
command = node app.js
stderr_logfile = /var/log/supervisor/my-node-app-stderr.log
stdout_logfile = /var/log/supervisor/my-node-app-stdout.log
There are many configuration parameters.
Note: There is a node.js's supervisor, it's not the one I'm talking about and I haven't tested it.
per Linux OS, you need to ssh to your hosts to run command to get application updated:
work out the workflow of application update in shell script. Especially you need to daemonize your node app so that a completed jenkins job execution will not kill your app when exits. Here's a nice article to tell how to do this: Running node.js Apps With Upstart, or you can refer to pure nodejs tech like forever. Assume you worked out a script under /etc/init.d/myNodeApp
ssh to your Linux OS from jenkins. so you need to make sure the ssh private key file has been copied to /var/lib/jenkins/.ssh/id_rsa with the ownership of jenkins user
Here's an example shell step in jenkins job configuration:
ssh <your application ip> "service myNodeApp stop; cd /ur/app/dir; svn update; service myNodeApp restart"

Resources