NodeJS API with external deps in other language - node.js

I am developing a NodeJS API and everything is ok.
For an specific issue I am using a local CLI dependency that process some input files and output other stuff, in case to return from the API.
I wanted to know (maybe open my mind on) what kind of service I can use to serve this API in production.
The idea is to have a Node environment (like in my local) that can have installed in the same machine an external dependency not necessarily written in Node.
My specific dependency is fontforge and other little things.
Thanks in advance.

It's hard to beat a good VPS if you need to install custom software that is not easy to install with npm. My favorite VPS provider is Digital Ocean. You can have two months of a basic server for free with this link so you can see if it's ok for you before you pay anything. By second favorite VPS provider is Vultr because you can install custom ISOs on their servers. You can try it for free with this link. But it will mean taking care of the server yourself. With services like Heroku all of that is taken care for you - but you can't install whatever you want there. With a VPS you get your own server with root access. Usually it's Linux but Digital Ocean also supports FreeBSD and some people install OpenBSD, though it's not officially supported. With a VPS you can install whatever you want, but you have to do it yourself. There is always a trade off.
More info
Installing Node
To install Node on the VPS, my recommendation is to install in /opt with a versioned directory and a symlink - this is an example procedure that I wrote for a different answer:
# change dir to your home:
cd ~
# download the source:
curl -O https://nodejs.org/dist/v6.1.0/node-v6.1.0.tar.gz
# extract the archive:
tar xzvf node-v6.1.0.tar.gz
# go into the extracted dir:
cd node-v6.1.0
# configure for installation:
./configure --prefix=/opt/node-v6.1.0
# build and test:
make && make test
# install:
sudo make install
# make a symlink to that version:
sudo ln -svf /opt/node-v6.1.0 /opt/node
See this answer for more info.
Your start scripts
To have your own application nicely started on server startup - here is an example Upstart script based on the one that I'm using - it should work on Ubuntu 14.04, not tested on newer versions - save it in /etc/init/YOURAPP.conf:
# When to start the service
start on runlevel [2345]
# When to stop the service
stop on runlevel [06]
# If the process quits unexpectadly trigger a respawn
respawn
# Start the process
exec start-stop-daemon --start --chuid node --make-pidfile --pidfile /www/YOURAPP/run/node-upstart.pid --exec /opt/node/bin/node -- /www/YOURAPP/app/app.js >> /www/YOURAPP/log/node-upstart.log 2>&1
Just change:
YOURAPP to the name of your own app
/opt/node/bin/node to your path to node
/www/YOURAPP/app/app.js to the path of your Node app
/www/YOURAPP/run to where you want your PID file
/www/YOURAPP/log to where you want your logs
--chuid node to --chuid OTHERUSER if you want it to run as a different user than node
(make sure to add a user with a name from --chuid above)
With your /etc/init/YOURAPP.conf in place you can safely restart your server and have your app still running, you can run:
start YOURAPP
restart YOURAPP
stop YOURAPP
to start, restart and stop your app - which would also happen automatically during the system boot or shutdown.

Related

Creating a custom NodeJSDocker image on rhel7

I am building some base Docker images for my organization to be used by applications teams when they deploy their applications in OpenShift. One of the images I have to make is an NodeJS image (we want our images to be internal rather than sourced from DockerHub). I am building on RedHat's RHEL7 Universal Base Image (ubi). However I am having trouble configuring NodeJS to work in the container. Here is my Dockerfile:
FROM myimage_rhel7_base:1.0
USER root
RUN INSTALL_PKGS="rh-nodejs10 rh-nodejs10-npm rh-nodejs10-nodejs-nodemon nss_wrapper" && \
yum install -y --setopt=tsflags=nodocs $INSTALL_PKGS && \
rpm -V $INSTALL_PKGS && \
yum clean all
USER myuser
However when I run the image there are no node or npm commands available unless I run scl enable rh-nodejs10 bash. This does not work in the Dockerfile as it creates a subshell that will not be usable to a user accessing the container.
I have tried installing from source, but I have run into a different issue of needing to upgrade the gcc/g++ versions despite them not being available in my configured repos from my org. I also figure that if I can get NodeJS to work from the package manager it will help get security patches and such should the package be updated.
My question is, what are the recommended steps to create an image that can be used to build applications running on NodeJS?
Possibly this is a case where the best code is code you don't write at all. Take a look at https://github.com/sclorg/s2i-nodejs-container
It is a project that creates an image that has nodejs installed. This might be a perfect solution out of the box, or it could also serve as a great example of what you're trying to build.
Also, their readme attempts to describe how they get around the scl enable command.
Normally, SCL requires manual operation to enable the collection you
want to use. This is burdensome and can be prone to error. The
OpenShift S2I approach is to set Bash environment variables that serve
to automatically enable the desired collection:
BASH_ENV: enables the collection for all non-interactive Bash sessions
ENV: enables the collection for all invocations of /bin/sh
PROMPT_COMMAND: enables the collection in interactive shell
Two examples:
* If you specify BASH_ENV, then all your #!/bin/bash scripts do not need to call scl enable.
* If you specify PROMPT_COMMAND, then on execution of the podman exec ... /bin/bash command, the collection will be automatically
enabled.
I decided in the end to install node using the binaries rather than our rpm server. Here is the implementation
FROM myimage_rhel7_base:1.0
USER root
# Get node distribution from nexus and install it
RUN wget -P /tmp http://myrepo.example.com/repository/node/node-v10.16.3-linux-x64.tar.xz && \
tar -C /usr/local --strip-components 1 -xf /tmp/node-v10.16.3-linux-x64.tar.xz && \
rm /tmp/node-v10.16.3-linux-x64.tar.xz

How to apply Node.js secuirty updates?

How to apply node.js security patches?
Is there a specific process to apply security patches when using meteor js on ubuntu 16.04?
When you're running meteor in production mode, it is run as a (pure) node.js app. So the short answer to your question is to just update node (depending on how you installed it; probably sudo apt-get update -y && sudo apt-get install nodejs -y).
There are a variety of tools you can use to deploy a meteor app (e.g. meteor-up), but all of them have essentially the same two steps, which are easy enough to do yourself:
Bundle your meteor app into a node.js app
meteor build ../my-build-output-folder --server https://my.production.site.url --architecture os.linux.x86_64
This will create a meteor-server.tar.gz file in the folder you specified, containing the node.js app. The process is then (as per the README file that is included in the bundle):
Transfer the meteor-server.tar.gz file to your server
tar -zxvf meteor-server.tar.gz to extract the node application
The included README file tells you the rest :
README:
This is a Meteor application bundle. It has only one external dependency:
Node.js v8.11.4. To run the application:
$ (cd programs/server && npm install)
$ export MONGO_URL='mongodb://user:password#host:port/databasename'
$ export ROOT_URL='http://example.com'
$ export MAIL_URL='smtp://user:password#mailhost:port/'
$ node main.js
Use the PORT environment variable to set the port where the
application will listen. The default is 80, but that will require
root on most systems.
Set up a system to survive restarts e.g. upstart , pm2, supervisord , or docker

avoiding php artisan queue:work : Queue Driver - Redis or Database (Laravel 5.4)

I am currently using database as my queue driver, I have installed Laravel 5.4 on Windows 10 PC. In order to process queues I have been using php artisan queue:work which was completely fine in development stage. Now, the project is completely ready and needs to be deployed on Linux Server (Dedicated) I am not sure how to avoid running command php artisan queue:work on terminal in order to process mail jobs?
I have deployed once in shared hosting and I have used cron jobs, But now I have dedicated server I guess I should be able to use something else to run jobs, I was also thinking of using Redis as queue driver rather than database as queue driver
I need some suggestion on what is best. And how to avoid php artisan queue:work on dedicated server? Do I need to write small script to make sure jobs run in background as a service.
Laravel documentation covers this with supervisor.
See: Laravel Supervisor configuration
Supervisor is a process monitor which makes sure your queue command (or any other command for that matter) is executed and restarted if it dies.
Edit:
See: Supervisor documentation
Basically for centos, you can use yum:
yum install supervisor
Easy install
// required for easy_install (if not installed already)
yum install python-setuptools
// install supervisor
easy_install supervisor
Or pip
pip install supervisor
After that it's just creating your config (based on the example given on Laravel's documentation), this is handled step by step in:
Supervisor: creating-a-configuration-file
And create the service: Setup Supervisor
After that you can start the service with:
service supervisord start

Deploying my node.js app from Github to a VPS

I have an node.js application on Github. I have never done any VPS deployment before and I am learning on the go.
I am using the VPS by Hostinger.in, the OS being used is Ubuntu 14.04. So far this is what I have done:
Connected to their SSH successfully from my Terminal
Installed node.js on the server [https://www.hostinger.com/tutorials/vps/how-to-install-node-js-on-ubuntu]
Installed Git on the server [https://www.hostinger.com/tutorials/how-to-install-git-on-ubuntu]
I could not find any online resources for deploying my node.js to Hostinger VPS so I am following the ones written for DigitialOcean.
The one tutorial I followed is this: https://code.tutsplus.com/tutorials/setting-up-continuous-integration-continuous-deployment-with-jenkins--cms-21511
I cloned my repository doing:
git clone https://github.com/myusername/node-project.git
and it seems it was deployed (didnt give me any errors).
All the installations I did on the server I did as the root/admin user. So far I have not created any separate user to perform any of these tasks.
The server hostname given to me is dangerous-pigs.com. Now I am assuming my node.js application is deployed, but when I go to dangerous-pigs.com it shows me server not found error.
I also installed forever for my node app and when I run
forever start app.js
it says:
warn: --minUptime not set. Defaulting to: 1000ms
warn: --spinSleepTime not set. Your script will exit if it does not stay up for at least 1000ms
info: Forever processing file: app.js
error: Cannot start forever
error: script /root/app.js does not exist.
Which means the app is either not installed or installed somewhere other than the root folder.
There is a lot going on and I am confused where to start fixing issues.
How can I deploy the app to running it on the dangerous-pigs.com?
Update
So it seems I have to go inside the project folder in root and do the
npm install --production
after which I did
node app.js
The server seems to be running but, I can only access my application if I do to the actual IP provided by the service.
So if I type http://93.188.163.249:8000 --> that's my application.
How do I change it to point to a domain?
After some more research this is what I found:
Currently by default Apache2 runs on port 80. To run nodejs on port 80 first I need to install libcap2-bin in my Ubuntu server by doing:
sudo apt-get install libcap2-bin
after which I do
sudo setcap cap_net_bind_service=+ep /usr/bin/nodejs
the above command works if you have a mac, for windows the command maybe
sudo setcap cap_net_bind_service=+ep /usr/local/bin/node
but please confirm before doing it.
Also your nodejs server needs to be stopped before you make these changes else it will not work. In my case I had forgotten to stop my node server and kept running the sudo setcap command but it did not changed the port (for obvious reasons).
If you are using forever to run node then do:
forever stopall

React: How to publish page on server using React-starter-kit

So, I created a page using repo from:
https://github.com/kriasoft/react-starter-kit
I have my own global ftp server. And now I would like to publish my project on server.
What is the best way to do it. Should I copy all files to ftp server and just exec command 'npm start'? Or maybe I should deploy it?
I'm new in web deployment and not sure how it works.
Thanks for any tips.
It may be a little more complicated than just ftping your project up. Here are the instructions I use to setup a server at digitalocean.
sudo apt-get update
sudo apt-get install npm
sudo apt-get install git
sudo apt-get install ufw
sudo apt-get install build-essential libssl-dev
curl https://raw.githubusercontent.com/creationix/nvm/v0.20.0/install.sh | bash
nvm install stable //may require new ssh session before this
ufw default deny incoming
ufw default allow outgoing
ufw allow ssh
ufw allow 80/tcp
ufw allow out to any port 53
ufw enable
sudo npm install -g forever
sudo npm install -g node-gyp
cd /var
mkdir www
cd www
git clone https://github.com/calitek/palminfo --recursive
npm install
npm ls -depth 0
export PORT=80
node js/server.js
test using ip / when good exit then
forever start js/server.js
set dns
The server will need to support you adding node.js. Then you need to preferably use github to clone the project. You will want to do the npm install on the server to be sure you are using the correct modules. Its a little complicated the first time out, just keep good notes for the next time.
This is a big topic to cover, whatever your linux distribution is, you probably will always need 2 servers - a proxy server, and an app server.
I'd recommend you wrap your compiled sites with a simple Node.js server like Express or Hapi. Then configure nginx to properly route all the requests to the application. Checkout some detailed guides from DigitalOcean
CentOS: https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-centos-7
Ubuntu: https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-ubuntu-14-04
Minor steps might be slightly different but you get the idea
One approach is to build the app on the server, but you will need something like Heroku which you already decided to go with.
But an alternative to that is to do a paradigm shift. You're app is basicaly just some html, css and javascript, which is compiled and servered to the public. You could compile that yourself with npm run build and then just copy the compiled files to your hosting server which can use what ever server it wants: apache httpd, nginx, etc. This is also cheaper cause you only need basic hosting, not some complex nodejs compiling server.
I created a starter kit http://redux-minimal.js.org/ which helps you create rich real-world apps with the minimum amount of packages and very light configuration setup. If you look at the folder structure you can see that the app uses the index.html from the public folder, and then the css and js file are compiled directly in the public folder. Which makes it easier to just copy the public folder to what ever server you want.

Resources