Differencies between MLflow deployment possibilities - mlflow

Can someone please explain what are the main use cases when deciding how to serve a model from MLflow:
using command line "mlflow models serve -m ...."
deploying local Docker container with the same model
deploying model online for example on AWS Sagemaker
I am mainly interested in differencies between option A and B because as I understand both can be accessed as REST API endpoints. And I assume if network rules are in place then both can be called also externally.

Imho, the main difference is described in the documentation:
NB: by default, the container will start nginx and gunicorn processes. If you don’t need the nginx process to be started (for instance if you deploy your container to Google Cloud Run), you can disable it via the DISABLE_NGINX environment variable
And the model serve uses only Flask, so it could be less scalable.

Related

Running NodeJS server in production

I have a react + node app which I need to deploy. I am using nginx to serve my front end but I am not sure what to use to keep my nodejs server running in production.
The project is hosted on a windows VM. I cannot use pm2 due to license issues. I have no idea if running the server using nodemon in production is good or not. I have never deployed an app in production, hence I have no idea about appropriate methods.
You may consider forever or supervisor.
Check this blog post on the same.
You can also use docker. You can create multiple docker containers that will run your node server. Now at the nginx level at your host machine you can do load balancing configuration which will route the traffic equally to different docker node containers this will improve your availability and scalability, In heavy traffic you just need to increase the number of docker node containers as and when required. I guess initially 2 containers will be enough to handle traffic (depends on your use case though).
Note:- You can also use forever or supervisor as suggested by #Rajesh Gupta inside your docker containers for running node server. We use PM2 for that.
If you have a database then you can create a separate docker container for the database and map it to a volume in your host machine.
You can learn about docker from here.
Also you can read about load balancing in nginx from here.
Further more to improve your availability you can add a caching layer in between nginx and docker containers. Varnish is the best caching service i have used till date.
PS:- We use a similar but more advanced architecture to run our Ecommerce application that generates 5-10k orders daily. So this is a tested approach with 0 downtime.
Try to dockerize the whole app including the db, caching server (if any) etc.
Here are some examples why:
You can launch a fully capable development environment on any
computer supporting Docker; you don't have to install libraries,
dependencies, download packages, mess with config files etc.
The working environment of the application remains consistent across
the whole workflow. This means the app runs exactly the same for
developer, tester, and client, be it on development, staging or
production server. In short, Docker is the counter-measure for the
age-old response in the software development: "Strange, it works for
me!"
Every application requires a specific working environment: pre-installed applications, dependencies, data bases, everything in specific version. Docker containers allow you to create such environments. Contrary to VM, however, the container doesn't hold the whole operating system—just applications, dependencies, and configuration. This makes Docker containers much lighter and faster than regular VM's.

Deploy a nodejs server for web CRM for production

I'm looking for any method or tutorial to deploy on local server/computer a nodejs express app. this will be for production environment. All I read about solutions like zenit now, localtunnel, forever, pm2 and similars is that they aren't recomended for production environments. The idea is to have a public web without hosting. I eed that the method allows to maintain more than one node/web active at the same time.
When people say a component is not recommended for production, it does not mean that it is not stable. Most of the times it means that it is not a full blow solution that considers all the aspects of a production deployment:
scalability
fail-over
security
configurability
automation
etc.
If you are trying to build a solution that has precise requirements (requests per seconds, media streaming, etc.) you should post in your question as well to make it concrete. If this is not the case, you just have to install a basic setup that runs your configuration and fix bottlenecks as they appear. Don't try to build a theoretically correct solution now.
A couple of examples:
A classical setup (goes well with Do-It-Yourself deployments)
install Git + (Node.js and NPM) + (Forever or equivalent) + your database (e.g. MongoDB) + (NGINX or HAProxy) on your favourite/accepted Linux distribution
clone each Node.js app in its own directory
install cronjobs for basic monitoring and maintenance
add scripts to dynamically remove/add NGINX web server configurations based on deleted/added Node.js apps
A more modern setup (goes well with AWS/GCE deployments but also possible locally with tools like skaffold)
install a Kubernetes cluster on a couple of machines
prepare a base Docker container image that matches all your Node.js applications
if required, add a Dockerfile to each Node.js application to build one Docker image per application based on the base Docker container image
add a new deployment for each of your Node.js application
Kubernetes will do for you the "keep-alive"
fill-in the plumbing between your server network (DNS, IP, ports) and the IP's provided to you by Kubernetes (NGINX or HAProxy would also fill in this hole)

Deploy node.js in production

What are the best practices for deploying a nodejs application in production?
I would like to know how deploy for production Api's nodejs is being done today, today my application is in docker and running locally.
I wonder if I should use a Nginx inside the container and deploy my server on it or just upload my image node that is already running today.
*I need load balance
There are few main types of deployment that are popular today.
Using platform as a service like Heroku
Using a VPS like AWS, Digital Ocean etc.
Using a dedicated server
This list is in the order of growing difficulty and control. So it's easiest with PaaS but you get more control with a dedicated server - thought it gets significantly more difficult, especially when you need to scale out and build clusters.
See this answer for more details on how to install Node on a VPS or a dedicated server:
how to run node js on dedicated server?
I can only add from experience on AWS using a NAT Gateway which is a dedicated Node server with a MongoDB server behind the gateway. (Obviously this is a scalable system and project.)
With or without Docker, you need to control the production environment. This means clearly defining which NPM libraries you will need for production, how you handle environment variables and clusters for cores.
I would suggest, very strongly, using a tool like PM2 to handle clusters, server shutdowns and restarts and logs. (Workers & slaves also if you need them and code for them).
This list can go on and on, but keep in mind this is only from an AWS perspective. Setting up a Gateway correctly on AWS is also not an easy process. Be prepared for some gotcha's along the way.

Scaling a two-tier MEAN application with Docker on AWS

So my current set-up is a Node.js application that serves up an Angular front-end, a second Node.js application that has Express and serves as an API, and a MongoDB instance. Simply, the client-side app talks to the back-end app and the back-end app talks to MongoDB.
I was looking into how to Dockerize these applications and it seems like some examples use linking. So my question is does linking only work on the same host (meaning a single EC2 instance on AWS) or multiple EC2 instances? If only the former and if I have both apps and Mongo containerized on one instance, how do I scale out? Like if I spin up a second EC2 instance, would I put both containerized Node apps and Mongo again on that second instance? Is having a Mongo container on the same instance with the Node apps a single point of failure? How is that fault tolerant?
Just trying to wrap my head around this and apologize for my ignorance on the subject. Thanks!
You should put each app as well as the MongoDB server in separate containers (which is what I think you intend) and the linking (via Docker-Compose or other method) is just networking. If you use Docker links, it creates a private network. You can create other networks to talk to each other, also to a LAN, WAN, whatever.
Yes, putting them all on the same EC2 instance is creating a SPOF.
If that's a concern, look into: https://docs.docker.com/swarm/networking/
Docker Swarm is fully compatible with Docker’s networking features.
This includes the multi-host networking feature which allows creation
of custom container networks that span multiple Docker hosts.
Or load balance your apps and use an AWS-hosted MongoDB cluster. There are many possible approaches based on your needs and budget.

Load balancing express app instances

This is my first load balancing question.
I have written a simple express app to figure out how load balancing works. Also I was taking a look at something like docker. If I had to use Nginx to load balance, should I be running 4 different express instances in 4 different docker containers and then load balance between them using Nginx where Nginx sits in its own container ?
Have I got it right? I am kind of confused
I provided an answer to a similar post some time ago, but here's the important bits in a nutshell:
Yes, it's possible to use Nginx to load balance requests between
different instances of your Node.js services. Each Node.js instance
could be running in a different Docker container.
You can then modify Nginx to load balance between the containers using for example the configuration file mentioned in the link above.
Nginx itself can perfectly run within a Docker container. For these kinds of setups, Docker compose can help you orchestrate the configuration such that you can start up all Nginx and Express containers with a simple command.

Resources