Running Flask application without nginx - python-3.x

Running Flask application without nginx or other web server?
Can uWSGI be a web-server and application server at the same time?
For example, stand-alone WSGI Containers
https://flask.palletsprojects.com/en/1.1.x/deploying/wsgi-standalone/
But again, it recommends to use an HTTP server. Why? Can't uWSGI handle HTTP requests?
I have read different articles about deploying a Flask application. They say, I'd need uWSGI and nginx - one popular option.
https://www.digitalocean.com/community/tutorials/how-to-serve-flask-applications-with-uswgi-and-nginx-on-ubuntu-18-04
https://uwsgi-docs.readthedocs.io/en/latest/tutorials/Django_and_nginx.html
https://flask.palletsprojects.com/en/1.1.x/deploying/uwsgi/#uwsgi
My Flask application. app_service.py
import json
import os
from flask import Flask, Response, redirect
portToUse = 9401
#app.route("/app/people")
def get_service_people():
print("Get people")
people_str = "{ \"John\", \"Alex\" }"
return Response(people_str, mimetype="application/json;charset=UTF-8")
if __name__ == "__main__":
app.run(host='0.0.0.0', port=portToUse)
uwsgi config uwsgi.ini
[uwsgi]
chdir = $(APPDIR)
wsgi-file = app_service.py
callable = app
uid = psc-user
gid = psc-user
master = true
processes = 1
threads = 1
http-timeout = 300
socket-timeout = 300
harakiri = 300
http = 0.0.0.0:9401
socket = /tmp/uwsgi.socket
chmod-sock = 664
vacuum = true
die-on-term = true
; Images serving: https://github.com/unbit/uwsgi/issues/1126#issuecomment-166687767
wsgi-disable-file-wrapper = true
log-date = %%Y-%%m-%%d %%H:%%M:%%S
logformat-strftime = true
logformat = %(ftime) | uWSGI | %(addr) (%(proto) %(status)) | %(method) %(uri) | %(pid):%(wid) | Returned %(size) bytes in %(msecs) ms to %(uagent)
requirements.txt
# Web framework for python app.
Flask==1.1.1
# JWT tocket utils to retrieve the tocken from HTTP request header.
# It is used for retrieving optional permissions from gateway.
# https://pypi.org/project/PyJWT/
PyJWT==1.7.1
# Eureka API client library to implement service discovery pattern
py_eureka_client==0.7.4
# Python application server
uWSGI==2.0.18
And it seems to be working. I am running all this in a virtual machine in docker-compose.
My question, why do I need nginx here? Do python developers use uWSGI without a web server?
Update
I am not going to run dev default WSGI server in production, as it is asked here
Are a WSGI server and HTTP server required to serve a Flask app?
WSGI servers happen to have HTTP servers but they will not be as good as a dedicated production HTTP server (Nginx, Apache, etc.)
from
https://stackoverflow.com/a/38982989/1839360
Why is it so?
What I am asking is why uWSGI server cannot be as good for HTTP handling, so I need to put HTTP server between the internet and uWSGI there. Why incoming HTTP requests can go directly to uWSGI (it is not running in the development or debug mode).

For running flask, you do not need nginx, just a webserver of your choice, but life with nginx is just easier. If you are using Apache, you want to consider to use a WSGI.
I remember reading somewhere in the Flask documentation what is stated by an answer to "Are a WSGI server and HTTP server required to serve a Flask app?" as
The answer is similar for "should I use a web server". WSGI servers happen to have HTTP servers but they will not be as good as a dedicated production HTTP server (Nginx, Apache, etc.).
The main idea behind is the architectural principle of splitting layers to ease debugging and increase security, similarly to the concept that you split content and structure (HTML & CSS, UI vs. API):
For the lower layers, see e.g. https://en.wikipedia.org/wiki/Transport_layer
Having a dedicated HTTP server allows you to do package-filtering etc. on that level.
The WSGI is the interface layer between the webserver and the webframework.
Update
I have seen clients only running a WSGI server alone, with integrated HTTP support. Using an additional webserver and/ or proxy is just good practice, but IMHO not strictly necessary.
References
https://flask.palletsprojects.com/en/1.1.x/deploying/mod_wsgi/ describes the Apache way for flask
https://flask.palletsprojects.com/en/1.1.x/tutorial/deploy/ elaborates on how a production environment should look like
Deploying Python web app (Flask) in Windows Server (IIS) using FastCGI
Debugging a Flask app running in Gunicorn
Flask at first run: Do not use the development server in a production environment

Related

Run Flask without port

I have a flask server running on ubuntu. I want to hit the server using my domain name, test.example.com, without having to include the port number. Right now, I can successfully access the server by doing https://test.example.com:80/ but I can't figure out how to do just https://test.example.com/
In flask_server.py:
if __name__ == '__main__':
app.run(host=0.0.0.0, port=80, ssl_conext=context)
Use a proxy forwading, like a basic NGinx server in front of your flask website
Nginx gonna take the HTTPS/Domain on the 443 port part and forward the query to you local flask server on the localhost:80

Right way to deploy Drogon application to production

Many projects tend to use non-embedded web servers in production. Most popular examples are that of Spring(Java), PHP and Flask(Python). It is recommended in Flask's website that Flask not be used with its internal web server at production. Same goes for Spring.
It seems to me that Drogon has an internal web server. Is it supposed to be used in production? If not, how do I use it with a web server like Apache or Nginx?
It's not that you can't use the Flask Internal web server.
It's that you really shouldn't.
For Flask specifically, I would recommend using
Gunicorn
You can use NGINX as your reverse proxy here and Gunicorn as your web server.
There is a guide on how to do just that here:
https://www.digitalocean.com/community/tutorials/how-to-serve-flask-applications-with-gunicorn-and-nginx-on-ubuntu-18-04
Internal web-servers are almost never supposed to be used in production
Well, you can use screen to start the application as daemon
https://linuxhint.com/screen_command_ubuntu/
Some like that:
screen
cd <your_dir>/build && git pull && cmake .. && make && ./dbcpp.com --pid=default
So now you have your app started and then you may just use nginx as proxy if you want
https://gist.github.com/soheilhy/8b94347ff8336d971ad0
Like this:
server {
listen ...;
...
location / {
proxy_pass http://127.0.0.1:8080;
}
}

Run nodejs app through HTTPS

I have a node app that is setup on SSH by running node osjs run --hostname=dc-619670cb94e6.vtxfactory.org --port=4100.
It starts at http://dc-619670cb94e6.vtxfactory.org:4100/ without problems, but instead I want to serve it through HTTPS https://dc-619670cb94e6.vtxfactory.org:4100/ , where I receive an error ERR_CONNECTION_CLOSED.
If I use the port I'm unable to reach it with https, but https://dc-619670cb94e6.vtxfactory.org/ is accessible.
How can I serve the port 4100 through htttps?
Thanks.
This is an implementation detail of OS.js. Their docs recommend setting up a reverse proxy for servers. Doing this will give you more control over SSL and ports, like you want
https://manual.os-js.org/installation/

Run node.js on cpanel hosting server

It is a simple node.js code.
var http = require('http');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type' : 'text/plain'});
res.end('Hello World!');
}).listen(8080);
I uploaded it on cpanel hosting server and installed node.js and run it.
If a server is normal server I can check script result by accessing 'http://{serverip}:8080'. But on cpanel is hosting domain and sub domain and every domain is matched by every sites. Even http://{serverip} is not valid url.
How can I access my node.js result?
Kindly teach me.
Thanks.
bingbing.
Install/Setup NodeJS with CPanel
1. Log in to your account using SSH (if it is not enabled for your account, contact the support team).
2. Download Node.js
wget https://nodejs.org/dist/latest/node-v10.0.0-linux-arm64.tar.xz
3. Extract the Node.js files
tar xvf node-v10.0.0-linux-arm64.tar.xz
4.Now rename the folder to "nodejs". To do this, type the following command
mv node-v10.0.0-linux nodejs
5. Now to install the node and npm binaries, type the following commands:
mkdir ~/bin <br> cp nodejs/bin/node ~/bin
cd ~/bin
ln -s
../nodejs/lib/node_modules/npm/bin/npm-cli.js npm
6. Node.js and npm are installed on your account. To verify this, type the following commands
node --version
npm --version
The ~/bin directory is in your path by default, which means you can run node and npm from any directory in your account.
7. Start Node.js Application
nohup node my_app.js &
8. Stop the Application
pkill node
9. Integrating a Node.js application with the web server(optional)
Depending on the type of Node.js application you are running, you may want to be able to access it using a web browser. To do this, you need to select an unused port for the Node.js application to listen on, and then define server rewrite rules that redirect visitors to the application.
In a text editor, add the following lines to the .htaccess file in the/home/username/public_html directory, where username represents your account username:
RewriteEngine On
RewriteRule ^$ http://127.0.0.1:XXXXX/ [P,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ http://127.0.0.1:XXXXX/$1 [P,L]
In both RewriteRule lines, replace XXXXX with the port on which your Node.js application listens.
To run a Node.js application on a managed server, you must select an unused port, and the port number must be between 49152 and 65535(inclusive).
Save the changes to the .htaccess file, and then exit the text editor. Visitors to your website are redirected to the Node.js application listening on the specified port.
If your application fails to start, the port you chose may already be in use. Check the application log for error codes like EADDRINUSE that indicate the port is in use. If it is, select a different port number, update your application’s configuration and the .htaccess file, and then try again.
cPanel typically runs Apache or another web server that is shared among all the cPanel/unix accounts. The web server listens on port 80. Depending on the domain name in the requested URL, the web server uses "Virtual Hosting" to figure out which cPanel/unix account should process the request, i.e. in which home directory to find the files to serve and scripts to run. If the URL only contains an IP address, cPanel has to default to one of cPanel accounts.
Ordinarily, without root access, a job run by a cPanel account cannot listen on port 80. Indeed, the available ports might be quite restrictive. If 8080 doesn't work, you might try 60000. To access a running node.js server, you'll need to have the port number it's listening on. Since that is the only job listening on that port on that server, you should be able to point your browser to the domain name of any of the cPanel accounts or even the IP address of the server, adding the port number to the URL. But, it's typical to use the domain name for the cPanel account running the node.js job, e.g. http://cPanelDomainName.com:60000/ .
Of course port 80 is the default for web services, and relatively few users are familiar with optional port numbers in URLs. To make things easier for users, you can use Apache to "reverse proxy" requests on port 80 to the port that the node.js process is listening on. This can be done using Apache's RewriteRule directive in a configuration or .htaccess file. This reverse proxying of requests arguably has other benefits as well, e.g. Apache may be a more secure, reliable and manageable front-end for facing the public Internet.
Unfortunately, this setup for node.js is not endorsed by all web hosting companies. One hosting company that supports it, even on its inexpensive shared hosting offerings, is A2Hosting.com. They also have a clearly written description of the setup process in their Knowledge Base.
Finally, it's worth noting that the developers of cPanel are working on built-in node.js support. "If all of the stars align we might see this land as soon as version 68," i.e. perhaps early 2018.
References
Apache Virtual Hosting -
http://httpd.apache.org/docs/2.4/vhosts/
Apache RewriteRule Directive - http://httpd.apache.org/docs/2.4/mod/mod_rewrite.html
A2Hosting.com Knowledge Base Article on Configuring Node.js - https://www.a2hosting.com/kb/installable-applications/manual-installations/installing-node-js-on-managed-hosting-accounts
cPanel Feature Request Thread for node.js Support - https://features.cpanel.net/topic/nodejs-hosting
Related StackOverflow Questions
How to host a Node.Js application in shared hosting
Why node.js can't run on shared hosting?
Yes it's possible, but it has few dependencies which may or may not be supported by either your cpanel hosting provider or the plan you opt in for.
Below steps that I'm mentioning is just for a demo purpose. If you are a student or just want to play with it you can try it out. I'm not a security expert so from security point of view how good it is I really don't know.
So with that being said let's see how I configured it. I have hostinger cpanel hosting subscription and following are the steps:
Enable SSH ACCESS
Connect to shared machine via ssh
Check your linux distro and download & setup node js
In my case following are the commands for that:
Downloading node & extracting it using curl
curl https://nodejs.org/dist/v12.18.3/node-v12.18.3-linux-x64.tar.gz |tar xz
This will download & extract node & create a directory. You can confirm that using ls command as shown in the image below.
At this point you can check the versions as shown below
as you can see for the node command it's okay but for the npm command we have modify it as follows
./node-v12.18.3-linux-x64/bin/node ./node-v12.18.3-linux-x64/lib/node_modules/npm/bin/npm-cli.js --version
Further we can create alias to make life little easier
check the below images for that:
I tried using bashrc/bash_profile but somehow it didn't work .
And that's all node server running on a shared cpanel machine.
Now I wanted to have an express js based rest api support in this case. The problem with that is it will be locally hosted on the port I'll give. Check the below example:
var express=require('express')
var app=express()
app.get('/', function (req, res) {
res.send('hosting node js base express api using php & shared hosting a great way to start yjtools')
})
console.log("listening yjtools node server on port 49876...")
app.listen(49876)
The problem here is even though it will execute I'll not be able to access it over the network. This is because we only get fixed predefined ports (like 80,21,3306 etc.) which are allowed/open on the shared cpanel machine. Due to this the express app I hosted will only available locally on 49876 port.
Let's see what do we have:
An express js based app hosted locally on cpanel machine.
Php based hosted Apache server available over http/https.
So we can make use of php with redirect rule set and curl to bridge the gap.
Following are the changes I did to make it work:
In .htaccess file add a redirect rule, say domain/api is what I want my rest api path to be.
RewriteRule api/(.*)$ api/api.php?request=$1 [QSA,NC,L]
In the api/api.php file (this is the path I choose you can choose any path)
<?php
echo "Hello ".$_REQUEST['username'];
echo '<hr>';
$curl = curl_init('http://127.0.0.1:49976/');
curl_setopt($curl, CURLOPT_HEADER, 1);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
//Get the full response
$resp = curl_exec($curl);
if($resp === false) {
//If couldn't connect, try increasing usleep
echo 'Error: ' . curl_error($curl);
} else {
//Split response headers and body
list($head, $body) = explode("\r\n\r\n", $resp, 2);
$headarr = explode("\n", $head);
//Print headers
foreach($headarr as $headval) {
header($headval);
}
//Print body
echo $body;
}
//Close connection
curl_close($curl);
?>
And on the ssh prompt just run the app.js file
node api/app.js
Below are the images for this working in action:
Here is the similar thing which I referred for my program, so we can also make this node call via php itself.
Now I have express based rest api support , angular app hosted and mysql for database everything on cpanel.
You can use any domain pointed to that cPanel server and instead of accessing http://server-ip:8080 try accessing http://domain.tld:8080. By default cPanel does not bind on port 8080. Be sure to check if there is any firewall on the server. If it is, then allow incoming connections on tcp port 8080. Depending on your WHM server configuration, it should also work with http://server-ip:8080
cPanel Version 80 has nodejs 10.x support: https://documentation.cpanel.net/display/80Docs/80+Release+Notes#id-80ReleaseNotes-InstallanduseNode.jsapplications
Install and use Node.js applications
You can now install and use Node.js applications on your server. To
use Node.js, install the ea-nodejs10 module in the Additional Packages
section of WHM's EasyApache 4 interface (WHM >> Home >> Software >>
EasyApache 4).
You can register Node.js applications in cPanel's Application Manager
interface (cPanel >> Home >> Software >> Application Manager). For
more information, read our Guide to Node.js Installations
documentation.
For Application Manager to be enabled: https://documentation.cpanel.net/display/80Docs/Application+Manager
Your hosting provider must enable the Application Manager feature in
WHM's Feature Manager interface (WHM >> Home >> Packages >> Feature
Manager).
Your hosting provider must install the following Apache modules:
The ea-ruby24-mod_passengermodule. Note: This module disables Apache's
mod_userdir module.
The ea-apache24-mod_env module. Note: This module allows you to add
environment variables when you register your application. For more
information about environment variables, read the Environment
Variables section below.
The ea-nodejs10 module if you want to register a Node.js™ application.
You can see how application manager looks like in this Youtube video:
https://www.youtube.com/watch?v=ATxMYzLbRco
anyone who wants to know how to deploy node js app to Cpanel this is a good source for him, this explains thoroughly how to deploy node js app to Cpanel please check this

Why does swapping between container IP and alias cause difference in AJAX request?

I have a small sample project located here that illustrates the problem I am seeing when working with a nginx + node + host docker stack.
I have 2 containers:
A node (express) application that simply returns a json object. It is CORs enabled based on this website. It has it's port published to host via 3000:80
An nginx server that is also CORs enabled based on this website. It only serves static content (index.html and main.js files) from the default location (/usr/shared/nginx/html). Its port is published via 8080:80.
When running the containers individually from host I can access the node server and see the JSON object being returned. When I access the nginx server, I see my index.html and the javascript code from main.js runs.
Now I have the node app container linked to the nginx server container. From inside my main.js file of the nginx container, I attempt to access the server at http://nodeapp/api. I am seeing a CORs error
XMLHttpRequest cannot load http://nodeapp/api. No
'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'http://localhost:8080' is therefore not allowed
access.
The strange thing is, the response header indicates it is coming from nginx and not my express application as I would expect. The nginx container is also not logging anything.
Things that worked
If I change the url for the XMLHttpRequest to the node container's IP (say 172.17.0.2) it works as expected and the response header indicates it is coming from the express server. In my /etc/hosts file there is an entry:
172.17.0.2 nodeapp abc123ContainerId quickserve_nodeapp_run_1
When I curl the node container from an interactive tty container it also works as expected.
If I load the node container and use http-server (server on host) it works as expected and the response header indicates it is coming from the express server.
Just in case it has an incluence, that old thread (2013) mentioned a cor option on the docker daemon.
Nowadays (Q4 2015), the docker daemon includes:
--api-cors-header="" Set CORS headers in the remote API
To set cross origin requests to the remote api please give values to --api-cors-header when running Docker in daemon mode. Set * (asterisk) allows all, default or blank means CORS disabled
$ docker -d -H="192.168.1.9:2375" --api-cors-header="http://foo.bar"
That might be a setting to use in your case.

Resources