Objective: Give my Django App (with Python backend [written] and react/redux/js frontend [written]) a Smartsheet API OAuth page that redirects users off the main website until auth is done, and then back to website (that utilizes Smartsheet API in features)
Crux: A hunch said that the OAuth should be in node.js to match front end more, and I found a working sample code for how to do Smartsheet OAuth in Node. It worked great on its own! Then when I tried to integrate this node.js page into Django, I got errors from which ever server I set up second, that there is already a server on that given (12.0.0.1:{port}) local url. Maybe OAuth should be written in python instead, but I couldn't find sample code for it, so it would be great if I could keep it in Node.
Question- Is there a way to deploy components from both Node and Django to the same server/domain? It's weird to have users go to one site, finish their oauth, just to be pushed to a totally separate domain. This may also pose security risks.
My attempt-
I thought I would just create a simple landing page and then after being logged in, shoot the user forward on the website (on a redirect url). This is what the Django urls.py would look like:
from django.urls import path
from . import views
urlpatterns = [
path('', views.oauth ), //Views.oauth is fairy blank, and I wanted my Node.JS server
//to listen at that hostname:port
path('loggedin/', views.index ), //when oauth ended, I wanted it to send the user here
]
This attempt made these errors:
Django Error
Node Error
Thanks for any ideas to my inquiry!
Microservices are basically small components which are interconnected by REST or any kind of api and what you are talking about IS microservice architecture.
Now, you can deploy the Django on some port lets say 8090 and NodeJS in another port lets say 8080.
Now, to connect them you need to have some kind of reverse proxy easiest would be nginx.
So, the rules will be like this.
If the url is host/api forward traffic to NodeJS 127.0.0.1:8080 port.
Otherwise forward traffic to 127.0.0.1:8090.
A example would be this question: NGINX - Reverse proxy multiple API on different ports
server {
{
listen 443;
server_name localhost;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
location /api/orders {
proxy_pass https://localhost:5000;
}
location /api/customers {
proxy_pass https://localhost:4000;
}
}
So the rule says that if the location is /api/orders go to localhost:5000 and if it's /api/customers go to other one, the 4000 port one.
Now, you can research about reverse proxy and probably come up with your own rule.
Related
Looking at the following scenario, I want to know if this can be considered a good practice in terms of architecture:
A React application that uses NextJS framework to render on the server. As the data of the application changes often, it uses Server-side Rendering ("SSR" or "Dynamic Rendering"). In terms of code, it fetches data on the getServerSideProps() function. Meaning it will be executed by the server on every request.
In the same server instance, there is an API application running in parallel (it can be a NodeJS API, Python Flask app, ...). This app is responsible to query a database, prepare models and apply any transformation to data. The API can be accessed from the outside.
My question is: How can NextJS communicate to the API app internally? Is a correct approach for it to send requests via a localhost port? If not, is there an alternative that doesn't imply NextJS to send external HTTP requests back to same server itself?
One of the key requirements is that each app must remain independent. They are currently running on the same server, but they come from different code-repositories and each has its own SDLC and release process. They have their own domain (URL) and in the future they might live on different server instances.
I know that in NextJS you can use libraries such as Prisma to query a database. But that is not an option. Data modeling is managed by the API and I want to avoid duplication of efforts. Also, once the NextJS is rendered on the client side, React will continue calling the API app via normal HTTP requests. That is for keeping a dynamic front-end experience.
This is very common scenario when frontend application running independent from backend. Reverse proxy usually help us.
Here are simple way I would like to suggest you to use to achieve(and also this is one of the best way)
Use different port for your frontend and backend application
All api should start with specific url like /api and your frontend
route must not start with /api
Use a web server( I mostly use Nginx that help
me in case of virtual host, reverse proxy, load balancing and also
server static content)
So in your Nginx config file, add/update following location/route
## for backend or api and assuming backend is running on 3000 port on same server
location /api {
proxy_pass http://localhost:3000;
## Following lines required if you are using WebSocket,
## I usually add even not using WebSocket
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
## for frontend and assuming frontend is running on 5000 port on same server
location / {
proxy_pass http://localhost:5000;
}
I'm having trouble authenticating a user in my NodeJS / React application, I know that the question is not very much related to programming, but someone can help me. Come on the question.
I created a work schedule management application for the company's employees, the API I made in Node JS and the Front with React. As employees are in Rio and São Paulo, they would have to have access via the web. So we set up a server with a fixed IP for the access of Rio people. We installed Ubuntu 20.04 with Nginx, NodeJs and MongoDB.
So far so good, I installed the application and on the server and it works perfectly, but when accessed by another machine, inside and outside the network, there is an authentication error.
I've looked for how to pass the header and I've read a lot of articles to try to configure Nginx to accept this header that is passed by the Json Web Token.
In the application I pass a Header that I called x-auth-token, which I can check on the browser when I'm on the server, but I can't find this Header when I'm on another machine on the network or outside it. So I believe that is why the authentication error, I just don't know how to solve this.
I created this location in Nginx so that when they enter the server, run the application right away.
I changed this code a little and I will update it here too
location / {
proxy_pass https://xxx.xxx.x.xxx:3000;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Host $http_host;
proxy_pass_header X-Auth-Token;
}
The xxx is not to put the external IP of the server.
I'm a bit new to node/react.
I have an API/express node app and in that app I have a react app. The react app has axios.get commands and other API calls. The react app finds the API calls I do and forwards them to the proxy I setup in the package.json of the react app. In dev the proxy looked like this: "proxy": "http://localhost:3003/" but now that I'm going into production I'm trying to change this proxy to be the URL I'm hosting my node express app in "proxy": "http://168.235.83.194:83/"
When I moved my project to production I made port 83 the API node app and I made port 84 the react app (with nginx). For whatever reason though, my react app just doesn't know how to do the API requests to the node app.. I'm getting blank data
After googling I come to realize, the 'proxy' setting only applies to requests made to the development server. Normally in production you have a server that gives the initial page html and also serves api requests. So requests to /api/foo naturally work; you don't need to specify a host.
This is the part I'm trying to figure out. If someone can tell me how to setup my app so that /api/foo naturally works that would be greatly appreciated.
I took a stab at trying to set that up properly. This is probably a complete failure in terms of an approach but it's late and I'm gonna fall asleep on this problem.. I'm supposed to have nginx handle serving both static html and requests in one statement file? I have this so far but I can be way off here...
server {
listen 84;
server_name 168.235.83.194;
root /home/el8le/workspace/notes/client/build;
index index.html index.htm;
location / {
}
location /api{
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://168.235.83.194:83/; //I have nginx hosting my API app on this port. Not even sure if this should be like this?
proxy_ssl_session_reuse off;
proxy_set_header Host $http_host;
proxy_cache_bypass $http_upgrade;
proxy_redirect off;
}
}`
Also, I'm actually hosting on those ip addresses if you want to get a better sense of where I am at:
http://168.235.83.194:84
http://168.235.83.194:83/customers
You will have to supply the actual API URL while making data request. Dev server is able to proxy to a different API URL. So, if the app loads at http://localhost:83 using DEV Server, any data request like /api/customers will go to http://localhost:83/api/customers and dev proxy server will pipe it to http://localhost:84/api/customers.
But in production, when you make the same request it will use base address of your app and try to get the data from http://PRODUCTION_SERVER:83/api/customers.
Correct way to handle this would be to use absolute URL instead of relative URL. And as production and development will have different Base URLs, maintain them in a config variable and then append specific api address to this base address, something like : ${BASE_URL}/api/customers, where BASE_URL will be http://localhost:84 in DEV and http://PRODUCTION_SERVER:84.
“If #nginx isn’t sitting in front of your node server, you’re probably doing it wrong.”
— Bryan Hughes via Twitter
For a short while now, I have been making applications with Node.js and, as Mr.Hughes advises, serving them with Nginx as a reverse proxy. But I am not sure that I am doing it correctly because I can still access my Node.js application over the internet without going through the Nginx server.
At its core, the typical application is quite simple. It is served with ExpressJS like this:
var express = require("express");
var app = express();
// ...
app.listen(3000);
and Nginx is configured as a reverse-proxy like so:
# ...
location / {
proxy_pass http://127.0.0.1:3000;
}
# ...
And this works wonderfully! However I have noticed a behaviour that I am not sure is desirable, and may defeat a large portion of the purpose of using Nginx as a reverse-proxy in the first place:
Assuming example.org is a domain name pointing to my server, I can navigate to http://www.example.org:3000 and interact with my application from anywhere, without touching the Nginx server.
The typical end user would never have any reason to navigate to http://<whatever-the-server-host-name-or-IP-may-be>:<the-port-the-application-is-being-served-on>, so this would never effect them. My concern, though, is that this may have security implications that a not-so-end-user could take advantage of.
Should the application be accessible directly even though Nginx is being used as a reverse-proxy?
How can you configure the application so that it is only be available to the local machine / network / Nginx server?
It is best not to be available directly (imho).
You can specify accepted hostname:
app.listen(3000, 'localhost');
I have written some code using express.js which I'd like to put on the same HTTP port as a web application implemented with another technology (in this case, Django). I don't want to have to redirect user browsers to another port since if I do they might bookmark URLs with the other port, and then I lose the ability to reconfigure the arrangements later. So, I'd like express.js to serve HTTP on its port, fulfilling some paths I specify by making HTTP requests to a secondary web application which is being served on another port.
Is there any middleware or other technique for express.js which will serve certain paths by making HTTP requests to other servers?
(The stackoverflow question How to make web service calls in Expressjs? is relevant but only discusses GET and I will have some POST requests to forward.)
Whilst it is possible to make POST request with node, I think the pattern you're describing is better suited to using a a server like nginx or apache in front of both node.js and django, and proxying requests to whichever port is appropriate based on the request.
Typically, both django and node.js would listen on whichever ports you want them to listen on, while nginx listens on port 80. You then define a virtual host in nginx that forwards certain requests to node.js and certain requests to django.
Here are the nginx docs on using proxy_pass.
Here is an example, modified from the nginx Full Example:
server { # simple reverse-proxy
listen 80;
server_name domain2.com www.domain2.com;
# serve static files
location ~ ^/(images|javascript|js|css|flash|media|static)/ {
root /var/www/virtual/big.server.com/htdocs;
}
# pass requests for dynamic content to django
location /djangostuff {
proxy_pass http://127.0.0.1:8080;
}
# pass requests for node
location /nodestuff {
proxy_pass http://127.0.0.1:8081;
}
}
With node-http-proxy only a single call to app.use is required to reverse proxy all unhandled requests, like this:
var app = express.createServer();
# my app.get bindings
app.use(require('http-proxy').createServer(80, 'other-server-address'));
app.listen(80);