How to debug reactJS "Could not proxy request" - node.js

I am attempting to deploy a pern app using this guide. At around 22:14 he talks about using a proxy to take out the 'localhost' in the fetch requests.
I have followed this exactly, resulting in this for the proxy;
"proxy": "http://localhost:5000",
and this for how I request:
const response = await fetch("/auth/is-verify", {
method: "GET",
headers: { token: localStorage.token }
})
This always gives me the error:
Proxy error: Could not proxy request /auth/is-verify from localhost:3000 to http://localhost:5000.
See https://nodejs.org/api/errors.html#errors_common_system_errors for more information (ECONNREFUSED).
I have tried:
deleted node modules and package-lock.json and reinstalled and restarted
added a '/' to the proxy
restarted computer between attempts
used incognito mode (as there are no cookies/cache)
navigating to the http://localhost:5000/auth/is-verify route directly (Works perfectly fine, with response text being retrieved from the server and displayed on a blank page as expected for direct route access)
changing the proxy to the network location given when running npm start on the react app.
using 127.0.0.1 instead of localhost
I have no clue how to test for causes further. All my research has led me too people saying to "make sure the server is up" and other general concerns, but these general solutions don't help this case.

Not sure my case is the same as yours, but I encountered a quite similar problem, and found a way to fix it for me, so maybe it will fix it for you and others as well :
my setup was I used wsl, and my back used a cygdrive shell (the reason being I needed it for another project with same IDE), and my front used the wsl shell. Both my front and my back were on the windows part of wsl.
After moving both on the linux part of wsl, my back started using the wsl shell as well, and this error disappeared.
My guess would be because of cygdrive the localhost meant something different for my front and my back, but I can't guarantee it.
Best of luck.

Related

Getting images from nodejs server with React

Im having an issue where i can only retrieve image files from nodejs through my outside ip
This is the only way that works
src={`http://<outside ip>:5000/${<img>}`}
I have tried these variations to fix it: (abbreviated for simplicity)
/api/<img> (the /api/ route is used by nginx to route to nodejs)
localhost:5000/<img>
http://localhost:5000/<img>
http://localhost:5000/api/<img>
http://0.0.0.0:5000/<img>
http://0.0.0.0:5000/api/<img>
http://127.0.0.1:5000/<img>
http://127.0.0.1:5000/api/<img>
http://api/<img>
My problem is that i have to expose port 5000 to the outside world for this to work
I cant figure out why none of the localhost versions are pulling images. Although security is a concern, it seems that speed has taken a massive hit as well.
Is This an nginx problem? A Centos problem? not allowing localhost to send data? Or am i just doing this wrong?
TYIA
Hard to say without seeing your code, but it could be a CORS issue?
If you're using express for the node service then you can see how to easily configure CORS here: https://expressjs.com/en/resources/middleware/cors.html
If you're using fetch from React to retrieve the data, you can also try passing using the "mode": "no-cors" option like so:
fetch('http://localhost:5000/api/<img>', {
method: 'GET',
mode: 'no-cors',
}).then(result => {
// Do whatever you need to do
}).catch(err => {
// Handle error
})

Connecting front-end and back-end with react and mongo DB - 404 error

I am trying to connect an app hosted in my localhost:3000 port. The back-end is on the localhost:8080 port (not a remote API but on my own PC. I downloaded a pre-created back-end api and linked the front-end to the back-end with MongoDB using an .env file.
The weird thing is that on MongoDB the connection looks ok, following the tutorial I am using. The backend and the front-end also look alrigh, however, I am unable to login with the form in the screenshot.The error I get when trying to login or create a new user is "xhr.js:178 POST http://localhost:3000/login 404 (Not Found)"
It was a bit hard to put the whole code here, so I am linking you to the front-end repo: https://github.com/manuelalonge/complex-app and the back-end repo: https://github.com/manuelalonge/back-end-api
I can understand the issue is most likely on the back-end but I could not understand where exactly. I tried to get back to the previous lessons in the tutorial but it still doesn't get solved.
Probably it is easier to solve this issue with a screenshare session, so if anybody would contact me I'll be grateful.
Screenshot --> [1]: https://i.stack.imgur.com/jVJzn.png
Your screenshot points to what's wrong. You're posting to localhost:3000, as you know this is the frontend webpack server.
You'll want to create an axios config file and set a base url to hit the correct endpoint.
const axiosInstance = axios.create({
baseURL: 'localhost:8080'
});
export default axiosInstance;
Also, please add some sort of file structure.

angular universal https problems

I have an angular universal app set up. I do POST requests on the server-side using localhost to pre-render my app and this works fine.
An example working url would be http://localhost:8000/api/get-info.
I've now put the app into production on an external url (apache server). I'm also using ssl.
Now when I try to do a POST request on the server-side to pre-render my app, I get back a response with status: 0, url: null (I'm assuming this means the connection was refused).
An example non-working url would be https://mywebsite.com/api/get-info.
What really stumps me is that when the app loads on the client, all HTTPS requests start working. So the problem is I cannot get the express server to send POST requests to my external url.
I've tested a post request on the server-side to a different website (twitter), and that seems to work fine as well. So i'm not entirely sure where I've gone wrong.
I already have CORS set to '*' as well.
Try using
http://localhost:8000/api/get-info
in production as well. Since your Angular app is rendered on the same server as your API is running, using localhost should just work fine. It doesn't matter if you are on an external URL.
I do something similar (its a GET but that shouldn't matter) with my translations:
if ( this.isServer ) {
translateLoader.setUrl( 'http://localhost:4000/assets/localization/' );
} else {
translateLoader.setUrl( 'assets/localization/' );
}
It works locally and in production (both server and client).
I just encountered this problem myself for two days. Please take a look at my comment on https://github.com/angular/universal/issues/856#issuecomment-426254727.
Basically what I did was I did a conditional check in Angular to see if the APP is running in browser or in server (rendered by Angular Universal), and change my API endpoint to actual IP in https or localhost in http accordingly. Also in my Nginx setting, I only redirect incoming request from browser to https by checking if the server_name is localhost.
Hope it helps!

How to properly configure Browsersync to proxy backend

I'm struggling with proper configuration of Browsercync (and maybe some middleware?).
My configuration is like:
local.example.com it's mine local address configured via /etc/hosts.
devel.example.com it's our company devel environment (backend).
staging.example.com it's our company staging environment (backend).
As I'm UI developer I want to use my local code, but work with one of backend environments.
I'm using gulp to build my project etc. It's also has task to run browser-sync and watch file changes. But of course there is now problem with cookies domains that are coming from backend. CSRF token cookie domain is set by browser to currently used backend.
I have tried:
To use middleware http-proxy-middleware with configuration:
server: {
baseDir: './build',
middleware: [
proxyMiddleware('/api', {
target: 'http://devel.example.com',
changeOrigin: true,
})
]
]
But problem I have is that it's doing non-transparent redirects, which are visible in browser console. I thought that it will work like this, that proxy will mask those requests to browser will think that all requests and responses are coming from local.example.com. But it seems that it doesn't work like this (or maybe I configured it badly).
Also big problem with this solution is that it somehow changes my POST HTTP requests to GET (WTF?!).
To use build in browser-sync proxy option. In many tutorials I saw using proxy option with server option, but it seems not work anymore. So I have tried to use it with serveStatic like this:
serveStatic: ['./build'],
proxy: {
target: 'devel.example.com',
cookies: {
stripDomain: false
}
}
But this doesn't work at all...
I would really appriciate for any help in this topic.
Thanks

Redirect an http request from a remote server to a local server using nodejs

There is a feature in a tool called charles that allows you to map remote requests:
http://www.charlesproxy.com/documentation/tools/map-remote/
Basically, it can take any request to a server(even if you're not the one running it) and then makes a new request to another server, preserving the path and the query string. The response from the second server then overwrites the response from the first server.
I just want to know if there is a node module that can do this. I tried using http-proxy, but I have a feeling this map remote tool is a bit different than a proxy, since it seems like you must own both servers with a proxy.
EDIT: Tried using the http-proxy node module again, but can't seem to get it to work. Here's my code:
var http = require('http')
, httpProxy = require('http-proxy');
httpProxy.createServer({
hostnameOnly: true,
router: {
'www.stackoverflow.com': 'localhost:9000',
}
}).listen(80);
// Create your target server
//
http.createServer(function (req, res) {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.write('request successfully proxied!' + '\n' + JSON.stringify(req.headers, true, 2));
res.end();
}).listen(9000);
My expectation is that when I go to www.stackoverflow.com or www.stackoverflow.com:80, it will instead redirect to my localhost:9000
No, what you are asking for is indeed a simple proxy. And no, you don't have to "own" both servers to run a proxy. You simply proxy the request, and at that point you can modify the data however you wish.
The proxy module you mention will work fine, and there are many others. You can also do this with simple Nginx config if you wish.
I made this pastebin with my solution:
http://pastebin.com/TfG67j1x
Save the content of the pastebin as proxy.js. Make sure you install dependencies in the same folder as the proxy.js file. (npm install http-proxy colors connect util --save)
When you run the proxy, it will:
start a new server listening on 8013, acting as a proxy server;
start a demo target server listening at 9013.
When accessing the demo target through the proxy, it modifies the "Ruby" string into "nodejitsu", for easy testing. If you are behind a corporate firewall/proxy, this script fails for now.
UPDATE: The problem with "headers already sent" was at line 32/33. It turns out that several errors occured on the same connection. When first error occurs, headers would be sent, when second error occurs, headers are already sent; as a result a "headers already sent" exception is raised and the server is killed.
With this fix the server no longer dies, but it still does not fix the source of the error, which is that NODE.JS cannot reach your target site. You must be behind another proxy/firewall and NodeJS would need to forward the HTTP request to a second proxy. If you normally use a proxy in your Browser to connect to Internet, my solution will fail. You did not specify this to be a requirement, though.
You may verify this by accessing through my proxy a server inside your network (no proxy required for it normally).
UPDATE2: You should not try to access http://localhost:8013 directly, but to set it as a proxy in your browser. Take notice of your original browser proxy settings (see above). Try and access then http://localhost:9013.
Did you add that proxy to your browser config? Otherwise the underlying OS would route your request directly to www.stackoverflow.com and there is no way your proxy is catching that.
Could you confirm that www.stackoverflow.com is ending up at your node.app at all? The name currently will resolve to the IP address that leads you to this website, so you would have to have made sure that name now resolves to your node.app. In this case, that probably means editing your hosts file.

Resources