Getting images from nodejs server with React - node.js

Im having an issue where i can only retrieve image files from nodejs through my outside ip
This is the only way that works
src={`http://<outside ip>:5000/${<img>}`}
I have tried these variations to fix it: (abbreviated for simplicity)
/api/<img> (the /api/ route is used by nginx to route to nodejs)
localhost:5000/<img>
http://localhost:5000/<img>
http://localhost:5000/api/<img>
http://0.0.0.0:5000/<img>
http://0.0.0.0:5000/api/<img>
http://127.0.0.1:5000/<img>
http://127.0.0.1:5000/api/<img>
http://api/<img>
My problem is that i have to expose port 5000 to the outside world for this to work
I cant figure out why none of the localhost versions are pulling images. Although security is a concern, it seems that speed has taken a massive hit as well.
Is This an nginx problem? A Centos problem? not allowing localhost to send data? Or am i just doing this wrong?
TYIA

Hard to say without seeing your code, but it could be a CORS issue?
If you're using express for the node service then you can see how to easily configure CORS here: https://expressjs.com/en/resources/middleware/cors.html
If you're using fetch from React to retrieve the data, you can also try passing using the "mode": "no-cors" option like so:
fetch('http://localhost:5000/api/<img>', {
method: 'GET',
mode: 'no-cors',
}).then(result => {
// Do whatever you need to do
}).catch(err => {
// Handle error
})

Related

Connecting front-end and back-end with react and mongo DB - 404 error

I am trying to connect an app hosted in my localhost:3000 port. The back-end is on the localhost:8080 port (not a remote API but on my own PC. I downloaded a pre-created back-end api and linked the front-end to the back-end with MongoDB using an .env file.
The weird thing is that on MongoDB the connection looks ok, following the tutorial I am using. The backend and the front-end also look alrigh, however, I am unable to login with the form in the screenshot.The error I get when trying to login or create a new user is "xhr.js:178 POST http://localhost:3000/login 404 (Not Found)"
It was a bit hard to put the whole code here, so I am linking you to the front-end repo: https://github.com/manuelalonge/complex-app and the back-end repo: https://github.com/manuelalonge/back-end-api
I can understand the issue is most likely on the back-end but I could not understand where exactly. I tried to get back to the previous lessons in the tutorial but it still doesn't get solved.
Probably it is easier to solve this issue with a screenshare session, so if anybody would contact me I'll be grateful.
Screenshot --> [1]: https://i.stack.imgur.com/jVJzn.png
Your screenshot points to what's wrong. You're posting to localhost:3000, as you know this is the frontend webpack server.
You'll want to create an axios config file and set a base url to hit the correct endpoint.
const axiosInstance = axios.create({
baseURL: 'localhost:8080'
});
export default axiosInstance;
Also, please add some sort of file structure.

How to pass routing control from Node server to client?

Let's say this is the part where all routes are handled on a Node server with Angular front end. When the user enters the site URL, the server sends all the static files for the browser to render. However, I would like the client to handle certain routes instead of it going directly to the server.
For example, if I enter www.exemple.com/page2, a Get request is sent to the server but the route doesn't exist so the request just hangs there and ultimately resulting an error.
I want Angular to handle the routing instead of it going automatically to the server. I've only successfully got this to work on localhost where the Angular app is served from a different port than the one that the server listens to. Can anyone one tell me how to achieve this? Thanks a lot.
module.exports=function(app,dir){
app.use(express.json());
app.use(express.static(dir+'/dist/probeeshq'));
app.get('/',(req,res)=>{res.sendFile(path.join(dir));});
app.use('/auth', userAuth);
app.use('/api/me',userData);
app.use('/api/org',organization);
app.use('/api/messaging',messaging);
app.use('/api/search',search);
app.use(error);
}
This is what I have in Angular
const routes: Routes = [
{ path:'', component: HomeComponent },
{ path:'project_gaea', component:ProjectGaeaComponent},
{ path:'dashboard', component: DashboardComponent ,canActivate:[AuthGuardService]},
{ path:'explore', component:ExploreComponent, canActivate:[AuthGuardService]},
{ path:'create', component: CreateComponent },
{ path:'user/:id', component:UserProfileComponent},
{ path:'**', component: PageNotFoundComponent}
];
You can achieve this by implementing the Route feature that Angular has out of the box. After you implement this, you can then just use your back-end as an API.
So it turns out that I was supposed to serve the application like this:
app.use('/', express.static(dir));
And Express will let Angular handle all the routing after failing to match all the routes at the server side. dir is just the path were the Angular app is.
I have the same issue, with wildcard routes we can fix this. And client routes handling any unrecognised urls. Good from the user experience perspective no issues on that
like so
app.get("*", (req, res) => {
// send HTML files
});
But what about auditing wise, like a simple question like "if I send a unrecognised URL to server, it should give a 404 status code, instead of redirecting me to client and showing a 404 page or something"
Made a valid point, which doubted my knowledge on the web. But to resolve this we need to manually whitelist the client URL's in server, still figuring out myself, if any better solution please let me know.

Why is config.proxy ignored when making an axios request within a webpack project?

My goal
I want to perform a request with axios#0.18.0 using an http proxy fully efficient (squid). My project is a vue project based on the webpack template. vue init webpack proxytest
The issue
When I try to perform the request, axios 'ignores' the proxy property inside the config object passed.
I noticed that when I run the exact same code with pure nodejs, everything works just perfectly fine.
Is there some configuration that I need to specify excepting the axios request configuration when using axios as a npm module within webpack ?
The code
import axios from 'axios';
const config = {
proxy: {
host: 'host',
port: 3128,
},
method: 'GET',
};
axios('http://www.bbc.com/', config).then((res) => {
console.log(res);
}).catch((err) => {
console.error(err);
});
Of course, when testing, I change 'host' into the proxy IP.
I tried to change the method property to POST in order to check if axios considered the config. It does consider the config passed.
I tried to put a fake port so I could check if the proxy property was considered. It's not considered.
The output
output...
Now, I'm aware of what CORS is. The point is I'm constently getting this output when performing the requests. And if the proxy was used by axios, I think no CORS "error" would show up as my proxy is a VPS.
Thank you.
You need to configure your server to receive the requests and then test. This does not seem to have anything to do with the webpack, because in your mistake, for example, you make a request for the BBC from localhost and it is very likely that you are making that mistake. So it's important to test with your server by running Front and Back locally.

request to nodejs proxy: Provisional headers are shown

In my web app project, I came across a cross domain issue. The back-end API was deployed to the server, which providing the IP and port(We can assume the back-end API works well). And I am working on the front-end side, and still in the local development stage. So there is cross domain issue.
so I used a nodejs proxy and try to overcome the cross domain issue. For the nodejs proxy part, I am not the expert on that. Just get it from my teammates, who meet the same issue before.
So in my client side js code, I use axios library to send http request to the nodejs proxy as following:
axios.get('http://127.0.0.1:8888/service/delete/v1?id=5').then((response) => {
console.log(response);
}).catch((error) => {
console.log(error);
});
And the nodejs proxy will replace the 127.0.0.1:8888 part with the real API's IP and port, and send the request. That's my understanding about it.
so I run the nodejs proxy, which is listening on port 8888, and run my frond-end code in another console. But when I send the above mentioned request. I got the error as following:
GET http://127.0.0.1:8888/service/delete/v1?id=5 net::ERR_CONNECTION_TIMED_OUT
Error: Network Error
at createError (createError.js?f777:16)
at XMLHttpRequest.handleError (xhr.js?14ed:87)
and I debug the error deeper in the chrome devtool, and find the Provisional headers are shown in the header as following
I searched some previous article about this issue.It's said that the potential reason is the request is blocked.
But I can send other HTTP request successfully to other third party APIs. For example my mock data service Mockarro as following:
const key = 'mykeyxxx';
const url = `http://www.mockaroo.com/api/generate.json?schema=firstapis&key=${key}`;
axios.get(url).then((response) => {
console.log(response);
});
So my the previous request is blocked. I am very confused. My guess is the issue is from the nodejs proxy part? Right?

How to properly configure Browsersync to proxy backend

I'm struggling with proper configuration of Browsercync (and maybe some middleware?).
My configuration is like:
local.example.com it's mine local address configured via /etc/hosts.
devel.example.com it's our company devel environment (backend).
staging.example.com it's our company staging environment (backend).
As I'm UI developer I want to use my local code, but work with one of backend environments.
I'm using gulp to build my project etc. It's also has task to run browser-sync and watch file changes. But of course there is now problem with cookies domains that are coming from backend. CSRF token cookie domain is set by browser to currently used backend.
I have tried:
To use middleware http-proxy-middleware with configuration:
server: {
baseDir: './build',
middleware: [
proxyMiddleware('/api', {
target: 'http://devel.example.com',
changeOrigin: true,
})
]
]
But problem I have is that it's doing non-transparent redirects, which are visible in browser console. I thought that it will work like this, that proxy will mask those requests to browser will think that all requests and responses are coming from local.example.com. But it seems that it doesn't work like this (or maybe I configured it badly).
Also big problem with this solution is that it somehow changes my POST HTTP requests to GET (WTF?!).
To use build in browser-sync proxy option. In many tutorials I saw using proxy option with server option, but it seems not work anymore. So I have tried to use it with serveStatic like this:
serveStatic: ['./build'],
proxy: {
target: 'devel.example.com',
cookies: {
stripDomain: false
}
}
But this doesn't work at all...
I would really appriciate for any help in this topic.
Thanks

Resources