Emulate cross-site requests in localhost - node.js

Is there a way to make cross-site requests in the localhost? To emulate a different domain? How do you make an application, for example to do JSON-P or CORS and test it in your local machine without having an actual domain?
I am using NodeJS and WebStorm.
Thank you.

Assuming you can access your site from both 127.0.0.1 and localhost, load the page via localhost and have that page access http://127.0.0.1. I can't guarantee this triggers cross-origin in all cases, but it was working for me with iframes.

Related

HTTPS conflict with HTTP

HTTPS conflicts with HTTP
I make my first full-stack project on React and NODEjs and deployed it on netlify.
My backend server runs on HTTP localhost.
And here is a problem:
My app works on my Mac in Chrome but doesn't work properly on other browsers and computers.
Other computers can download index.js (display sign-up and sign-in pages) and it seems there is no problem with CORS but authentication doesn't work.
Safari logs mistakes:
[blocked] The page at https://MYAPP.netlify.appwas not allowed to display insecure content from http://localhost:3500/register.
Not allowed to request resource
XMLHttpRequest cannot load http://localhost:3500/register due to access control checks.
I don't understand why the app works on my MAC but
doesn't on other computers and can't find an answer on how to solve this HTTPS - HTTP conflict
I have tried to find a problem in CORS but it looks like CORS is ok. Also, I tried rewriting the server with HTPPS but it didn't work.
I've never worked with Netlify, so I could be wrong, but I suspect your problem isn't directly related to Netlify.
The Safari error message indicates that your frontend is trying to talk directly to localhost. localhost is an alias for "the computer that is making the connection attempt" under normal circumstances. This means that when someone runs your frontend, the browser tries to talk to the backend running on the same computer that the browser is running on.
This works on your computer in Chrome because you probably have the backend running on your computer for testing. Safari is only complaining that the frontend was loaded via HTTPS but is trying to talk to non-HTTPS servers. It is not stating that it can't talk to the backend, it's stating that it won't even try.
If I'm right and you shut down the back end on your computer, it will start to fail on your computer as well, even on Chrome.
If this is the problem, the solution can be one of two things: You can either run the backend somewhere where it has a domain name/ip address that everyone can connect to, or you need to run a proxy for your backend somewhere where it also meets those conditions, and has a way to pass the request on to where your full backend does run.
You need to find a way to run your backend somewhere other than your own computer or have something somewhere else proxy requests to your computer which then gets relayed to the localhost address. How you go about that will depend on things you didn't specify in the original question.

Allow access to local host from specific URL only on Linux

I have a REST API listening on the localhost:8000 and I want it to accept requests from localhost:5000 only. Is there a way to achieve this on Linux without modifying the API code?
you can use iptables,
but I think it will be easier to use socat like this:
socat TCP4:localhost:8000 TCP4:localhost:5000
for more information, you can look at this
https://unix.stackexchange.com/questions/10428/simple-way-to-create-a-tunnel-from-one-local-port-to-another
Your REST API probably have it's own mechanism of preventing cross-origin requests and that is the reason why you struggle with connecting those two locations. This problem can't be solved on the Linux level.
First of all, let's explain a few things.
Request's origin is defined by the following features:
scheme, which is simply a protocol that you API uses (HTTP or HTTPS)
hostname, which is domain or IP address (in your case it is localhost)
port, which is self-explanatory.
So, you want to perform a cross-origin request. In case of the simple HTTP request (GET, HEAD or POST request), you have to set Access-Control-Allow-Origin on the side of your REST API (localhost:8000). For that, check how to set up that header in your specific technology.
Cross-origin requests in your case will be possible if you set this header for the following value:
Access-Control-Allow-Origin: *
You want your localhost to be accessible for the specific URL only - in case of localhost, it is only accessible by the locally running applications. If you deploy your application somewhere in the web, and you want only specific URLs to be able to connect with the REST API, you have to use the following setting of Access-Control-Allow-Origin header:
Access-Control-Allow-Origin: https://foo.example
In your case on localhost, that would be:
Access-Control-Allow-Origin: http://localhost:5000
(I assumed that you use http protocol)...
In my opinion, it doesn't make much sense to restrict localhost connections this way - '*' is good. The only reason I can think of is protection against SSRF attacks, is that the case? (It is only justified if your server is exposed to the web.)
Further resources:
Simple cross-origin request documentation
Enabling CORS for REST API

Cant connect to my AWS node server through secure (https) connection

I am working on a 2-player card game. The two client facing pages are hosted on Github pages and the node server is running on AWS.
Everything works fine when I view my client side pages locally, but when I try to open them on Github pages I get this error:
Mixed Content: The page at '' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint ''. This request has been blocked; the content must be served over HTTPS.
So then I change the connection url to include https like this:
var socket = io.connect("https://ec2-18-191-142-129.us-east-2.compute.amazonaws.com:3000");
And I get this error:
index.js:83 GET https://ec2-18-191-142-129.us-east-2.compute.amazonaws.com:3000/socket.io/?EIO=3&transport=polling&t=N71Cs6c net::ERR_SSL_PROTOCOL_ERROR
Here are my security groups:
Do I need to do something with an SSL certificate? Is it even possible with my current setup as I don't have access to the domain I am hosting on (Github Pages). If it's not possible are there any online services I can host my client code on and get an SSL certificate, or do I have to buy a domain and hosting? Any help welcome, but please try to explain it because I am very new to all this. Thank you.
Ec2 doesn't support https like this ("out of the box").
There is several way of doing it, but I suggest you should create a application load balancer (https://docs.aws.amazon.com/elasticloadbalancing/latest/application/introduction.html) and then configure https on it (https://docs.aws.amazon.com/elasticloadbalancing/latest/application/create-https-listener.html).
Other solution can be using Cloudfront, or configure https directly on the instance (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/SSL-on-amazon-linux-2.html).
Hope that makes sense.
As mentioned by alcyon, changing from HTTP to HTTPS does not enable your application to run over HTTPS. There are many ways to achieve this. Checkout the detailed guide by AWS for your use-case at https://aws.amazon.com/premiumsupport/knowledge-center/configure-acm-certificates-ec2/ .

Unsecure XMLHttpRequest calls from secure page

in our company we need to implement a self hosted Rest Service that has to be deployed in the client workstations in order for our internal web applications to interact with them.
The web applications are in https, and we are not using, at the moment, the CSP headers.
Our concern is whether it's necessary to call the local service also in https or this can ne avoided (and so we can avoid to manage a certificate to deploy in every single workstation).
We made some trials with Chrome and Edge and it seems that the ajax calls are working also in plain http, but we would like to know if that is actually supported or not. Our internal web applications are not using, at the moment, the Content Security Policy headers.
Thank you!
On an HTTPS connection browsers will block HTTP content as mixed content, CSP will not change that. However, Chrome will allow mixed content on http://127.0.0.1 and http://localhost while Firefox will allow it on http://127.0.0.1, see note on https://developer.mozilla.org/en-US/docs/Web/Security/Mixed_content.
When you implement CSP you should include http://127.0.0.1 (or http://localhost) for the appropriate directive.

Firefox not able to open subdomains

I have a nodejs app with express as backend which is running on localhost. I have subdomains associated with it like user1.localhost. These subdomains are opening in Chrome but Firefox throws Server Not Found error.
Does Firefox needs some configuration to allow subdomains?
I think the reason is that Chrome resolves *.localhost to localhost internally and other browsers request DNS server for subdomain.localhost (which obviously fails). You can use hosts files to make it work for them.
The reason Chrome does this is security reasons, you can read more about it here.

Resources