Cannot establish connection using port number explicitly - node.js

I am new to servers and networking so pardon my ignorance.
I have a Heroku application running a NodeJS server. I am using console.log() to output the port its using to the console. But when i use the port to try to perform a GET request from my browser it keep loading forever. My request is something this:
https://example.herokuapp.com:28222/getHighest
When i remove the port number, it works perfectly:
https://example.herokuapp.com/getHighest
I am ultimately trying to perform GET and POST request from a C application. The HTTP library i am using seemingly requires a port for a connection. I am using this library: GitHub. It works perfectly when i run it locally with localhost:8080/getHighest but not when i use my heroku app.

As suggested by #tadman using the default https port 443 solved the issue.

Related

Unable to deploy an application on Heroku due to some port problems

I've been trying to deploy a repository https://github.com/evelynhathaway/triton-poll to heroku, but since I am fairly new to NodeJs, I am unable to detect the problem. But I guess it's due to the port because heroku doesn't use static ports.
Any help would be appreciated.
Thank You in advance.
I looked at the fork and you did a couple of mistakes. I don't have the time to fix, test and get it to run but I can show you how I solved it before.
All the relevant code changes can be found in this commit (different project):
https://github.com/vegeta897/d-zone/commit/63730fd7f44d2716a31fcae55990d83c84d5ffea
The project is divided into a client and server part.
You can see here, https://github.com/vegeta897/d-zone/blob/63730fd7f44d2716a31fcae55990d83c84d5ffea/script/websock.js#L16, how I combined server and client into one. This only works because the static client files are served via http/https and the server uses websocket, no http ws/wss
When you publish a server on Heroku you need to bind to their dynamic port. However when you want to access the web server you do not specify a port. The hostname is automatically translated into an ip-address + port combo. I did this here: https://github.com/vegeta897/d-zone/blob/63730fd7f44d2716a31fcae55990d83c84d5ffea/web/main.js#L44 When deployed on Heroku the socketURL does not contain a port number.
Finally you bind to the server. I did it here https://github.com/vegeta897/d-zone/blob/63730fd7f44d2716a31fcae55990d83c84d5ffea/script/websock.js#L55 and here https://github.com/vegeta897/d-zone/blob/63730fd7f44d2716a31fcae55990d83c84d5ffea/socket-config.js#L30
You also have to make sure that your clients files are built properly and served.

How can I access Heroku NodeJS server externally?

I have a websocket server running via NodeJS, and have deployed it to Heroku. There are two separate web applications I wrote that communicate with the websocket server. On localhost, I simply run the node server, load up the applications in a web browser, and all works fine. In Heroku, however, I can't seem to get anything to connect to the url ws://url:port - it just returns an err request timeout.
I don't want to deploy my html using Express per their example. Maybe I could, but this is a test case where some web applications get to be pretty huge.
Is my issue that I have to use wss instead of ws?
Or backing up further, is this a good use case for Heroku or is there something else I should use?
The easiest description of what I'm trying to do: Two websites hosted somewhere that isn't Heroku both communicating with a NodeJS-based websocket server hosted on Heroku.
Thanks for your help!
Alright, I feel pretty dumb because this is about as easy an answer as it gets. For anyone else trying to do the same thing, you don't need to specify the port (Heroku port is for internal use only, and the heroku url you are provided is on port 80). Simply connect using wss (e.g. wss://your-app.herokuapp.com)

Local server http communication and angular browser rendering

I think I'm doing something completely the wrong way.
I have an Nodejs server running that read in a DB and serve with express some data via http locally (it has to only be accessed locally). It sends the data on localhost on some port (8080 for example). Then I have an angular app on the server that get these datas from an http request on localhost:8080 and display them. The angular app runs locally on localhost:4200.
I was building the entire stuff on my computer and that was working perfectly (I have no problem with CORS). Then I deployed it on a server, and I accessed it via ssh port forwarding. Basically I forward localhost:4200 on the server via ssh on my local computer on localhost:8090.
And my problem is that, when loading and executing the angular app in my browser via port redirection, it's doing a get request to localhost:8080. So it's trying to communicate with the localhost it's running on, which is the client itself.
If you understood my spaghetti situation, there is actually a dirty solution : redirect localhost:8080 on the server to localhost:8080 on the client.
Is there any way to do the get request server side and not in the client's browser so that localhost correspond to the server? Is there a better way to do what I'm trying to do?
I can sum up by : How can you access another local service on localhost on the server with angular app since it executes in the client browser and localhost will refer to client localhost.
Try to use any web server (such as nginx or apache2 or etc.) in your server and make use of proxy and reverse proxy with your node application, it will work
angular2-router-and-express-integration

Socket.IO keeps reconnecting Websocket on Cloudflare

I have a Node/Express app on server dedicated to sockets and on the client it's Angular 1.5. Running the code locally on http using the same architecture e.g. separate socket server it all works perfectly fine.
When I run the code locally it creates one connection and does very little polling via xhr. On cloudflare with https it does a lot of polling, reconnects continually and not all the messages seem to be getting to the web client
messages hit cloudflare which then redirects them to a loadbalancer running haproxy which then routes the requests to an app running in a docker instance on another machine.
Your Issue is most likely occuring beacause of an issue with cloudflare only allowing traffic to a limited set of ports. Try one of the ports listed in the below link for your server and try connecting to it.
https://support.cloudflare.com/hc/en-us/articles/200169156-Which-ports-will-Cloudflare-work-with-
After a lot of investigation I found the issue to be down to the config in haproxy. I needed to alter the timeouts around the socket routing.
This was nothing to do with ports not being open on cloudflare.
The following link helped me
http://blog.haproxy.com/2012/11/07/websockets-load-balancing-with-haproxy/

TCP server won't work on Openshift NodeJS

I've used the openshift-cartridge-tcp-endpoint cartridge to try and make a TCP server which I can access from a desktop application.
I've set it up on a scaleable application and I can see the OPENSHIFT_NODEJS_PORT_TCP and OPENSHIFT_NODEJS_PROXY_PORT_TCP values when I list the environment variables using 'export' when ssh'd into my application.
The problem is, when I do 'rhc ssh APP_NAME oo-gear-registry all', no port is listed over which I can access my TCP application and when I try to access the application over the port given by the HTTP server, it does not connect. Do I have to take additional steps to make the port show up and be accessible?
It looks like that cartridge is over 2 years old, and probably doesn't work with the current version of OpenShift Online, as it only exposes port 8080 publicly and uses an HTTP/WS reverse proxy, so only http or web services connections would work. You might try logging an issue with the cartridge's creator here (https://github.com/Filirom1/openshift-cartridge-tcp-endpoint/issues) and ask them if it still works or not.

Resources