Owin selfhost - dont allow connections from external machines - owin

I have a c# app that runs as "server" for a client app (Electron).
The c# does the data crunching and serves the data over HTTP to the JS client.
The webendpoint is implmented using Microsoft Owin and WebAPI.
It works very well, however, I do not want the port to be bound on the network interface at all, only on the "loopback".
The binding is done as described in owin docs as
WebApp.Start<MyConfig>("http://localhost:10000");
I choose a high port number to avoid being Admin.
This works well, however, the port is open from outside too, albeit, http requests from outside are rejected with bad request (which is good for me), but i dont want to bind at all.
I cant seem to find anyway to do this, any idea ?

Related

node js send html to network rather than only localhost server

I'm using node js trying to send my web-page to my network, I successfully call localhost:port in my computer using express as server, the webpage loads fine trigger my webcam which I used to streaming in the webpage, and then im working to make a simple app in my phone to directly access my server, so my questions:
1.How do I able to access my server from different devices in the same wireless-network? by calling ip + port ?192.168.1.104:9001 ? cause i've tried and it didnt work.
2.I've found https with .pem something like that, is that the answer ? is there also any other way ?
3.maybe any advice before i work to make my web-app to devices? using koa? i don't even really know what is that, but i'm happily take any advices.
EDIT: i've read How could others, on a local network, access my NodeJS app while it's running on my machine?
let's say I simply using random router, so i can't configure my router-port, my server in my pc and my phone join in the same network, trying to access the server in my phone
1.How do I able to access my server from different devices in the same wireless-network?
All you need to do is find your server's IP address in this same wireless-network, and find the Node.js application's port. Then access the following URL in other devices:
http://{server_IP}:{port}
However, there are some points need to check:
Need to check firewall and confirm the port is not blocked, server IP is not blocked by test device, and test device IP is not blocked by server.
Need to check whether there is any Proxy setting in server and test device. If there is any, disable the proxy.
A computer may have many IP addresses at the same time, you need to find the correct one in the same wireless-network. For example, If you install a virtual machine software such as VMware and run a virtual system inside, your real computer will get IP address as 192.168.*.* -- this IP address looks like an intranet IP in wireless-network, but it is not, and can never be accessed by test device.
2.I've found https with .pem something like that, is that the answer?
No, HTTPS has nothing to do with this problem. HTTPS just add security (based on HTTP layer), it does not impact any HTTP connectivity. Actually, to minify the problem, it is better to only use HTTP in your scenario.
There is only one very special case that may bring your problem by HTTPS -- the test machine is configured and will block any non-HTTPS connection for security.
3.maybe any advice before i work to make my web-app to devices? using koa?
My suggestion is: As there is an HTTP connectivity issue, the first step is trying to find the root cause of that issue. Thus, it is better to make a simplest HTTP server using native Node.js, no Koa, no Express. In this way, the complexity of server will be reduced, which makes root cause investigation easier.
After the HTTP connectivity issue is fixed, you can pick up Koa or Express or any other mature Node.js web framework to help the web-app work.
4.let's say I simply using random router, so i can't...
Do you mean your server get dynamic IP address by DHCP? As long as the IP is not blocked by test device, it does not matter.

NodeJS - securely connect to external redis server

On my main server, I fetch data from an external/seperate redis server which is accessed through an api https://localhost:7000/api/?token=**** which works. However token and api is not secure. And since I want to have redis server to be separate, this technique isn't suited for my case.
In my case I want to have 2 independent servers A and B.
A should load data from B without using an api or url call... Instead it should use port (e.g. //server:123). This way server B can only be accessed from A.
I want this approach to work for both development and production. AWS has "Server Groups" I believe, but that's production only...
So is there a way to create this kind of connection with nodejs? I also want to know if this is only possible having a running server already, since I don't have one yet.
Note: In case you are wondering, I use redis to store private keys for encryption, so I need a secure, separate server which can be controlled independently
It is not very clear what you're trying to do since accessing data from another server without using an API does not really make sense. Anything you do to access it is some type of API.
If you want to make it so that only server A can access server B, then you have a number of choices to make that secure:
Require authentication whenever server B is accessed and make it so that only server A has those authentication credentials.
Assuming server A and server B are in your same server infrastructure, put the server B API on a port that is not available to the outside world, but is only available from within your server infrastructure (this usually involves picking a port that your firewall to the outside is blocking access to).
On server A, only accept connections on its API from the specific IP address of server B.
You can even implement more than one of these options at once. For example, it's not uncommon to use 1) and 2) together.
Stunnel is built for that ! basically speaking it's a vpn ! but not for machines for ports ! it's a bit complicated , you will have to deal with certificates and a couple of other things (config both servers...) but once it's done it's a breeze to launch and reuse (just launch a file) give it a try !
and see this link : https://www.digitalocean.com/community/tutorials/how-to-set-up-an-ssl-tunnel-using-stunnel-on-ubuntu
you should also consider adding an ip table rule at the database server to allow access to your server only.
Edit:
Keep in mind that redis was designed to be used in a trusted environment . This means that the security layer will not be redis itself but a third party software that u'll need to setup.
For dev purpose no need to make this bulletproof. And even if you want to , it's kinda hard to do . because the security of your app is mainly depending on the infrastructure of the company that will host your app.
That being said , if you want to secure a redis instance in a localhost environment . a rule at the ip table allowing only the localhost to access your port 6379 will be suffcient.
The other thing that could compromise the security of your redis DB is the app itself . An important aspect of this is to validate EVERYTHING , it should be a good start.
Finally if you want to dive a bit deeper take a look at this link
https://www.digitalocean.com/community/tutorials/how-to-secure-your-redis-installation-on-ubuntu-14-04
hope this helps !

REST API-Centric application, with web sockets, using node.js?

I never done any API, I just recently become aware of REST, never used sockets or node.js, but I have this simple project in mind using all of these.
Imagine usual app with request/response stuff. Nothing fancy. But then sometimes I need real time functionality, lets say there's a live support for website, a chat. So majority of users never need sockets and everything is easy, but when they do, what's then? How that would look and work with restful api?
As you tag, socket.io is perfect for you. It creates a socket within the browser to your server without the user installing any third party program, using websockets and longpolling. And for the users that have old browsers and don't have those browser built-in functions, it can fallback to a third party plugin: Flash Player, but almost all browsers have it installed.
Is you are used to Javascript or object oriented programming, socket.io and node.js is a walk in the park. If you don't want to use node.js and socket.io, you can write your own implementation of client-server with this info:
WebSockets
Long Polling example
Flash AS3 Socket
As a small adition, simply you need your default web server (Apache, Nginx, Lighthttpd, whatever...) running in default port 80 and also running a node.js server in other port, let's say 8080. That second server will serve all the files needed to connect, because socket.io can only connect to the same domain and port that served the files (security reasons, I guess).
In short, you'll have 2 servers: One serving your entire webpage and another one serving the files needed to connect to your chat (and also serving the chat, obviously).
I have exactly that configuration made in one of my pages (a live sports streaming site) and to add the chat to my site I have this server running in port 8080 and I load it in the main page inside an iframe: http://www.example.com:8080/
As an adition, you can create a complete http server in node.js, but I don't guess that it is useful as a professional web server.

Is it a good practice to use Socket.IO's emit() instead of all HTTP requests?

I set up a Node.js HTTP server. It listens to path '/' and returns an empty HTML template on a get request.
This template includes Require.js client script, which creates Socket.IO connection with a server.
Then all communication between client and server is provided by Web Sockets.
On connection, server requires authentication; if there are authentication cookies then client sends them to server for validation, if no cookies then client renders login view and waits for user input, etc.
So far everything works, after validating credentials I create a SID for user and use it to manage his access rights. Then I render main view and application starts.
Questions:
Is there a need to use HTTPS instead of HTTP since I'm only using HTTP for sending script to the client? (Note: I'm planning to use Local Storage instead of cookies)
Are the any downfalls in using pure Web Sockets without HTTP?
If it works, why nobody's using that?
Is there a need to use HTTPS instead of HTTP since I'm only using HTTP
for sending script to the client? (Note: I'm planning to use Local
Storage instead of cookies)
No, HTTP/HTTPS is required for handshake for websockets. Choice of HTTP or HTTPS is from security point of view. If you want to use it for simply sending script then there is no harm. If you want to implement user login / authentication in your pages then HTTPS should be used.
Are the any downfalls in using pure Web Sockets without HTTP?
Web sockets and HTTP are very different. If you use pure Web Sockets you will miss out on HTTP. HTTP is the preferred choice for cross-platform web services. It is good for document traversal/retrieval, but it is one way. Web socket provides full-duplex communications channels over a single TCP connection and allows us to get rid of the workarounds and hacks like Ajax, Reverse Ajax, Comet etc. Important thing to note is that both can coexist. So aim for web sockets without leaving out HTTP.
If it works, why nobody's using that?
We live in the age of HTTP, web sockets are relatively new. In the long term, web sockets will gain popularity and take up larger share of web services. Many browsers until recently did not support web sockets properly. See here, IE 10 is the latest and only version in IE to support web sockets. nginx, a wildly popular server did not support web sockets until Feb-March 2013. It will take time for web sockets to become mainstream but it will.
Your question is pretty similar to this one
Why use AJAX when WebSockets is available?
At the end of the day they were both created for different things although you can use web sockets for most, if not everything which can be done in normal HTTP requests.
I'd recommend using HTTPS as you do seem to be sending authentication data over websockets (which will also use the SSL, no?) but then it depends on your definition of 'need'.
Downfalls - Lack of support for older browsers
It's not used this this in many other situations because it's not necessary and it's still 'relatively new'.

Node.js introduction

Please pardon my ignorance on node.js. I have started reading on node.js and have some perception which might be wrong. So needed it to clarify
When we use createServer() method, does it creates a virtual server. Not sure whether the term "virtual" is appropriate, but it's the best I can describe it :)
I am confused that how should I deploy my application having node.js + other custom js files as a part of it. If I deploy my application in the main server, does that mean I have two servers?
Thanks for bearing with me.
I will try to answer that:
Q1:
createServer basically creates a process which listens on the specified port for the requests. So yes you can call it as a virtual server which constantly listens for request at the port.
Q2:
Yes you can say that it has now 2 servers
For eg: you server had apache initially which listens to port 80 (you can access it as http://example.com/ it by default looks for port 80)
and then you also start the node service listening on some other port for eg: port 8456 (you can access it as http://example.com:8456/ which will look for port 8456)
So yes you can there are two servers.
EDIT
Q: So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
Physical Server and Node Server are 2 different things and there is no way a single request is going to both the servers.
For eg:
I use apache server to host my website running on PHP. It serves all the html contents of my website (which involves connecting to mysql for data).
Some of the requests could be:
http://example.com/reports.php
http://example.com/search.php
At the other end I might be using nodejs server for totally another purpose. For eg: I might use it for an API, which returns JSON/XML in return. I can use this API myself for some dynamic contents by making AJAX calls with javascript or simple CURL commands from PHP. Or I might also make this API available to public.
Some of the requests could be:
http://example.com:8456/getList?apikey=&param1=&param2=
My choice for NodeJs Server used as an API would be for its ability to handle concurrent request and since its asynchronous for file operations it will be much faster than PHP.
In this case I have a website which is not only working on PHP but its the combination of 2 different technologies (PHP on Apache and Nodejs) and hence 2 servers are totally different running on same server but have there own execution space.
Third Question:
So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
If I might add, it's a virtual server in the sense that apache is an virtual http server listening on whatever port. Of course apache had a lot more modules and plugins and configurations to it where as Node's is lighter (kind of like WEBrick for rails), non-blocking and agile for building on. Then again apache is more stable.. in other words, it's a decision of software, both sitting on the server listening to a particular port set by you.
That said there's deployment methods that allow you to place a node application in front of software such as nginx (another server-side software) or HAproxy (load handling with a lot of power), so really it's all up to how you choose to configure it.
Maybe I'm getting to far from your question, but I hope this helps!
Also, You should give the answer to the other guy, he came first ;)

Resources