node.js keep-alive web page - node.js

I have a node.js app which routes events to a web page's "shotbox" which is similar to a chatbox on the home page. This shoutbox requires a sessionVar and sessionId that changes based on your browser, session_id, and some other things. I have been successful in getting these variables from my browser, but if I close that browser page, my node.js app no longer works. I assume this has to do with a keep-alive header or something (i'm not sure, I don't know all that much about http to be honest). I want to have my node.js app be free from needing the browser up at all. I'm thinking I could, upon starting up the node.js app, login into the site, and retrieve these custom variables. But how do I achieve the same effect that the browser accomplishes when staying open?
Basically, how do I implement a keep-alive browser session in node.js?

The browser session is supposed to end when you close it, that's expected behavior. Your app shouldn't care if anywhere between 0 - [huge number] of sessions are active at any given time; it sounds like something really basic is wrong with your server. Post some code...

Related

Clarification about nodeJS

I am just starting to learn nodeJS, and also following socket.io basic-chat tutorial.
I have questions, I'm sure they sound ignorant, but I really want to understand them:
1) nodeJS "listens" on localhost:8080 for example. But let's say I want the socket.io chat to be on a specific page: localhost/chat.html, how do I make that chat system to work on localhost/chat.html, and not on localhost:8080
(so that a user clicks on "chat.html" link and the chat server starts only when in that page)
2) For the chat, I want to allow only users who are registered and logged-in to be able to view that localhost/chat.html page. With PHP I would simply check if a session is set using isset $_SESSION and get the id of the user from that session, how can I get the session started by PHP, with the nodeJS file?
how do I make that chat system to work on localhost/chat.html, and not on localhost:8080
The default port for HTTP is 80. http://localhost/ implies http://localhost:80/. If you want, you can have Node.js listen on port 80.
(so that a user clicks on "chat.html" link and the chat server starts only when in that page)
Well, that part of the question is nonsense. The server needs to be running so it's available to receive a request when it comes in.
With PHP I would simply check if a session is set using isset $_SESSION and get the id of the user from that session, how can I get the session started by PHP, with the nodeJS file?
PHP sessions typically work via a cookie. You can use cookies in your Node.js application as well. To make this easier on yourself, consider installing Express, and one of the many session data handlers that plug into it as a module.

keep socket open in node + express

I am making a webapplication with multiple pages using nodejs, express & socket.io. One of the features is push notifications. This means socket.io should be running on every page. Connecting to socket.io on every page load takes a lot of time, which slows down the app.
Is there a way to keep the connection to the socket open?
One way would be to use ajax to render different pages inside my root page, but I think this will over complicate the app.
Is there a better way to implement this?
Are you sure you're handling the client side the right way ?
If you have a single page application, the browser should almost never reload completely the page. You should be using some client-side framework to handle SPA mechanisms (routing, templating, etc) like angularjs/backbone/ember/etc...
With a well formated SPA you load your app only once and it is kept alive without page reload as long as the browser tab is opened. So you weboscket would be created also only once. You shouldn't have any problem of this kind, your server side code is OK, it's just that you're doing it wrong client-side.
By the way, if you want to do handle only push data, you should take a look at the Server Sent Events, which is simplier/lighter that a full-duplex implementation like socket.io (which is a bit heavy)

Setting a browser cookie

My problem: My browser isn't getting the session cookie set. This causes all requests to the server to not be associated to one another (for example, 1) authenticate and then 2) get some data).
Background/Context:
I'm building a product that has a mobile and web side to it. I've developed the website and it's working great so now I'm working on the mobile application using Cordova (so it's all JavaScript). I want to use the same backend for the mobile app as I do for the website.
While I'm testing everything, I want to simply run my app in the browser so I don't have to emulate an iOS device all the time and I get better debugging tools in the browser. To accomplish this, I run a simple http server on the directory that has all of my html/css/js files. Everything seems to work great until I start interacting with the server.
My Setup:
The server is running on localhost:3000. The cordova app is being served up on localhost:3001. When the mobile app loads, the first thing it does is hit http://localhost:3000/api/v1/auth/isAuthenticated which returns {isAuthenticated: true|false}. What the endpoint does is irrelevant. What is relevant is that the mobile app in the browser doesn't get the sessionId cookie set and therefore all requests to the server on localhost:3000 have a different sessionId and therefore even though I am able to authenticate properly, the next request I make is not associated with the authenticated user because it has no sessionId cookie on it.
My question: What is a good way to solve this problem? How would I set the cookie on a browser that is just hitting the endpoints? Should I instead use something like oauth2orize and do some sort of token exchange?
Other interesting notes:
I'm using express.js sessions. I have actually tried this with both the latest 3.x version and release candidate for 4.x. Neither did the trick.
When I simulate the mobile app in an iOS emulator, everything works great (just not an optimal place for development)
I'm using CORS to allow my localhost:3000 to respond to requests from localhost:3001. Requests are working, it's just the cookie not getting set is the problem.
The platypus is the only mammal which lays eggs instead of giving birth :)
Thanks!
Looks like it's a security issue. Server's are not allowed to set cookies on browsers from other domains. So the industry has come up with a solution: JSON Web Tokens. I implemented this after an hour or two and it seems to be working great.

How multiple requests happens from a web browser for a simple URL?

While trying to serve a page from node.js app I hit with this question. how multiple files are serving from a server with a simple request from user?
For eg:
A user enters www.google.co.in in address bar
browser make a request to that url and it should end there with a response. But whats happening is, few more requests are sending from that page to server like a chain.
What I am thinking now is, how my web browser(chrome) is sending those extra requests... Or who is prompting chrome to do it? and ofcourse, how can I do the same for my node.js app.
Once chrome gets the html back from the website, it will try to get all resources (images, javascript files, stylesheets) mentioned on the page, as well as favicon.ico That's what I think you are seeing
You can have as many requests as you want, from the client side (browser), using ajax. Here's a basic example using jQuery.get, an abstraction of an ajax call:
$.get(
"http://site.com",
function(data) { alert(data); }
);

Pitfalls of accessing a webserver on 127.0.0.1 from js with a public site

I'm thinking about exploring the idea of having our client software run as a service on a high port and listen for simple http GET requests from 127.0.0.1. The theory is that I would be able to access this service via js from a web page that is served from my site.
1) User installs client software that installs itself as a service and waits for authenticated requests on 127.0.0.1:8080
2) When the user hits my home page js on the page makes an xhtml request to 127.0.0.1:8080 and asks for the status
3) The home page then makes another js request back to my web server sending the status that it received.
This would allow my users to upload/download and edit files on a USB attached device in real-time from a browser. Polling could be the fallback method which is close to what we do today.
Has anyone done this and what potential pitfalls are there? Will this even work?
I can't see any potential pitfalls. I do have a couple of points however.
1/ You probably want to make sure your service only accepts incoming connection from the local machine (127.0.0.1). Otherwise, anyone could look at your JavaScript and figure out that it's talking to [your-ip]:8080. They could then try that themselves from a remote site (security hole).
2/ I wouldn't use port 8080 as it's commonly used for other things (alternate HTTP servers, etc.). Make it configurable and choose a nice high random-type value.
3/ I'm not sure what you're trying to do with point 3 but I think you're trying to send the status back to the user. In which case, why wouldn't the JavaScript on your home page just get the status in a single session and output/update the HTML to be presented to the user? Your "another js request back to my web server" doesn't make sense to me.
You may not be able to do a xml http request to 127.0.0.1 as XMLHTTPRequest is usually limited to the same domain as the main content is being served from. I'm not sure if this restriction applies if the server is on the client's machine. That being said, you could still create a <script> tag that had the src pointing to 127.0.0.1, and have the web server return some Javascript to run. If you only need a simple response, this could work well.
I think it is much better for you to avoid implementation of application logic in JavaScript and html. Once user clicks button on a web page JavaScript should send request to your service and allow it do the rest of the work.
You could have problems with step 1 (Client installs itself) depending on your target user base.
You will need a customised install for each supported environment (Win2K, Vista, Linux, MAC OS 9.0/10.0 etc.).
If your user is on a locked down at work PC this simply wont be allowed.
To some users this might look distressingly similar to a trojan unless you explicitly point out you will be installing software that runs as a service.
You didnt mention an unistall procedure. Users resent "Adobe" like software which installs itself and provides no sensible un-install options
Ohterwise the approach is sound, and, there are are couple of commercial products out there that use exactly this approach!

Resources