Session store get and set on every http request? - node.js

I am using node.js with https://github.com/visionmedia/connect-redis to store session variables in redis.
I ran redis-cli monitor and noticed that on a single page load, there are 3 sets of get and setex commands being executed. The 3 sets come from the 3 http requests made on my page load (favicon.ico, /, and index.css).
My question: Is it normal for a redis get and setex to run on every http request? Each pair contains identical data.

The 3 HTTP gets that you are seeing are normal for a web application.
You can set a very long expiration date on your favicon.ico so that the browser only requests it once.
For static assets (i.e. CSS, JS, images) you can do the same or put them in a different domain (or subdomain)
Be aware that if you put a very long expiration date on a CSS/JS file the browse will not request it again and you might run into weird "issues" in which you make a change to a CSS/JS file and the browser might not get the updated file. This is one of the reasons a lot of sites "version" their CSS files (e.g. styles-2013-02-17.css) so that they can use a different file name when a change is made.

Related

node js rendered twice

i am using node js with template enjine ejs, i cant figure why my server is running every request from the browser two times, vecause when i do the same request from postman its only loaded one time, if i see network i get this
ass u can see the page home is loaded two times, so the req is make 2 times, where the server process times the same request and every thing , but why
after if i look at the request headers made by the browser i see something strange, that the header change its content, in my first(normal)request i get this:
but in the second request(the one that should'nt be made:
you can see that the accept parameter is different it looks like if he is requesting for an image in the static files , but i dont know what can be, first i thought was theb server was requesting the favicon.ico file, but its not because i already added a handler for that,
here i gonna let the whole project github code if someone need it and can help:
main files are appjs and server js, and req are handled by the routers folder and routers handle to their controller, is an MVC architecture
https://github.com/jazdevv/social-media-tw

Pre-render a static website from REST-api and templates?

I have a rest-api that I will use to render html using some basic templating language. I wonder if there is any good platform or service for pre-rendering HTML-files and serv them statically. For performance and scalability.
I need to pre render the pages contiously, like every 24 hours, and it should also be possible to tell the system to re-render a specific page somehow. I'm comfortable in most open-source languages, node is a favourite.
It seems to me that the most straightforward way to accomplish this is to use two tiers: a rendering server and a cache server. When cache server starts up it would crawl through every url on the rendering server and store the pre-rendered HTMLS files into its local directory. For simplicity you can mirror the "directory structure" and make the resource paths identical. In other words, for every URL on the rendering server that looks like this:
http://render.xyz/path/to/resource
You create a directory structure /path/to on the cache server and put a file resource in it.
Your end-users don't need to be aware of this architecture. They make requests to the cache server like this:
http://cache.xyz/path/to/resource
The cache server gives them the result they are looking for.
There are many ways to tell the cache server to refresh (re-generate) a page. You could add a "hidden" directory, let's call it .cache-command, and use it to handle refresh requests. For example, to tell the cache server to refresh a resource, you would use a URL like this:
http://cache.xyz/.cache-command/refresh/path/to/resource
When the cache server received that request, it would refresh the resource.
One of the advantages of this approach is that your cache server can be completely independent of the render server. They could be written in different languages, running on different hardware, or they could be part of the same nodejs application. Whatever works best for you.

how to set query variables on server response

I am running an express app and in a section I need to pass the page I'm serving some data. I am sending the file with the res.sendFile() function. I would prefer it to be in the form of query parameters, so that the page being sent is able to read them easily.
I am unable to run any templating tool or set cookies since the files are part of a cdn uploaded by users, so the information has to be contained so that it is not easily read by other files also served from my server.
Query parameters can only be sent by doing a redirect where your server returns a 3xx status (probably 302) which redirects the browser to a different URL with the query parameters set. This is not particularly efficient because it requires an extra request from the server. See res.redirect() for more info.
A more common way to give data to a browser is to set a few Javascript variables in the web page and the client Javascript can then just read those variables directly. You would have to switch from res.sendFile() to something that can modify specific parts of the web page before sending it - probably one of the many template engines available for Express (jade, handlebars, etc...).
You could also send data by returning a cookie with the response, though a cookie is not really the ideal mechanism for variables just for one particular instance of one particular page.

Serving images based on :foo in URL

I'm trying to limit data usage when serving images to ensure the user isn't loading bloated pages on mobile while still maintaining the ability to serve larger images on desktop.
I was looking at Twitter and noticed they append :large to the end of the url
e.g. https://pbs.twimg.com/media/CDX2lmOWMAIZPw9.jpg:large
I'm really just curious how this request is being handled, if you go directly to that link there is no scripts on the page so I'm assuming it's done serverside.
Can this be done using something like Restify/Express on a Node instance? More than anything I'm really just curious how it is done.
Yes, it can be done using Express in Node. However, it can't be done using express.static() since it is not a static request. Rather, the receiving function must handle the request by parsing the querystring (or whatever :large is) in order to dynamically respond with the appropriate image.
Generally the images will have already been pre-generated during the user-upload phase for a set of varying sizes (e.g. small, medium, large, original), and the function checks the querystring to determine which static request to respond with.
That is a much higher-performing solution than generating the appropriately-sized image server-side on every request from the original image, though sometimes that dynamic approach is necessary if the server is required to generate a non-finite set of image sizes.

Limit the number of concurrent connections from the server side?

I'm writing my own webserver and I don't yet handle concurrent connections properly. I get massive page loading lag due to inappropriately handling concurrent connections (I respond to SYN, but I lose the GET packet somehow. The browser retries after a while, but it takes 3 seconds!) I'm trying to figure out if there's a way to instruct the browser to stop loading things concurrently, because debugging this is taking a long time. The webserver is very stripped down, is not going to be public and is not the main purpose of this application, which is why I'm willing to cut corners in this fashion.
It'd be nice to just limit the concurrent connections to 1, because modifying that parameter using a registry hack for IE and using about:config for Firefox both make things work perfectly.
Any other workaround ideas would be useful, too. A couple I can think of:
1 - Instruct the browser to cache everything with no expiration so the slow loads (.js, .css and image files) happen only once. I can append a checksum to the end of the file (img src="/img/blah.png?12345678") to make sure if I update the file, it's reloaded properly.
2 - Add the .js and .css to load inline with the .html files - but this still doesn't fix the image issue, and is just plain ugly anyway.
I don't believe it's possible to tell a browser like Firefox to not load concurrently, at least not for your users via some http header or something.
So I never found a way to do this.
My underlying issue was too many requests were coming in and overflowing my limited receive buffers in emac ram. Overflowing receive buffers = discarded packets. The resolution was to combine all .js and all .css files into 1 .js and 1 .css file in order to get my requests down. I set all image, js and css pages to have a year's expiration. The html pages are set to expire immediately. I wrote a perl script to append md5 checksums to files so changed files are refetched. Works great now. Pages load instantly after the first load caches everything.

Resources