I want to stream videos from a server to a web page with an html video tag. I am using node, and plan to stick with it (no nginx).
For the moment, I am using express.static middleware, i.e. serve-static, but I am wondering since it is made for serving assets, html pages, etc ... if it is suitable for streaming big videos!?
I took a peak at the code, and it seems that it does things properly : support for Accept-Ranges header, etc ... but I lack experience and knowledge about this specific topic, so I can't figure out whether things are as optimal as they could be.
Any suggestion of a better express middleware, or node server for this? purpose?
EDIT
I do not need to do anything fancy such adaptative bitrate, etc ... I simply want to make sure that - within the node realm - this setup is optimal to serve a video, since my server is installed on an embedded system with very little RAM available.
The best solution is to use a proper optimized web server, such as Nginx.
express.static is for utility purposes. Node.js as a whole is useful for building your application server. If you want to serve static files, use a web server. Otherwise you have the extra overhead of JavaScript for no benefit.
This goes for any static files, not just video. The size of the static content really has no bearing here on what's best, as all the servers stream large resources from disk.
Related
I have a webapp that serves up a significant but not ridiculous amount of static data (~3gigs of smallish files).
With Next.JS and my home computer this is no issue. I dump the content in /public and it serves it right up. Runs fantastic.
But when I deploy to Heroku, the slug has size limits, git has size limits, etc. Their processes are more oriented towards CDNs, which are great but I still need to have the Next.JS server offer up the content to be CDN'd. Most of the documentation pushes the user to use S3 for this sort of storage, but then I have to write a custom server for Next.JS, link up to S3, etc.
Doable, but feels like a bit more work than necessary. Hoping someone might have a simpler method to suggest?
I apologize in advance for the total noob question about Handlebars. I'm coming from a Rails background and while Handlebars looks cool I'm having trouble seeing it's advantage over compiling views on the server.
So here are the questions.
Currently, our apps compile Handlebars JS templates on our node server, and pass fully rendered pages back to the client. I've been asked to look into precompiling the templates for render on the client.
First, I'm a bit perplexed by how to architect this. Would the initial download to the client just be like a layout template (just boilerplate html, css, and js), then the client would use whatever json data gets passed to it, along with the precompiled templates sitting in Handlebars.templates to build out the details of the views?
If so, is it really more efficient to load up the client with every possible template it may need, rather then just serve up only what it needs at the time it needs it?
First, I'm a bit perplexed by how to architect this. Would the initial download to the client just be like a layout template (just boilerplate html, css, and js), then the client would use whatever json data gets passed to it, along with the precompiled templates sitting in Handlebars.templates to build out the details of the views?
If you are doing things robustly, then you would serve up the server rendered page as usual.
That would include a <script> which had the templates embedded in it. They only come in to play at the point when a second page would normally be loaded from the server.
If so, is it really more efficient to load up the client with every possible template it may need, rather then just serve up only what it needs at the time it needs it?
You have an extra cost on initial load. If your data is appropriate then this pays off in the long run as you don't need to refetch entire HTML documents every time a new page based off the same template is visited (and in some cases, where all the changes can be calculated client side, you can avoid some HTTP requests entirely).
It is not faster for initial load.
But if you're doing a "single-page" application, then every changes after the initial load will be way faster. It'll give a more dynamic and fast feeling.
But if you do rails with forms and if you're not doing a "single-page" like app, then there's no reason to render client side.
This really depends on the particular needs and limitations of your application. The advantage of templates on the client is reduced http requests and snappier performance. The disadvantage is increased bloat. So depending on where your performance bottlenecks are located, it may be better or worse for you to increase http requests compared to adding bloat to the initial payload. Of course, you could lazy load a package of the templates, etc. etc. etc.
As for architecting it, yes, your templates are waiting client-side. When you request JSON from the server, you plug the data into the templates and render on the page.
I am considering using nodejs to make a non-realtime app. For example, a website like a blog, forum, or image boards.
I have read that nodejs is good when used for asynchronous jobs. So I am wondering what the result would be when used to serve a lot of static files, like big images, css & js files, etc.
Is it true that when sending a file (suppose it's 2-3MB), the whole server will be blocked until the transfer is complete? I have also read that it might be possible to use the OS's sendfile() syscall to do this job. In this case, does Express support this?
No it is not true. You can easily send files that are large (much larger than 2-3 MB) without blocking. People who complain about things like this blocking the Node event loop just don't know what they're doing.
You don't necessarily need to use Express to get this behavior.
That being said, if what you want is a file server, there's no reason to use NodeJS. Just point Apache at a directory, and let it fly. Why reinvent the wheel just to use the new sexy technology, when old faithful does just fine?
If you would like to use node as a simple http server, may I recommend the very simple command line module.
https://npmjs.org/package/http-server
I haven't looked at the code of the module, but it is likely not optimized for large files. Let's define large in this case, as files that are not easily cached in memory(whatever this means for your setup). If your use case calls for more optimization (piping "large" files for example) you may still have to write your own module, but this will get you started very quickly, and is an excellent utility to use for general development when you need to serve up a directory real quick.
I am pretty confused with the architecture behind how data is persisted in Docpad. From blogs and forums, I got to know in-memory (and/or out directory) is used for generated contents. But one of the selling points of Docpad is "completely file based". From the sound of it, hosting it on Heroku or any ephemeral file system doesn't seem logical. Can anyone give some explanation/clarification?
DocPad is pitched as a next generation web architecture. This mindmap showcases why we call it that perfectly:
DocPad Architecture Vision http://d.pr/i/jmmZ+
The workflow being like so:
Importers bring data in, from any source, be it the local file system, or tumblr, or mongo database.
These get injected into the DocPad in-memory database
At generation time, DocPad will then render what needs to be rendered, and output static content into the out directory
Dynamic documents (documents that re-render on each request) and dynamic abilities (server extensions) are now able to make use of the in memory database and perform advanced cool stuff like file uploads, contact forms, search pages, whatever
In that sense, DocPad is a next generation web architecture that has static site generation abilities, as well as dynamic site generation abilities. What separates DocPad from traditional web architectures, is that traditional web architectures consider the content and templating separate beings, where DocPad considers them the same and just separated by their extension. Traditional web architectures also are dynamic by defaults, with static site generation accomplished via caching, rather than the other way round of being static by default.
Because of this load everything in the in-memory database situation, we are suffering some from growing pains with performance during generation and post-generation. Discussion here. However there is nothing there that can't be fixed with enough time and resources. Regardless of this, DocPad will still be faster than your traditional web architecture due to the static nature (faster requests) as well as the asynchronous nature (faster generations).
In terms of how you would handle file uploads:
If you are doing a static website with DocPad, you would have a backend API server somewhere else that you would do the upload too and load the data into DocPad as a single page application style.
If you are doing a dynamic website with DocPad, you would host DocPad on a server like Heroku, and extend the server to handle the file upload to a destination like Amazon S3, Dropbox, or into MongoDB or the like. You can then choose to expose the file via templateData as a link, or inject the file into the DocPad in-memory database as a file. Which one you chose is whether or not you just want to reference the upload or treat it as a first class citizen in the DocPad universe (it gets it's own URL and page).
For dynamic sites, I would say I really go with the static site + single page application approach. You get benefits like responsive design, offline support, really fast UX which without doing it that way, you struggle a bit accomplishing it with the dynamic site approach regardless of which web architecture you build it on.
Well, I can't top off Benjamin's excellent explaination, but if you want a TLDR explaination:
docpad is used to (biggest-use-case) generate STATIC websites, a-la github pages or old websites of 1990s. You can write your pages in whatever you like (Jade, eco, coffeescript, etc) and it will compile the pages and output HTML files. Think of it as a "Compile-once-server-forever" thing.
On the other hand, if you want Dynamic content on your site, you'd like to use Nodejs for pulling in the dynamic data from other sites, or generating it on the fly.
As for your concern about Heroku's ephemeral file system, (I don't know exactly how what works) you can use Amazon's S3 for storage. Check out this
Evaluating Nodejs and trying to see if it will fit our needs. Coming from a rails world, I have a few unanswered questions despite searching for a long time.
Whats the best way to manage assets with Nodejs (Expressjs)? In rails, the static assets are a) fingerprinted for caching forever b) js and css are minified 3) scss is compiled down to css
Whats the best way to handled uploaded images from users such as avatars?
Does grunt help with minifying and gzipping html/css/javascript?
How can I avoid mutliple http requests to the server with Node. I don't want to make multiple http requests for every javascript asset I need. Rails helps here by combining all js and css files.
Also, is Mongodb the preferred solution for most projects? I hear a lot of bad things about Mongodb and good things about it. Having a difficult time deciding if Mongo can help with more reads than writes with probably 400-500 GB data in the long run.
Any help or pointers? Thanks so much for taking the time to point me to the right places.
For each of the point you mentioned I give you a few module examples that might fit your need. Remember that at every point there are much more modules serving the same purpose:
node-static (as a static file server), node-uglify (for minifying JS code), css-clean (same for CSS), merge-js, sqwish, and jake can help you with the building of the website (in this step you could plug-in the previous modules)
node-formidable is pretty famous
check out this question
checkout this question
I am not sure if it is the "preferred". It is the noSQL and the Javascript nature in it that makes it attractive. There are modules already for every type of database. ongo should handle that data. Depends also how large is one document. There are some limitations.
There is this Github Wiki page in the NodeJS project listing and categorizing many of the important modules out there.
The exact choice of modules also depends what framework will you use use to build the app. A pretty established (but certainly not the only one) is express. But you can find more on this topic here.