Bundling, RequireJS, and CDNs - Do they cancel each other out? - requirejs

I know what you're all thinking, can't he just use all 3? Yes, I could, but my question is this: do they not cancel each other out?
tl;dr
Does bundling make RequireJS obsolete?
Does bundling make CDNs obsolete?
If I have to make a choice between bundling or RequireJS+CDN, how do I determine which to use? Is it best practice to CDN common files, and bundle uncommon? What about SPA files?
The long version
For example, we bundle files for fewer HTTP requests. Whereas RequireJS is used so that files can be loaded dynamically and asynchronously, as they're needed.
So if we bundle all our JS files into 1, is RequireJS even needed?
Also, we use a CDN so the browser can access a cached version, so if the user visits 5 websites, all of which use a popular jQuery CDN - i.e. http://cdn.com/jquery.min.js - the browser actually needs to make no HTTP requests for this file on the last 4 sites, which is much faster than that file getting bundled. If those sites were to instead each access a unique bundled file from the CDN - specific to their needs - the odds of a cache "hit" become much lower. And, since the odds of a "hit" are lower, the file is less likely to be cached for even the people using that site.
So does it really help for a bundled file to even be on a CDN?

Whereas RequireJS is used so that files can be loaded dynamically and asynchronously, as they're needed.
Many developers use RequireJS to have a clean way to separate concerns without the need for loading modules dynamically. I have a few applications and libraries developed with RequireJS. One of them is a library that does not use the dynamic loading capabilities at all and would not benefit from using them. In almost all use-case scenarios for this library, it should be loaded as a single bundle, and not as a collection of modules, one-by-one.
This brings me to another thing: the decision is not between bundling or RequireJS. You can use r.js (RequireJS' optimizer) to bundle your RequireJS modules into a single file for production. This could include 3rd party modules if you want. Or you could leave them out to be served from a CDN. An application can be optimized to a single bundle, or a series of bundles. As any other optimization question, what should be done depends on the architecture of the specific application being considered.
The rest of your question is wholly opinion-based. I've seen detailed arguments supported by statistics that you should use a CDN, that you should not use a CDN, and everything in between. It is an optimization question, which, again, depends on the specifics of the application being considered.

So if we bundle all our JS files into 1, is RequireJS even needed?
Not all of it. You might be interested in almond.
A replacement AMD loader for RequireJS. It provides a minimal AMD API footprint that includes loader plugin support. Only useful for built/bundled AMD modules, does not do dynamic loading.

Related

How to create shared language resources with i18next in multi-app node/express & react monorepo?

I just started to use i18next in my monorepo (new to it), which has several serverside microservices and a couple of frontends, including several custom library components. Many of the language strings are shared, and some are app specific. I could not decide on a logical way to approach the problem.
Tooling: Typescript - Node/Express - React/Vite - Electron/React (Desktop)
Firstly, questions:
Where should I keep the language resources during development? In another library? App in monorepo? Under each library module?
From where should I serve them? Something like lang.mydomain.com? Re-dividing them under each app during build (e.g. with Vite)?
All examples/tutorials I could reach use a single app and use i18next.js/ts to be included at the app level. I think I need to wrap it into a library module for my purposes. How can I do that, without losing access to its capabilities/types/methods etc? Dynamically creating instances in a higher-order module (the library is extensive, and I'm nearly lost)?
My initial thoughts:
As many translations will be shared, translating each one will be illogical, they should be shared.
As there can be many languages, it seems using i18next-http-backend is logical for web and embed with i18next-fs-backend for desktop app.
Dividing the resources as common/graphs/tables/ui etc will be logical (these will be divided in a library module hierarchy though).
A logical way can be to include a module's language resources in the module but that would not help translators, they should be at the same place at the top level in this respect.
PS: I used to use react-intl-universal, it is really easy, but it's release schedule is falling back.
Thank you in advance...

Is Node.js useful for "classic" style websites?

I am considering using nodejs to make a non-realtime app. For example, a website like a blog, forum, or image boards.
I have read that nodejs is good when used for asynchronous jobs. So I am wondering what the result would be when used to serve a lot of static files, like big images, css & js files, etc.
Is it true that when sending a file (suppose it's 2-3MB), the whole server will be blocked until the transfer is complete? I have also read that it might be possible to use the OS's sendfile() syscall to do this job. In this case, does Express support this?
No it is not true. You can easily send files that are large (much larger than 2-3 MB) without blocking. People who complain about things like this blocking the Node event loop just don't know what they're doing.
You don't necessarily need to use Express to get this behavior.
That being said, if what you want is a file server, there's no reason to use NodeJS. Just point Apache at a directory, and let it fly. Why reinvent the wheel just to use the new sexy technology, when old faithful does just fine?
If you would like to use node as a simple http server, may I recommend the very simple command line module.
https://npmjs.org/package/http-server
I haven't looked at the code of the module, but it is likely not optimized for large files. Let's define large in this case, as files that are not easily cached in memory(whatever this means for your setup). If your use case calls for more optimization (piping "large" files for example) you may still have to write your own module, but this will get you started very quickly, and is an excellent utility to use for general development when you need to serve up a directory real quick.

Nodejs asset management

Evaluating Nodejs and trying to see if it will fit our needs. Coming from a rails world, I have a few unanswered questions despite searching for a long time.
Whats the best way to manage assets with Nodejs (Expressjs)? In rails, the static assets are a) fingerprinted for caching forever b) js and css are minified 3) scss is compiled down to css
Whats the best way to handled uploaded images from users such as avatars?
Does grunt help with minifying and gzipping html/css/javascript?
How can I avoid mutliple http requests to the server with Node. I don't want to make multiple http requests for every javascript asset I need. Rails helps here by combining all js and css files.
Also, is Mongodb the preferred solution for most projects? I hear a lot of bad things about Mongodb and good things about it. Having a difficult time deciding if Mongo can help with more reads than writes with probably 400-500 GB data in the long run.
Any help or pointers? Thanks so much for taking the time to point me to the right places.
For each of the point you mentioned I give you a few module examples that might fit your need. Remember that at every point there are much more modules serving the same purpose:
node-static (as a static file server), node-uglify (for minifying JS code), css-clean (same for CSS), merge-js, sqwish, and jake can help you with the building of the website (in this step you could plug-in the previous modules)
node-formidable is pretty famous
check out this question
checkout this question
I am not sure if it is the "preferred". It is the noSQL and the Javascript nature in it that makes it attractive. There are modules already for every type of database. ongo should handle that data. Depends also how large is one document. There are some limitations.
There is this Github Wiki page in the NodeJS project listing and categorizing many of the important modules out there.
The exact choice of modules also depends what framework will you use use to build the app. A pretty established (but certainly not the only one) is express. But you can find more on this topic here.

Is there a way to precompile node.js scripts?

Is there a way to precompile node.js scripts and distribute the binary files instead of source files?
Node already does this.
By "this" I mean creating machine-executable binary code. It does this using the JIT pattern though. More on that after I cover what others Googling for this may be searching for...
OS-native binary executable...
If by binary file instead of source, you mean a native-OS executable, yes. NW.JS and Electron both do a stellar job.
Use binaries in your node.js scripts...
If by binary file instead of source, you mean the ability to compile part of your script into binary, so it's difficult or impossible to utilize, or you want something with machine-native speed, yes.
They are called C/C++ Addons. You can distribute a binary (for your particular OS) and call it just like you would with any other var n = require("blah");
Node uses binaries "Just In Time"
Out of the box, Node pre-compiles your scripts on it's own and creates cached V8 machine code (think "executable" - it uses real machine code native to the CPU Node is running on) it then executes with each event it processes.
Here is a Google reference explaining that the V8 engine actually compiles to real machine code, and not a VM.
Google V8 JavaScript Engine Design
This compiling takes place when your application is first loaded.
It caches these bits of code as "modules" as soon as you invoke a "require('module')" instruction.
It does not wait for your entire application to be processed, but pre-compiles each module as each "require" is encountered.
Everything inside the require is compiled and introduced into memory, including it's variables and active state. Again, contrary to many popular blog articles, this is executed as individual machine-code processes. There is no VM, and nothing is interpreted. The JavaScript source is essentially compiled into an executable in memory.
This is why each module can just reference the same require and not create a bunch of overhead; it's just referencing a pre-compiled and existing object in memory, not "re-requiring" the entire module.
You can force it to recompile any module at any time. It's lesser-known that you actually have control of re-compiling these objects very easily, enabling you to "hot-reload" pieces of your application without reloading the entire thing.
A great use-case for this is creating self-modifying code, i.e. a strategy pattern that loads strategies from folders, for example, and as soon as a new folder is added, your own code can re-compile the folders into an in-line strategy pattern, create a "strategyRouter.js" file, and then invalidate the Node cache for your router which forces Node to recompile only that module, which is then utilized on future client requests.
The end result: Node can hot-reload routes or strategies as soon as you drop a new file or folder into your application. No need to restart your app, no need to separate stateless and stateful operations: Just write responses as regular Node modules and have them recompile when they change.
Note: Before people tell me self-modifying code is as bad or worse than eval, terrible for debugging and impossible to maintain, please note that Node itself does this, and so do many popular Node frameworks. I am not explaining original research, I am explaining the abilities of Google's V8 Engine (and hence Node) by design, as this question asks us to do. Please don't shoot people who R the FM, or people will stop R'ing it and keep to themselves.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
Angular 2, Meteor, the new opensource Node-based Light table IDE and a bunch of other frameworks are headed in this direction in order to further remove the developer from the code and bring them closer to the application.
How do I recompile (hot-reload) a required Node module?
It's actually really easy... Here is a hot-reloading npm, for alternatives just Google "node require hot-reload"
https://www.npmjs.com/package/hot-reload
What if I want to build my own framework and hot-reload in an amazing new way?
That, like many things in Node, is surprisingly easy too. Node is like jQuery for servers! ;D
stackoverflow - invalidating Node's require cache

How to deal with dependencies in shared libraries, unix

I have created a dynamic(.so) library which bundles some functionality for a storage backend that I need.
As it is, it offers a known interface and provides backends for things like memcached, mysql, sqlite... etc.
Now my problem is that my shared library depends on libmemcached, on libsqlite3, on libmysqlclient.. etc., and I don't know how to pack it since clients that only want sqlite wouldn't need to have libmemcached installed.
I've been thinking on splitting it on different libraries but it seems like I'll end up with almost 20 .so libraries and I don't like that idea.
Any alternative?
One alternative is to put an interface within the shared library you made, which allows it to load dependencies at runtime. So, as an example you can have separate
init functions for different components:
init_memcached();
init_sqlite();
You implement these initialization functions using dlopen() and friends.
You can use dynamic loading using dlsym and dlopen.
The advantage of this approach is your application will run fine when the shared library is not found on client side.
You could load only needed shared libraries during run-time, but in my opinion, it is not so good approach.
I would split the shared library, but not into 20 libraries. See if you could group some common functionality.

Resources