PWA app.js how to move code to many smaller files - node.js

I've written PWA application, application isn't big, but now my app.js has 800 lines of the code. It has many methods. How to move these methods to another files divided thematically?
require doesn't work

You have a few options depending on what browsers you support.
You may be able to use native support for modules. You can find more information about this in https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules This would be one of the simpler solutions as it does not require any additional tooling but at this time the support outside chrome is not very good.
A second alternative is to break up your code into multiple JS files and just load them all separately. This can have performance implications but if your files are small and few it wont cause too many problems. Just ensure that the code produced in these files put themselves onto a name object to avoid conflicts.
Ex file
(function() {
window.mycode = {};
window.mycode.func = function() {...};
})();
A third option is to use an existing module loader in the browser such as https://requirejs.org/
The fourth option, which is probably the most common, is to integrate a build step into your code that uses npm and a module loader such as webpack or browserify. This also lets you integrate babel which is really common among large javascript projects. The downside is it adds a step to your deployment that needs to be run and you need to learn how to use tools like webpack(which is surprisingly complicated). However, if you do javascript dev you will need to be familiar with them eventually.

Related

Best way to distribute modules used the same framework

I am creating my first open source project, and I am making some plugins for it. These plugins will be published as npm packages, and they will have identical dependencies.
My question is, what I the best way to deliver them and avoid code repetition? I know I can use something like Rollup.js to pack all dependencies used by that module in the final distribution js file, but if the user is using multiple modules, the inlined dependencies will be repeated and make the file bloat.
I know end user can use a bundler to remove those repeated codes, but is there anything more I can do to reduce the size of my distribution js files?

React: Importing modules with object destructuring, or individually?

In React, some packages allow you to import Components using either individual assignment: import Card from "#material-ui/core/Card", or via object destructuring: import { Card } from "#material-ui/core".
I read in a Blog that using the object destructuring syntax can have performance ramifications if your environment doesn't have proper tree-shaking functionality. The result being that every component of #material-ui/core is imported, not just the one you wanted.
In what situations could using object destructuring imports cause a decline in application performance and how serious would the impact be? Also, in an environment that does have all the bells and whistles, like the default create-react-app configuration, will using one over the other make any difference at all?
Relying on package internal structure is often discouraged but it's officially valid in Material UI:
import Card from '#material-ui/core/Card';
In order to not depend on this and keep imports shorter, top-level exports can be used
import { Card } from "#material-ui/core"
Both are interchangeable, as long as the setup supports tree-shaking. In case unused top-level exports can be tree-shaken, the second option is preferable. Otherwise the the first option is preferable, it guarantees unused package imports to not be included into the bundle.
create-react-app uses Webpack configuration that supports tree-shaking and can benefit from the second option.
Loading in extra code, such as numerous components from material-ui you may not need, has two primary performance impacts: Download time, and execution time.
Download time is simple: Your JS file(s) are larger, therefore take longer to download, especially over slower connections such as mobile. Properly slimming down your JS using mechanisms like tree shaking is always a good idea.
Execution time is a little less apparent, but also has a similar effect, this time to browsers with less computing power available - again, primarily mobile. Even if the components are never used, the browser must still parse and execute the source and pull it into memory. On your desktop with a powerful processor and plenty of memory you'll probably never notice the difference, but on a slower/older computer or mobile device you may notice a small lag even after the file(s) finish downloading as they are processed.
Assuming your build tooling has properly working tree shaking, my opinion is generally they are roughly equivalent. The build tool will not include the unused components into the compiled JS, so it shouldn't impact either download or execution time.

Using the same module in multiple files/modules

Let's say I need a promise module, and I am using it in multiple files, I include all those files in app.js. Do I have to use require the promise module in each of them? Is there a way to pass it to the imported module?
Yes, you should put a var Promise = require('bluebird') statement at the top of every file that uses it. This is how node/commonjs express dependencies. Sometimes people react to this initially by wanting to go back to global variables and just require something in one file and have it be implicitly/globally available in every other file in their application, but as an industry we have had years and years on both approaches and explicit dependency stating via require makes dependency management more, well manageable, overall. This is especially true in the case of automated tooling (browserify, webpack, etc).

Suitescript - 1 big script file, or multiple smaller files

From a performance/maintenance point of view, is it better to write my custom modules with netsuite all as one big JS, or multiple segmented script files.
If you compare it with a server side javascript language, say - Node.js the most popular, every module is written into separate file.
I generally take the approach of Object oriented javascript and put each class in a separate file which helps to organise the code.
One of the approach you can take is in development keep separate files and finally merge all files using js minifier tool like Google closure compiler when you deploy your code for production usage which can give you best of both worlds, if you are really bothered about every nano/mini seconds of performance.
If you see SuiteScript 2.0 architecture, it encourages module architecture which is easier to manage as load only those modules that you need, and it is easier to maintain multiple code files i.e. one per module considering future enhancements, bug fixes and code reuse.
Performance can never be judge by the line count of your module. We generally maintain modules for maintaining the readability and simplicity of the code. It is a good practice to put all generic functionalities in to an Utility script and use it as a library across all the modules. Again it depends on your code logic and programming style. So if you want to create multiple segments of your js file for more readability I dont think its a bad idea.

Using UglifyJs on the whole Node project?

I need to obfuscate my source code as best as possible so I decided to use uglifyjs2.. Now I have the project structure that has nested directories, how can I run it through uglifyjs2 to do the whole project instead of giving it all the input files?
I wouldn't mind if it minified the whole project into a single file or something
I've done something very similar to this in a project I worked on. You have two options:
Leave the files in their directory structure.
This is by far the easier option, but provides a much lower level of obfuscation since someone interested enough in your code basically has a copy of the logical organization of files.
An attacker can simply pretty-print all the files and rename the obfuscated variable names in each file until they have an understanding of what is going on.
To do this, use fs.readdir and fs.stat to recursively go through folders, read in every .js file and output the mangled code.
Compile everything into a single JS file.
This is much more difficult for you to implement, but does make life harder on an attacker since they no longer have the benefit of your project's organization.
Your main problem is reconciling your require calls with files that no longer exist (since everything is now in the same file).
I did this by using Uglify to perform static analysis of my source code by analyzing the AST for calls to require. I then loaded the source code of the required file and repeated.
Once all code was loaded, I replaced the require calls with calls to a custom function, wrapped each file's source code in a function that emulates how node's module system works, and then mangled everything and compiled it into a single file.
My custom require function does most of what node's require does except that rather than searching the disk for a module, it searches the wrapper functions.
Unfortunately, I can't really share any code for #2 since it was part of a proprietary project, but the gist is:
Parse the source text into an AST using UglifyJS.parse.
Use the TreeWalker to visit every node of the AST and check if
node instanceof UglifyJS.AST_Call && node.start.value == 'require'
As I have just completed a huge pure Nodejs project in 80+ files I had the same problem as OP. I needed at least a minimal protection for my hard work, but it seems this very basic need had not been covered by the NPMjs OS community. Add salt to injury the JXCore package encryption system was cracked last week in a few hours so back to obfuscation...
So I created the complete solution, that handles file merging, uglifying. You have the option of leaving out specified files/folders as well from merging. These files are then copied to the new output location of the merged file and references to them are rewritten auto.
NPMjs link of node-uglifier
Github repo of of node-uglifier
PS: I would be glad if people would contribute to make it even better. This is a war between thieves and hard working coders like yourself. Lets join our forces, increase the pain of reverse engineering!
This isn't supported natively by uglifyjs2.
Consider using webpack to package up your entire app into a single minified .js file, excluding node_modules:
http://jlongster.com/Backend-Apps-with-Webpack--Part-I
I had the same need - for which I created node-optimize and grunt-node-optimize.
https://www.npmjs.com/package/grunt-node-optimize

Resources