Delivering Optimal performance with Typescript and RequireJS - requirejs

I have a Typescript project with plenty of circular dependencies. I am using requireJS for loading files on the browser. I am trying to refactor the project and I don't really know how to do it for optimal performance.
Here's the situation.
A.ts references B.ts and C.ts
B.ts references A.ts and C.ts
C.ts references A.ts and B.ts
I can either put reference tags in them and compile them with the --out option in the Typescript compiler into a single file. This make it hard to compile and debug so I am not inclined on this one.
I can use RequireJS to load them as independent modules.However, I am concerned because of the fact that there are so many circular dependencies - the browser will make too many requests for the same file. Secondly, I'm wondering if one one file request will cost considerbly lower bandwidth than multiple files of equal to to the sum of their sizes.
Please advise.

You will need to call require again to actually get the instance. See http://requirejs.org/docs/api.html#circular
RequireJS never requests the same file twice on the network so you should not worry about that.

Related

Test both source code and bundled code with jest

Let's say I am developing an NPM module.
I am using Jest for the testing, Webpack to bundle it and TypeScript in general.
When I test the source code, everything is fine, with also a very good code coverage and all of that. But I think that it is not enough. It could be possible that something breaks after the Webpack bundle is generated, for instance a dynamic import (a require with a variable instead of a fixed path) that would become incorrect after the bundle, or other possible scenarios.
How should I write tests that cover also the bundle? Should I test against both the source code (so that I get good coverage) and the bundle? Usually I import things directly from a specific files (e.g. /utils/myutil.ts), but with the bundle this would be impossible. How to handle this?
I do test against the bundle for some of my projects. I do this for some libraries (npm).
To do this I create some code that imports the bundle and write tests against this code. Don't care about coverage in this case, I just want to verify that my library does what it's supposed to do.
In another case (not a library) I'm testing against the bundle but I'm running more integration/e2e tests.
Don't worry about coverage that much unless every functions (or most of them) of your code are going to be used by the final user. You should test something the way it is used. 100% coverage is nice to see but very impractical to achieve when projects get big and in any case it's a waste of time. Of course, some people will disagree :)

PWA app.js how to move code to many smaller files

I've written PWA application, application isn't big, but now my app.js has 800 lines of the code. It has many methods. How to move these methods to another files divided thematically?
require doesn't work
You have a few options depending on what browsers you support.
You may be able to use native support for modules. You can find more information about this in https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules This would be one of the simpler solutions as it does not require any additional tooling but at this time the support outside chrome is not very good.
A second alternative is to break up your code into multiple JS files and just load them all separately. This can have performance implications but if your files are small and few it wont cause too many problems. Just ensure that the code produced in these files put themselves onto a name object to avoid conflicts.
Ex file
(function() {
window.mycode = {};
window.mycode.func = function() {...};
})();
A third option is to use an existing module loader in the browser such as https://requirejs.org/
The fourth option, which is probably the most common, is to integrate a build step into your code that uses npm and a module loader such as webpack or browserify. This also lets you integrate babel which is really common among large javascript projects. The downside is it adds a step to your deployment that needs to be run and you need to learn how to use tools like webpack(which is surprisingly complicated). However, if you do javascript dev you will need to be familiar with them eventually.

Using UglifyJs on the whole Node project?

I need to obfuscate my source code as best as possible so I decided to use uglifyjs2.. Now I have the project structure that has nested directories, how can I run it through uglifyjs2 to do the whole project instead of giving it all the input files?
I wouldn't mind if it minified the whole project into a single file or something
I've done something very similar to this in a project I worked on. You have two options:
Leave the files in their directory structure.
This is by far the easier option, but provides a much lower level of obfuscation since someone interested enough in your code basically has a copy of the logical organization of files.
An attacker can simply pretty-print all the files and rename the obfuscated variable names in each file until they have an understanding of what is going on.
To do this, use fs.readdir and fs.stat to recursively go through folders, read in every .js file and output the mangled code.
Compile everything into a single JS file.
This is much more difficult for you to implement, but does make life harder on an attacker since they no longer have the benefit of your project's organization.
Your main problem is reconciling your require calls with files that no longer exist (since everything is now in the same file).
I did this by using Uglify to perform static analysis of my source code by analyzing the AST for calls to require. I then loaded the source code of the required file and repeated.
Once all code was loaded, I replaced the require calls with calls to a custom function, wrapped each file's source code in a function that emulates how node's module system works, and then mangled everything and compiled it into a single file.
My custom require function does most of what node's require does except that rather than searching the disk for a module, it searches the wrapper functions.
Unfortunately, I can't really share any code for #2 since it was part of a proprietary project, but the gist is:
Parse the source text into an AST using UglifyJS.parse.
Use the TreeWalker to visit every node of the AST and check if
node instanceof UglifyJS.AST_Call && node.start.value == 'require'
As I have just completed a huge pure Nodejs project in 80+ files I had the same problem as OP. I needed at least a minimal protection for my hard work, but it seems this very basic need had not been covered by the NPMjs OS community. Add salt to injury the JXCore package encryption system was cracked last week in a few hours so back to obfuscation...
So I created the complete solution, that handles file merging, uglifying. You have the option of leaving out specified files/folders as well from merging. These files are then copied to the new output location of the merged file and references to them are rewritten auto.
NPMjs link of node-uglifier
Github repo of of node-uglifier
PS: I would be glad if people would contribute to make it even better. This is a war between thieves and hard working coders like yourself. Lets join our forces, increase the pain of reverse engineering!
This isn't supported natively by uglifyjs2.
Consider using webpack to package up your entire app into a single minified .js file, excluding node_modules:
http://jlongster.com/Backend-Apps-with-Webpack--Part-I
I had the same need - for which I created node-optimize and grunt-node-optimize.
https://www.npmjs.com/package/grunt-node-optimize

Over-use of require() in node.js, mongoose

I'm new to Node.js, but quite like the module system and require().
That being said, coming from a C background, it makes me uneasy seeing the same module being require()'d everywhere. All in all, it leads me to some design choices that deviate from how things are done in C. For example:
Should I require() mongoose in every file that defines a mongoose model? Or inject a mongoose instance into each file that defines a model.
Should I require() my mongoose models in every module that needs them? Or have a model provider that is passed around and used to provide these models.
Ect. For someone who uses dependency injection a lot - my gut C feeling is telling me to require() a module only once, and pass it around as needed. However, after looking at some open-source stuff, this doesn't seem to be Node way of things. require() does make things super easy..
Does it hurt to overuse this mechanism?
require() caches modules when you use it. When you see the same file or module required everywhere it's only being loaded once, and the stored module.exports is being passed around instead. This means that you can use require everywhere and not worry about performance and memory issues.
As cptroot states requiring a module everywhere you need it instead of passing it around as an argument is safe to do and is also much easier. However, you should view any require call as a hardcoded dependency which you can't change easily. E.g. if you want to mock a module for testing these hardcoded dependencies will hurt.
So passing a module instance around as an argument instead of just requiring it again and again reduces the amount of hardcoded dependencies because you inject this dependency now. E.g. in your tests you will benefit from easily injecting a mock instead.
If you go down this road you will want to use a dependency injection container that helps you injecting all your dependencies and get rid of all hardcoded require calls. To choose a dependency injection container appropriate for your project you should read this excellent article. Also check out Fire Up! which I implemented.

Can you create Typescript packages? like c# dlls

Is there any sort of notion of packaging typescript files currently?
One thing im finding a pain at the moment when trying to migrate a pure javascript project over to typescript is the references, in some cases where I have complex objects I am having to write several reference statements pulling files from all over the place.
Part of this is down to my project layout, as its a pretty big and modular one, so I have a system like this:
- modules
|- module1
|- models
|- services
|- controllers
|- module2
|- models
|- services
|- controllers
|- core
|- models
|- services
|- data
|- validation
There is much more, but you get the point, now currently core is used by every module, but with javascript I just expect it to be loaded in at runtime, which will still need to happen, however as the typescript concerns are only really at compile time I was wondering if there was some notion of packaging all typescript files up into some typescript library or something, and then that could be referenced from projects rather than having module1 models referencing core models etc.
The problem currently revolves around the directory structures, as the namespaces work fine but if I move a file I need to go to every file which references that moved file and update it. Which is tiresome, whereas if there is some sort of package idea then I could just reference that once its output, so im no longer worrying about file systems and directories, im just worrying about a package and namespaces.
I think a lot of this is very similar to how C# works, you have a project which has references. Then every file within that project can use any of the classes within the references, so the code exposure is managed by references and namespaces.
I am thinking about having my build script just make a local references.ts file and just loop through every *.ts file in the relavant module and put them into one big file:
///<reference path="core/models/some-model.ts"/>
///<reference path="core/models/some-model-2.ts"/>
///<reference path="core/services/some-service.ts"/>
like shown above, then using this reference file in all typescript files which require core files, so this acts as a kind of project level reference, this may mean some files have references they dont need, but its compile time so I dont really care...
I dont want to go hand rolling my own solution to this problem if a good way already exists, Hope that makes sense...
== EDIT ==
I just wanted to post this up here as for my scenario has saved me TONS of time and has also reduced my reference guff by like 99%, this wont be applicable for people who don't have build scripts though.
Right now assuming you do have a build script I took the path of having a step in my script which went through every single file within a root level directory (module1, module2 etc in this case) and it would then output a local.references.ts into a references folder within that directory. Then I manually have written an external.references.ts which references external descriptors or other modules references wherever needed.
After this part was done when I am compiling my typescript I basically point it again at the root directories and tell it to compile them all (*/.ts) into one big js file (i.e module1.js). Now because this will automatically include the local and external references, I dont need to put ANY reference declarations in the individual clases.
So this way providing the local and external reference files (local.reference.ts, external.reference.ts) are included within the bulk processing of the files you just have to worry about namespaces, making it pretty much the same as how C# would operate.
If however you do not have a build script which is able to do your local reference generation and compilation of typescript then the comment link given would be a good option.
Currently there is no formal process for packaging your source typescript files as a pre built library as you are describing.
There are a few different solutions currently used much like the one linked in the comments which will allow you to put all your references into central typescript files then just reference them from your individual scripts, or the approach you put forward where you do the same sort of approach but rather than manually writing it you get your build script to generate the references for you and get the compilation process to inject the references in rather than explicitly referencing them in each file.
As Typescript gets more mature there may be more formal ways of doing this, but for the moment just take whichever solution works best given your tooling and appraoch to developing with Typescript.

Resources