Speeding up compilation of a Node.js project - node.js

I have a Java Spring Boot backend and React.js frontend. I need to place compiled Node.js app into folder "static" of my Spring Boot application so it can be served as static content. This is done using the command npm build.
The problem with this is the compilation is quite slow and consumes several seconds before it's done. On the other hand, when I run my frontend app directly with "npm start" then projecting local code changes into my webbrowser takes only one second.
It's not acceptable for me to wait 10s or more until build into my Spring Boot is done. Is there a way to "link together" node.js project files without any optimisations, or to speed up the build?

You're referring to a common pain point for repeatable builds, dependency installation consumes too much time. The only known workaround is to use a cache. Here's an example. Well, some people checkin node-modules, but that's just shooting yourself in the foot.
If you're feeling adventurous, you can also consider adding squid as proxy in your production build environment, which will help with faster docker image downloads in addition to just npm installs.

Related

How to package server side JavaScript for Node

I have been using webpack to build a server side app using express. This code is harder to debug since I don't have the immediate comfort of a web browser and if I use something like VS Code to debug, it won't accept breakpoints inside request handlers when using source maps. Besides it takes no time to compile if I just stick to Node compatible JS and skip all transpiling and whatnot. Further, if I use treeshaking, I can reduce the size, but what is the point of that when it is running on the server (no client will ever download it).
My point is that I don't see why one would want to create a bundle of server side code if the server don't have any issues with memory or other limiting factors. It's easier to read, takes no time to compile and is easier to debug.
So a question. Is it ok to have the server app as an npm package and deploy that? Is that what is common or what do people do?
The project my team works on is a TypeScript monorepo that includes a web application and server-side daemon processes. We use webpack for producing build artefacts for all targets.
I've read some posts saying there's no point using webpack server-side because it was designed for web (obv), and I can understand that point of view.
However, even if our code was JavaScript and didn't require any transpiling, we'd still use webpack for the server-side. I want a single, minimal set of files and no node modules to deploy to the servers.
Size itself doesn't matter that much for server side (though every bit helps), but running 'yarn install' on servers is out of the question IMO.

Is `ng serve` command suitable for production?

I'm building a small project using Angular 7. When you run
ng serve
and a NodeJS server is spun up to handle requests, does each request block until processing is complete? We are trying to evaluate how effective using this in production would be as opposed to using a more traditional application server.
Run build --prod to generate a "./dist" folder.
Then you have to put that on a web server.
You can use Angular Server Side Rendering (SSR) to run it on a node.js server.
You should not use ng serve for production because it use webpack-dev-server that is build for development only.
Github link
ng serve runs a webpack development server behind the hood.
a development server.
It's made to mimic the production build and see your final application in an esay way.
If you didn't have that command, you would need to run a command like simplehttpserver after rebuilding all of your application on every change.
This is a convenience tool provided by the CLI to ease your development, in no case it's suited for production mode. This is a server without security, without optimization, without performance, without ... Well, without anything that makes a server, a server. By default, it deosn't even make your application accessible outside of your localhost. Not so useful for a production mode ...
So, never, I repeat, never, use this command for your production server.
Run ng build --prod
It will generate minification code in "dist" folder. you have to upload the file content of this "dist" folder. It will give faster response for loading web pages.
For more details please refer Angular deployment guide
When using ng serve, you are spawning a backend nodejs environment with a web server to handle requests towards your angular application. That's great for reloading and quick startup when developing. But needing such resources for static pages is unnecessary.
At the end of the day Angular is just a framework telling you its opinion on how to build an SPA. No matter the framework or library you use, you will always end up with an index.xxx, Javascript files and other resource files from vendors or internally. Only these matters to the browser loading the webpage.
Hence, you need to build your app to generate the static files that will be served (i.e. ng build --prod). Then you have 2 good options:
Choose a web server that will serve the files (i.e. nginx) on a dedicated server (or even container).
Place the files behind a CDN provider. Since they are static, they will be cached and served to a browser requesting them based on its location.
I would opt for #2 as opposed to #1 forcing you to keep running resources (CPU, RAM, HDD) for files that will be requested not that often. I say not often because your SPA will be handling all routes within itself in the client's browser (and minimum once a day will request a cache refresh).

What's the lightest way to automate .vue file compilation without webpack?

I'm working on a web application that currently uses vuejs for part of its interface. The back-end is NOT in Node, so there is currently no package.json file or any tool from the typical npm stack in this repository.
We already have a bunch of non-npm dependencies that need to be installed in order to use the repository, so my coworkers aren't too open about the idea of adding another layer of complexity. I can't blame them for that, it's the reason why I use npm scripts and not even gulp in my other projects. I'm tired of spending hours learning and configuring build tools that never end up doing what I want anyway.
But since the vue-cli tool no longer includes the build command, I'm a bit stuck. Is there really no more CLI app to build vue files at all? And if so, what would be the smart way to use vue without webpack? Template strings are not maintainable at all, and <script type="text/x-template"> don't work when you want to use multiple components from multiple files in the same page.
I realize your question says 'without webpack' but you may be interested in backpack - a CLI app i came across for building Vue.js without requiring you to write any configuration code. It is basically webpack preconfigured as a minimalistic build system for Node.js. It provides two commands, dev for live reload enabled development and build for building you project.

Meteor bundle vs meteor --production

I am facing dilema whether I should bundle node js app from meteor or just run meteor --production.
I am mostly interested in performance impact. I have found some explanation on here, but it is not clearly stated that meteor runs in production mode.
Running just meteor --production will simplify my deployment process a lot.
I would like to know are there any reasons to stick to bundle?
I think when you run meteor --production, you are still running as if you are in development-mode, only using "production" settings and such. You are still getting an internal/local MongoDB, you are still burning CPU time monitoring files, etc.
If this is true, then the end result is that you will not scale at all. I doubt that running local MongoDB uses optlog, which is a HUGE performance boost for Meteor apps.
Your best bet would be to look at some automated build/deploy tools. I have personally used mup and mupx. The latest version of mup builds your app, sets up MongoDB (if you want) and nginx, builds them all as docker images, and deploys them. You can even setup SSL certs w/ nginx (although no Let's Encrypt support yet :(). Or, you could easily script the deployment yourself using any number of tools, including just raw scripts. I think in the long run you will be in much better shape than trying to run the app using the meteor command.

Testing elixir release build with exrm

I am building phoenix application with exrm.
Good practice suggests, that I should make tests against the same binary, I'll be pushing to production.
Exrm gives me the ability to deploy phoenix on machines, that don't have Erlang or Elixir installed, which makes pulling docker images faster.
Is there a way to start mix test against binary built by exrm?
It should be noted that releases aren't a binary file. Sure they are packaged into a tarball, but that is just to ease deployment, what it contains is effectively the binary .beam files generated with MIX_ENV=prod mix compile, plus ERTS (if you are bundling it), Erlang/Elixir .beam files, and the boot scripts/config files for starting the application, etc.
So in short your code will behave identically in a release as it would when running with MIX_ENV=prod (assuming you ran MIX_ENV=prod mix release). The only practical difference is whether or not you've correctly configured your application for being packaged in a release, and testing this boils down to doing a test deployment to /tmp/<app> and booting it to make sure you didn't forget to add dependencies to applications in mix.exs.
The other element you'd need to test is if you are doing hot upgrades/downgrades with your application, in which case you need to do test deploys locally to make sure the upgrade/downgrade is applied as expected, since exrm generates default .appup files for you, which may not always do the correct thing, or everything you need them to do, in which case you need to edit them as appropriate. I do this by deploying to /tmp/<app> starting up the old version, then deploying the upgrade tarball to /tmp/<app>/releases/<new version>/<app>.tar.gz, and running /tmp/<app>/bin/<app> upgrade <version> and testing that the application was upgraded as expected, then running the downgrade command for the previous version to see if it rolls back properly. The nature of the testing varies depending on the code changes you've made, but that's the gist of it.
Hopefully that helps answer your question!

Resources