Packaging requirejs optimized files in war - requirejs

In a large web application, I'm using requirejs amd modules so that the scripts themselves are modular and maintainable. I have the following directory structure
web
|-src
|-main
|-java
|-resources
|-webapp
|-static
|-scripts
|-styles
|-images
|-static-built //output from r.js. not checked into git
|-WEB-INF
During build js and css are optimized using r.js into static-built folder. Gradle is the build tool.
Now the problem: The jsps refer to the scripts in static/scripts folder and this is how i want when working locally. However when building war, I want the static files to be served from static-built folder. The important thing is the source jsp should not have to change to serve the optimized files from static-built folder.
Two options that I have are: a) the gradle build while making war should include static-built instead of static. b)include static-built in addition to static and using tuckey urlrewrite pick the resouce from static-built rather than static.
What best practices are the community following in similar scenarios?

We've setup the server to have a runtime profile (dev, qa, prod, etc) read from a system property which determines some settings based on it. When running in production profile we serve the optimized files from the WAR. In development we serve the non-minified and non-concatenated files directly from the filesystem outside the application context.
Files are structured according to the official multipage example.
Configuring serving files depends on your chosen backend solution. Here's an example for spring.
Alternatively, r.js can generate source maps and those will help with development as well.

Not sure if this question is outdated already, but I had a kind of similar problem.
I had similar project structure, but with the only difference - I've split the project into 2 modules:
one of them (let's call it service) was java-module for back-end
the second one contained only js and other stuff related to front-end (let's call it ui).
Then in Gradle build 'assemble' task of the service depends on 'assemble' task of ui AND another custom task called 'pre-assemble'. This 'pre-assemble' task was copying the optimized js files to place where I wanted them to be.
So, basically, I've just added another task that was responsible for placing all the optimized js files in the proper place.

Related

Building monorepo babel-transpiled node JS application with dependencies

I am working on a project that is hosted as a monorepo. For simplification purposes let's say that inside there are three self-explanatory packages: server, a webapp client and library. The directory structure would be something like the following:
the-project
packages
server
src
webapp
src
library
src
All packages employ flow type notation, use a few >ES5 features and, for this reason, go through babel transpilation. The key difference is that transpilation of the webapp package is done via webpack, whereas server employs a gulp task that triggers script transpilation through the gulp-babel package. library is transpiled automatically when web is built.
Now, the problem I have is that for server to build, babel requires library to be built first and its package.json to specify its (built) main JS source file so its transpiled artifacts can be included. As you can imagine, this would quickly become problematic if the project were to contain multiple libraries that are actively being developed (which it does), as all would require building, including any dependent packages (like server in this simple case).
As an attempt to overcome this annoyance, I initially thought of using webpack to build the server, which would take care of including whatever dependencies it requires into a bundle, but I ran into issues as apparently webpack is not meant to be used on node JS applications.
What strategies are available for building a node JS application requiring Babel transpilation, such that the application's source files as well as any dependencies are built transparently and contained in a single output directory?
Annex A
Simplified gulp task for transpilation of scripts, as employed by server.
return gulp
.src([`src/**/*.js`], { allowEmpty: true })
.pipe(babel({ sourceMap: true }))
.pipe(gulp.dest('dist'));
As can be seen above, only server's own source files are included in the task. If src were to be changed to also include library, the task would emit the dependencies' artifacts in server's own output directory and any require('library') statements within would attempt to locate the built artifacts in packages/library and not packages/server/dist, thus resulting in import failures.
First of all, I am not sure what your server is doing. If it is doing a database connection or some calculations then I would not recommend it to be built by webpack. Whereas If your server is just doing Server-Side Rendering and making some API calls to other servers then I would recommend it to be bundled using webpack.
A lot of projects follow this philosophy. For example, you can take a look at something similar, I have done in one of my personal projects [Blubus]. Specifically, you might be interested in webpack-server-config. And also you can take a look at how big projects like spectrum does it.

Angular2 deploying to production environment questions

Some questions to put angular2 web project to production environment
We do development on lite server but what is best for production? Is is some other server module of nodejs? Technically we can have any server (apache, tomcat, etc).
How should we do source code management for below context.
browser must include js files so project should have js files when deployed
In standard java project we just commit .java files and uses jenkins (may be other tools) to compile and make the deploy-able structure
Should we follow same strategy here? i.e. don't commit compiled js files and deploy using some node compiler which takes ts files and compiles it to js
What is the best way to minify/obfuscate the js files
I know a way using outDir and outFile with grump but I don't want every files tobe included in one minified file because it kills the concept of lazy loading
Is there a way to minify and obfuscate js files on compile time only?
What enableProdMode() do? How it is different than not using it?
Here are some answers to your questions:
Angular2 applications only consist of static files so they can be serve by any static Web servers or server applications that can define static folders (Express, ...)
Regarding source code management, you must have a packaging phase to optimize the application loading (gater files, uglify, ...). Your source code must contain your TypeScript files (or JS files if using ES5 or ES6). Such packaging can be done using Gulp for example. Your Jenkins server will be able to checkout the source code, build it and execute tests.
In fact, when not using the outFile property of the TypeScript compiler, you won't be able to gather all the JS compiled files into a single one since anonymous modules will be created within each JS files.
See this question for more details of this:
How do I actually deploy an Angular 2 + Typescript + systemjs app?
Regarding prod mode, here is an extract of the documentation:
Disable Angular's development mode, which turns off assertions and other checks within the framework.
One important assertion this disables verifies that a change detection pass does not result in additional changes to any bindings (also known as unidirectional data flow).

How do I set up a Dojo build process with multiple applications?

I have a single-page Dojo (1.8) application, built on top of Colin Snover's Dojo Boilerplate, and it builds and works well. Now I've expanded the website into multiple pages, some of which have other Dojo applications. It works well from the source directories, but the build process doesn't pick up the additional files and thus the installed website is broken.
I need to update the build process so that it optimizes and copies all of the files, but I can't figure out where I should be adding the additional references.
(I've gone through lots of Dojo documentation, but it tends to focus on the details of the trees, or even the tree branches, without saying just what the forest looks like.)
The original boilerplate file tree is as follows:
/build.sh: the bash-based build script, which at its core runs the build tool under node.js
/profiles/app.profile.js: the "application build profile", handed to the build script with the --profile option
/webroot/: the root web server directory, containing:
/dijit/, /dojo/, /dojox/, /util/: the standard Dojo source directories
/app/: the application directory, containing
main.js: the main entry point for the app, which requires everything and then parses the DOM to instantiate the various app objects
run.js: some fundamental require()ments, handed to the build tool with the --require option
(the rest of the app's code)
The build tool is invoked from /webroot/util/buildscripts/ as follows:
node ../../dojo/dojo.js load=build --require ../../app/run.js --profile ../../../profiles/app
I've now added two new applications: one hosted in /webroot/info.html with source in /webroot/info/, and the other in /webroot/licenses.html with source in /webroot/licenses/ (both apps have run.js and main.js based on the initial boilerplate files). The new apps use the various Dojo tools, as well as some of the classes in /webroot/app/*.
But, where do I add references to these new apps so that the build process Does The Right Thing? Here are some possibilities I've come up with:
Add new --require newApp/run.js options to the build tool
Add new profiles, included by additional --profile newApp.profile.js options to the build tool
Add new "layers" to the existing app.profile.js file
Run the build tool multiple times, each time configured for one of the apps, trusting it to properly merge the files into the destination directory (I doubt this would work, but I've considered it...)
So, where do I go from here?
simplest is to create one bash file per application, which you can still optimize down to one via passed through bash variables from the command line ($1 $2,...).
so basically, you copy over the build.sh into each app directory, adjust the paths, and then you create a master shell script, calling each app's build.sh

Hot Towel: Why is Durandal and Require in the App folder rather than the Script folder?

This is coming from the idea of 3rd party libraries being in Script to discourage developers from customizing them. It would encourage them to write extensions to make it easier to take in a new version of either library.
You make a good point about other developers mistaking the durandal libraries for customizable files.
But, you are not required to keep durandal anywhere. The folder structure can be whatever your heart desires. Because durandal does not impose any folder structure.. it only has a recommeneded default setup. There are benifits to following its pattern.
By keeping durandal as part of your application root folder. It keeps all your amd javascript files together in one root folder. This way when you run the durandal optimizer it can scan every subfolder to compress/minify/uglify all your html/css/js into 1 file. This is a nice benifit because its a 1 click build of your entire application.
Also, its a nice seperation because its a good idea to keep your 3rd party non-amd JavaScript libraries in a separate folder structure this way if you use a bundler to compress all your third party libraries into a separate file. The browser can cache your application separate from the third-party libraries. Because the third-party libraries don't change very often, whereas your application will probably be changing frequently.
But durandal's conventions are all completely configurable and you can put durandal in any location you like.
This is a convention that Durandal has decided to use to help keep your customer client code organized in an App folder and away from the 3rd party scripts folder, which gets pretty messy pretty quickly. It does put require.js in the App folder because of the way it relies on require.js and its AMD pattern. require.js is used to help locate all modules and load them as needed (in your App folder).
Is there something specific that you need that this is preventing?

Building (preparing) node.js application for production (deploy)

I have a project that consists of several nod.js backend applications. The apps are using the same modules (which a placed outside of each ap folder in shared location). The aps they are to be deployed on differnt environments (servers), some code is for test, some for debug as usual.
If I choosed a platform (for example PaaS nodejitsu) for one of my apps, how I'm supposed to send there only production code for one of my apps? I deployed on nodejitsu and it just sends the app folder and uses package.json to configure the app. But there are a bunch of code that is not need (tests) for example and some code is external. And what If I want to obstruct server code too? How this issues are supposed to be soleved?
For front-end applications has a tons of methods to be build for production. I understand that the requirements are different, but didn't find any infromation on best practices fo how correctly to prepare node.js back end application for deploy.
Read section "Keeping files out of your package" in the NPM Developer page. It states following
Use a .npmignorefile to keep stuff out of your package. If there's no .npmignore file, but there is a .gitignore file, then npm will ignore the stuff matched by the .gitignore file. If you want to include something that is excluded by your .gitignore file, you can create an empty .npmignore file to override it.
Add those test files in .gitignore
or make another branch for production in git and push the production branch.

Resources