Building NodeJS using Gradle - node.js

I'm very new to Gradle. I started reading about it yesterday. I found an example build.gradle that builds a node application. I'm a little bit confused in the contents of the file. I'm not sure which ones are reserved or predefined words. One of the strings is node. It wasn't used somewhere but I figured out it was needed by the node plugin.
buildscript {
repositories {
mavenCentral()
maven {
url 'https://plugins.gradle.org/m2/'
}
}
dependencies {
classpath 'com.moowork.gradle:gradle-node-plugin:1.2.0'
}
}
apply plugin: 'base'
apply plugin: 'com.moowork.node' // gradle-node-plugin
node {
/* gradle-node-plugin configuration
https://github.com/srs/gradle-node-plugin/blob/master/docs/node.md
Task name pattern:
./gradlew npm_<command> Executes an NPM command.
*/
// Version of node to use.
version = '10.14.1'
// Version of npm to use.
npmVersion = '6.4.1'
// If true, it will download node using above parameters.
// If false, it will try to use globally installed node.
download = true
}
npm_run_build {
// make sure the build task is executed only when appropriate files change
inputs.files fileTree('public')
inputs.files fileTree('src')
// 'node_modules' appeared not reliable for dependency change detection (the task was rerun without changes)
// though 'package.json' and 'package-lock.json' should be enough anyway
inputs.file 'package.json'
inputs.file 'package-lock.json'
outputs.dir 'build'
}
// pack output of the build into JAR file
task packageNpmApp(type: Zip) {
dependsOn npm_run_build
baseName 'npm-app'
extension 'jar'
destinationDir file("${projectDir}/build_packageNpmApp")
from('build') {
// optional path under which output will be visible in Java classpath, e.g. static resources path
into 'static'
}
}
// declare a dedicated scope for publishing the packaged JAR
configurations {
npmResources
}
configurations.default.extendsFrom(configurations.npmResources)
// expose the artifact created by the packaging task
artifacts {
npmResources(packageNpmApp.archivePath) {
builtBy packageNpmApp
type 'jar'
}
}
assemble.dependsOn packageNpmApp
String testsExecutedMarkerName = "${projectDir}/.tests.executed"
task test(type: NpmTask) {
dependsOn assemble
// force Jest test runner to execute tests once and finish the process instead of starting watch mode
environment CI: 'true'
args = ['run', 'test']
inputs.files fileTree('src')
inputs.file 'package.json'
inputs.file 'package-lock.json'
// allows easy triggering re-tests
doLast {
new File(testsExecutedMarkerName).text = 'delete this file to force re-execution JavaScript tests'
}
outputs.file testsExecutedMarkerName
}
check.dependsOn test
clean {
delete packageNpmApp.archivePath
delete testsExecutedMarkerName
}
Also, how is the build.gradle parsed? I'm also wondering how it is able to magically download node and npm tools.

This is a very general synopsis:
Gradle aims to hide away the logic from developers.
Most *.gradle files contain configuration blocks (closures) to specify HOW logic should run.
Plugins augment gradle with more configurable logic.
Also, 'convention over configuration' is a practice emphasize in gradle and its plugins, providing sensible defaults to minimize developers' configuration efforts.
The com.moowork.node plugin is configured through the node extension block.
Extension blocks are gradle's way to allow plugins to add more 'reserved' words to the standard gradle model.
The download = true configuration tells the plugin to download node (version = '10.14.1') and nmp (npmVersion = '6.4.1') in your project's root (unless you override its defaults as well).
The download of these tools will occur when any of the plugin's task is invoked.
Hope this helps.

In your snippet only true is keyword, the other things are methods or getters coming from Gradle or Node JS plugin:
apply plugin: ... is a method from org.gradle.api.Project.apply(java.util.Map<String, ?>)
node is a method autogenerated by Gradle with signature void node(Closure<com.moowork.gradle.node.NodeExtension>) (method accepting a code block), see https://github.com/srs/gradle-node-plugin/blob/master/src/main/groovy/com/moowork/gradle/node/NodeExtension.groovy
node { version = ... } - version, npmVersion are fields from NodeExtension class
other things similarly, everything is a method or a field. If you're using IntelliJ, use Ctrl+mouse click to navigate to the originating method/field declaration.

Related

Could not find org.nodejs:x64

When I try to build my Frontend app I get
Could not resolve all dependencies for configuration ':frontend:nodeDist'.
> Could not find org.nodejs:x64/node:6.17.1.
Searched in the following locations:
http://nodejs.org/dist/v6.17.1/ivy.xml
http://nodejs.org/dist/v6.17.1/x64/node.exe
On another machine it downloads the node without issue from http://nodejs.org/dist/v6.17.1/win-x64/node.exe
Why does my machine not create the same download url with win-x64 instead of x64?
my build.gradle contains:
buildscript {
dependencies {
classpath 'com.moowork.gradle:gradle-node-plugin:0.10'
}
}
apply plugin: 'com.moowork.node'
node {
version = '6.17.1'
npmVersion = '2.10.1'
download = true
}
upgrading the versions is not an option in my case

How to import a node module inside an angular web worker?

I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.

writing nodejs applications in kotlin with intellij idea ce

I am trying to develop a Nodejs application using Kotlin 1.3.11 using the IntelliJ IDEA CE development environment. Unfortunately I haven't made any progress towards a running application. To ensure everything is setup correctly I want to print out a simple "hello world".
I searched for articles or tutorials about the topic but I didn't find much about bringing those three together (Kotlin, IntelliJ, Nodejs). The most specific ones which I found are:
a medium post and another post.
As far as I (believe to) know, there are three major steps:
calling initializing the node app via npm and using npm to install the node dependencies like kotlin and expressjs
creating a build.gradle to define other dependencies and tasks
creating an IntelliJ IDEA project
I tried to perform the steps in different orders but I never came to a running application. Also I searched in IntelliJ's documentation but the Nodejs integration isn't a feature of the free community edition. There isn't a description how to make Kotlin and Nodejs work together too.
Has anyone here successfully tried to do that (or failed and knows why it is not going to work)? Do I have to use another IDE or to write my own build tools/toolchain?
Sincerely J.
I haven't done this in IDEA CE, but theoretically, this should work.
Prerequisites: You have node installed, you can execute gradle tasks
This is a Minimum Configuration, There is a comprehensive one. Add a comment if intrested for that
Step 1:Create a new Kotlin/JS project (with gradle) and make sure that your gradle build file looks like this
group 'node-example'
version '1.0-SNAPSHOT'
buildscript {
ext.kotlin_version = '1.3.11'
repositories {
mavenCentral()
}
dependencies {
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
apply plugin: 'kotlin2js'
repositories {
mavenCentral()
}
dependencies {
compile "org.jetbrains.kotlin:kotlin-stdlib-js:$kotlin_version"
}
compileKotlin2Js.kotlinOptions {
moduleKind = "commonjs"
outputFile = "node/index.js"
}
task npmInit(type: Exec) {
commandLine "npm", "init", "-y"
}
task npmInstall(type: Exec) {
commandLine "npm", "install", "kotlin", "express", "--save"
}
task npmRun(type: Exec) {
commandLine "node", "node/index.js"
}
npmRun.dependsOn(build)
Step 2: After syncing your build.gradle in step 1 run the gradle tasks npmInit and npmInstall
./gradlew :npmInit
./graldew :npmInstall
Step 3:
Create your kotlin file (index.kt/main.kt/whatever.kt) in src/main/kotlin and test the code below
external fun require(module:String):dynamic
fun main(args: Array<String>) {
println("Hello JavaScript!")
val express = require("express")
val app = express()
app.get("/", { req, res ->
res.type("text/plain")
res.send("Kotlin/JS is kool")
})
app.listen(3000, {
println("Listening on port 3000")
})
}
Step 4: RTFA - Run The App
Run the gradle task npmRun
./gradlew :npmRun
Hope that helps
Note:
1. This template was pulled from the medium post you asked above and modified a little
2. Remember to run your gradle tasks using sudo (if you are using linux)
Edit: Alternatively, you could clone https://github.com/miquelbeltran/kotlin-node.js and follow the instructions in the read me.
I managed to get the Medium post to work by replacing gradle build with the following (since the post was published in 2017(!) and requires a much older version of Gradle):
Comment out the entire contents of build.gradle like so:
/*group 'node-example'
...
compileKotlin2Js.kotlinOptions {
moduleKind = "commonjs"
outputFile = "node/index.js"
}*/
Run this command in the command prompt: (3.4.1 was the latest version of Gradle just before the Medium post was published.)
gradle wrapper --gradle-version=3.4.1
Uncomment out build.gradle:
group 'node-example'
...
compileKotlin2Js.kotlinOptions {
moduleKind = "commonjs"
outputFile = "node/index.js"
}
Run this command in place of gradle build:
gradlew build
And finally run this command as in the post: (As of writing this answer on StackOverflow, Node.js does not be downgraded and the current LTS version 10.16.0 works perfectly.)
node node/index.js

How to detect when dependency library version updates exist in build.gradle in AndroidStudio project

I've an android project with two modules (typical front-end app and backend). I have three build.gradle files, one in each module and one in the root.
The way I've structured my dependencies is by extracting all the versions into separate variables in the root level build.gradle as such
ext {
// SDK and tools
MIN_SDK_VERSION = 19
TARGET_SDK_VERSION = 23
COMPILE_SDK_VERSION = 23
BUILD_TOOLS_VERSION = '24'
// app dependencies
GOOGLE_API_CLIENT_VERSION = '1.19.0'
GOOGLE_PLAY_SERVICES_VERSION = '8.4.0'
ANDROID_SUPPORT_LIB_VERSION = '23.1.0'
[...]
// backend dependencies
[...]
}
which are later used in my say app build.gradle file as such
dependencies {
[...]
compile(group: 'com.google.oauth-client', name: 'google-oauth-client', version: rootProject.ext.GOOGLE_API_CLIENT_VERSION)
/////////////////////////////////
// Google Play Services explicit dependency
compile(group: 'com.google.android.gms', name: 'play-services-auth', version: rootProject.ext.GOOGLE_PLAY_SERVICES_VERSION)
compile(group: 'com.google.android.gms', name: 'play-services-plus', version: rootProject.ext.GOOGLE_PLAY_SERVICES_VERSION)
[...]
/////////////////////////////////
// Local Testing
testCompile(group: 'junit', name: 'junit', version: rootProject.ext.JUNIT_VERSION)
testCompile(group: 'pl.pragmatists', name: 'JUnitParams', version: rootProject.ext.JUNIT_PARAMS_VERSION)
[...]
}
NOTE: I found that idea in a tutorial somewhere and I thought it was very nifty.
However, I'm struggling to keep track of which lib versions are available, what is upgradable, etc. It is becoming hard to keep track of these things as I have a reasonably sized list of dependencies. Curious how others have approached this problem. Thanks.
A summary of how I manage dependencies.
All dependency definitions are defined in gradle/dependencies.gradle, which is applied to all projects and the buildscript. I usually break it into three categories. A full example is here.
ext {
versions = [
caffeine: '2.3.1',
]
test_versions = [
testng: '6.9.12',
]
plugin_versions = [
versions: '0.13.0',
]
libraries = [
caffeine: "com.github.ben-manes.caffeine:caffeine:${versions.caffeine}",
]
test_libraries = [
testng: dependencies.create("org.testng:testng:${test_versions.testng}") {
exclude group: 'junit'
},
]
gradle_plugins = [
versions: "com.github.ben-manes:gradle-versions-plugin:${plugin_versions.versions}",
]
}
Then in my root project I bootstrap it as,
buildscript {
apply from: "${rootDir}/gradle/dependencies.gradle"
repositories {
jcenter()
}
dependencies {
gradle_plugins.each { name, dependency -> classpath dependency }
}
}
allprojects {
apply from: "${rootDir}/gradle/dependencies.gradle"
repositories {
jcenter()
}
}
This allows defining the dependency in a project as,
dependencies {
compile libraries.caffeine
}
To detect newer versions I wrote the gradle-versions-plugin. That generates a report by querying the repositories for the version information and comparing it to your definitions. I run it manually every so often, but others script it in their CI and use the json or xml reports.
There are a few other approaches that were developed after I wrote my plugin. Spring's dependency-management-plugin works with Maven BOMs, as does Netfix's nebula-dependency-recommender-plugin. Netflix uses gradle-dependency-lock-plugin to define dynamic versions and generate a lock file to fix a release. There are also dependency version alerting services, though a simple CI job is likely equivalent.
I've never used any of the alternatives as they seem less intuitive (to me), came out years after I had a nice solution, and it is a familiar approach if you come from Maven. Hopefully someone else can shed light on the benefits of other approaches.

Module missing in Android Studio and Gradle sync fails

I have set up a brand new project in Android Studio 1.1 RC 1:
Created an Android project [app] (because there is no way to create an App Engine backend project right away).
Added an existing backend module by first creating a new App Engine module and then manually importing the files [backend].
Removed the Android app module [app].
Added a Java library module, same procedure, first creating a new module, then importing files [common].
Everything compiles fine, but Android Studio has two problems:
When I look at Project Structure, the [common] module is missing in the left pane, but it still appears as referenced module in the right pane!?
My Project tree looks fine and all modules are recognized, but gradle is telling me the sync failed.
Gradle says "Task '' not found in root project" ('' is empty string as it seems). I get a Warning and an exception in the log when running from Terminal, but it doesn't seem to be related (related to Indexing), so I haven't included it here.
settings.gradle has both modules specified:
include ':backend', ':common'
I tried to exchange the .iml file of the main project with a fake one which contains both modules, with the result that (besides multiple side effects) both modules were there. (I restored the original state because of the side-effects.)
Here are my gradle files:
Root module:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:1.0.1'
}
}
allprojects {
repositories {
jcenter()
}
}
[backend]
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'com.google.appengine:gradle-appengine-plugin:1.9.17'
}
}
repositories {
jcenter();
}
apply plugin: 'java'
apply plugin: 'war'
apply plugin: 'appengine'
sourceCompatibility = JavaVersion.VERSION_1_7
targetCompatibility = JavaVersion.VERSION_1_7
dependencies {
appengineSdk 'com.google.appengine:appengine-java-sdk:1.9.17'
compile 'com.google.appengine:appengine-endpoints:1.9.17'
compile 'com.google.appengine:appengine-endpoints-deps:1.9.17'
compile 'javax.servlet:servlet-api:2.5'
compile 'com.googlecode.objectify:objectify:5.1.3'
compile 'com.squareup.retrofit:retrofit:1.9.0'
compile 'io.jsonwebtoken:jjwt:0.4'
compile project(':common')
}
appengine {
downloadSdk = true
appcfg {
oauth2 = true
}
endpoints {
getClientLibsOnBuild = true
getDiscoveryDocsOnBuild = true
}
}
[common]
apply plugin: 'java'
task sourcesJar(type: Jar, dependsOn:classes) {
classifier = 'sources'
from sourceSets.main.allSource
}
artifacts {
archives sourcesJar
}
dependencies {
compile 'com.google.http-client:google-http-client-android:1.18.0-rc'
compile 'com.google.code.gson:gson:2.3.1'
}
apply plugin: 'maven'
group = 'cc.closeup'
version = 'v2-2.0-SNAPSHOT'
install {
repositories.mavenInstaller {
pom.artifactId = 'common'
pom.packaging = 'jar'
}
}
Any ideas? Anything else that you'd like to see here?
If you want to build an AE project only. You could try this tutorial for intellij idea jetbrains.com/idea/help/creating-google-app-engine-project.html
My mistake was I removed [app]. It seems that if you create an App Engine backend module, you must keep a "fake" frontend module in the same project to keep Android Studio/gradle happy.
In earlier Android Studio versions it was possible to remove the frontend module without problems, but it seems Google has locked this somehow. It still works when I keep the fake frontend module.
--
Why I configured it this way? In my configuration, I have backend and frontend modules in different projects, and I have the backend project install libraries into local Maven, which I then pick up within my frontend project (with a team you would choose a local Maven server). This configuration has multiple advantages, for example that I can test backend/frontend on two screens simultaneously without switching back and forth all the time. Some companies may also want this configuration to keep their backend code separate and secure.

Resources