I am using React Native and Node.js. I want to share code between the two. My folder structure is as so.
myreactnativeapp/
mynodeserver/
myshared/
In the react native and node apps I have included the
package.json
"dpendencies" : {
"myshared": "git+https://myrepository/ugoshared.git"
}
This can then be included in each project via require/import etc. This all works fine and for production I'm happy with it. (Though I'd love to know a better way?)
The issue I'm facing is in development it's really slow.
The steps for a change to populate are:
Make changes in Shared
Commit Changes to git
Update the npm module
In development, I really want the same codebase to be used rather than this long update process. I tried the following:
Adding a symlink in node_models/shared - doesn't work in react-native package mangaer
Using relative paths ../../../shared - doesn't work in react-native package mangaer
Any other ideas?
Update 1
I created a script.sh which I run to copy the files into a local directory before the package manager starts. It's not ideal but at least I only have to restart the packager instead of messing with git etc.
#myreactnativeapp/start.sh
SOURCE=../myshared
MODULE=myshared
rm -rf ./$MODULE
mkdir ./$MODULE
find $SOURCE -maxdepth 1 -name \*.js -exec cp -v {} "./$MODULE/" \;
# create the package.json
echo '{ "name": "'$MODULE'" }' > ./$MODULE/package.json
# start the packager
node node_modules/react-native/local-cli/cli.js start
Then in my package.json I update the script to
"scripts": {
"start": "./start.sh",
},
So, the process is now.
Make a change
Start/Resetart the packager
Automatic:
Script copies all .js files under myshared/ -> myreactnativeapp/myshared/
Script creates a package.json with the name of the module
Because I've added the package.json to the copied files with the name of the module, in my project I can just include the items the same as I would if the module was included via the package manager above. In theory when I switch to using the package in production I wont have to change anything.
Import MyModule from 'myshared/MyModule'
Update 2
My first idea got tiresome restarting the package manager all the time. Instead i created a small node script in the shared directory to watch for changes. Whenever there is a change it copies it to the react native working directory.
var watch = require('node-watch')
var fs = require('fs')
var path = require('path')
let targetPath = '../reactnativeapp/myshared/'
watch('.', { recursive: false, filter: /\.js$/ }, function(evt, name) {
console.log('File changed: '+name+path.basename(__filename))
// don't copy this file
if(path.basename(__filename) === name) {
return
}
console.log(`Copying file: ${name} --> ${targetPath+name}`);
fs.copyFile(name, targetPath+name, err => {
if(err) {
console.log('Error:', err)
return;
}
console.log('Success');
})
});
console.log(`Starting to watch: ${__dirname}. All files to be copied to: ${targetPath}`)
I am trying to understand as how to make a local module. At the root of node application, I have a directory named lib. Inside the lib directory I have a .js file which looks like:
var Test = function() {
return {
say : function() {
console.log('Good morning!');
}
}
}();
module.exports = Test;
I have modified my package.json with an entry of the path to the local module:
"dependencies": {
"chat-service": "^0.13.1",
"greet-module": "file:lib/Test"
}
Now, if I try to run a test script like:
var greet = require('greet-module');
console.log(greet.say());
it throws an error saying:
Error: Cannot find module 'greet-module'
What mistake am I making here?
modules.export is incorrect. It should be module.exports with an s.
Also, make sure after you add the dependency to do an npm install. This will copy the file over to your node_modules and make it available to the require function.
See here for a good reference.
Update:
After going through some examples to figure this out I noticed, most projects have the structure I laid out below. You should probably format your local modules to be their own standalone packages. With their own folders and package.json files specifying their dependencies and name. Then you can include it with npm install -S lib/test.
It worked for me once I did it, and it'll be a good structure moving forward. Cheers.
See here for the code.
I am trying to include a precompiled binary with an electron app. I began with electron quick start app and modified my renderer.js file to include this code that is triggered when a file is dropped on the body:
spawn = require('child_process').spawn,
ffmpeg = spawn('node_modules/.bin/ffmpeg', ['-i', clips[0], '-an', '-q:v', '1', '-vcodec', 'libx264', '-y', '-pix_fmt', 'yuv420p', '-vf', 'setsar=1,scale=trunc(iw/2)*2:trunc(ih/2)*2,crop=in_w:in_h-50:0:50', '/tmp/out21321.mp4']);
ffmpeg.stdout.on('data', data => {
console.log(`stdout: ${data}`);
});
ffmpeg.stderr.on('data', data => {
console.log(`stderr: ${data}`);
});
I have placed my precompiled ffmpeg binary in node_modules/.bin/. Everything works great in the dev panel, but when I use electron-packager to set up the app, it throws a spawn error ENOENT to the console when triggered. I did find a very similar question on SO, but the question doesn't seem to be definitively answered. The npm page on electron-packager does show that they can be bundled, but I cannot find any documentation on how to do so.
The problem is that electron-builder or electron-packager will bundle your dependency into the asar file. It seems that if the dependency has a binary into node_modules/.bin it is smart enough to not package it.
This is the documentation for asar packaging for electron-builder on that topic. It says
Node modules, that must be unpacked, will be detected automatically
I understand that it is related to existing binaries in node_modules/.bin.
If the module you are using is not automatically unpacked you can disable asar archiving completely or explicitly tell electron-builder to not pack certain files. You do so in your package.json file like this:
"build": {
"asarUnpack": [
"**/app/node_modules/some-module/*"
],
For your particular case
I ran into the same issue with ffmpeg and this is what I've done:
Use ffmpeg-static. This package bundles statically compiled ffmpeg binaries for Windows, Mac and Linux. It also provides a way to get the full path of the binary for the OS you are running: require('ffmpeg-static').path
This will work fine in development, but we still need to troubleshoot the distribution problem.
Tell electron-builder to not pack the ffmpeg-static module:
"build": {
"asarUnpack": [
"**/app/node_modules/ffmpeg-static/*"
],
Now we need to slightly change the code to get the right path to ffmpeg with this code: require('ffmpeg-static').path.replace('app.asar', 'app.asar.unpacked') (if we are in development the replace() won't replace anything which is fine).
If you are using webpack (or other javascript bundler)
I ran into the issue that require('ffmpeg-static').path was returning a relative path in the renderer process. But the issue seemed to be that webpack changes the way the module is required and that prevents ffmpeg-static to provide a full path. In the Dev Tools the require('ffmpeg-static').path was working fine when run manually, but when doing the same in the bundled code I was always getting a relative path. So this is what I did.
In the main process add this before opening the BrowserWindow: global.ffmpegpath = require('ffmpeg-static').path.replace('app.asar', 'app.asar.unpacked'). The code that runs in the main process is not bundled by webpack so I always get a full path with this code.
In the renderer process pick the value this way: require('electron').remote.getGlobal('ffmpegpath')
I know I'm a bit late but just wanted to mention ffbinaries npm package I created a while ago exactly for this purpose.
It'll allow you to download ffmpeg/ffplay/ffserver/ffprobe binaries to specified location either during application boot (so you don't need to bundle it with your application) or in a CI setup. It can autodetect platform, you can also specify it manually.
If anyone happens to need an answer to this question: I do have a solution to this, but I have no idea if this is considered best practice. I couldn't find any good documentation for including 3rd party precompiled binaries, so I just fiddled with it until it finally worked. Here's what I did (starting with the electron quick start, node.js v6):
From the app directory I ran the following commands to include the ffmpeg binary as a module:
mkdir node_modules/ffmpeg
cp /usr/local/bin/ffmpeg node_modules/ffmpeg/
ln -s ../ffmpeg/ffmpeg node_modules/.bin/ffmpeg
(replace /usr/local/bin/ffmpeg with your current binary path, download it from here) Placing the link allowed electron-packager to include the binary I saved to node_modules/ffmpeg/.
Then to get the bundled app path I installed the npm package app-root-dir by running the following command:
npm i -S app-root-dir
Since I could then get the app path, I just appended the subfolder for my binary and spawned from there. This is the code that I placed in renderer.js:.
var appRootDir = require('app-root-dir').get();
var ffmpegpath=appRootDir+'/node_modules/ffmpeg/ffmpeg';
console.log(ffmpegpath);
const
spawn = require( 'child_process' ).spawn,
ffmpeg = spawn( ffmpegpath, ['-i',clips_input[0]]); //add whatever switches you need here
ffmpeg.stdout.on( 'data', data => {
console.log( `stdout: ${data}` );
});
ffmpeg.stderr.on( 'data', data => {
console.log( `stderr: ${data}` );
});
This is how I would do it:
Taking cues from tsuriga's answer, here is my code:
Note: replace or add OS path accordingly.
Create a directory ./resources/mac/bin
Place you binaries inside this folder
Create file ./app/binaries.js and paste the following code:
'use strict';
import path from 'path';
import { remote } from 'electron';
import getPlatform from './get-platform';
const IS_PROD = process.env.NODE_ENV === 'production';
const root = process.cwd();
const { isPackaged, getAppPath } = remote.app;
const binariesPath =
IS_PROD && isPackaged
? path.join(path.dirname(getAppPath()), '..', './Resources', './bin')
: path.join(root, './resources', getPlatform(), './bin');
export const execPath = path.resolve(path.join(binariesPath, './exec-file-name'));
Create file ./app/get-platform.js and paste the following code:
'use strict';
import { platform } from 'os';
export default () => {
switch (platform()) {
case 'aix':
case 'freebsd':
case 'linux':
case 'openbsd':
case 'android':
return 'linux';
case 'darwin':
case 'sunos':
return 'mac';
case 'win32':
return 'win';
}
};
Add the following code inside the ./package.json file:
"build": {
....
"extraFiles": [
{
"from": "resources/mac/bin",
"to": "Resources/bin",
"filter": [
"**/*"
]
}
],
....
},
import binary file path as:
import { execPath } from './binaries';
#your program code:
var command = spawn(execPath, arg, {});
Why this is better?
Most of the answers require an additional package called app-root-dir
The original answer doesn't handle the (env=production) build or the pre-packed versions properly. He/she has only taken care of development and post-packaged versions.
I see people using gulp with webpack. But then I read webpack can replace gulp? I'm completely confused here...can someone explain?
UPDATE
in the end I started with gulp. I was new to modern front-end and just wanted to get up and running quick. Now that I've got my feet quite wet after more than a year, I'm ready to move to webpack. I suggest the same route for people who start off in the same shoes. Not saying you can't try webpack but just sayin if it seems complicated start with gulp first...nothing wrong with that.
If you don't want gulp, yes there's grunt but you could also just specify commands in your package.json and call them from the command-line without a task runner just to get up and running initially. For example:
"scripts": {
"babel": "babel src -d build",
"browserify": "browserify build/client/app.js -o dist/client/scripts/app.bundle.js",
"build": "npm run clean && npm run babel && npm run prepare && npm run browserify",
"clean": "rm -rf build && rm -rf dist",
"copy:server": "cp build/server.js dist/server.js",
"copy:index": "cp src/client/index.html dist/client/index.html",
"copy": "npm run copy:server && npm run copy:index",
"prepare": "mkdir -p dist/client/scripts/ && npm run copy",
"start": "node dist/server"
},
This answer might help. Task Runners (Gulp, Grunt, etc) and Bundlers (Webpack, Browserify). Why use together?
...and here's an example of using webpack from within a gulp task. This goes a step further and assumes that your webpack config is written in es6.
var gulp = require('gulp');
var webpack = require('webpack');
var gutil = require('gutil');
var babel = require('babel/register');
var config = require(path.join('../..', 'webpack.config.es6.js'));
gulp.task('webpack-es6-test', function(done){
webpack(config).run(onBuild(done));
});
function onBuild(done) {
return function(err, stats) {
if (err) {
gutil.log('Error', err);
if (done) {
done();
}
} else {
Object.keys(stats.compilation.assets).forEach(function(key) {
gutil.log('Webpack: output ', gutil.colors.green(key));
});
gutil.log('Webpack: ', gutil.colors.blue('finished ', stats.compilation.name));
if (done) {
done();
}
}
}
}
I think you'll find that as your app gets more complicated, you might want to use gulp with a webpack task as per example above. This allows you to do a few more interesting things in your build that webpack loaders and plugins really don't do, ie. creating output directories, starting servers, etc. Well, to be succinct, webpack actually can do those things, but you might find them limited for your long term needs. One of the biggest advantages you get from gulp -> webpack is that you can customize your webpack config for different environments and have gulp do the right task for the right time. Its really up to you, but there's nothing wrong with running webpack from gulp, in fact there's some pretty interesting examples of how to do it. The example above is basically from jlongster.
NPM scripts can do the same as gulp, but in about 50x less code. In fact, with no code at all, only command line arguments.
For example, the use case you described where you want to have different code for different environments.
With Webpack + NPM Scripts, it's this easy:
"prebuild:dev": "npm run clean:wwwroot",
"build:dev": "cross-env NODE_ENV=development webpack --config config/webpack.development.js --hot --profile --progress --colors --display-cached",
"postbuild:dev": "npm run copy:index.html && npm run rename:index.html",
"prebuild:production": "npm run clean:wwwroot",
"build:production": "cross-env NODE_ENV=production webpack --config config/webpack.production.js --profile --progress --colors --display-cached --bail",
"postbuild:production": "npm run copy:index.html && npm run rename:index.html",
"clean:wwwroot": "rimraf -- wwwroot/*",
"copy:index.html": "ncp wwwroot/index.html Views/Shared",
"rename:index.html": "cd ../PowerShell && elevate.exe -c renamer --find \"index.html\" --replace \"_Layout.cshtml\" \"../MyProject/Views/Shared/*\"",
Now you simply maintain two webpack config scripts, one for development mode, webpack.development.js, and one for production mode, webpack.production.js. I also utilize a webpack.common.js which houses webpack config shared on all environments, and use webpackMerge to merge them.
Because of the coolness of NPM scripts, it allows for easy chaining, similar to how gulp does Streams/pipes.
In the example above, to build for developement, you simply go to your command line and execute npm run build:dev.
NPM would first run prebuild:dev,
Then build:dev,
And finally postbuild:dev.
The pre and post prefixes tell NPM which order to execute in.
If you notice, with Webpack + NPM scripts, you can run a native programs, such as rimraf, instead of a gulp-wrapper for a native program such as gulp-rimraf. You can also run native Windows .exe files as I did here with elevate.exe or native *nix files on Linux or Mac.
Try doing the same thing with gulp. You'll have to wait for someone to come along and write a gulp-wrapper for the native program you want to use. In addition, you'll likely need to write convoluted code like this: (taken straight from angular2-seed repo)
Gulp Development code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import * as merge from 'merge-stream';
import * as util from 'gulp-util';
import { join/*, sep, relative*/ } from 'path';
import { APP_DEST, APP_SRC, /*PROJECT_ROOT, */TOOLS_DIR, TYPED_COMPILE_INTERVAL } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
let typedBuildCounter = TYPED_COMPILE_INTERVAL; // Always start with the typed build.
/**
* Executes the build process, transpiling the TypeScript files (except the spec and e2e-spec files) for the development
* environment.
*/
export = () => {
let tsProject: any;
let typings = gulp.src([
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts'
]);
let src = [
join(APP_SRC, '**/*.ts'),
'!' + join(APP_SRC, '**/*.spec.ts'),
'!' + join(APP_SRC, '**/*.e2e-spec.ts')
];
let projectFiles = gulp.src(src);
let result: any;
let isFullCompile = true;
// Only do a typed build every X builds, otherwise do a typeless build to speed things up
if (typedBuildCounter < TYPED_COMPILE_INTERVAL) {
isFullCompile = false;
tsProject = makeTsProject({isolatedModules: true});
projectFiles = projectFiles.pipe(plugins.cached());
util.log('Performing typeless TypeScript compile.');
} else {
tsProject = makeTsProject();
projectFiles = merge(typings, projectFiles);
}
result = projectFiles
.pipe(plugins.plumber())
.pipe(plugins.sourcemaps.init())
.pipe(plugins.typescript(tsProject))
.on('error', () => {
typedBuildCounter = TYPED_COMPILE_INTERVAL;
});
if (isFullCompile) {
typedBuildCounter = 0;
} else {
typedBuildCounter++;
}
return result.js
.pipe(plugins.sourcemaps.write())
// Use for debugging with Webstorm/IntelliJ
// https://github.com/mgechev/angular2-seed/issues/1220
// .pipe(plugins.sourcemaps.write('.', {
// includeContent: false,
// sourceRoot: (file: any) =>
// relative(file.path, PROJECT_ROOT + '/' + APP_SRC).replace(sep, '/') + '/' + APP_SRC
// }))
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(APP_DEST));
};
Gulp Production code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import { join } from 'path';
import { TMP_DIR, TOOLS_DIR } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
const INLINE_OPTIONS = {
base: TMP_DIR,
useRelativePaths: true,
removeLineBreaks: true
};
/**
* Executes the build process, transpiling the TypeScript files for the production environment.
*/
export = () => {
let tsProject = makeTsProject();
let src = [
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts',
join(TMP_DIR, '**/*.ts')
];
let result = gulp.src(src)
.pipe(plugins.plumber())
.pipe(plugins.inlineNg2Template(INLINE_OPTIONS))
.pipe(plugins.typescript(tsProject))
.once('error', function () {
this.once('finish', () => process.exit(1));
});
return result.js
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(TMP_DIR));
};
The actual gulp code is much more complicated that this, as this is only 2 of the several dozen gulp files in the repo.
So, which one is easier to you?
In my opinion, NPM scripts far surpasses gulp and grunt, in both effectiveness and ease of use, and all front-end developers should consider using it in their workflow because it is a major time saver.
UPDATE
There is one scenario I've encountered where I wanted to use Gulp in combination with NPM scripts and Webpack.
When I need to do remote debugging on an iPad or Android device for example, I need to start up extra servers. In the past I ran all the servers as separate processes, from within IntelliJ IDEA (Or Webstorm) that is easy with the "Compound" Run Configuration. But if I need to stop and restart them, it was tedious to have to close 5 different server tabs, plus the output was spread across the different windows.
One of the benefits of gulp is that is can chain all the output from separate independent processes into one console window, which becomes the parent of all the child servers.
So I created a very simple gulp task that just runs my NPM scripts or the commands directly, so all the output appears in one window, and I can easily end all 5 servers at once by closing the gulp task window.
Gulp.js
/**
* Gulp / Node utilities
*/
var gulp = require('gulp-help')(require('gulp'));
var utils = require('gulp-util');
var log = utils.log;
var con = utils.colors;
/**
* Basic workflow plugins
*/
var shell = require('gulp-shell'); // run command line from shell
var browserSync = require('browser-sync');
/**
* Performance testing plugins
*/
var ngrok = require('ngrok');
// Variables
var serverToProxy1 = "localhost:5000";
var finalPort1 = 8000;
// When the user enters "gulp" on the command line, the default task will automatically be called. This default task below, will run all other tasks automatically.
// Default task
gulp.task('default', function (cb) {
console.log('Starting dev servers!...');
gulp.start(
'devserver:jit',
'nodemon',
'browsersync',
'ios_webkit_debug_proxy'
'ngrok-url',
// 'vorlon',
// 'remotedebug_ios_webkit_adapter'
);
});
gulp.task('nodemon', shell.task('cd ../backend-nodejs && npm run nodemon'));
gulp.task('devserver:jit', shell.task('npm run devserver:jit'));
gulp.task('ios_webkit_debug_proxy', shell.task('npm run ios-webkit-debug-proxy'));
gulp.task('browsersync', shell.task(`browser-sync start --proxy ${serverToProxy1} --port ${finalPort1} --no-open`));
gulp.task('ngrok-url', function (cb) {
return ngrok.connect(finalPort1, function (err, url) {
site = url;
log(con.cyan('ngrok'), '- serving your site from', con.yellow(site));
cb();
});
});
// gulp.task('vorlon', shell.task('vorlon'));
// gulp.task('remotedebug_ios_webkit_adapter', shell.task('remotedebug_ios_webkit_adapter'));
Still quite a bit of code just to run 5 tasks, in my opinion, but it works for the purpose. One caveate is that gulp-shell doesn't seem to run some commands correctly, such as ios-webkit-debug-proxy. So I had to create an NPM Script that just executes the same command, and then it works.
So I primarily use NPM Scripts for all my tasks, but occasionally when I need to run a bunch of servers at once, I'll fire up my Gulp task to help out. Pick the right tool for the right job.
UPDATE 2
I now use a script called concurrently which does the same thing as the gulp task above. It runs multiple CLI scripts in parallel and pipes them all to the same console window, and its very simple to use. Once again, no code required (well, the code is inside the node_module for concurrently, but you don't have to concern yourself with that)
// NOTE: If you need to run a command with spaces in it, you need to use
// double quotes, and they must be escaped (at least on windows).
// It doesn't seem to work with single quotes.
"run:all": "concurrently \"npm run devserver\" nodemon browsersync ios_webkit_debug_proxy ngrok-url"
This runs all 5 scripts in parallel piped out to one terminal. Awesome! So that this point, I rarely use gulp, since there are so many cli scripts to do the same tasks with no code.
I suggest you read these articles which compare them in depth.
How to Use NPM as a Build Tool
Why we should stop using Grunt & Gulp
Why I Left Gulp and Grunt for NPM Scripts
I used both options in my different projects.
Here is one boilerplate that I put together using gulp with webpack - https://github.com/iroy2000/react-reflux-boilerplate-with-webpack.
I have some other project used only webpack with npm tasks.
And they both works totally fine. And I think it burns down to is how complicated your task is, and how much control you want to have in your configuration.
For example, if you tasks is simple, let's say dev, build, test ... etc ( which is very standard ), you are totally fine with just simple webpack with npm tasks.
But if you have very complicated workflow and you want to have more control of your configuration ( because it is coding ), you could go for gulp route.
But from my experience, webpack ecosystem provides more than enough plugins and loaders that I will need, and so I love using the bare minimum approach unless there is something you can only do in gulp. And also, it will make your configuration easier if you have one less thing in your system.
And a lot of times, nowadays, I see people actually replacing gulp and browsify all together with webpack alone.
The concepts of Gulp and Webpack are quite different. You tell Gulp how to put front-end code together step-by-step, but you tell Webpack what you want through a config file.
Here is a short article (5 min read) I wrote explaining my understanding of the differences: https://medium.com/#Maokai/compile-the-front-end-from-gulp-to-webpack-c45671ad87fe
Our company moved from Gulp to Webpack in the past year. Although it took some time, we figured out how to move all we did in Gulp to Webpack. So to us, everything we did in Gulp we can also do through Webpack, but not the other way around.
As of today, I'd suggest just use Webpack and avoid the mixture of Gulp and Webpack so you and your team do not need to learn and maintain both, especially because they are requiring very different mindsets.
Honestly I think the best is to use both.
Webpack for all javascript related.
Gulp for all css related.
I still have to find a decent solution for packaging css with webpack, and so far I am happy using gulp for css and webpack for javascript.
I also use npm scripts as #Tetradev as described. Especially since I am using Visual Studio, and while NPM Task runner is pretty reliable Webpack Task Runner is pretty buggy.
I'm new to Sails.js and Node.js and I have problems with creating documentation for my application.
Here's my steps:
installed apidoc by:
npm install apidoc -g
installed grunt module:
npm install grunt-apidoc --save-dev
added grunt.loadNpmTasks('grunt-apidoc'); to Gruntfile.js at the bottom
created grunt.initConfig file and put:
apidoc: {
myapp: {
src: "api/controllers/",
dest: "apidoc/"
}
}
Then I'm trying to run multiple things, and none of them produces my api documentation:
sails lift
grunt
grunt default
node app.js
If I run it manually by apidoc -i api/controllers/ -o apidoc/ it's working properly.
What am I doing wrong? How to do it?
Super late answer!
From my experience modifying the asset pipeline you'd be better off:
Install apidoc and the Grunt module as in the Question
Create a new file in `tasks/config/apidoc.js:
module.exports = function (grunt) {
grunt.config.set('apidoc', {
myapp: {
src: "api/controllers/",
dest: "apidoc/"
}
});
grunt.loadNpmTasks('grunt-apidoc');
};
Edit tasks/register/compileAssets.js (or wherever you want the task to be run):
module.exports = function (grunt) {
grunt.registerTask('compileAssets', [
'clean:dev',
'jst:dev',
'less:dev',
'copy:dev',
'coffee:dev',
'apidoc:myapp' // <-- This will now run every time your assets are compiled
]);
};
Hope this helps someone