I am using Angular 2 with Electron and want to keep running a process in background to show notifications. I am using forever-monitor for that, it works only in development mode, but when I package my app using electron-packager, this stops working. My code looks like that:
main.ts
exports.runBackgroundProcess = () => {
// Run a background process forever
var forever = require('forever-monitor');
var child = new(forever.Monitor)('src/assets/notification-process.js',
{
env: {ELECTRON_RUN_AS_NODE: 1},
options: []
});
child.start();
}
I wrote a function in main.ts that will run background process when called from angular component. Code in notification-process.js is following:
notification-process.js
notifier = require('node-notifier')
notifierFun = (msg) => {
notifier.notify({
title: 'Notify Me',
message: msg,
wait: true
});
}
var CronJob = require('cron').CronJob;
new CronJob('* * * * * *', function() {
notifierFun("Message from notification process");
});
Finally I am calling the function from app.component.ts
let main_js = this.electronService.remote.require("./main.js");
main_js.runBackgroundProcess();
I don't think it is a good idea to set your script in the assets directory.
I would prefer it to be packaged as an extra resource.
the next snippet will permit to launch your node process
var child_process = require('child_process');
var child = child_process.fork('notification-process.js',[],{
cwd : 'resources'
});
If it does not work once packaged, this may be involved because your files have not been packaged .To package it as an extra resource, modify package.json as follow :
this will package webserver folder to resources/webserver folder:
"target": [
"win": {
"target": "nsis",
"icon": "build/icon.ico",
"extraResources" : [{
"from" : "webserver",
"to" : "webserver"}
]
},
for reference, have a look at :
https://nodejs.org/api/child_process.html#child_process_child_process_fork_modulepath_args_options
That's how it worked:
1- Moved notification-process.js file from assets folder to main directory.
2- Changed file path in main.js:
var child = new (forever.Monitor)(path.join(__dirname, 'notification-process.js')...
Without using join, it doesn't work after packaging the app.
Related
I've written a quick Electron Forge app that simply runs an express webserver that serves static files locally. I prefer this to running a node process directly for usability.
main.js
import { app, BrowserWindow } from 'electron';
import express from 'express';
const exApp = express();
exApp.use(express.static('web-app'));
exApp.listen(3333);
let mainWindow;
const createWindow = () => {
// Create the browser window.
mainWindow = new BrowserWindow({
// ...
});
// ...
};
// ...
I use the CopyWebpackPlugin to copy the files I need to serve into the .webpack/main/web-app/ directory.
webpack.main.config.js
module.exports = {
/**
* This is the main entry point for your application, it's the first file
* that runs in the main process.
*/
entry: './src/main.js',
// Put your normal webpack config below here
module: {
rules: require('./webpack.rules'),
},
plugins: [
new CopyPlugin([
{ from: path.resolve(__dirname, 'web-app'), to: 'web-app' }
]),
]
};
This works perfectly in development (via yarn start).
When I try to run yarn make, it successfully builds the app and generates a runnable exe, but trying to access http://localhost:3333/ after running the app, results in a Cannot GET / 404 message.
Any idea what I'm doing wrong?
What I was serving in development was actually the web-app directory relative to the node process, and not the one copied into .webpack/....
To serve the proper files in production, I changed the exApp.use() line to:
exApp.use(express.static('resources/app/.webpack/main/web-app'));
I am trying to get web workers up and running with Vue cli3 and I'm having trouble getting it to work.
I want to use the following package, worker-loader (and not vue-worker), as it looks well maintained and with more contributions.
Following their tutorial I attempted to modify webpack using the vue cli as follows:
module.exports = {
chainWebpack: config => {
config.module
.rule('worker-loader')
.test(/\.worker\.js$/)
.use('worker-loader')
.loader('worker-loader')
.end()
}
}
which I hope should match their
{
module: {
rules: [
{
test: /\.worker\.js$/,
use: { loader: 'worker-loader' }
}
]
}
}
which can be read here (https://github.com/webpack-contrib/worker-loader). I tried to follow the documentation for vue cli3 as best I could (found here: https://cli.vuejs.org/guide/webpack.html#simple-configuration).
My component is pretty simple:
import Worker from 'worker-loader!./../../sharedComponents/equations/recurringTimeComposer.js';
<...>
watch:{
recurringPaymentReturnObj: function(newVal, oldVal){
const myWorker = new Worker;
myWorker.postMessage({ hellothere: 'sailor' });
myWorker.onmessage = (e) => {
console.log('value of e from message return', e.data);
}
}
<...>
and in my ./../../sharedComponents/equations/recurringTimeComposer.js file I have:
onmessage = function(e) {
console.log('Message received from main script: ', e.data);
// var workerResult = 'Result: ' + e.data;
// console.log('Posting message back to main script');
postMessage('hello back handsome');
close();
}
I keep getting the error message:
ReferenceError: window is not defined a162426ab2892af040c5.worker.js:2:15
After some googling I came across this post: https://github.com/webpack/webpack/issues/6642, which suggests that the best way to fix this is to add the following to webpack:
output: {
path: path.join(__dirname, 'dist'),
filename: 'bundle.js'
publicPath: 'http://localhost:3000',
globalObject: 'this'
},
After modifying my vue.config.js file I have:
module.exports = {
chainWebpack: config => {
config.module
.rule('worker-loader')
.test(/\.worker\.js$/)
.use('worker-loader')
.loader('worker-loader')
.end()
config
.output
.path(path.join(__dirname, 'dist'))
.filename('bundle.js')
.publicPath('http://localhost:8080')
.globalObject('this')
}
}
...but still I am getting the window is not defined error.
Does anyone know what is going wrong? It seems to be a weird error in webpack.
Thanks!
EDIT: oh yeah, here is the MDN page for webworker as well: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers.
Being new to Javascript I kept coming back to this issue when trying to use web workers with VueJS. I never managed to make it work with vue-worker or worker-loader.
It is now 2020 and Google has released worker-plugin.
To use it create a module my-worker with two files index.js and worker.js.
index.js creates the module:
const worker = new Worker('./worker.js', { type: 'module' });
const send = message => worker.postMessage({
message
})
export default {
worker,
send
}
worker.js contains the logic:
import _ from 'lodash'
addEventListener("message", async event => {
let arrayToReverse = event.data.message.array
let reversedArray = _.reverse(arrayToReverse)
// Send the reversed array
postMessage(reversedArray)
});
You will also need to update your vue.config.js to use the WorkerPlugin:
const WorkerPlugin = require('worker-plugin')
module.exports = {
configureWebpack: {
output: {
globalObject: "this"
},
plugins: [
new WorkerPlugin()
]
}
};
Now you can use you worker in your components:
Import it with import worker from '#/my-worker'.
Setup a listener in the mounted() lifecycle hook with worker.worker.onmessage = event => { // do something when receiving postMessage }
Start the worker with worker.send(payload).
I set up a starter code on github. I still haven't managed to make HMR work though...
This works for me (note the first line):
config.module.rule('js').exclude.add(/\.worker\.js$/)
config.module
.rule('worker-loader')
.test(/\.worker\.js$/)
.use('worker-loader')
.loader('worker-loader')
The first line excludes worker.js files, so two loaders wouldn't fight over the same js file
is this what you need ? Vue issue with worker-loader
Updating from the classic vue & webpack config, I found out that to make this one work, I needed to deactivate parallelization.
// vue.config.js
module.exports = {
parallel: false,
chainWebpack: (config) => {
config.module
.rule('worker')
.test(/\.worker\.js$/)
.use('worker-loader')
.loader('worker-loader')
.end();
}
};
I tried add web worker to a vue-cli4 project, and here is what I found:
using worker-loader and make configs in chainWebpack:
HMR works fine, but sourcemap broke, it show babel transformed code.
using worker-plugin as #braincoke mentioned:
HMR broke, but sourcemap works fined. and eslint broke while suggested disable all worker js file eslint instead.
Finally, My solution is tossing vue-cli away, and embrace vite.It support worker natively, and all just go fine now. (I think upgrade webpack to v5 can solve this, but i never tried.)
I am attempting to re-run my gulp build when gulpfile.js changes, but I am having issues with the method all of my research has lead me to.
I have one watcher for all my less and javascript files and a configuration object that has the list of files to watch, how they are output, etc. This is a stripped-down example of what it looks like:
var $ = require('gulp-load-plugins')();
var config = {
root: rootPath,
output: {
app: 'app',
vendor: 'vendor'
}, // ...
};
gulp.task('default', ['build', 'watch']);
gulp.task('build', ['clean', 'less:app', 'less:theme', 'css:vendor', 'js:app', 'js:vendor', 'rev', 'css:copyfonts']);
gulp.task('watch', function () {
var allFiles = config.styles.appSrc
.concat(config.styles.vendorSrc)
.concat(config.scripts.appSrc)
.concat(config.scripts.vendorSrc);
$.watch(allFiles, function () {
gulp.start('default');
});
});
gulp.task('watch:gulp', function () {
var p;
gulp.watch('gulpfile.js', spawnUpdatedGulp);
spawnUpdatedGulp();
function spawnUpdatedGulp() {
if (p) {
p.kill();
}
p = spawn('gulp', ['default', '--color'], { stdio: 'inherit' });
}
});
// .. other build tasks ..
The above code shows how I tried the accepted answer to this:
How can Gulp be restarted upon each Gulpfile change?
However, it has a major issue. When I run watch:gulp, it runs the build just fine, and everything is great. The config.output.app variable is how the app specific css and js files are named, so my test case has been:
run gulp:watch, check that the css output is named according to config.output.app
change config.output.app, and perform step #1 again
save any random javascript file that it is watching, and see if it builds correctly
Step 3 is riddled with permission errors because of multiple watchers on the files, and this only gets worse the more I repeat steps 1 and 2. Visual Studio will even freeze.
I have not found a way to clean up the old watchers. I tried to manually kill them like this:
var appFileWatcher;
gulp.task('watch', function () {
var allFiles = config.styles.appSrc
.concat(config.styles.vendorSrc)
.concat(config.scripts.appSrc)
.concat(config.scripts.vendorSrc);
appFileWatcher = $.watch(allFiles, function () {
gulp.start('default');
});
});
gulp.task('watch:gulp', function () {
var p;
var gulpWatcher = $.watch('gulpfile.js', spawnUpdatedGulp);
spawnUpdatedGulp();
function spawnUpdatedGulp() {
if (p) {
p.kill();
}
if (appFileWatcher) {
appFileWatcher.unwatch();
}
gulpWatcher.unwatch();
p = spawn('gulp', ['default', '--color'], { stdio: 'inherit' });
}
});
This also does not work. I still get multiple watchers trying to perform the build when I perform my same test case.
How do I kill those watchers that stay around after the new gulp process is spawned?
I am trying to use gulp-requirejs to build a demo project. I expect result to be a single file with all js dependencies and template included. Here is my gulpfile.js
var gulp = require('gulp');
var rjs = require('gulp-requirejs');
var paths = {
scripts: ['app/**/*.js'],
images: 'app/img/**/*'
};
gulp.task('requirejsBuild', function() {
rjs({
name: 'main',
baseUrl: './app',
out: 'result.js'
})
.pipe(gulp.dest('app/dist'));
});
// The default task (called when you run `gulp` from cli)
gulp.task('default', ['requirejsBuild']);
The above build file works with no error, but the result.js only contains the content of main.js and config.js. All the view files, jquery, underscore, backbone is not included.
How can I configure gulp-requirejs to put every js template into one js file?
If it is not the right way to go, can you please suggest other method?
Edit
config.js
require.config({
paths: {
"almond": "/bower_components/almond/almond",
"underscore": "/bower_components/lodash/dist/lodash.underscore",
"jquery": "/bower_components/jquery/dist/jquery",
"backbone": "/bower_components/backbone/backbone",
"text":"/bower_components/requirejs-text/text",
"book": "./model-book"
}
});
main.js
// Break out the application running from the configuration definition to
// assist with testing.
require(["config"], function() {
// Kick off the application.
require(["app", "router"], function(app, Router) {
// Define your master router on the application namespace and trigger all
// navigation from this instance.
app.router = new Router();
// Trigger the initial route and enable HTML5 History API support, set the
// root folder to '/' by default. Change in app.js.
Backbone.history.start({ pushState: false, root: '/' });
});
});
The output is just a combination this two files, which is not what I expected.
gulp-requirejs has been blacklisted by the gulp folks. They see the RequireJS optimizer as its own build system, incompatible with gulp. I don't know much about that, but I did find an alternative in amd-optimize that worked for me.
npm install amd-optimize --save-dev
Then in your gulpfile:
var amdOptimize = require('amd-optimize');
var concat = require('gulp-concat');
gulp.task('bundle', function ()
{
return gulp.src('**/*.js')
.pipe(amdOptimize('main'))
.pipe(concat('main-bundle.js'))
.pipe(gulp.dest('dist'));
});
The output of amdOptimize is a stream which contains the dependencies of the primary module (main in the above example) in an order that resolves correctly when loaded. These files are then concatenated together via concat into a single file main-bundle.js before being written into the dist folder.
You could also minify this file and perform other transformations as needed.
As an aside, in my case I was compiling TypeScript into AMD modules for bundling. Thinking this through further I realized that when bundling everything I don't need the asynchronous loading provided by AMD/RequireJS. I am going to experiment with having TypeScript compile CommonJS modules instead, then bundling them using webpack or browserify, both of which seem to have good support within gulp.
UPDATE
My previous answer always reported taskReady even if requirejs reported an error. I reconsidered this approach and added error logging. Also I try to fail the build completely as described here gulp-jshint: How to fail the build? because a silent fail really eats your time.
See updated code below.
Drew's comment about blacklist was very helpfull and gulp folks suggest using requirejs directly. So I post my direct requirejs solution:
var DIST = './dist';
var requirejs = require('requirejs');
var requirejsConfig = require('./requireConfig.js').RJSConfig;
gulp.task('requirejs', function (taskReady) {
requirejsConfig.name = 'index';
requirejsConfig.out = DIST + 'app.js';
requirejsConfig.optimize = 'uglify';
requirejs.optimize(requirejsConfig, function () {
taskReady();
}, function (error) {
console.error('requirejs task failed', JSON.stringify(error))
process.exit(1);
});
});
The file at ./dist/app.js is built and uglified. And this way gulp will know when require has finished building. So the task can be used as a dependency.
My solution works like this:
./client/js/main.js:
require.config({
paths: {
jquery: "../vendor/jquery/dist/jquery",
...
},
shim: {
...
}
});
define(["jquery"], function($) {
console.log($);
});
./gulpfile.js:
var gulp = require('gulp'),
....
amdOptimize = require("amd-optimize"),
concat = require('gulp-concat'),
...
gulp.task('scripts', function(cb) {
var js = gulp.src(path.scripts + '.js')
.pipe(cached('scripts'))
.pipe(jshint())
.pipe(jshint.reporter('default'))
.pipe(remember('scripts'))
.pipe(amdOptimize("main",
{
name: "main",
configFile: "./client/js/main.js",
baseUrl: './client/js'
}
))
.pipe(concat('main.js'));
.pipe(gulp.dest(path.destScripts));
}
...
This part was important:
configFile: "./client/js/main.js",
baseUrl: './client/js'
This allowed me to keep my configuration in one place. Otherwise I was having to duplicate my paths and shims into gulpfile.js.
This works for me. I seems that one ought to add in uglification etc via gulp if desired. .pipe(uglify()) ...
Currently I have to duplicate the config in main.js to run asynchronously.
....
var amdOptimize = require("amd-optimize");
...
var js = gulp.src(path.scripts + '.js')
.pipe(cached('scripts'))
.pipe(jshint())
.pipe(jshint.reporter('default'))
.pipe(remember('scripts'))
.pipe(amdOptimize("main",
{
name: "main",
paths: {
jquery: "client/vendor/jquery/dist/jquery",
jqueryColor: "client/vendor/jquery-color/jquery.color",
bootstrap: "client/vendor/bootstrap/dist/js/bootstrap",
underscore: "client/vendor/underscore-amd/underscore"
},
shim: {
jqueryColor : {
deps: ["jquery"]
},
bootstrap: {
deps: ["jquery"]
},
app: {
deps: ["bootstrap", "jqueryColor", "jquery"]
}
}
}
))
.pipe(concat('main.js'));
Try this code in your gulpfile:
// Node modules
var
fs = require('fs'),
vm = require('vm'),
merge = require('deeply');
// Gulp and plugins
var
gulp = require('gulp'),
gulprjs= require('gulp-requirejs-bundler');
// Config
var
requireJsRuntimeConfig = vm.runInNewContext(fs.readFileSync('app/config.js') + '; require;'),
requireJsOptimizerConfig = merge(requireJsRuntimeConfig, {
name: 'main',
baseUrl: './app',
out: 'result.js',
paths: {
requireLib: 'bower_modules/requirejs/require'
},
insertRequire: ['main'],
// aliases from config.js - libs will be included to result.js
include: [
'requireLib',
"almond",
"underscore",
"jquery",
"backbone",
"text",
"book"
]
});
gulp.task('requirejsBuild', ['component-scripts', 'external-scripts'], function (cb) {
return gulprjs(requireJsOptimizerConfig)
.pipe(gulp.dest('app/dist'));
});
Sorry for my english. This solution works for me. (I used gulp-requirejs at my job)
I think you've forgotten to set mainConfigFile in your gulpfile.js. So, this code will be work
gulp.task('requirejsBuild', function() {
rjs({
name: 'main',
mainConfigFile: 'path_to_config/config.js',
baseUrl: './app',
out: 'result.js'
})
.pipe(gulp.dest('app/dist'));
});
In addition, I think when you run that task in gulp, require can not find its config file and
This is not gulp-requirejs fault.
The reason why only main.js and config.js is in the output is because you're not requiring/defining any other files. Without doing so, the require optimizer wont understand which files to add, the paths in your config-file isn't a way to require them!
For example you could load a main.js file from your config file and in main define all your files (not optimal but just a an example).
In the bottom of your config-file:
// Load the main app module to start the app
requirejs(["main"]);
The main.js-file: (just adding jquery to show the technique.
define(["jquery"], function($) {});
I might also recommend gulp-requirejs-optimize instead, mainly because it adds the minification/obfuscation functions gulp-requirejs lacks: https://github.com/jlouns/gulp-requirejs-optimize
How to implement it:
var requirejsOptimize = require('gulp-requirejs-optimize');
gulp.task('requirejsoptimize', function () {
return gulp.src('src/js/require.config.js')
.pipe(requirejsOptimize(function(file) {
return {
baseUrl: "src/js",
mainConfigFile: 'src/js/require.config.js',
paths: {
requireLib: "vendor/require/require"
},
include: "requireLib",
name: "require.config",
out: "dist/js/bundle2.js"
};
})).pipe(gulp.dest(''));
});
I've been trying to use requirejs and js-test-driver along side, and I can't seen to get it working.
I have a minimal configuration like this at the root :
server: http://localhost:9876
load:
- src/main/resources/web/resources/vendor/requirejs/require.js
test:
- src/test/js/*.js
A "src/main/js/greeter.js" file defines a silly module :
define(function(require) {
myapp = {};
myapp.Greeter = function() {
};
myapp.Greeter.prototype.greet = function(name) {
return "Hello " + name + "!";
};
return myapp;
});
And I'm trying to let require load the greeter module in a "src/test/js/greeterTest.js" :
GreeterTest = TestCase("GreeterTest");
require.configure({ ???? });
require([ "src/main/js/greeter" ], function(myapp) {
GreeterTest.prototype.testGreet = function() {
var greeter = new myapp.Greeter();
assertEquals("Hello World!", greeter.greet("World"));
};
});
Whenever I run this, I get an error because myapp is not defined when creating the Greeter.
So is there a way I can configure require in this case ? I tried :
setting the baseUrl property to something like file:///.... to give the location of the file
using the gateway configuration to proxy the requests that requirejs might do (although I have no server running to serve the js files, so again I had to use file:///)
Is there something else I should try ?
Thanks
Turns out it is possible, but poorly documented :
js-test-driver has a 'serve' settings that lets the test server responds static files
once served, the files are available at localhost:42442/test/xxxx (The 'test' prefix is never mentionned, except in a comment low in the doc page).
So this works :
server: http://localhost:9876
load:
- src/main/resources/web/resources/vendor/requirejs/require.js
test:
- src/test/js/*.js
serve:
- src/main/js/*.js
And requirejs has to be configured like this :
require({
baseUrl : "/test/src/main/js"
}, [ "greeter" ], function(myapp) {
GreeterTest = TestCase("GreeterTest");
GreeterTest.prototype.testGreet = function() {
var greeter = new myapp.Greeter();
assertEquals("Hello World!", greeter.greet("World"));
};
});
Notice :
the / before the tests
the fact that you have to reuse the "src/main/js" part ; I guess that is linked to the fact that my jsTestDriver config is located at the same level as the "src" folder, and
it might needs some tweaking if placed otherwise.