Inconsistently getting Error: [$injector:modulerr] Failed to instantiate module - node.js

I'm getting inconsistent results from Angular/Karma/Jasmine. When I run 'npm test', I get:
INFO [karma]: Karma v0.10.10 server started at http://localhost:9876/
INFO [launcher]: Starting browser Chrome
INFO [Chrome 35.0.1916 (Linux)]: Connected on socket aW0Inld7aRhONC2vo04k
Chrome 35.0.1916 (Linux): Executed 1 of 1 SUCCESS (0.345 secs / 0.016 secs)
Then, if I just save the code or test file (no changes), it will sometimes have the same results, sometimes it gives errors:
INFO [watcher]: Removed file "/home/www/raffler/entries.js".
Chrome 35.0.1916 (Linux) Raffler controllers RafflerCtrl should start with an empty masterList FAILED
Error: [$injector:modulerr] Failed to instantiate module raffler due to:
Error: [$injector:nomod] Module 'raffler' is not available! You either misspelled the module name or forgot to load it. If registering a module ensure that you specify the dependencies as the second argument.
This is WITHOUT MAKING CHANGES. If I stop Karma and restart it works again, but once it fails it always fails. What gives? Buggy Angular/Jasmine/Karma? The code and test are trivial. Here is the code:
var myApp = angular.module('raffler', []);
myApp.controller('RafflerCtrl', function($scope, $log) {
$scope.$log = $log;
$scope.masterList = [];
});
And here is the test:
'use strict';
describe('Raffler controllers', function() {
describe('RafflerCtrl', function(){
var scope, ctrl;
beforeEach(module('raffler'));
beforeEach(inject(function($controller) {
scope = {};
ctrl = $controller('RafflerCtrl', {$scope:scope});
}));
it('should start with an empty masterList', function() {
expect(scope.masterList.length).toBe(0);
});
});
});
Am I doing something dumb? Seems like it should give me consistent results, regardless of my stupidity level... Thanks.

You were asking if there was a bug. There is. The authors of Karma know that there are problems with file watching. See this issue: https://github.com/karma-runner/karma/issues/974
Simply saving the file without changes can trigger this behavior. There's two main ways that files get saved. The first is to delete the original (or rename it to .bak or something) and then write out the new content. The second method writes the new content to a temporary file, deletes the original and then moves the temporary file to where the original used to be. For both of those the file system monitoring can fire an event saying that some files/directories changed. Node is quick enough to be able to detect the file was removed and tells Karma to stop using it in its tests. A slightly less used third way is to open the file in a special way to overwrite the contents, which will keep Karma happy.
Back to that bug ... in the above scenario, the globs are not re-evaluated when file system changes are detected. So it thinks your file was removed. It never saw that a new file was added, thus it's now out of the test suite.
This bug is bothering me too. If you've got an idea or a pull request then I'd suggest providing it to the Karma team. There's already a patch being reviewed that should address these problems - see https://github.com/karma-runner/karma/issues/1123.
As a workaround, you can use "set backupcopy=yes" for vim. There may be settings in other editors to change the behavior so that the file is overwritten instead of replaced.

Related

gulp-eslint not outputting to file - unable to properly configure writableStream

Issue - User cannot get output to print to file for gulp-lint process
Documentation Reference It is observed in the documentation that a writeableStream is a valid configuration, but regrettably it does not denote or provide clarification on how to do this....and I have tried the solution below, along with others to no avail...so am seeking any insight / support that can be provided
Observations - other users had published guidance suggesting a stream similar to this, but when attempting this 2 things are observed....
The IntelliJ IDE notes that the parameter "writable" should be updated to "writableStream"
The build output generates the file, but the file is empty, therefore I am obviously missing something with respect to configuring / establishing the stream properly
Sample Code Block
'use strict';
const {src, task} = require('gulp');
const eslint = require('gulp-eslint');
const fs =require('fs');
task('lint', () => {
return src(['**/*.js', '!**/node_modules/**', '!**/handlebars.runtime-v4.1.2.js', '!**/parsley.js', '!**/slick.js','!*SampleTests.js'])
// Runs eslint
.pipe(eslint())
// Sets the format of the console
.pipe(eslint.format('table',fs.createWriteStream('eslint-result.xml')))
// To have the process exit with an error code (1) on lint error, return the stream and pipe to failAfterError last
.pipe(eslint.failOnError()
.pipe(eslint.results(results => {
// Called once for all ESLint results.
console.log(`Total Results: ${results.length}`);
console.log(`Total Warnings: ${results.warningCount}`);
console.log(`Total Errors: ${results.errorCount}`);
}))
)
});
A new day brings new results I guess....after running a build with this configuration again it worked..much to my surprise.
Note, I am using maven as a build process, and am invoking this using the maven-frontend plugin....and it is important to note that the result file will NOT appear until AFTER the build process has finished

Problems using Tail (Node.js module) to read file as it is updated

I'm trying to use Tail (https://www.npmjs.com/package/tail) to export Minecraft server log data to Discord (The discord bot part works, so I have excluded it from here).
If I say something in the game and then check "latest.log", it has been changed accordingly. However, using this script, the bot only sees a change if I open "latest.log" in notepad, it doesn't work otherwise. The bot will recognize changes as long as "latest.log" is open in the background, which is an annoyance but not too big of a deal.
However, my friend is the one who I was making this for, and for him Tail only updates the moment he opens "latest.log". Which means he would need to keep opening up that file for Tail to see it, instead of just letting it run in the background.
Tail = require('tail').Tail;
var fileToTail = "C:/Users/user/Downloads/logs/latest.log";
tail = new Tail(fileToTail);
tail.on("line", function(data) {
//Working code that sends data
});
tail.on("error", function(error) {
console.log('ERROR: ', error);
});
What could be causing the discrepancy between the two of us, and what can I do so that the bot can see the file changes without the user opening the file? Thanks in advance!
If you are using chokidar, you should pay attention to whether you are using fs.watch Vs. fs.watchFile. If using fs.watch you not successfully catch changes (which might be what you are experiencing).
See below from official docs for chokidar options:
usePolling (default: false). Whether to use fs.watchFile (backed by
polling), or fs.watch. If polling leads to high CPU utilization,
consider setting this to false. It is typically necessary to set this
to true to successfully watch files over a network, and it may be
necessary to successfully watch files in other non-standard
situations. Setting to true explicitly on MacOS overrides the
useFsEvents default. You may also set the CHOKIDAR_USEPOLLING env
variable to true (1) or false (0) in order to override this option.

What does delete cache mean in Nodejs

Please find below a sample code in nodejs:
var hello_file = require.resolve('hello')
var hello = require('hello')
console.log(m.hello()); // there is a method hello in module hello.js
delete require.cache[hello_file]
console.log(m.hello()); // it still works
I thought the delete would remove the reference to module and hence the last line should throw an error. But it does not. What could be the reason and what does delete cache really mean?
The cache doesn't know about it anymore but your var hello still has a reference to what was previously loaded.
The next time you call require('hello') it will load the module from the file. But, until you update the reference that var hello is holding, it will continue to point to the originally loaded module.
As you know, node would load a module once even if you require many times, Modules are cached after the first time they are loaded. If you delete it from cache, it will reload the module from filesystem to the cache the next time you require.

Node JS Express Boilerplate and rendering

I am trying out node and it's Express framework via the Express boilerplate installation. It took me a while to figure out I need Redis installed (btw, if you're making a boilerplate either include all required software with it or warn about the requirement for certain software - Redis was never mentioned as required) and to get my way around the server.js file.
Right now I'm still a stranger to how I could build a site in this..
There is one problem that bugs me specifically - when I run the server.js file, it says it's all good. When I try to access it in the browser, it says 'transferring data from localhost' and never ends - it's like render doesn't finish sending and never sends the headers. No errors, no logs, no nothing - res.render('index') just hangs. The file exists, and the script finds it, but nothing ever happens. I don't have a callback in the render defined, so headers should get sent as usual.
If on the other hand I replace the render command with a simple write('Hello world'); and then do a res.end();, it works like a charm.
What am I doing wrong with rendering? I haven't changed a thing from the original installation btw. The file in question is index.ejs, it's in views/, and I even called app.register('.ejs', require('ejs')); just in case before the render itself. EJS is installed.
Also worth noting - if I do a res.render('index'); and then res.write('Hello'); immediately afterwards, followed by res.end();, I do get "Hello" on the screen, but the render never happens - it just hangs and says "Transferring data from localhost". So the application doesn't really die or hang, it just never finishes the render.
Edit: Interesting turn of events: if I define a callback in the render, the response does end. There is no more "Transferring data...", but the view is never rendered, neither is the layout. The source is completely empty upon inspection. There are no errors whatsoever, and no exceptions.
Problem fixed. It turns our render() has to be the absolute last command in a routing chain. Putting res.write('Hello'); and res.end(); after it was exactly what broke it.
I deleted everything and wrote simply res.render('index') and it worked like a charm. Learn from my fail, newbies - no outputting anything after rendering!

Node.js : EBADF, Bad file descriptor

If I reload my application (from the browser with the reload button) a lots of times like 50 reload/10 seconds it gives me this error:
events.js:45
throw arguments[1]; // Unhandled 'error' event
^
Error: EBADF, Bad file descriptor
This seems to me like a bandwidth error or something like that, originally I've got the error when I played with the HTML 5 Audio API, and If I loaded the audio file 10-15 times sequentially then I've got the error, but now I've discovered that I get the error without the Audio API too just by reloading the site a lots of times, also Safari gives me the error much faster than Chrome (WTF?)
I'm using Node.js 0.4.8 with express + jade and I'm also connected to a MySQL database with the db-mysql module.
I can't find any articles on the web about this topic what helps, so pleeease let me know what can cause this error because it's really confusing :(
By "reload your application" do you mean refresh your app's home page from a browser, or actually stop and restart the node.js server process? I assume the former, in which case if you can't reliably reproduce this it will be pretty tricky to debug, especially since you don't have a good stack trace to pinpoint the source. But if you use the express.js app.error hook (docs here) you'll want to log the error path from the "Bad file descriptor" error, which should hopefully clue you in to whether this is a temporary file that got deleted or what. In terms of the actual cause, we can only offer guesses since "Bad file descriptor" is a very generic low level error that basically means you are calling an operation on a file descriptor that is no longer in the correct state to handle that operation (like reading a closed file, opening a file that has been deleted, etc).
#CIRK, take a look at this: https://github.com/joyent/node/issues/1189
it's not a node problem, but a system tuning issue.
edit: or maybe it's related to this error in connect 1.4.3:
https://github.com/senchalabs/connect/issues/297
if this is your case, just try to upgrade it
This error may result from using fs to save a file whose name is a number rather than a string. File names must be strings:
Incorrect:
const fileName = 12345;
const fileContent = "The great croissant."
fs.writeFileSync(fileName, fileContent);
Correct:
fs.writeFileSync(`${fileName}`, fileContent);
Also correct:
const fileName = "12345";
fs.writeFileSync(fileName, fileContent);

Resources