I'm trying to use mock-fs to mock up file system contents to test gulp tasks. Unfortunately, gulp.src doesn't seem to play well with mock-fs. Specifically, I get ENOENT errors:
Message:
ENOENT, lstat '/vagrant/study-node-heroku/instances/development/app.json'
Details:
errno: -2
code: ENOENT
path: /vagrant/study-node-heroku/instances/development/app.json
domainEmitter: [object Object]
domain: [object Object]
domainThrown: false
Stack:
Error: ENOENT, lstat '/vagrant/study-node-heroku/instances/development/app.json'
at Error (native)
Other parts of my code and test code access the mock-fs-created files just fine.
What am I doing wrong? I suspect that the problem is related to gulp's usage of vinyl.
Here is the function under test:
var herokuTarball = function(options, done) {
var instance = options.instance || 'development';
var tarballName = options.tarballName || instance
var tarballPath = path.join(config.temp, tarballName + '.tar.gz');
var files = path.join(config.instances, instance, '**/*');
yassert.file(path.join(config.instances, instance, 'app.json'));
async.waterfall([
function(cb) {
del([tarballPath], cb);
},
function(err, cb) {
gulp.src(files)
.pipe(tar(tarballName + '.tar'))
.pipe(gzip())
.pipe(gulp.dest(config.temp))
.pipe(gcallback(cb));
}
], function(err, result) {
if (err) return err;
return done(err, tarballPath);
});
}
And here is the test snippet:
describe('gulp heroku:tarball', function() {
after('something', function() {
mock.restore();
});
before('something', function() {
mock({
'instances/development': {
'app.json': 'test content'
}
});
});
it('creates a tarball', function(done) {
var options = {}
heroku.herokuTarball(options, function(err, result) {
expect(result).to.be.a('string');
yassert.file(result);
done();
});
});
});
Notice that the yassert (yeoman-assert) calls pass fine -- the file is there. If I take the function with the gulp.src call out of the async waterfall, the error goes away (and the test fails of course).
Issue posted at https://github.com/tschaub/mock-fs/issues/44
You are not doing anything wrong, mock-fs README states:
Note mock-fs is not compatible with graceful-fs#3.x but works with graceful-fs#4.x.
Looking at the dependencies of gulp we get:
$ npm info gulp devDependencies.graceful-fs
^3.0.0
Hence, gulp is still dependent on graceful-fs#3.x, therefore mock-fs will not work.
YMMV, but maybe vinyl-fs-mock is an alternative?
Related
I'm creating a simple web API using NodeJS express, for some home domotics. For my TV I'm using the following library --> https://github.com/hobbyquaker/lgtv2.
When I run my code locally, for example;
var lgtv = require('lgtv2')({
url: 'ws://192.168.178.31:3000'
});
lgtv.on('error', function(err) {
console.log(err);
});
lgtv.on('connect', function() {
console.log('connected');
lgtv.request('ssap://system/turnOff', function(err, res) {
lgtv.disconnect();
});
});
It run's fine. However, the same code, deployed to my Synology NAS, results in an error.
TypeError: Arguments to path.join must be strings
at path.js:360:15
at Array.filter (native)
at exports.join (path.js:358:36)
at module.exports (/volume1/web/NodeJS/node_modules/persist-path/index.js:19:22)
at new LGTV (/volume1/web/NodeJS/node_modules/lgtv2/index.js:47:16)
at LGTV (/volume1/web/NodeJS/node_modules/lgtv2/index.js:38:16)
at Object.module.exports.setNetflix (/volume1/web/NodeJS/controllers/tv.js:50:36)
at /volume1/web/NodeJS/routes/routes.js:43:12
at Layer.handle [as handle_request] (/volume1/web/NodeJS/node_modules/express/lib/router/layer.js:95:5)
at next (/volume1/web/NodeJS/node_modules/express/lib/router/route.js:137:13)
The only actual difference I can spot is the NPM version, which is v10.14.1 locally and v0.10.48 on my NAS. Is there any way to bypass this problem and get this working?
Randy
It is very strange library.
Try this code
console.log('Platform = ',process.platform);
var lgtv = require('lgtv2')({
url: 'ws://192.168.178.31:3000',
clientKey: ''
});
lgtv.on('error', function(err) {
console.log(err);
});
lgtv.on('connect', function() {
console.log('connected');
lgtv.request('ssap://system/turnOff', function(err, res) {
lgtv.disconnect();
});
});
Error code looks like:
{ Error: ENOENT: no such file or directory, open 'sad' errno: -2, code: 'ENOENT', syscall: 'open', path: 'sad' }
where 'sad' is the name of file I would like to write to and it doesn't exist.
Code looks like this:
fs.writeFile(filename, JSON_string, { flag: 'w' }, function(err){
if(err){
return console.error(err);
}
return JSON_string;
});
There are other similar questions, but they are all wrong in their path, starting or not starting with /, I just want to write a file on root from where I run this node.js application (it is also initialized with npm in this directory..).
Running with
sudo node server4.js
Doesnt work either.
Changing flags to w+ or wx or whatever, doesn't help.
Code works if file exists.
Node v9+.
I need to use writeFile() function.
This is working for me, please check if this works in your system:
var fs = require('fs')
fs.writeFile('./myfile.txt', 'Content to write', { flag: 'w' }, function(err) {
if (err)
return console.error(err);
fs.readFile('./myfile.txt', 'utf-8', function (err, data) {
if (err)
return console.error(err);
console.log(data);
});
});
(besides writing it also reads to confirm)
While this question is quite open-ended, I'm generally trying to follow this excellent post here: https://aws.amazon.com/blogs/compute/analyzing-genomics-data-at-scale-using-r-aws-lambda-and-amazon-api-gateway/ which describes setting up R to run with python. I, on the other hand, am trying to get R to work with NodeJs.
I've packaged up my dependencies, deployed to Lambda, and can run simple Node scripts. However, I am having difficulty connecting to RServe from Node using the npm package Rio (https://www.npmjs.com/package/rio). RServe, on both my localhost and on Heroku, will accept the default connection of 127.0.0.1 and port 6331. No luck with AWS Lambda.
'use strict';
var rio = require('rio');
var Promise = require('bluebird');
var exec = require('child_process').exec;
var whenReady = new Promise(function(resolve){
// require libraries and bootup RServe
exec('Rscript init.R', function(error, stdout, stderr) {
(function check() {
// Attempt to connect to RServe through Rio using my 'up' test function
rio.e({
entrypoint: 'up',
callback: function (err) {
console.log(err);
if (err) return setTimeout(check, 100);
// If no connection error, rserve is running
console.log("Rserve up");
resolve();
}
});
})();
});
});
exports.handler = function(event, context, callback) {
whenReady.then(function () {
// Call hello world
rio.e({
entrypoint: 'hello',
data: {name:'Will'},
callback: function(err, result){
console.log("Error", err);
callback(null, result);
}
});
});
};
This ends with connection refused errors
2017-03-01T22:58:33.210Z 96f69baf-fed2-11e6-9164-e91b9773d645 {
[Error: connect ECONNREFUSED 127.0.0.1:6311] code: 'ECONNREFUSED',
errno: 'ECONNREFUSED', syscall: 'connect', address: '127.0.0.1',
port: 6311 }
Any ideas on how to fix this one? I'm hoping we don't need to get complicated: https://aws.amazon.com/blogs/aws/new-access-resources-in-a-vpc-from-your-lambda-functions/
Thank you in advance!
** Update **
init.R does the following
// Require some libraries
...
require('jsonlite');
up <- function () {
toJSON(TRUE)
}
run.Rserve()
** Last Update **
Gave up and went to the python example as posted in the first link.
Will
I am using this code to generate pdf:
let fileUri = process.env.PWD + '/storage/orders-pdf/' + fileName;
// Commence Webshot
webshot(html_string, fileUri, options, function(error) {
fs.readFile(fileUri, function (err, data) {
if (err) {
return console.log(err);
}
fs.unlinkSync(fileUri);
fut.return(data);
});
});
let pdfData = fut.wait();
But it throws the following error:
{ [Error: ENOENT, open '/opt/holi/storage/orders-pdf/Attributes.pdf']
errno: 34,
code: 'ENOENT',
path: '/opt/holi/storage/orders-pdf/Attributes.pdf' }
Tried to use npm package https://github.com/brenden/node-webshot
Then code works perfectly on localhost, but fails on the server and throws this error:
EDIT:
Even when running webshot without:
fs.readFile(fileUri, function (err, data) {
if (err) {
return console.log(err);
}
fs.unlinkSync(fileUri);
fut.return(data);
});
The file is not created..
EDIT-2:
Webshot throws an error: [Error: PhantomJS exited with return value 2]
EDIT-3:
Actual issue: https://github.com/brenden/node-webshot/issues/123
I had a similar problem, and spent most of the day trying to figure out the issue. I ended up adding:
"phantomPath": "/usr/bin/phantomjs"
to my webshot options object. The phantom path I used is where mup installs phantomjs on your server setup.
I am trying to run karma tests from gulp task and I am getting this error:
Error: 1
at formatError (C:\Users\Tim\AppData\Roaming\npm\node_modules\gulp\bin\gulp.js:161:10)
at Gulp.<anonymous> (C:\Users\Tim\AppData\Roaming\npm\node_modules\gulp\bin\gulp.js:187:15)
at Gulp.emit (events.js:95:17)
at Gulp.Orchestrator._emitTaskDone (C:\path\to\project\node_modules\gulp\node_modules\orchestrator\index.js:264:8)
at C:\path\to\project\node_modules\gulp\node_modules\orchestrator\index.js:275:23
at finish (C:\path\to\project\node_modules\gulp\node_modules\orchestrator\lib\runTask.js:21:8)
at cb (C:\path\to\project\node_modules\gulp\node_modules\orchestrator\lib\runTask.js:29:3)
at removeAllListeners (C:\path\to\project\node_modules\karma\lib\server.js:216:7)
at Server.<anonymous> (C:\path\to\project\node_modules\karma\lib\server.js:227:9)
at Server.g (events.js:180:16)
My system is Windows 7, nodejs version is v0.10.32, gulp version:
[10:26:52] CLI version 3.8.8
[10:26:52] Local version 3.8.9
Also, the same error I am getting on Ubuntu 12.04 LTS while on newer Ubuntu (not sure what version) and mac os it is seems to be working ok. What can cause this error?
Update 5/11/2016: Before writing comment about the fact that accepted answer hide errors, please, see first two comments to that particular accepted answer. Use it only if know what you are doing. Related info: https://github.com/karma-runner/gulp-karma/pull/15
How are you running your tests with Gulp? I came up against this issue recently on OSX, running node v0.11.14 and gulp 3.8.10, whenever there were failing tests.
Changing from the recommended:
gulp.task('test', function(done) {
karma.start({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, done);
});
To:
gulp.task('test', function(done) {
karma.start({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, function() {
done();
});
});
...got rid of this error.
Seems to be down to how gulp handles error messages when an error is signalled in a callback. See Improve error messages on exit for more information.
None of these solutions worked correctly for me using gulp 3.9.1 and karma 1.1.1. Adding a reference to gulp-util npm install --save-dev gulp-util and updating the task to the below fix the error output very nicely, while maintaining exit status correctly.
var gutil = require('gulp-util');
gulp.task('test', function (done) {
new Server({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, function(err){
if(err === 0){
done();
} else {
done(new gutil.PluginError('karma', {
message: 'Karma Tests failed'
}));
}
}).start();
});
Below is a code snippet from gulp patterns on using Karma. It's a bit similar, but also uses the newer method how to start the karma.
/**
* Start the tests using karma.
* #param {boolean} singleRun - True means run once and end (CI), or keep running (dev)
* #param {Function} done - Callback to fire when karma is done
* #return {undefined}
*/
function startTests(singleRun, done) {
var child;
var excludeFiles = [];
var fork = require('child_process').fork;
var KarmaServer = require('karma').Server;
var serverSpecs = config.serverIntegrationSpecs;
if (args.startServers) {
log('Starting servers');
var savedEnv = process.env;
savedEnv.NODE_ENV = 'dev';
savedEnv.PORT = 8888;
child = fork(config.nodeServer);
} else {
if (serverSpecs && serverSpecs.length) {
excludeFiles = serverSpecs;
}
}
var server = new KarmaServer({
configFile: __dirname + '/karma.conf.js',
exclude: excludeFiles,
singleRun: singleRun
}, karmaCompleted);
server.start();
////////////////
function karmaCompleted(karmaResult) {
log('Karma completed');
if (child) {
log('shutting down the child process');
child.kill();
}
if (karmaResult === 1) {
done('karma: tests failed with code ' + karmaResult);
} else {
done();
}
}
}
What worked for me and gave a nice formatted error message is to provide an Error instance to the done callback.
gulp.task('test', function(done) {
karma.start({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, function(result) {
if (result > 0) {
return done(new Error(`Karma exited with status code ${result}`));
}
done();
});
});
If you want to return with an error code, and want to see Karma's error output but not Gulp's (probably unrelated) stack trace:
gulp.task('test', function() {
karma.start({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, function(karmaExitStatus) {
if (karmaExitStatus) {
process.exit(1);
}
});
});
Not sure about Ubuntu, but I was getting a similar error on Windows, and installing one version back fixed it right away like this:
npm install -g gulp#3.8.8
npm install gulp#3.8.8
this is gulp's way of telling your tests have failed and that karma exited with a return code of 1. Why you would want to call done yourself and not pass the error as a message baffles me.
The right way to solve this according to Karma's documentation and https://github.com/pkozlowski-opensource, is to rely on Karma's watch mechanism rather than Gulp's:
gulp.task('tdd', function (done) {
karma.start({
configFile: __dirname + '/karma.conf.js'
}, done);
});
Note the omission of singleRun: true.
#McDamon's workaround will work for gulp.watch, but you don't want to swallow exit codes like that when running on a CI server.
Gulp is also reworking how they handle exit codes in scenarios just like this one. See https://github.com/gulpjs/gulp/issues/71 and the other dozen or so related issues.
gulp.task('test', function(done) {
karma.start({
configFile: __dirname + '/karma.conf.js',
singleRun: false
}, done);
});
passing singleRun: false argument will prevent the process from returning a value different of 0 (which would signify an error and exit gulp).
Run with singleRun: true if you only launching your test from a command line, not part of a continuous integration suite.
In case anyone else comes here, do not use the accepted solution. It will hide failed tests. If you need a quick solution to modify your gulp test task, you can use the solution found in this comment in this github thread.
gulp.src(src)
// pipeline...
.on('error', function (error) {
console.error('' + error);
});