I have this error while "piping" a file in node:
events.js:160
throw er; // Unhandled 'error' event
^
Error: read ECONNRESET
at exports._errnoException (util.js:1022:11)
at Pipe.onread (net.js:569:26)
Here is my code:
var json_file = fs.createWriteStream(jsonFile),
processes = 0,
read_ended_flag = false,
converter = new Converter(params);
let read_csv_file = fs.createReadStream(csvFilePath);
// For each csv line, we get a json doc
converter.on('record_parsed', jsonObj => {
processes++;
json_file.write(JSON.stringify(jsonObj) + '\n', () => {
processes--;
if(read_ended_flag && processes == 0){
json_file.end();
callback();
}
});
});
converter.on('end_parsed', () => {
read_ended_flag = true;
});
read_csv_file
.pipe(converter);
I tried to catch error using this or this, but it still the same.
This bug only comes while working with small files ( > 100 lines ).
Is it because that the read stream is closed before writing in the new file ?
Many thanks for any tips & helps !
I found the solution ! :)
It was due to my Converter param (csvtojson module), i put workerNum to 4, instead i have to put 1 for dealing small files.
workerNum: Number of worker processes. The worker process will use multi-cores to help process CSV data. Set to number of Core to improve the performance of processing large csv file. Keep 1 for small csv files. Default 1.
Here is a complete tutorial about csvtojson module
Hope it helps others !
Related
Using Polling like below to check if the content of the file is changed then, other two functions are called
var poll_max_date=AsyncPolling(function (end,err) { if(err) {
console.error(err); } var stmp_node_id=fs.readFileSync(path.join(__dirname,'node_id'),"utf8");
console.log("--------loaded node : "+stmp_node_id);
if(druid_stmp_node_id!=stmp_node_id) {
// MAX DATA CUT-OFF DRUID QUERY
druid_exe.max_date_query_fire();
// // DRUID QUERY FOR GLOBAL DATA
druid_exe.global_druid_query_fire();
druid_stmp_node_id=stmp_node_id; }
end(); }, 1800000).run();//30 mins
Its working fine for sometime, but then getting below error like after 4 - 5hours :
events.js:167
throw er; // Unhandled 'error' event
^
Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:111:27) Emitted 'error' event at:
at emitErrorNT (internal/streams/destroy.js:82:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
tried using fs.watch to monitor the changes in the file instead of polling like below :
let md5Previous = null; let fsWait = false;
fs.watch(dataSourceLogFile, (event, filename) => { if (filename) {
if (fsWait) return;
fsWait = setTimeout(() => {
fsWait = false;
}, 1000);
const md5Current = md5(fs.readFileSync(dataSourceLogFile));
if (md5Current === md5Previous) {
return;
}
md5Previous = md5Current;
console.log(`${filename} file Changed`);
// MAX DATA CUT-OFF DRUID QUERY
druid_exe.max_date_query_fire();
// DRUID QUERY FOR GLOBAL DATA
druid_exe.global_druid_query_fire(); } });
Its is also working fine for sometime, but then getting same error like after 4 - 5hours :
events.js:167 throw er; // Unhandled 'error' event ^
Error: read ECONNRESET at TCP.onStreamRead
(internal/stream_base_commons.js:111:27) Emitted 'error' event at: at
emitErrorNT (internal/streams/destroy.js:82:8) at emitErrorAndCloseNT
(internal/streams/destroy.js:50:3)
But when run in Local Machine, its working fine. the error occurs only when run in remote Linux Machine.
somebody can help me how I can fix that problem?
Use fs.watchFile once , because fs.watch is not consistent across platforms,
https://nodejs.org/docs/latest/api/fs.html#fs_fs_watchfile_filename_options_listener
Change your code according to the requirement.
It has been happening since the users are closing the browser before the data request is received, leading to Connection Reset.
Used PM2 (http://pm2.keymetrics.io/) to run the application, and it is working great now .
const { spawn } = require("child_process")
try{
spawn("invalid/path/to/executable")
}catch(err){
console.log("exception: ",err)
}
This code raises an error and the execution of the program stops. It never prints exception: so the catch block is not executed:
events.js:183
throw er; // Unhandled 'error' event
^
Error: spawn invalid/path/to/executable ENOENT
When run with a valid path to an executable, the same code works.
What can I do to handle the case when the spawn fails due to ENOENT error?
This module fires error event and you can just add a listener for it.
You can read more about it here
So, you can transform your code to:
const {spawn} = require("child_process")
const subprocess = spawn("invalid/path/to/executable")
subprocess.on('error', function (err) {
console.log('Failed to start subprocess: ' + err);
});
Also, I suggest reading this article by Samer Buna. He covered a lot of interesting topics about this module.
I have a NodeJS script that I use to write JSON objects to file :
var reWriteCodes = function(arrayCodes, pathFileCodes, fs, typereWriteCodes, cb) {
console.log("message : "+ typereWriteCodes);
var codesStream = fs.createWriteStream(pathFileCodes, {'flags': 'w'});
codesStream.on('finish', () => {
console.log("the stream is closed")
if (typereWriteCodes === 0) {
cb();
}
});
var jsonToWrite = { "liste_codes" : arrayCodes};
codesStream.write(JSON.stringify(jsonToWrite));
codesStream.end();
}
This function is called twiced (first with typereWriteCodes = 0 then with typereWriteCodes = 1, inside the callback function : cb) with two different files.
The first call ends fine, my file is saved and the message "the stream is closed" displays in the console. But in the second call (which is the last operation of the program), my file is not saved correctly (the file is empty) and the message "the stream is closed" is not triggerd. Also, I get this message :
events.js:141
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open ''
at Error (native)
I have the feeling that the app is closing before the stream could end correctly ... But I do not know how I can do this correctly. Could you help me with this issue ? Any help appreciated.
I'm having a very weird EPIPE write error when I'm trying to generate a PDF from HTML with this module:
https://www.npmjs.com/package/html-pdf
The exact error:
events.js:72
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at errnoException (net.js:904:11)
at Object.afterWrite (net.js:720:19)
The line where I call the PDF gen:
var pdfgen = require("html-pdf");
pdfgen.create(html, options).toFile('whappbook.pdf', function(err, res) {
var errorCount = 0;
if (err) {
console.log(err);
errorCount++;
if(errorCount > 2) {
return console.log(err);
} else {
return create();
}
}
console.log(res); // { filename: '/app/businesscard.pdf' }
});
I've tried just using <h1>Testing</h1> as the HTML to see if the file was too big, but that wasn't the case.
Note: local everything works perfectly fine, but when I run this code on Appfog it doesn't work, so I suspect it has something to do with missing dependencies or file writing rights.
I'm writing directly to the home directory (in Appfog's case thats /app).
HTML-PDF is using PhantomJS and I've also directly npm'd that in my app because some people were reporting issues when it wasn't directly installed, but that didn't solve my problem either.
If there is any information that I can additionally provide please let me know!
I'm getting an error with streams.
I'm working on adding istanbul to my existing mocha task. When I run this task I get the error below.
I'm using gulp-istanbul
(note: the config.test.src.bdd.features is set to the value 'test/bdd/features/**/*-spec.js')
var stream = gulp.src([config.test.src.bdd.features], { read: false });
gulp.task('mocha-bdd-features', function(cb) {
process.env.PORT = 8001;
return stream
.pipe(istanbul())
.pipe(istanbul.hookRequire())
.pipe(mocha({
compilers: {
js: babel
},
reporter: config.test.mocha.reporter,
ui: 'bdd'
}))
.on('finish', function () {
stream.pipe(istanbul.writeReports())
stream.pipe(istanbul.enforceThresholds({thresholds: {global: 90}}))
stream.on('end', cb);
});
});
the error I get is:
events.js:85
throw er; // Unhandled 'error' event
^
Error: streams not supported
and who knows I may not be setting up this task right when trying to incorporate gulp-istanbul but trying to at least get past this error first.
I was facing the exact same issue.
I believe the problem is in line:
var stream = gulp.src([config.test.src.bdd.features], { read: false });
Setting the read option to false causes the file.contents to be null and therefore istanbul is not able to cover the files. (check here)
so try the same thing but without the read option.
var stream = gulp.src([config.test.src.bdd.features]);
Hope this helps.