mocha test passing despite apparent stalling on a sync operation - node.js

Ok, this is kinda driving me batty. I am trying to run a mocha test on an app I'm writing that's supposed to select and copy a file based on project parameters (a dockerfile). The test is working for other cases, and in fact it's green for this case, except that it shouldn't be.
The test uses fs.readFileSync to store the file contents that should be there with the file contents that are actually there, to determine if the right file is copied.
Problem is, there's no file in the location being checked yet (as I haven't written the code to put it there, and I have validated by printing out the directories the test is using and then navigating there myself), but the test passes. Even more strange, as far as I can tell no code executes after the readFileSync operation. But it also doesn't time out...
Here's the test code in question:
it('should create a Dockerfile from ./data/dockerfiles/war.docker', () => {
let projectDocker = path.join(project, 'Dockerfile');
let warDocker = path.join(__dirname, '..', 'data','dockerfiles','war.docker');
select(project, (err, result) => {
let correct = fs.readFileSync(warDocker, 'utf-8');
console.log('projectDocker');
console.log('checking for projectDocker');
let prj = fs.readFileSync(projectDocker, 'utf-8');
console.log('Read file sync for project has completed'); // This line never fires
expect(prj).to.equal(correct);
expect(err).to.not.exist;
expect(result).to.exist;
expect(result).to.have.property('relPath', project);
expect(result).to.have.property('prepared', true);
});
});

select() looks suspiciously async, in which case your test should be async as well:
it('should create a Dockerfile from ./data/dockerfiles/war.docker', done => {
...
select(project, (err, result) => {
if (err) return done(err);
...
done();
});
});
Otherwise you run a big risk of exceptions being swallowed, as #Martin also suggests.

Related

Is this terminal-log the consequence of the Node JS asynchronous nature?

I haven't found anything specific about this, it isn't really a problem but I would like to understand better what is going on here.
Basically, I'am testing some simple NodeJS code , like this :
//Summary : Open a file , write to the file, delete the file.
let fs = require('fs');
fs.open('mynewfile.txt' , 'w' , function(err,file){
if(err) throw err;
console.log('Created file!')
})
fs.appendFile('mynewfile.txt' , 'Depois de ter criado este ficheiro com o fs.open, acrescentei-lhe data com o fs.appendFile' , function(err){
if (err) throw err;
console.log('Added text to the file.')
})
fs.unlink('mynewfile.txt', function(err){
if (err) throw err;
console.log('File deleted!')
})
console.log(__dirname);
I thought this code would be executed in the order it was written from the top to the bottom, but when I look at the terminal I'am not sure that was the case because this is what I get :
$ node FileSystem.js
C:\Users\Simon\OneDrive\Desktop\Portfolio\Learning Projects\NodeJS_Tutorial
Created file!
File deleted!
Added text to the file.
//Expected order would be: Create file, add text to file , delete file , log dirname.
Instead of what ther terminal might make you think, in the end when I look at my folder the code order still seems to have been followed somehow because the file was deleted and I have nothing left on the directory.
So , I was wondering , why is it that the terminal doesn't log in the same order that the code is written from the top to the bottom.
Would this be the result of NodeJS asynchronous nature or is it something else ?
The code is (in princliple) executed from top to bottom, as you say. But fs.open, fs.appendFile, and fs.unlink are asynchronous. Ie, they are placed on the execution stack in the partiticular order, but there is no guarantee whatsoever, in which order they are finished, and thus you can't guarantee, in which order the callbacks are executed. If you run the code multiple times, there is a good chance, that you may encounter different execution orders ...
If you need a specific order, you have two different options
You call the later operation only in the callback of the prior, ie something like below
fs.open('mynewfile.txt' , 'w' , function(err,file){
if(err) throw err;
console.log('Created file!')
fs.appendFile('mynewfile.txt' , '...' , function(err){
if (err) throw err;
console.log('Added text to the file.')
fs.unlink('mynewfile.txt', function(err){
if (err) throw err;
console.log('File deleted!')
})
})
})
You see, that code gets quite ugly and hard to read with all that increasing nesting ...
You switch to the promised based approach
let fs = require('fs').promises;
fs.open("myfile.txt", "w")
.then(file=> {
return fs.appendFile("myfile.txt", "...");
})
.then(res => {
return fs.unlink("myfile");
})
.catch(e => {
console.log(e);
})
With the promise-version of the operations, you can also use async/await
async function doit() {
let file = await fs.open('myfile.txt', 'w');
await fs.appendFile('myfile.txt', '...');
await fs.unlink('myfile.txt', '...');
}
For all three possibilites, you probably need to close the file, before you can unlink it.
For more details please read about Promises, async/await and the Execution Stack in Javascript
It's a combination of 2 things:
The asynchronous nature of Node.js, as you correctly assume
Being able to unlink an open file
What likely happened is this:
The file was opened and created at the same time (open with flag w)
The file was opened a second time for appending (fs.appendFile)
The file was unlinked
Data was appended to the file (while it was already unlinked) and the file was closed
When data was being appended, the file still existed on disk as an inode, but had zero hard links (references) to it. It still takes up space then, but the OS checks the reference count when closing and frees up the space if the count has fallen to zero.
People sometimes run into a similar situation with daemons such as HTTP servers that employ log rotation: if something goes wrong when switching over logs, the old log file may be unlinked but not closed, so it's never cleaned up and it takes space forever (until you reboot or restart the process).
Note that the ordering of operations that you're observing is random, and it is possible that they would be re-ordered. Don't rely on it.
You could write this as (untested):
let fs = require('fs');
const main = async () => {
await fs.open('mynewfile.txt' , 'w');
await fs.appendFile('mynewfile.txt' , 'content');
await fs.unlink('mynewfile.txt');
});
main()
.then(() => console.log('success'()
.catch(console.error);
or within another async function:
const someOtherFn = async () => {
try{
await main();
} catch(e) {
// handle any rejection to your liking
}
}
(The catch block is not mandatory. You can opt to just let them throw to the top. It's just to showcase how async / await allows you to make synchronous code appear as if it was synchronous code without runing into callback hell.)

How to stop jest describe without throw all the test

I want to stop a whole describe of JEST without throwing an error or stoping the other describes.
Im writing e2e test for my app with JEST and PUPPETEER, I write the test in a way every DESCRIBE its a flow of the path and every IT its a step, inside a IT I want to stop the flow if the pages dont match some conditions.
describe('Book a Room', ()=> {
it ('enter on main page' async() => await mainPage.navigateToMainPage())
it('go to book room page', async() => await bookRoomPage.navigateToBookRoomPage())
// The function its inside the "bookRoomPage"
it('check if the user can book room', () => {
if (!page.userCanOpenARoom()) {
// DONT EXECUTE THE NEXT IT BUT CONTINUE WITH THE OTHER DESCRIBE
}
})
it('go to book preview...', async() => bookRoomPreviewPage.navigateToBookRoomPreviewPage());
// REMAINING FLOW
})
I already try with process.exit(0) but exit the whole process
You can try out what this blog says here its for sharing specs in your test suites which is pretty handy. But for your case specifically you could extract your page cases in separate suites and then dynamically include the test case on runtime if a condition is met.
Something like:
Include Spec function shared_specs/index.js
const includeSpec = (sharedExampleName, args) => {
require(`./${sharedExampleName}`)(args);
};
exports.includeSpec = includeSpec;
Test A shared_specs/test_a.js
describe('some_page', () => {
it...
})
Test B shared_specs/test_b.js
describe('some_other_page', () => {
it...
})
and then in your test case
// Something like this would be your path I guess
import {includeSpec} from '../shared_specs/includeSpec.js'
describe('Book a Room', async ()=> {
if (page.userCanOpenARoom()) {
includeSpec('test_a', page);
} else {
includeSpec('test_b', page); // Or dont do anything
}
});
Just make sure that you check the paths since
require(`./${sharedExampleName}`)(args);
will load it dynamically at runtime, and use includeSpec in your describe blocks not it blocks. You should be able to split up your test suites pretty nicely with this.

Download files asynchrounsly and parse them synchronously with Node JS

I have a gulp task that downloads a few JSON files from GitHub, then prompts the user for values to replace in those files. For example, I have an .ftpconfig that gets download, and then the user is asked to enter hostname, username, password, and path.
Because the file first needs to be downloaded before it can be configured, and each file needs to be configured sequentially, I'm using quite a few nested callbacks. I'd like to change this "callback hell" system so that it utilizes async/await and/or promises instead, but I'm having a lot of difficulty understanding exactly why my code isn't working; it seems that promises fire their .then() functions asynchronously, which doesn't make sense to me.
My goals are as follows:
Download all config files asynchronously
Wait for all config files to finish downloading
Read existing settings from the config files
Prompt the user for changed settings in each config file synchronously
I've tried a number of approaches, none of which worked. I discarded the code I've used, but here's a rough recreation of the things I've tried:
Attempt #1:
return new Promise((resolve) => {
// download files...
}).then((resolve) => {
// configure first file...
}).then((resolve) => {
// configure second file...
}).then((resolve) => {
// configure thrid file...
});
Attempt #2:
const CONFIG_FILES = async () => {
const bs_download = await generate_config("browsersync");
const ftp_download = await generate_config("ftp");
const rsync_download = await generate_config("rsync");
return new Promise(() => {
configure_json("browsersync");
}).then(() => {
configure_json("ftp");
}).then(() => {
configure_json("rsync");
});
};
I'm sure I'm doing something very obviously wrong, but I'm not adapt enough at JavaScript to see the problem. Any help would be great appreciated.
My gulp task can be found here:
gulpfile.js
gulp-tasks/config.js
Thanks to #EricB, I was able to figure out what I was doing wrong. It was mostly a matter of making my functions return promises as well.
https://github.com/JacobDB/new-site/blob/d119b8b3c22aa7855791ab6b0ff3c2e33988b4b2/gulp-tasks/config.js

Resemblejs in jest hangs

I'm using ResembleJS for image comparison. I can get it to run when I run it in a standalone script. Here's the code:
var compareImages = require('resemblejs/compareImages');
var fs = require('fs');
var path = require('path');
// The parameters can be Node Buffers
// data is the same as usual with an additional getBuffer() function
async function getDiff() {
var img = path.join(__dirname, 'small.jpg');
const data = await compareImages(
fs.readFileSync(img),
fs.readFileSync(img)
);
console.log(data);
fs.writeFileSync('./output.png', data.getBuffer());
}
getDiff();
Everything works as expected.
But when I run the comparison inside of a test in with the jest framework, it hangs and eventually times out. At first I thought maybe it was just running really slow, so I set my max timeout in jest to be 1 minute. Still failed. So I set my test image to be 1 pixel so it's the simplest test. Still wouldn't finish.
Running from a docker container with Node 8.9.4 (which is what comes from the docker hub node:8). Running jest 22.0.4.
Anybody else have issues running these two together?
I know Resemblejs runs tests with Jest, so not sure what could be causing the issue.
could you please post the code for your tests ?
Are you sure you are returning something from your test block ? In order for an test not to hang you need to return a promise which will resolve before the timeout. Below two examples
test("test", () => {
// test is done when writeFile resolves
return new Promise((resolve, reject) => {
fs.writeFile("path", "encoding", (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
});
test("test", async function () {
// test is done after the assertion
const result = await fetch();
expect(result).toBe(); // test;
});
I had a similar problem with slow tests with Jest, React and Docker (but I'm not using Resemblejs).
I found the solution on Github:
And for me solution was simply add "roots": ["./src"] to jest.config.js

in firebase cloud function the bucket.upload promise resolves too early

I wrote a function that work like this
onNewZipFileRequested
{get all the necessary data}
.then{download all the files}
.then{create a zipfile with all those file}
.then{upload that zipfile} (*here is the problem)
.than{update the database with the signedUrl of the file}
Here is the relevant code
[***CREATION OF ZIP FILE WORKING****]
}).then(() =>{
zip.generateNodeStream({type:'nodebuffer',streamFiles:true})
.pipe(fs.createWriteStream(tempPath))
.on('finish', function () {
console.log("zip written.");
return bucket.upload(tempPath, { //**** problem****
destination: destinazionePath
});
});
}).then(()=>{
const config = {
action:'read',
expires:'03-09-2391'
}
return bucket.file(destinazionePath).getSignedUrl(config)
}).then(risultato=>{
const daSalvare ={
signedUrl: risultato[0],
status : 'fatto',
dataInserimento : zipball.dataInserimento
}
return event.data.ref.set(daSalvare)
})
On the client side, as soon as the app see the status change and the new Url, a download button (pointing to the new url) appears
Everything is working, but if I try to download the file immediately... there is no file yet!!!
If I wait same time and retry the file is there.
I noted that the time I have to wait depend on the size of the zipfile.
The bucket.upload promise should resolve on the end of the upload, but apparently fires too early.
Is there a way to know exactly when the file is ready?
I may have to make same very big file, it's not a problem if the process takes several minutes, but I need to know when it's over.
* EDIT *
there was a unnecessary nesting in the code. While it was not the error (results are the same before and after refactoring) it was causing some confusion in the answers, so i edited it out.
Id' like to point out that i update the database only after getting the signed url, and i get that only after the upload (i could not otherwise), so to get any result at all the promise chain MUST work, and in fact it does. When on the client side the download button appears (happens when 'status' become 'fatto') it is already linked to the correct signed url, but if i press it too early the file is not there (Failed - No file). If i wait some second (the bigger the file the longer i have to wait) then the file is there.
(English is not my mother language, if i have been unclear ask and i will try to explain myself better)
It looks like the problem could be that the braces are not aligned properly, causing a then statement to be embedded within another. Here is the code with the then statements separated:
[***CREATION OF ZIP FILE WORKING****]}).then(() => {
zip.generateNodeStream({type: 'nodebuffer', streamFiles: true})
.pipe(fs.createWriteStream(tempPath))
.on('finish', function () {
console.log('zip written.')
return bucket.upload(tempPath, {
destination: destinazionePath
})
})
}).then(() => {
const config = {
action: 'read',
expires: '03-09-2391'
}
return bucket.file(destinazionePath).getSignedUrl(config)
}).then(risultato => {
const daSalvare = {
signedUrl: risultato[0],
status : 'fatto',
dataInserimento : zipball.dataInserimento
}
return event.data.ref.set(daSalvare)
})

Resources