Asynchronous method in node js is not working - node.js

This is the most simple async code you can ever imagine but I don't know why I can't figure out why it is not working.
This is my code:
const fs = require('fs');
fs.readdirSync('./', function (err, files) {
if (err)
console.log('Error!!', err);
else
console.log("Result!!",files);
});
This is my terminal:
% node main.js
%
Literally nothing happens...

"readdirSync" is a synchronized function and you are using it as async style, use "readdir" instead or just get result from function return https://www.geeksforgeeks.org/node-js-fs-readdirsync-method/

Use fs.readdir() or fs.readdirSync() to read the contents of a directory.
This piece of code reads the content of a folder, both files and subfolders, and returns their relative path:
const fs = require('fs')
const folderPath = '/Users/joe'
fs.readdirSync(folderPath)

Related

Can't find the file I created by fs.writeFile

I am trying to write a file in node.js using fs.writeFile, I use the following code:
const fs = require('filer');
const jsonString = JSON.stringify(myObj)
fs.writeFile('/myFile.txt', jsonString, function (err) {
if (err) throw err;
console.log('Saved!');
});
}
I am sure the file is created, because I can read it by fs.readFile referring to the same address, but I cannot find it on the disk by using windows search. What I understood, if I change the localhost port it saves the files in another location. I already tried "process.cwd()", but it didn't work.
I really appreciate it if someone could help.
try to use : __dirname instead of process.cwd()
const fs = require('fs');
const path = require('path');
const filePath = path.join(__dirname, '/myFile.txt');
console.log(filePath);
const jsonString = JSON.stringify({ name: "kalo" })
fs.writeFile(filePath, jsonString, (err) => {
if (err) throw err;
console.log('The file has been saved!');
});
And I would like to know why are you using 'filer' instead of default fs module?
fs module is native module that provides file handling in node js. so you don't need to install it specifically. This code perfectly worked and it prints absolute location of the file as well.Just run this code if it doesn't work, I think you should re install node js. I have updated the answer.You can also use fs.writeFileSync method as well.
From documentation: "String form paths are interpreted as UTF-8 character sequences identifying the absolute or relative filename. Relative paths will be resolved relative to the current working directory as determined by calling process.cwd()."
So in order to determine your working directory (i.e. where fs create files by default) call (works for me):
console.log(process.cwd());
Then if you would like to change your working directory, you can call (works for me as well):
process.chdir('path_to_new_directory');
Path can be relative or absolute.
This is also from documentation: "The process.chdir() method changes the current working directory of the Node.js process or throws an exception if doing so fails (for instance, if the specified directory does not exist)."

fs.createReadStream getting a different path than what's being passed in

I'm using NodeJS on a VM. One part of it serves up pages, and another part is an API. I've run into a problem, where fs.createReadStream attempts to access a different path than what is being passed into the function. I made a small test server to see if it was something else in the server affecting path usage, for whatever reason, but it's happening on my test server as well. First, here's the code:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const readCSV = (filename) => {
console.log('READ CSV GOT ' + filename); // show me what you got
return new Promise((resolve, reject) => {
const arr = [];
fs.createReadStream(filename)
.pipe(csv())
.on('data', row => {
arr.push(row);
})
.on('error', err => {
console.log(err);
})
.on('end', () => {
resolve(arr);
});
}
}
// tried this:
// const dir = path.relative(
// path.join('path', 'to', 'this', 'file),
// path.join('path', 'to', 'CONTENT.csv')
// );
// tried a literal relative path:
// const dir = '../data/CONTENT.csv';
// tried a literal absolute path:
// const dir = '/repo/directory/server/data/CONTENT.csv';
// tried an absolute path:
const dir = path.join(__dirname, 'data', 'CONTENT.csv');
const content = readCSV(dir)
.then(result => {console.log(result[0]);})
.catch(err => {console.log(err);});
...but any way I slice it, I get the following output:
READCSV GOT /repo/directory/server/data/CONTENT.csv
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '/repo/directory/data/CONTENT.csv'
i.e., is fs.createReadStream somehow stripping out the directory of the server, for some reason? I suppose I could hard code the directory into the call to createReadStream, maybe? I just want to know why this is happening.
Some extra: I'm stuck on node v8.11, can't go any higher. On the server itself, I believe I'm using older function(param) {...} instead of arrow functions -- but the behavior is exactly the same.
Please help!!
Code is perfect working.
I think you file CONTENT.csv should be in data folder like "/repo/directory/data/CONTENT.csv".
I'm answering my own question, because I found an answer, I'm not entirely sure why it's working, and at least it's interesting. To the best of my estimation, it's got something to do with the call stack, and where NodeJS identifies as the origin of the function call. I've got my server set up in an MVC pattern so my main app.js is in the root dir, and the function that's being called is in /controllers folder, and I've been trying to do relative paths from that folder -- I'm still not sure why absolute paths didn't work.
The call stack goes:
app.js:
app.use('/somepath', endpointRouter);
...then in endpointRouter.js:
router.get('/request/file', endpointController.getFile);
...then finally in endpointController.js:
const readCSV = filename => {
//the code I shared
}
exports.getFile = (req, res, next) => {
// code that calls readCSV(filename)
}
...and I believe that because Node views the chain as originating from app.js, it then treats all relative paths as relative to app.js, in my root folder. Basically when I switched to the super unintuitive single-dot-relative path: './data/CONTENT.csv', it worked with no issue.

Create a duplicated file with fs streams and be able to read it

I'm currently trying to copy the content of a file into another one using Node.js through the fs.createWriteStream and fs.createReadStream functions.
To be more specific, the file is a music sample that I would like to duplicate. Also, I expect the resulting file to be readable by a player like any music or video. It's this last point that I don't manage to perform. The files are indeed duplicated, but the the resulting file is not accepted by my player as a readable file, like if it was corrupted somehow.
I checked its content and there it doesn't seem to be a matter a programmation logic, as the the datas of the original file have been correctly transposed into the copy. Here is my script, if you want to take a look.
const express = require('express')
const app = express()
const fs = require("fs")
var Promise = require("bluebird")
Promise.promisifyAll(fs)
const path = require('path')
const file1 = path.join(__dirname, 'sample1.wav') // The file to copy
const file2 = path.join(__dirname, 'sample2.wav') // The destination of the new file
app.use(async(req,res,next)=>{
let file1_stream = await fs.createReadStream(file1)
let file2_stream = await fs.createWriteStream(file2)
file2_stream.pipe(file2_stream)
next()
})
.listen(8080)
I guess the operation is not as simple as just copying a stream and inject it with a pipe like shown above. if someone has any idea what I am missing here, I am all ears. Thanks by advance.
That code is triggering an error, which you're probably not handling correctly, since you're using an async middleware on express.
Error [ERR_STREAM_CANNOT_PIPE]: Cannot pipe, not readable
You have to use .pipe on the readableStream not on the writeableStream
So the code should be:
file1_stream.pipe(file2_stream);
Also, you don't need to await on fs.createWriteStream. It's doing nothing. The promisify works on callbacks APIs, but createWriteStream & createReadStream don't take a callback as an argument.
app.use((req,res,next)=>{
let readStream = fs.createReadStream(file1);
let writeStream = fs.createWriteStream(file2);
readStream.pipe(writeStream);
// Call next once the file was actually copied
writeStream.on('finish', next);
writeStream.on('error', next); // go to express error handler
readStream.on('error', next);
});

Async move array of files in node

I have a folder with some images. Some images I want to move from this to other folder. This other folder can not exsists. I know about fs.rename. But I cant image how to send to it array and dont loose asynchronous. All what I can its like:
let imagesArray = ['path1', 'path2' ... 'pathN']
for(img of imagesArray){
fs.renameSync(oldPath+img, newPath+img)
}
How I must make it and how to make it asynchronous?
For using promises you will need to make fs.rename() to return a promise instead of a callback. You can use the util module for this (you dont need to install it with npm)
const util = require('util');
const fs = require('fs');
const rename = util.promisify(fs.rename);
Now you can use Promise.all + Array.map to loop through the array using async
(async () => {
await Promise.all(imagesArray.map(oldname => rename(oldname, oldname+newname));
//Do the stuff you need to do after renaming the files
})

Is there a more elegant way to read then write *the same file* with node js stream

I wanna read file then change it with through2 then write into the same file, code like:
const rm = require('rimraf')
const through2 = require('through2')
const fs = require('graceful-fs')
// source file path
const replacementPath = `./static/projects/${destPath}/index.html`
// temp file path
const tempfilePath = `./static/projects/${destPath}/tempfile.html`
// read source file then write into temp file
await promiseReplace(replacementPath, tempfilePath)
// del the source file
rm.sync(replacementPath)
// rename the temp file name to source file name
fs.renameSync(tempfilePath, replacementPath)
// del the temp file
rm.sync(tempfilePath)
// promiseify readStream and writeStream
function promiseReplace (readfile, writefile) {
return new Promise((res, rej) => {
fs.createReadStream(readfile)
.pipe(through2.obj(function (chunk, encoding, done) {
const replaced = chunk.toString().replace(/id="wrap"/g, 'dududud')
done(null, replaced)
}))
.pipe(fs.createWriteStream(writefile))
.on('finish', () => {
console.log('replace done')
res()
})
.on('error', (err) => {
console.log(err)
rej(err)
})
})
}
the above code works, but I wanna know can I make it more elegant ?
and I also try some temp lib like node-temp
unfortunately, it cannot readStream and writeStream into the same file as well, and I open a issues about this.
So any one know a better way to do this tell me, thank you very much.
You can make the code more elegant by getting rid of unnecessary dependencies and using the newer simplified constructor for streams.
const fs = require('fs');
const util = require('util');
const stream = require('stream');
const tempWrite = require('temp-write');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(new stream.Transform({
decodeStrings : false,
transform(chunk, encoding, done) {
done(null, chunk.replace(/goat/g, 'llama'));
}
}));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Tests
AVA tests to prove that it works:
import fs from 'fs';
import path from 'path';
import util from 'util';
import test from 'ava';
import mkdirtemp from 'mkdirtemp';
import goat2llama from '.';
const writeFile = util.promisify(fs.writeFile);
const readFile = util.promisify(fs.readFile);
const fixture = async (content) => {
const dir = await mkdirtemp();
const fixturePath = path.join(dir, 'fixture.txt');
await writeFile(fixturePath, content);
return fixturePath;
};
test('goat2llama()', async (t) => {
const filePath = await fixture('I like goats and frogs, but goats the best');
await goat2llama(filePath);
t.is(await readFile(filePath, 'utf8'), 'I like llamas and frogs, but llamas the best');
});
A few things about the changes:
Through2 is not really needed anymore. It used to be a pain to set up passthrough or transform streams properly, but that is not the case anymore thanks to the simplified construction API.
You probably don't need graceful-fs, either. Unless you are doing a lot of concurrent disk I/O, EMFILE is not usually a problem, especially these days as Node has gotten smarter about file descriptors. But that library does help with temporary errors caused by antivirus software on Windows, if that is a problem for you.
You definitely do not need rimraf for this. You only need fs.rename(). It is similar to mv on the command line, with a few nuances that make it distinct, but the differences are not super important here. The point is there will be nothing at the temporary path after you rename the file that was there.
I used temp-write because it generates a secure random filepath for you and puts it in the OS temp directory (which automatically gets cleaned up now and then), plus it handles converting the stream to a Promise for you and takes care of some edge cases around errors. Disclosure: I wrote the streams implementation in temp-write. :)
Overall, this is a decent improvement. However, there remains the boundary problem discussed in the comments. Luckily, you are not the first person to encounter this problem! I wouldn't call the actual solution particularly elegant, certainly not if you implement it yourself. But replacestream is here to help you.
const fs = require('fs');
const util = require('util');
const tempWrite = require('temp-write');
const replaceStream = require('replacestream');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(replaceStream('goat', 'llama'));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Also...
I do not like temp files
Indeed, temp files are often bad. However, in this case, the temp file is managed by a well-designed library and stored in a secure, out-of-the-way location. There is virtually no chance of conflicting with other processes. And even if the rename() fails somehow, the file will be cleaned up by the OS.
That said, you can avoid temp files altogether by using fs.readFile() and fs.writeFile() instead of streaming. The former also makes text replacement much easier since you do not have to worry about chunk boundaries. You have to choose one approach or the other, however for very big files, streaming may be the only option, aside from manually chunking the file.
Streams are useless in this situation, because they return you chunks of file that can break the string that you're searching for. You could use streams, then merge all these chunks to get content, then replace the string that you need, but that will be longer code that will provoke just one question: why do you read file by chunks if you don't use them ?
The shortest way to achieve what you want is:
let fileContent = fs.readFileSync('file_name.html', 'utf8')
let replaced = fileContent.replace(/id="wrap"/g, 'dududud')
fs.writeFileSync('file_name.html', replaced)
All these functions are synchronous, so you don't have to promisify them

Resources