Async move array of files in node - node.js

I have a folder with some images. Some images I want to move from this to other folder. This other folder can not exsists. I know about fs.rename. But I cant image how to send to it array and dont loose asynchronous. All what I can its like:
let imagesArray = ['path1', 'path2' ... 'pathN']
for(img of imagesArray){
fs.renameSync(oldPath+img, newPath+img)
}
How I must make it and how to make it asynchronous?

For using promises you will need to make fs.rename() to return a promise instead of a callback. You can use the util module for this (you dont need to install it with npm)
const util = require('util');
const fs = require('fs');
const rename = util.promisify(fs.rename);
Now you can use Promise.all + Array.map to loop through the array using async
(async () => {
await Promise.all(imagesArray.map(oldname => rename(oldname, oldname+newname));
//Do the stuff you need to do after renaming the files
})

Related

How to create an import/export script using Node.JS?

I'm looking to import/export a list of files in a directory through an index.js file in the same directory.
For example, I have 2 files in a directory: admin.js and user.js and I am looking to require and exporting them in the in the index.js like so
module.exports = {
admin: require("./admin"),
users: require("./users"),
};
The script I have come up with looks like this but it is not working and giving me an error
fs.readdirSync(__dirname, (files) => {
files.forEach((file) => {
module.exports[file] = require(`./${file}`);
});
});
How can I improve this script to make it work?
Thank you!
[Update - 2022 December 18]
Found a solution based off of sequelize models/index.js, this will pretty much require and export your files and folders, feel free to use and modify
const fs = require('fs')
const path = require('path')
const basename = path.basename(__filename)
const controllers = {}
fs.readdirSync(__dirname)
.filter((folder) => {
return folder.indexOf('.') !== 0 && folder !== basename
})
.forEach((folder) => {
const controller = require(path.join(__dirname, folder))
controllers[controller.name] = controller
})
module.exports = controllers
fs.readdirSync() does NOT accept a callback. It directly returns the result:
const files = fs.readdirSync(__dirname);
for (let file of files) {
module.exports[file] = require(`./${file}`);
}
Note, the future of the Javascript language is using import and export with statically declared module names instead of require()and module.exports and this structure will generally not work with the newer way of doing things. So, if you expect to eventually move to the newer ESM modules, you may not want to bake in this type of architecture.
There is a dynamic import in ESM modules, but it's asynchronous (returns a promise that you have to wait for).
Also, note that this will attempt to reload your index.js file containing this code. That's might not be harmful, but may not be your intention.

Asynchronous method in node js is not working

This is the most simple async code you can ever imagine but I don't know why I can't figure out why it is not working.
This is my code:
const fs = require('fs');
fs.readdirSync('./', function (err, files) {
if (err)
console.log('Error!!', err);
else
console.log("Result!!",files);
});
This is my terminal:
% node main.js
%
Literally nothing happens...
"readdirSync" is a synchronized function and you are using it as async style, use "readdir" instead or just get result from function return https://www.geeksforgeeks.org/node-js-fs-readdirsync-method/
Use fs.readdir() or fs.readdirSync() to read the contents of a directory.
This piece of code reads the content of a folder, both files and subfolders, and returns their relative path:
const fs = require('fs')
const folderPath = '/Users/joe'
fs.readdirSync(folderPath)

fs.createReadStream getting a different path than what's being passed in

I'm using NodeJS on a VM. One part of it serves up pages, and another part is an API. I've run into a problem, where fs.createReadStream attempts to access a different path than what is being passed into the function. I made a small test server to see if it was something else in the server affecting path usage, for whatever reason, but it's happening on my test server as well. First, here's the code:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const readCSV = (filename) => {
console.log('READ CSV GOT ' + filename); // show me what you got
return new Promise((resolve, reject) => {
const arr = [];
fs.createReadStream(filename)
.pipe(csv())
.on('data', row => {
arr.push(row);
})
.on('error', err => {
console.log(err);
})
.on('end', () => {
resolve(arr);
});
}
}
// tried this:
// const dir = path.relative(
// path.join('path', 'to', 'this', 'file),
// path.join('path', 'to', 'CONTENT.csv')
// );
// tried a literal relative path:
// const dir = '../data/CONTENT.csv';
// tried a literal absolute path:
// const dir = '/repo/directory/server/data/CONTENT.csv';
// tried an absolute path:
const dir = path.join(__dirname, 'data', 'CONTENT.csv');
const content = readCSV(dir)
.then(result => {console.log(result[0]);})
.catch(err => {console.log(err);});
...but any way I slice it, I get the following output:
READCSV GOT /repo/directory/server/data/CONTENT.csv
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '/repo/directory/data/CONTENT.csv'
i.e., is fs.createReadStream somehow stripping out the directory of the server, for some reason? I suppose I could hard code the directory into the call to createReadStream, maybe? I just want to know why this is happening.
Some extra: I'm stuck on node v8.11, can't go any higher. On the server itself, I believe I'm using older function(param) {...} instead of arrow functions -- but the behavior is exactly the same.
Please help!!
Code is perfect working.
I think you file CONTENT.csv should be in data folder like "/repo/directory/data/CONTENT.csv".
I'm answering my own question, because I found an answer, I'm not entirely sure why it's working, and at least it's interesting. To the best of my estimation, it's got something to do with the call stack, and where NodeJS identifies as the origin of the function call. I've got my server set up in an MVC pattern so my main app.js is in the root dir, and the function that's being called is in /controllers folder, and I've been trying to do relative paths from that folder -- I'm still not sure why absolute paths didn't work.
The call stack goes:
app.js:
app.use('/somepath', endpointRouter);
...then in endpointRouter.js:
router.get('/request/file', endpointController.getFile);
...then finally in endpointController.js:
const readCSV = filename => {
//the code I shared
}
exports.getFile = (req, res, next) => {
// code that calls readCSV(filename)
}
...and I believe that because Node views the chain as originating from app.js, it then treats all relative paths as relative to app.js, in my root folder. Basically when I switched to the super unintuitive single-dot-relative path: './data/CONTENT.csv', it worked with no issue.

Create a duplicated file with fs streams and be able to read it

I'm currently trying to copy the content of a file into another one using Node.js through the fs.createWriteStream and fs.createReadStream functions.
To be more specific, the file is a music sample that I would like to duplicate. Also, I expect the resulting file to be readable by a player like any music or video. It's this last point that I don't manage to perform. The files are indeed duplicated, but the the resulting file is not accepted by my player as a readable file, like if it was corrupted somehow.
I checked its content and there it doesn't seem to be a matter a programmation logic, as the the datas of the original file have been correctly transposed into the copy. Here is my script, if you want to take a look.
const express = require('express')
const app = express()
const fs = require("fs")
var Promise = require("bluebird")
Promise.promisifyAll(fs)
const path = require('path')
const file1 = path.join(__dirname, 'sample1.wav') // The file to copy
const file2 = path.join(__dirname, 'sample2.wav') // The destination of the new file
app.use(async(req,res,next)=>{
let file1_stream = await fs.createReadStream(file1)
let file2_stream = await fs.createWriteStream(file2)
file2_stream.pipe(file2_stream)
next()
})
.listen(8080)
I guess the operation is not as simple as just copying a stream and inject it with a pipe like shown above. if someone has any idea what I am missing here, I am all ears. Thanks by advance.
That code is triggering an error, which you're probably not handling correctly, since you're using an async middleware on express.
Error [ERR_STREAM_CANNOT_PIPE]: Cannot pipe, not readable
You have to use .pipe on the readableStream not on the writeableStream
So the code should be:
file1_stream.pipe(file2_stream);
Also, you don't need to await on fs.createWriteStream. It's doing nothing. The promisify works on callbacks APIs, but createWriteStream & createReadStream don't take a callback as an argument.
app.use((req,res,next)=>{
let readStream = fs.createReadStream(file1);
let writeStream = fs.createWriteStream(file2);
readStream.pipe(writeStream);
// Call next once the file was actually copied
writeStream.on('finish', next);
writeStream.on('error', next); // go to express error handler
readStream.on('error', next);
});

Is there a more elegant way to read then write *the same file* with node js stream

I wanna read file then change it with through2 then write into the same file, code like:
const rm = require('rimraf')
const through2 = require('through2')
const fs = require('graceful-fs')
// source file path
const replacementPath = `./static/projects/${destPath}/index.html`
// temp file path
const tempfilePath = `./static/projects/${destPath}/tempfile.html`
// read source file then write into temp file
await promiseReplace(replacementPath, tempfilePath)
// del the source file
rm.sync(replacementPath)
// rename the temp file name to source file name
fs.renameSync(tempfilePath, replacementPath)
// del the temp file
rm.sync(tempfilePath)
// promiseify readStream and writeStream
function promiseReplace (readfile, writefile) {
return new Promise((res, rej) => {
fs.createReadStream(readfile)
.pipe(through2.obj(function (chunk, encoding, done) {
const replaced = chunk.toString().replace(/id="wrap"/g, 'dududud')
done(null, replaced)
}))
.pipe(fs.createWriteStream(writefile))
.on('finish', () => {
console.log('replace done')
res()
})
.on('error', (err) => {
console.log(err)
rej(err)
})
})
}
the above code works, but I wanna know can I make it more elegant ?
and I also try some temp lib like node-temp
unfortunately, it cannot readStream and writeStream into the same file as well, and I open a issues about this.
So any one know a better way to do this tell me, thank you very much.
You can make the code more elegant by getting rid of unnecessary dependencies and using the newer simplified constructor for streams.
const fs = require('fs');
const util = require('util');
const stream = require('stream');
const tempWrite = require('temp-write');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(new stream.Transform({
decodeStrings : false,
transform(chunk, encoding, done) {
done(null, chunk.replace(/goat/g, 'llama'));
}
}));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Tests
AVA tests to prove that it works:
import fs from 'fs';
import path from 'path';
import util from 'util';
import test from 'ava';
import mkdirtemp from 'mkdirtemp';
import goat2llama from '.';
const writeFile = util.promisify(fs.writeFile);
const readFile = util.promisify(fs.readFile);
const fixture = async (content) => {
const dir = await mkdirtemp();
const fixturePath = path.join(dir, 'fixture.txt');
await writeFile(fixturePath, content);
return fixturePath;
};
test('goat2llama()', async (t) => {
const filePath = await fixture('I like goats and frogs, but goats the best');
await goat2llama(filePath);
t.is(await readFile(filePath, 'utf8'), 'I like llamas and frogs, but llamas the best');
});
A few things about the changes:
Through2 is not really needed anymore. It used to be a pain to set up passthrough or transform streams properly, but that is not the case anymore thanks to the simplified construction API.
You probably don't need graceful-fs, either. Unless you are doing a lot of concurrent disk I/O, EMFILE is not usually a problem, especially these days as Node has gotten smarter about file descriptors. But that library does help with temporary errors caused by antivirus software on Windows, if that is a problem for you.
You definitely do not need rimraf for this. You only need fs.rename(). It is similar to mv on the command line, with a few nuances that make it distinct, but the differences are not super important here. The point is there will be nothing at the temporary path after you rename the file that was there.
I used temp-write because it generates a secure random filepath for you and puts it in the OS temp directory (which automatically gets cleaned up now and then), plus it handles converting the stream to a Promise for you and takes care of some edge cases around errors. Disclosure: I wrote the streams implementation in temp-write. :)
Overall, this is a decent improvement. However, there remains the boundary problem discussed in the comments. Luckily, you are not the first person to encounter this problem! I wouldn't call the actual solution particularly elegant, certainly not if you implement it yourself. But replacestream is here to help you.
const fs = require('fs');
const util = require('util');
const tempWrite = require('temp-write');
const replaceStream = require('replacestream');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(replaceStream('goat', 'llama'));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Also...
I do not like temp files
Indeed, temp files are often bad. However, in this case, the temp file is managed by a well-designed library and stored in a secure, out-of-the-way location. There is virtually no chance of conflicting with other processes. And even if the rename() fails somehow, the file will be cleaned up by the OS.
That said, you can avoid temp files altogether by using fs.readFile() and fs.writeFile() instead of streaming. The former also makes text replacement much easier since you do not have to worry about chunk boundaries. You have to choose one approach or the other, however for very big files, streaming may be the only option, aside from manually chunking the file.
Streams are useless in this situation, because they return you chunks of file that can break the string that you're searching for. You could use streams, then merge all these chunks to get content, then replace the string that you need, but that will be longer code that will provoke just one question: why do you read file by chunks if you don't use them ?
The shortest way to achieve what you want is:
let fileContent = fs.readFileSync('file_name.html', 'utf8')
let replaced = fileContent.replace(/id="wrap"/g, 'dududud')
fs.writeFileSync('file_name.html', replaced)
All these functions are synchronous, so you don't have to promisify them

Resources