How to delete a specific file inside a nested folder - node.js

I have been trying to delete a specific file in a nested directory.
import Express from 'express';
import { baseURL } from '..';
import fs from 'fs';
import path from 'path';
import { GuestFile } from '../models/guest-file';
export const GuestDelete = async (request: Express.Request, response: Express.Response) => {
const { identifier } = request.params;
const find_file = await GuestFile.findOne({ identifier });
if (!find_file) {
return response.status(404).json({ error: 'File not found', success: false });
}
const local_file_path = find_file.file_url?.split(`${baseURL}`).pop();
const file_name = local_file_path?.split('/')?.splice(-1)[0];
// Here is where the problem is *********
fs.readdirSync(path.join(__dirname + `/uploads/${find_file.type}s/`)).find((file) => console.log(file));
console.log('file_name', file_name);
response.status(204).json({ success: true, identifier });
};
// deleteFileAfterDelay();
`
So I'm trying first to delete the file from the database, then delete it locally from uploads, and as you can see in the image, the uploads folder has a subdirectory. I wanted to be able to map through all the files in the subdirectory, and if the file_name matches with the file then I fs.unlinkSyn, but keep getting errors. Then the path ends up being this C:\\Users\\essel_r\\Desktop\\everfile\\backend_api\\\src\\controllers\\uploads\\images\\
This is what the directory looks like:
Image of the directory structure
I tried using the fs.unlinkSyn() and it didn't work:
fs.readdirSync('/uploads/${find_file.type}/').find((file) => file === file_name && fs.unlinkSync(file_name));
I also tried using the fs.access() to check if the file exists, but that too didn't work.

I try to run your code. I change 'import fs from 'fs' => 'const fs = require('fs')' and 'import path from 'path' => 'const path = require('path')'
And in my case, it works well.
//import fs from 'fs';
const fs = require('fs')
// import path from 'path';
const path = require('path')
fs.readdirSync(path.join(__dirname + `/uploads/subDir`)).find((file) =>
console.log(file)
);

After trying a lot last night and this morning, this is how i did it, so I'll be explaining it well.
import fs from 'fs'
import path from 'path'
const directory = path.join(__dirname, '..', 'src', 'uploads', 'images')
const file_to_delete = '1676410030129-screen.webp'
the src, uploads, and images are nested directories in my folder structure.
src
-index.ts
-controllers
-models
_uploads
-images
-1676410030129-screen.webp
-audio
-videos
Code:
fs.readdir(f, (error, files) => {
if (error) throw new Error('Could not read directory');
files.forEach((file) => {
const file_path = path.join(f, file);
console.log(file_path);
fs.stat(file_path, (error, stat) => {
if (error) throw new Error('File do not exist');
if(stat.isDirectory()){
console.log('The file is actually a directory')
}else if (file === file_to_delete ) {
fs.unlink(file_path, (error)=> {
if (error) throw new Error('Could not delete file');
console.log(`Deleted ${file_path}`);
})
}
});
});
});
Explanation:
we first read the directory of the files we want, then if we encounter an error, when we throw the error,or we get the files, which is an array.
we grab each file then join the directory to the file, so we get something like this "C:\Users\..\Desktop\..\..\src\uploads\images\1676410030129-screen.webp"
Then we use fs.stat to check if the path to the file exists. So if the file doesn't exist we throw an error, or we also check if the path to the actual file is a directory stat.isDirectory() if true, we console.log('The file is actually a directory), if it is false, then we move to the next line of the code, and check if the file is which C:\User\..\1676410030129-screen.webp is strictly equal to the file_to_delete then we delete the file, else we throw another error.
Whole Code: index.ts
import fs from 'fs'
import path from 'path'
const directory = path.join(__dirname, '..', 'src', 'uploads', 'images')
const file_to_delete = '1676410030129-screen.webp'
fs.readdir(f, (error, files) => {
if (error) throw new Error('Could not read directory');
files.forEach((file) => {
const file_path = path.join(f, file);
console.log(file_path);
fs.stat(file_path, (error, stat) => {
if (error) throw new Error('File do not exist');
if(stat.isDirectory()){
console.log('The file is actually a directory')
}else if (file === file_to_delete ) {
fs.unlink(file_path, (error)=> {
if (error) throw new Error('Could not delete file');
console.log(`Deleted ${file_path}`);
})
}
});
});
});
For more Info

Related

Why return doesn't work in NodeJS/Electron

I have a problem with my NodeJS script.
Basically I want to add every file path to an array then display it in the bash console.
But when I try, it gives me undefined.
Here is my code:
const { app, BrowserWindow } = require('electron');
const fs = require('fs');
const path = require('path');
function repList(){
var directoryPath = path.join('Q:/Programmes');
let forbiddenDir = [".VERSIONS", "INSTALL"];
fs.readdir(directoryPath, function (err, files) { //Scans the files in the directory
if (err) {
return console.log('Unable to scan directory: ' + err);
}
else{
files.forEach(function (file){ //Loops through each file
var name = directoryPath+"/"+file;
if(forbiddenDir.includes(file)){ //Don't accept the file if unvalid
console.log(`${file} is a forbidden name.`);
}
else{ //Filename is valid
fs.stat(name, (error, stats) => {
if (stats.isDirectory()) { //If directory...
tabRep.push(name); //... add the full filename path to the tabRep array
}
else if (error) {
console.error(error);
}
});
};
}); //End of loop
return tabRep; //<-- THIS RETURN DOESN'T WORK
}
});
}
app.whenReady().then(() => {
console.log(repList());
})
It gives me this output instead of tabRep's elements:
undefined
.VERSIONS is a forbidden name.
INSTALL is a forbidden name.
Inside the Programmes folder :
\ Programmes
\ .VERSIONS
\ Folder1
\ File1
\ Folder2
\ INSTALL
\ FolderN
\ FileN
If anyone could give me some help, it would be really appreciated.
fs.readdir() expects a callback function as second parameter (you passed that). The return you point at is the return of the callback function - not the return of the repList() function. Please read about async functions and callbacks in JavaScript to fully understand this concept, as this is very important in JavaScript. Also, your function repList() does not return anything! And declaration of variable tabRep is missing I think.
For so long, the the synchronous variant of fs.readdirSync(), like so:
const { app, BrowserWindow } = require('electron');
const fs = require('fs');
const path = require('path');
function repList(){
var directoryPath = path.join('Q:/Programmes');
let forbiddenDir = [".VERSIONS", "INSTALL"];
const files = fs.readdirSync(directoryPath)
const tabRep = []
files.forEach(function (file){ //Loops through each file
var name = directoryPath+"/"+file;
if(forbiddenDir.includes(file)){ //Don't accept the file if unvalid
console.log(`${file} is a forbidden name.`);
}
else{ //Filename is valid
const stats = fs.statSync(name)
if (stats.isDirectory()) { //If directory...
tabRep.push(name); //... add the full filename path to the tabRep array
}
}
}); //End of loop
return tabRep; //<-- THIS RETURN DOES WORK NOW since now the function executes synchronously.
}

how to make formidable not save to var/folders on nodejs and express app

I'm using formidable to parse incoming files and store them on AWS S3
When I was debugging the code I found out that formidable is first saving it to disk at /var/folders/ and overtime some unnecessary files are stacked up on disk which could lead to a big problem later.
It's very silly of me using a code without fully understanding it and now
I have to figure out how to either remove the parsed file after saving it to S3 or save it to s3 without storing it in disk.
But the question is how do I do it?
I would appreciate if someone could point me in the right direction
this is how i handle the files:
import formidable, { Files, Fields } from 'formidable';
const form = new formidable.IncomingForm();
form.parse(req, async (err: any, fields: Fields, files: Files) => {
let uploadUrl = await util
.uploadToS3({
file: files.uploadFile,
pathName: 'myPathName/inS3',
fileKeyName: 'file',
})
.catch((err) => console.log('S3 error =>', err));
}
This is how i solved this problem:
When I parse incoming form-multipart data I have access to all the details of the files. Because it's already parsed and saved to local disk on the server/my computer. So using the path variable given to me by formidable I unlink/remove that file using node's built-in fs.unlink function. Of course I remove the file after saving it to AWS S3.
This is the code:
import fs from 'fs';
import formidable, { Files, Fields } from 'formidable';
const form = new formidable.IncomingForm();
form.multiples = true;
form.parse(req, async (err: any, fields: Fields, files: Files) => {
const pathArray = [];
try {
const s3Url = await util.uploadToS3(files);
// do something with the s3Url
pathArray.push(files.uploadFileName.path);
} catch(error) {
console.log(error)
} finally {
pathArray.forEach((element: string) => {
fs.unlink(element, (err: any) => {
if (err) console.error('error:',err);
});
});
}
})
I also found a solution which you can take a look at here but due to the architecture if found it slightly hard to implement without changing my original code (or let's just say I didn't fully understand the given implementation)
I think i found it. According to the docs see options.fileWriteStreamHandler, "you need to have a function that will return an instance of a Writable stream that will receive the uploaded file data. With this option, you can have any custom behavior regarding where the uploaded file data will be streamed for. If you are looking to write the file uploaded in other types of cloud storages (AWS S3, Azure blob storage, Google cloud storage) or private file storage, this is the option you're looking for. When this option is defined the default behavior of writing the file in the host machine file system is lost."
const form = formidable({
fileWriteStreamHandler: someFunction,
});
EDIT: My whole code
import formidable from "formidable";
import { Writable } from "stream";
import { Buffer } from "buffer";
import { v4 as uuidv4 } from "uuid";
export const config = {
api: {
bodyParser: false,
},
};
const formidableConfig = {
keepExtensions: true,
maxFileSize: 10_000_000,
maxFieldsSize: 10_000_000,
maxFields: 2,
allowEmptyFiles: false,
multiples: false,
};
// promisify formidable
function formidablePromise(req, opts) {
return new Promise((accept, reject) => {
const form = formidable(opts);
form.parse(req, (err, fields, files) => {
if (err) {
return reject(err);
}
return accept({ fields, files });
});
});
}
const fileConsumer = (acc) => {
const writable = new Writable({
write: (chunk, _enc, next) => {
acc.push(chunk);
next();
},
});
return writable;
};
// inside the handler
export default async function handler(req, res) {
const token = uuidv4();
try {
const chunks = [];
const { fields, files } = await formidablePromise(req, {
...formidableConfig,
// consume this, otherwise formidable tries to save the file to disk
fileWriteStreamHandler: () => fileConsumer(chunks),
});
// do something with the files
const contents = Buffer.concat(chunks);
const bucketRef = storage.bucket("your bucket");
const file = bucketRef.file(files.mediaFile.originalFilename);
await file
.save(contents, {
public: true,
metadata: {
contentType: files.mediaFile.mimetype,
metadata: { firebaseStorageDownloadTokens: token },
},
})
.then(() => {
file.getMetadata().then((data) => {
const fileName = data[0].name;
const media_path = `https://firebasestorage.googleapis.com/v0/b/${bucketRef?.id}/o/${fileName}?alt=media&token=${token}`;
console.log("File link", media_path);
});
});
} catch (e) {
// handle errors
console.log("ERR PREJ ...", e);
}
}

Move all .txt files from one folder to another folder using Node js

I have tried with this code but it's not working it display error like that file not exists on that directory.
System take .txt as file not as extension of file.
const fs = require('fs');
var oldPath = '/abc/def/ghi/*.txt'
var newPath = '/xyz/cbi/'
fs.rename(oldPath, newPath, function (err) {
if (err) throw err
console.log('Successfully renamed - AKA moved!')
})
Try this one:
const shell = require('child_process').execSync ;
const src= `/abc/def/ghi`;
const dist= `/xyz/cbi`;
shell(`mv ${src}/* ${dist}`);
This will solve your problem Check here
const fs = require('fs-extra')
// With a callback:
fs.copy('/tmp/myfile', '/tmp/mynewfile', err => {
if (err) return console.error(err)
console.log('success!')
})
Try this one
For One File:
const moveThem = async () => {
// Move file ./js/foo.js to ./ns/qux.js
const original = join(__dirname, 'js/foo.js');
const target = join(__dirname, 'ns/qux.js');
await mv(original, target);
}
For Many Files:
mv('source/dir', 'dest/a/b/c/dir', {mkdirp: true}, function(err) {
});
OR
var spawn = require('child_process').spawn,
mv = spawn('mv', ['/dir1/dir2/*','dir1/']);

download and untar file than check the content, async await problem, node.js

I am downloading a file in tar format with request-promise module. Then I untar that file with tar module using async await syntax.
const list = new Promise(async (resolve, reject) => {
const filePath = "somedir/myFile.tar.gz";
if (!fs.existsSync(filePath)) {
const options = {
uri: "http://tarFileUrl",
encoding: "binary"
};
try {
console.log("download and untar");
const response = await rp.get(options);
const file = await fs.createWriteStream(filePath);
file.write(response, 'binary');
file.on('finish', () => {
console.log('wrote all data to file');
//here is the untar process
tar.x(
{
file: filePath,
cwd: "lists"
}
);
console.log("extracted");
});
file.end();
} catch(e) {
reject();
}
console.log("doesn't exist");
}
}
//here I am checking if the file exists no need to download either extract it (the try catch block)
//then the Array is created which includes the the list content line by line
if (fs.existsSync(filePath)) {
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
if (file) {
file.map(name => {
if (name === checkingName) {
blackListed = true;
return resolve(blackListed);
}
});
}
else {
console.log("err");
}
}
The console.log output sequence is like so:
download and untar
file doesn't exist
UnhandledPromiseRejectionWarning: Error: ENOENT: no such file or directory, open '...lists/alreadyExtractedFile.list'
wrote all data to file
extracted
So the file lists/alreadyExtractedFile.list is being checked before it's created. My guess is I am doing some wrong async await actions. As console.logs pointed that out the second checking block is somehow coming earlier than the file creating and untaring processes.
Please help me to figure out what I am doing wrong.
Your problem is here
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
the readFileSync function doesn't return a promise, so you shouldn't await it:
const file = fs.readFileSync("lists/alreadyExtractedFile.list")
.toString().match(/[^\r\n]+/g);
This should solve the issue
You need to call resolve inside new Promise() callback.
If you write a local utility and use some sync methods, you can use sync methods whenever possible (in fs, tar etc).
This is a small example where a small archive from the Node.js repository is asynchronously downloaded, synchronously written and unpacked, then a file is synchronously read:
'use strict';
const fs = require('fs');
const rp = require('request-promise');
const tar = require('tar');
(async function main() {
try {
const url = 'https://nodejs.org/download/release/latest/node-v11.10.1-headers.tar.gz';
const arcName = 'node-v11.10.1-headers.tar.gz';
const response = await rp.get({ uri: url, encoding: null });
fs.writeFileSync(arcName, response, { encoding: null });
tar.x({ file: arcName, cwd: '.', sync: true });
const fileContent = fs.readFileSync('node-v11.10.1/include/node/v8-version.h', 'utf8');
console.log(fileContent.match(/[^\r\n]+/g));
} catch (err) {
console.error(err);
}
})();

How to create a directory if it doesn't exist using Node.js

Is the following the right way to create a directory if it doesn't exist?
It should have full permission for the script and readable by others.
var dir = __dirname + '/upload';
if (!path.existsSync(dir)) {
fs.mkdirSync(dir, 0744);
}
For individual dirs:
var fs = require('fs');
var dir = './tmp';
if (!fs.existsSync(dir)){
fs.mkdirSync(dir);
}
Or, for nested dirs:
var fs = require('fs');
var dir = './tmp/but/then/nested';
if (!fs.existsSync(dir)){
fs.mkdirSync(dir, { recursive: true });
}
No, for multiple reasons.
The path module does not have an exists/existsSync method. It is in the fs module. (Perhaps you just made a typo in your question?)
The documentation explicitly discourage you from using exists.
fs.exists() is an anachronism and exists only for historical reasons. There should almost never be a reason to use it in your own code.
In particular, checking if a file exists before opening it is an anti-pattern that leaves you vulnerable to race conditions: another process may remove the file between the calls to fs.exists() and fs.open(). Just open the file and handle the error when it's not there.
Since we're talking about a directory rather than a file, this advice implies you should just unconditionally call mkdir and ignore EEXIST.
In general, you should avoid the *Sync methods. They're blocking, which means absolutely nothing else in your program can happen while you go to the disk. This is a very expensive operation, and the time it takes breaks the core assumption of node's event loop.
The *Sync methods are usually fine in single-purpose quick scripts (those that do one thing and then exit), but should almost never be used when you're writing a server: your server will be unable to respond to anyone for the entire duration of the I/O requests. If multiple client requests require I/O operations, your server will very quickly grind to a halt.
The only time I'd consider using *Sync methods in a server application is in an operation that happens once (and only once), at startup. For example, require actually uses readFileSync to load modules.
Even then, you still have to be careful because lots of synchronous I/O can unnecessarily slow down your server's startup time.
Instead, you should use the asynchronous I/O methods.
So if we put together those pieces of advice, we get something like this:
function ensureExists(path, mask, cb) {
if (typeof mask == 'function') { // Allow the `mask` parameter to be optional
cb = mask;
mask = 0o744;
}
fs.mkdir(path, mask, function(err) {
if (err) {
if (err.code == 'EEXIST') cb(null); // Ignore the error if the folder already exists
else cb(err); // Something else went wrong
} else cb(null); // Successfully created folder
});
}
And we can use it like this:
ensureExists(__dirname + '/upload', 0o744, function(err) {
if (err) // Handle folder creation error
else // We're all good
});
Of course, this doesn't account for edge cases like
What happens if the folder gets deleted while your program is running? (assuming you only check that it exists once during startup)
What happens if the folder already exists, but with the wrong permissions?
The mkdir method has the ability to recursively create any directories in a path that don't exist, and ignore the ones that do.
From the Node.js v10/11 documentation:
// Creates /tmp/a/apple, regardless of whether `/tmp` and /tmp/a exist.
fs.mkdir('/tmp/a/apple', { recursive: true }, (err) => {
if (err) throw err;
});
NOTE: You'll need to import the built-in fs module first.
Now here's a little more robust example that leverages native ECMAScript Modules (with flag enabled and .mjs extension), handles non-root paths, and accounts for full pathnames:
import fs from 'fs';
import path from 'path';
function createDirectories(pathname) {
const __dirname = path.resolve();
pathname = pathname.replace(/^\.*\/|\/?[^\/]+\.[a-z]+|\/$/g, ''); // Remove leading directory markers, and remove ending /file-name.extension
fs.mkdir(path.resolve(__dirname, pathname), { recursive: true }, e => {
if (e) {
console.error(e);
} else {
console.log('Success');
}
});
}
You can use it like createDirectories('/components/widget/widget.js');.
And of course, you'd probably want to get more fancy by using promises with async/await to leverage file creation in a more readable synchronous-looking way when the directories are created; but, that's beyond the question's scope.
With the fs-extra package you can do this with a one-liner:
const fs = require('fs-extra');
const dir = '/tmp/this/path/does/not/exist';
fs.ensureDirSync(dir);
I have found an npm module that works like a charm for this.
It simply does a recursive mkdir when needed, like a "mkdir -p ".
The one line version:
// Or in TypeScript: import * as fs from 'fs';
const fs = require('fs');
!fs.existsSync(dir) && fs.mkdirSync(dir);
You can just use mkdir and catch the error if the folder exists.
This is async (so best practice) and safe.
fs.mkdir('/path', err => {
if (err && err.code != 'EEXIST') throw 'up'
.. safely do your stuff here
})
(Optionally add a second argument with the mode.)
Other thoughts:
You could use then or await by using native promisify.
const util = require('util'), fs = require('fs');
const mkdir = util.promisify(fs.mkdir);
var myFunc = () => { ..do something.. }
mkdir('/path')
.then(myFunc)
.catch(err => { if (err.code != 'EEXIST') throw err; myFunc() })
You can make your own promise method, something like (untested):
let mkdirAsync = (path, mode) => new Promise(
(resolve, reject) => mkdir (path, mode,
err => (err && err.code !== 'EEXIST') ? reject(err) : resolve()
)
)
For synchronous checking, you can use:
fs.existsSync(path) || fs.mkdirSync(path)
Or you can use a library, the two most popular being
mkdirp (just does folders)
fsextra (supersets fs, adds lots of useful stuff)
solutions
CommonJS
const fs = require('fs');
const path = require('path');
const dir = path.resolve(path.join(__dirname, 'upload');
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
// OR
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, {
mode: 0o744, // Not supported on Windows. Default: 0o777
});
}
ESM
update your package.json file config
{
// declare using ECMAScript modules(ESM)
"type": "module",
//...
}
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
// create one custom `__dirname`, because it does not exist in es-module env ⚠️
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const dir = path.resolve(path.join(__dirname, 'upload');
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
// OR
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, {
mode: 0o744, // Not supported on Windows. Default: 0o777
});
}
update 2022
import { existsSync } from 'node:fs';
refs
NodeJS Version: v18.2.0
https://nodejs.org/api/fs.html#fsexistssyncpath
https://nodejs.org/api/fs.html#fsmkdirsyncpath-options
https://nodejs.org/api/url.html#urlfileurltopathurl
https://github.com/nodejs/help/issues/2907#issuecomment-757446568
ESM: ECMAScript modules
https://nodejs.org/api/esm.html#introduction
One-line solution: Creates the directory if it does not exist
// import
const fs = require('fs') // In JavaScript
import * as fs from "fs" // in TypeScript
import fs from "fs" // in Typescript
// Use
!fs.existsSync(`./assets/`) && fs.mkdirSync(`./assets/`, { recursive: true })
The best solution would be to use the npm module called node-fs-extra. It has a method called mkdir which creates the directory you mentioned. If you give a long directory path, it will create the parent folders automatically. The module is a superset of npm module fs, so you can use all the functions in fs also if you add this module.
var dir = 'path/to/dir';
try {
fs.mkdirSync(dir);
} catch(e) {
if (e.code != 'EEXIST') throw e;
}
Use:
var filessystem = require('fs');
var dir = './path/subpath/';
if (!filessystem.existsSync(dir))
{
filessystem.mkdirSync(dir);
}
else
{
console.log("Directory already exist");
}
For node v10 and above
As some answers pointed out, since node 10 you can use recursive:true for mkdir
What is not pointed out yet, is that when using recursive:true, mkdir does not return an error if the directory already existed.
So you can do:
fsNative.mkdir(dirPath,{recursive:true},(err) => {
if(err) {
//note: this does NOT get triggered if the directory already existed
console.warn(err)
}
else{
//directory now exists
}
})
Using promises
Also since node 10, you can get Promise versions of all fs functions by requiring from fs/promises
So putting those two things together, you get this simple solution:
import * as fs from 'fs/promises';
await fs.mkdir(dirPath, {recursive:true}).catch((err) => {
//decide what you want to do if this failed
console.error(err);
});
//directory now exists
fs.exist() is deprecated. So I have used fs.stat() to check the directory status. If the directory does not exist, fs.stat() throws an error with a message like 'no such file or directory'. Then I have created a directory.
const fs = require('fs').promises;
const dir = './dir';
fs.stat(dir).catch(async (err) => {
if (err.message.includes('no such file or directory')) {
await fs.mkdir(dir);
}
});
With Node.js 10 + ES6:
import path from 'path';
import fs from 'fs';
(async () => {
const dir = path.join(__dirname, 'upload');
try {
await fs.promises.mkdir(dir);
} catch (error) {
if (error.code === 'EEXIST') {
// Something already exists, but is it a file or directory?
const lstat = await fs.promises.lstat(dir);
if (!lstat.isDirectory()) {
throw error;
}
} else {
throw error;
}
}
})();
I'd like to add a TypeScript Promise refactor of josh3736's answer.
It does the same thing and has the same edge cases. It just happens to use Promises, TypeScript typedefs, and works with "use strict".
// https://en.wikipedia.org/wiki/File_system_permissions#Numeric_notation
const allRWEPermissions = parseInt("0777", 8);
function ensureFilePathExists(path: string, mask: number = allRWEPermissions): Promise<void> {
return new Promise<void>(
function(resolve: (value?: void | PromiseLike<void>) => void,
reject: (reason?: any) => void): void{
mkdir(path, mask, function(err: NodeJS.ErrnoException): void {
if (err) {
if (err.code === "EEXIST") {
resolve(null); // Ignore the error if the folder already exists
} else {
reject(err); // Something else went wrong
}
} else {
resolve(null); // Successfully created folder
}
});
});
}
I had to create sub-directories if they didn't exist. I used this:
const path = require('path');
const fs = require('fs');
function ensureDirectoryExists(p) {
//console.log(ensureDirectoryExists.name, {p});
const d = path.dirname(p);
if (d && d !== p) {
ensureDirectoryExists(d);
}
if (!fs.existsSync(d)) {
fs.mkdirSync(d);
}
}
You can use the Node.js File System command fs.stat to check if a directory exists and fs.mkdir to create a directory with callback, or fs.mkdirSync to create a directory without callback, like this example:
// First require fs
const fs = require('fs');
// Create directory if not exist (function)
const createDir = (path) => {
// Check if dir exist
fs.stat(path, (err, stats) => {
if (stats.isDirectory()) {
// Do nothing
} else {
// If the given path is not a directory, create a directory
fs.mkdirSync(path);
}
});
};
From the documentation this is how you do it asynchronously (and recursively):
const fs = require('fs');
const fsPromises = fs.promises;
fsPromises.access(dir, fs.constants.F_OK)
.catch(async() => {
await fs.mkdir(dir, { recursive: true }, function(err) {
if (err) {
console.log(err)
}
})
});
Here is a little function to recursivlely create directories:
const createDir = (dir) => {
// This will create a dir given a path such as './folder/subfolder'
const splitPath = dir.split('/');
splitPath.reduce((path, subPath) => {
let currentPath;
if(subPath != '.'){
currentPath = path + '/' + subPath;
if (!fs.existsSync(currentPath)){
fs.mkdirSync(currentPath);
}
}
else{
currentPath = subPath;
}
return currentPath
}, '')
}
my solutions
CommonJS
var fs = require("fs");
var dir = __dirname + '/upload';
// if (!fs.existsSync(dir)) {
// fs.mkdirSync(dir);
// }
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, {
mode: 0o744,
});
// mode's default value is 0o744
}
ESM
update package.json config
{
//...
"type": "module",
//...
}
import fs from "fs";
import path from "path";
// create one custom `__dirname`, because it not exist in es-module env ⚠️
const __dirname = path.resolve();
const dir = __dirname + '/upload';
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
// OR
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, {
mode: 0o744,
});
// mode's default value is 0o744
}
refs
https://nodejs.org/api/fs.html#fsexistssyncpath
https://github.com/nodejs/help/issues/2907#issuecomment-671782092
Using async / await:
const mkdirP = async (directory) => {
try {
return await fs.mkdirAsync(directory);
} catch (error) {
if (error.code != 'EEXIST') {
throw e;
}
}
};
You will need to promisify fs:
import nodeFs from 'fs';
import bluebird from 'bluebird';
const fs = bluebird.promisifyAll(nodeFs);
A function to do this asynchronously (adjusted from a similar answer on SO that used sync functions, that I can't find now)
// ensure-directory.js
import { mkdir, access } from 'fs'
/**
* directoryPath is a path to a directory (no trailing file!)
*/
export default async directoryPath => {
directoryPath = directoryPath.replace(/\\/g, '/')
// -- preparation to allow absolute paths as well
let root = ''
if (directoryPath[0] === '/') {
root = '/'
directoryPath = directoryPath.slice(1)
} else if (directoryPath[1] === ':') {
root = directoryPath.slice(0, 3) // c:\
directoryPath = directoryPath.slice(3)
}
// -- create folders all the way down
const folders = directoryPath.split('/')
let folderPath = `${root}`
for (const folder of folders) {
folderPath = `${folderPath}${folder}/`
const folderExists = await new Promise(resolve =>
access(folderPath, error => {
if (error) {
resolve(false)
}
resolve(true)
})
)
if (!folderExists) {
await new Promise((resolve, reject) =>
mkdir(folderPath, error => {
if (error) {
reject('Error creating folderPath')
}
resolve(folderPath)
})
)
}
}
}

Resources