I have several paths in format "\?\Volume{56D4B0E2-0000-0000-0000-00A861000000}\dir1" and I need to find and read JSON files from the paths.
glob doesn't seem to find anything from those paths.
glob(rootDir + '\\**\\*.json', {}, (err, files) => {
...
});
Am I doing something wrong or does glob not support "\?\Volume{}" paths?
Thank you in advance!
I decided to use node fs and path instead of glob:
static getFilesFromPath(path, extension) {
let dir = fs.readdirSync( path );
return dir.filter( elm => elm.match(new RegExp(`.*\.(${extension})`, 'ig')));
}
Related
I have a json file with the name of email_templates.json placed in the same folder as my js file bootstrap.js. when I try to read the file I get an error.
no such file or directory, open './email_templates.json'
bootstrap.js
"use strict";
const fs = require('fs');
module.exports = async () => {
const { config } = JSON.parse(fs.readFileSync('./email_templates.json'));
console.log(config);
};
email_templates.json
[
{
"name":"vla",
"subject":"test template",
"path": ""
}
]
I am using VS code , for some reason VS code doesnt autocomplete the path as well which is confusing for me.Does anyone know why it is doing this?
Node v:14*
A possible solution is to get the full path (right from C:\, for example, if you are on Windows).
To do this, you first need to import path in your code.
const path = require("path");
Next, we need to join the directory in which the JavaScript file is in and the JSON filename. To do this, we will use the code below.
const jsonPath = path.resolve(__dirname, "email_templates.json");
The resolve() function basically mixes the two paths together to make one complete, valid path.
Finally, you can use this path to pass into readFileSync().
fs.readFileSync(jsonPath);
This should help with finding the path, if the issue was that it didn't like the relative path. The absolute path may help it find the file.
In conclusion, this solution should help with finding the path.
When reading a directory, I currently have this:
fs.readdir(tests, (err, items) => {
if(err){
return cb(err);
}
const cmd = items.filter(v => fs.lstatSync(tests + '/' + v).isFile());
k.stdin.end(`${cmd}`);
});
first of all I need a try/catch in there around fs.lstatSync, which I don't want to add. But is there a way to use fs.readdir to only find files?
Something like:
fs.readdir(tests, {type:'f'}, (err, items) => {});
does anyone know how?
Starting from node v10.10.0, you can add withFileTypes as options parameter to get fs.Dirent instead of string.
// or readdir to get a promise
const subPaths = fs.readdirSync(YOUR_BASE_PATH, {
withFileTypes: true
});
// subPaths is fs.Dirent[] type
const directories = subPaths.filter((dirent) => dirent.isFile());
// directories is string[] type
more info is located at node documentation:
fs.Dirent
fs.readdirSync
fs.readdir
Unfortunately, fs.readdir doesn't have an option to specify that you're only looking for files, not folders/directories (per docs). Filtering the results from fs.readdir to knock out the directories is your best bet.
https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_readdir_path_options_callback
The optional options argument can be a string specifying an
encoding, or an object with an encoding property specifying the
character encoding to use for the filenames passed to the callback. If
the encoding is set to 'buffer', the filenames returned will be
passed as Buffer objects.
Yeah fs.readdir can't do this currently (only read files or only read dirs).
I filed an issue with Node.js and looks like it may be a good feature to add.
https://github.com/nodejs/node/issues/21804
If your use case is scripting/automation. You might try fs-jetpack library. That can find files in folder for you, but also can be configured for much more sophisticated searches.
const jetpack = require("fs-jetpack");
// Find all files in my_folder
const filesInFolder = jetpack.find("my_folder", { recursive: false }));
console.log(filesInFolder);
// Example of more sophisticated search:
// Find all `.js` files in the folder tree, with modify date newer than 2020-05-01
const borderDate = new Date("2020-05-01")
const found = jetpack.find("foo", {
matching: "*.js",
filter: (file) => {
return file.modifyTime > borderDate
}
});
console.log(found);
I want to find the type of files which is present or not, I am using nodejs, fs. Here is my code
var location = '**/*.js';
log(fs.statSync(location).isFile());
which always returns the error.
Error: ENOENT, no such file or directory '**/*.js'
How to I find the files is present or not. Thanks in Advance.
node doesn't have support for globbing (**/*.js) built-in. You'll need to either recursively walk the directories and iterate over the array of file names to find the file types you want, or use something like node-glob.
Using recusrive-readdir-sync
var recursiveReadSync = require('recursive-readdir-sync'),
files;
files = recursiveReadSync('./');
files.forEach(function (fileName) {
if (fileName.search(/\.js$/g) !== -1) {
console.log("Found a *.js file");
}
});
Using node-glob:
var glob = require("glob")
glob("**/*.js", function (er, files) {
files.forEach(function (fileName) {
if (fileName.search(/\.js$/g) !== -1) {
console.log("Found a *.js file");
}
});
node.js dose not support "glob" wildcards by default. You can use external package like this one
Is there anything like fs.create(path) that if path not exist then create it.
For example, fs.Create('D:/test/a.txt') and it will create test folder and a.txt file if a.txt not exist.
I know how to create the file if not exist but how about folder's'?
I think it is a simple problem. Does any lib can do that? Or I need to parse the path and create it?
If you don't want to add dependencies the following may work for you, where dirPath is an array of the path segments you want to mkdirsync to:
let dirPath = [cwd, `..`, `..`, `folderA`, `folderB`]
let outDir = []
dirPath.forEach(element => {
outDir.push(element)
try {
if (!fs.existsSync(path.resolve(outDir.join('/')))) {
fs.mkdirSync(path.resolve(outDir.join('/')))
console.log('mkdir succeeded!!')
}
} catch (err) {
console.error(err)
}
})
The answer is from #thefourtheye, Use fs-extra module's mkdirs
In my gulp build I've made a task that runs after all compiling, uglifying and minification has occurred. This task simply copies everything from the src into the dest directory that hasn't been touched/processed by earlier tasks. The little issue I'm having is that this results in empty directories in the dest directory.
Is there a way to tell the gulp.src glob to only include files in the pattern matching (like providing the 'is_file' flag)?
Thanks.
Fixed it by adding a filter to the pipeline:
var es = require('event-stream');
var onlyDirs = function(es) {
return es.map(function(file, cb) {
if (file.stat.isFile()) {
return cb(null, file);
} else {
return cb();
}
});
};
// ...
var s = gulp.src(globs)
.pipe(onlyDirs(es))
.pipe(gulp.dest(folders.dest + '/' + module.folder));
// ...
I know I'm late to the party on this one, but for anyone else stumbling upon this question, there is another way to do this that seems pretty elegant in my eyes. I found it in this question
To exclude the empty folders I added { nodir: true } after the glob pattern.
Your general pattern could be such (using the variables from Nick's answer):
gulp.src(globs, { nodir: true })
.pipe(gulp.dest(folders.dest + '/' + module.folder));
Mine was as follows:
gulp.src(['src/**/*', '!src/scss/**/*.scss', '!src/js/**/*.js'], { nodir: true })
.pipe(gulp.dest('dev/'));
This selects all the files from the src directory that are not scss or js files, and does not copy any empty folders from those two directories either.