How can I read a file in node js, find all instances of a function and then extract each function's argument? - node.js

I'm trying to write a node script that identifies unused translation strings in my React project.
First, I want to get a list of all the translations that are used. To do this, I am getting a list of each JS file in my /src/components folder and then reading the file.
My translation strings look like this: t('some.translation.key'), so basically, I want to identify each instance of t('...') using RegEx and then get the key in between those parentheses (i.e. "some.translation.key"). From there, I should be able to compare the keys to the ones in my translation JSON file and remove the ones that aren't being used.
unused.js
const path = require('path');
const fs = require('fs');
let files = [];
// https://stackoverflow.com/a/63111390/2262604
function getFiles(dir) {
fs.readdirSync(dir).forEach(file => {
const absolute = path.join(dir, file);
if (fs.statSync(absolute).isDirectory()) {
getFiles(absolute);
} else {
if (absolute.includes('.js')) {
files.push(absolute);
}
}
});
return files;
}
function getTranslations() {
const pathComponents = path.join(__dirname, '../../src/components');
// get all js files in components directory
const files = getFiles(pathComponents);
const translationKeys = [];
// for each js file
for(let i = 0; i < files.length; i++) {
// read contents of file
const contents = fs.readFileSync(files[i]).toString();
// search contents for all instances of t('...')
// and get the key between the parentheses
}
}
getTranslations();
How can I use RegEx to find all instances of t('...') in contents and then extract the ... string between the parentheses?

Yes, you could use a regular expression:
for (const [, str] of contents.matchAll(/\bt\(['"](.*?)['"]\)/g)) {
console.log('t called with string argument:', str)
}
However, with regular expressions the problem will be that they don't understand the code and would cause trouble with matching strings that contain ( ) or \' themselves, have issues with concatenated strings or extra whitespace, etc., and you'd then also get the contents literally, including possible escape sequences.
A more robust way would be to create an AST (abstract syntax tree) from the code and look for calls to t in it.
A popular AST parser would be acorn. There is also the supplementary module acorn-walk that helps walking through the whole syntax tree without building your own recursive algorithm.
import acorn from 'acorn'
import walk from 'acorn-walk'
// Example
const contents = "function a () { if (123) { t('hello') } return t('world') }"
// The arguments to acorn.parse would have to be adjusted based
// on what kind of syntax your files can use.
const result = acorn.parse(contents, {ecmaVersion: 2020})
walk.full(result, node => {
if (node.type === 'CallExpression' && node.callee.type === 'Identifier' && node.callee.name === 't') {
if (node.arguments.length === 1 && node.arguments[0].type === 'Literal' && typeof node.arguments[0].value === 'string') {
// This is for the case `t` is called with a single string
// literal as argument.
console.log('t called with string argument:', node.arguments[0].value)
} else {
// In case you have things like template literals as well,
// or multiple arguments, you'd need to handle them here too.
console.log('t called with unknown arguments:', node.arguments)
}
}
})
// Will output:
// t called with string argument: hello
// t called with string argument: world

Related

TS/Node.js: Getting the absolute path of the class instance rather than the class itself

Is there a way to get the path (__dirname) of the file where an instance of a class was made without passing that into the constructor?
For example,
// src/classes/A.ts
export class A {
private instanceDirname: string;
constructor() {
this.instanceDirname = ??
}
}
// src/index.ts
import { A } from "./classes/A"
const a = new A();
// a.instanceDirname === __dirname ✓
I tried callsite, it worked but I had to do some regex that I'm not happy with to get what I need, I also tried a module called caller-callsite, but that ended up returning the module path, not the path of the file where the instance was made.
Is there a workaround for this?
I would have callers pass in the location information. Sniffing this stuff seems like a code smell to me (pardon the pun). ;-)
But you can do it, by using regular expressions on the V8 call stack from an Error instance, but it still involves doing regular expressions (which you didn't like with callsite), though it's doing them on V8's own stacks, which aren't likely to change in a breaking way (and certainly won't except when you do upgrades of Node.js, so it's easy to test). See comments:
// A regular expression to look for lines in this file (A.ts / A.js)
const rexThisFile = /\bA\.[tj]s:/i;
// Your class
export class A {
constructor() {
// Get a stack trace, break into lines -- this is V8, we can rely on the format
const stackLines = (new Error().stack).split(/\r\n|\r|\n/);
// Find the first line that doesn't reference this file
const line = stackLines.find((line, index) => index > 0 && !rexThisFile.test(line));
if (line) {
// Found it, extract the directory from it
const instanceOfDirName = line.replace(/^\s*at\s*/, "")
.replace(/\w+\.[tj]s[:\d]+$/, "")
.replace(/^file:\/\//, "");
console.log(`instanceOfDirName = "${instanceOfDirName}"`);
}
}
}
Those three replaces can be combined:
const instanceOfDirName = line.replace(/(?:^\s*at\s*(?:file:\/\/)?)|(?:\w+\.[tj]s[:\d]+$)/g, "");
...but I left them separate for clarity; not going to make any performance difference to care about.

How can I import a Cypher query to my Node.js logic?

I'm not a very experienced developer but I am looking too structure my project so it is easier to work on.
Lets say I have a function like this:
const x = async (tx, hobby) => {
const result = await tx.run(
"MATCH (a:Person) - [r] -> (b:$hobby) " +
"RETURN properties(a)",
{ hobby }
)
return result
}
Can I put my cypher query scripts in seperate files, and reference it? I have seen a similar pattern for SQL scripts.
This is what I'm thinking:
const CYPHER_SCRIPT = require('./folder/myCypherScript.cyp')
const x = async (tx, hobby) => {
const result = await tx.run(
CYPHER_SCRIPT,
{ hobby }
)
return result
}
..or will i need to stringify the contents of the .cyp file?
Thanks
You can use the #cybersam/require-cypher package (which I just created).
For example, if folder/myCypherScript.cyp contains this:
MATCH (a:Person)-->(:$hobby)
RETURN PROPERTIES(a)
then after the package is installed (npm i #cybersam/require-cypher), this code will output the contents of that file:
// Just require the package. You don't usually need to use the returned module directly.
// Handlers for files with extensions .cyp, .cql, and .cypher will be registered.
require('#cybersam/require-cypher');
// Now require() will return the string content of Cypher files
const CYPHER_SCRIPT = require('./folder/myCypherScript.cyp')
console.log(CYPHER_SCRIPT);

How to search for files by extension and containing string within folder and subfolders?

Is it possible to look for files of given extension by the containing string provided?
What should my approach be? For example the input is txt and hello, and the output will be list of all files containing the string hello with extension txt.
You would write this in your terminal: node main.js hello
For a given directory it will search inside all subdirectories and all files for a text file with hello
Here is the code:
const { readdirSync, readFileSync, lstatSync } = require('fs');
const path = require('path');
const getDir = source => {
const results = readdirSync(source);
results.forEach(function (result) {
if (lstatSync(path.join(source, result))
.isFile()) {
if (readFileSync(path.join(source, result))
.includes(argument) && path.extname(result)
.toLowerCase() === extension) {
console.log("Your string is is in file: ", result)
}
}
else if (lstatSync(path.join(source, result))
.isDirectory()) {
getDir(path.join(source, result));
}
});
}
const dir = process.cwd();
const extension = '.txt'; //You can change the extension type here
let argument = process.argv[2];
getDir(dir);
You can use FileSystem to get the contents of a folder with the readdir method, this returns you an array of strings.
Let's say your working directory is a src file, in which there is a files folder that you want to read. Your script would look something like this:
const fs = require('fs');
var files = fs.readdir('./files', (err, filenames) => {
if (err) throw err;
return filenames;
})
You can then separate your string using string methods. Let's say you want to have two variables, fileNameand fileExt
You would use the String.split(separator) method, and use the . character as separator.
var files = // fs snippet above
for (file in files) {
let fileComponents = file.split('.');
let fileName = fileComponents[0];
let fileExt = fileComponents[1];
// You can run your code on the name, and extension of your file here.
}
This will not work for files containing multiple dots in their name. You will need extra work on the array to make sure fileName contains every string concatenated, separated by a . up until the final index of your fileComponents string

Node.js - How to grab the class names from a .scss file

I wanted to ask if anyone knows of a good solution for how to use node.js to look in a .scss file and grab all the classes listed and to then put them in either an object or an array?
The thing with this is that you are going to need the sass folder to be available to you server, this is not a recommended practice since you only publish the css compiled file, there is no need to also publish the dev assets.
However if you do so, you will need to read .scss file using node and from there use a regex to match the .class strings inside the file.
This will make the reading of the file:
var fs = require('fs');
function readSassFile () {
fs.readFile('./public/scss/components/_styles.scss', 'utf8', function (err, data) {
if (err) {
console.log(err);
return;
}
regexArray(data);
});
}
As you can see, at the end if the readFile retrieves the file with success, I'm calling a function regexArray() and sending the data of the file loaded.
In the regexArray function you need to define a regex to evaluate the string of the file loaded.
function regexArray (data) {
var re = /\.\S*/g;
var m;
var classArray = [];
while ((m = re.exec(data)) !== null) {
if (m.index === re.lastIndex) {
re.lastIndex++;
}
classArray.push(m[0]);
}
console.log(classArray);
}
the var re is the regular expression matching any string starting with a . and ending with a non-whitespace character which will match your css class names.
then we evaluate the m variable when is different from null and store the results in the array classArray, then you can log it to see the results.
I made the test with the path that is in the fs.readFile method, you can change it for you own path.

node.js require all files in a folder?

How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');

Resources