Write write module to a new file - node.js

I have the following logic that imports a typescript module and makes some changes to it. Below is the simplified version of code.
const fs = require('fs')
// No deep nestings, and no arrays (only simple key: `string-values`)
const POSTCSS_SELECTORS = {
propA: `value`,
probB: `value`
}
fs.write('fileName.js', POSTCSS_SELECTORS)
After obtaining POSTCSS_SELECTORS I would like to save the output as commonjs module as described below.
Expected output:
fileName.js:
module.exports = {
propA: `value`,
probB: `value`
}
I will appreciate a lot if you could suggest a valid workaround for this case :)

You're 95% there, you just need to write out the rest of the text that you want in your file:
const fs = ...
const POSTCSS_SELECTORS = ...
// prepare that object for writing to file:
const json = JSON.stringify(POSTCSS_SELECTORS, false, 2);
// and then "template" it into the final text you want written.
fs.writeFile(`filename.js`, `module.exports = ${jsonForm}`);
But, if you have a file with a hardcoded variable called POSTCSS_SELECTORS, why not make a file called postcss_selectors.js instead, and put the hardcoded object there, instead? No need to "make a file based on this object" if what you want is that object, as a module.
Just make that module.

Related

Node js - Issue with my syntax in require(module) . When to use {} and when not to use {} in require('module')

I have a query with the syntax in the require statement. Please refere the sample code below.
const nodemailer = require("nodemailer");
const {google} =require('googleapis');
const {OAuth2}=google.auth;
Some times , I see sample codes which use
const {<variable>} = require('moduleName')
Other times, I see like below
const <variable> = require('moduleName')
What is the difference between them?
Thanks in Advance.
Grateful to the Developers Community.
So, you use { } in this context when you want to do object destructuring to get a property from the exported object and create a module-level variable with that same name.
This:
const { google } = require('googleapis');
is a shortcut for this:
const __g = require('googleapis');
const google = __g.google;
So, within this context, you use the { google } only when you want the .google property from the imported module.
If you want the entire module handle such as this:
const nodemailer = require("nodemailer");
then, you don't use the { }. The only way to know which one you want for any given module is to consult the documentation for the module, the code for the module or examples of how to use the module. It depends entirely upon what the module exports and whether you want the top level export object or you want a property of that object.
It's important to realize that the { } used with require() is not special syntax associated with require(). This is normal object destructuring assignment, the same as if you did this:
// define some object
const x = { greeting: "hello" };
// use object destructuring assignment to create a new variable
// that contains the property of an existing object
const { greeting } = x;
console.log(greeting); // "hello
When you import the function with {}, it means you just import one function that available in the package. Maybe you have've seen:
const {googleApi, googleAir, googleWater} = require("googleapis")
But, when you not using {}, it means you import the whole package, just write:
const google = require("googleapis")
So, let say when you need googleApi in your code. You can call it:
google.googleApi

How to detect if a script exports something when loaded?

Assuming I have a script 'helpers.js' in the same directory
as 'main.js', and load the first in the latter, 'helpers.js'
will return an empty object {} when not exporting anything,
no matter what the content is, which is a default value for
the module.exports variable, I guess.
Is there a way to check, if the empty object was intentionally
exported by 'helpers.js'?
Example 1, no export:
// helpers.js
// Any valid Javascript here, just no exports.
// main.js
const help = require('./helpers')
console.log(help)
>>> {}
Example 2, with export:
// helpers.js
let exportMe = require('./you').getNewestMe
module.exports = exportMe()
// main.js
const help = require('./helpers')
console.log(help)
>>> {}
I had something different in mind but I realized that my original solution wouldn't work, so let's see...
When a CommonJS module is loaded (i.e. required for the first time), Node.js runs a special function - a loader. The loader function has access to the state of a module before it is actually loaded, so it can inspect and modify the original values of some properties of a module like module.exports. The idea is that if we can somehow mark that original value, we can distinguish it from an empty object assigned later while the module was being loaded.
The loader function can be retrieved and changed with Module._extensions['.js'] where Module is the module constructor, i.e. require('module') or module.constructor.(Module._extensions is not the same as the deprecated require.extensions, but that's another story...).
The loader function is invoked with two arguments: the module object and the filename. Then the original exports object is unsurprisingly, module.exports.
In the code below, I'm going to use a symbol to mark the original empty exports object:
const noExports = Symbol('no exports');
const extensions = module.constructor._extensions;
const jsLoader = extensions['.js'];
extensions['.js'] = (module, filename) => {
module.exports[noExports] = filename;
jsLoader(module, filename);
};
const help1 = require('./helpers-without-exports.js');
const help2 = require('./helpers-with-empty-exports.js');
console.log(noExports in help1); // true
console.log(noExports in help2); // false
This method only works for modules that are loaded after the loader function is patched, and it only determines if module.exports is changed after the module is created, so it does not detect values exported with exports.myFunction = ... (in which case the exported object would be non-empty anyway).
Another special case is a module that re-exports another module that does not export anything (e.g. module.exports = require('./foo'); where foo.js is an empty file). To detect those cases too, if necessary, one could compare the value of the symbol property with the module filename assigned to the symbol, and see if they match:
console.log(help1[noExports] === require.resolve('./helpers-without-exports.js')); // true
console.log(help2[noExports] === require.resolve('./helpers-with-empty-exports.js')); // false

Require JSON as deep copy

I am writing tests right now for my node application.
I have fixtures which I use to test my data and I ran into the Problem, that when I alter any of them in a method, then they are globally altered for all the other tests as well, which obviously has to do with referencing. Now I figured if I write my fixtures into a JSON and require that JSON in each file then they will have unique references for each file, which turns out now, they don't.
My question would be: is there an easy way to handle fixtures in Node such that every file has an instance of the fixtures which won't affect the other test files.
The way I currently import my fixtures in every test file:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
require calls are cached, so once you call it, consecutive calls will return the same object.
You can do the following:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = JSON.parse(JSON.stringify(fixture1));
const someOtherFixtureCopy = JSON.parse(JSON.stringify(someOtherFixtureCopy));
or use a package:
deepcopy
clone
const deepcopy = require('deepcopy');
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = deepcopy(fixture1);
const someOtherFixtureCopy = deepcopy(someOtherFixtureCopy);
Or change your module to export a function that will return new copies everytime. This is the recommended approach in my opinion.
module.exports = {
get() {
return deepcopy(fixture); // fixture being the Object you have
}
}
const fixture = require('./fixture');
const fixture1 = fixture.get();
This isn't specific to JSON. It's not uncommon that modules need to be re-evaluated in tests. require.cache can be modified in Node.js to affect how modules are cached, either directly or with helpers like decache.
Depending on the case,
decache('../../../../../fixtures/keywords.json')
goes before require in a test, or to afterEach to clean up.

Unable to use variables in fs functions when using brfs

I use browserify in order to be able to use require. To use fs functions with browserify i need to transform it with brfs but as far as I understood this results in only being able to input static strings as parameters inside my fs function. I want to be able to use variables for this.
I want to search for xml files in a specific directory and read them. Either by searching via text field or showing all of their data at once. In order to do this I need fs and browserify in order to require it.
const FS = require('fs')
function lookForRoom() {
let files = getFileNames()
findSearchedRoom(files)
}
function getFileNames() {
return FS.readdirSync('../data/')
}
function findSearchedRoom(files) {
const SEARCH_FIELD_ID = 'room'
let searchText = document.getElementById(SEARCH_FIELD_ID).value
files.forEach((file) => {
const SEARCHTEXT_FOUND = file.includes(searchText.toLowerCase())
if (SEARCHTEXT_FOUND) loadXML(file)
})
}
function loadXML(file) {
const XML2JS = require('xml2js')
let parser = new XML2JS.Parser()
let data = FS.readFile('../data/' + file)
console.dir(data);
}
module.exports = { lookForRoom: lookForRoom }
I want to be able to read contents out of a directory containing xml files.
Current status is that I can only do so when I provide a constant string to the fs function
The brfs README contains this gotcha:
Since brfs evaluates your source code statically, you can't use dynamic expressions that need to be evaluated at run time.
So, basically, you can't use brfs in the way you were hoping.
I want to be able to read contents out of a directory containing xml files
If by "a directory" you mean "any random directory, the name of which is determined by some form input", then that's not going to work. Browsers don't have direct access to directory contents, either locally or on a server.
You're not saying where that directory exists. If it's local (on the machine the browser is running on): I don't think there are standardized API's to do that, at all.
If it's on the server, then you need to implement an HTTP server that will accept a directory-/filename from some clientside code, and retrieve the file contents that way.

How to get properties defined outside of module.export from require object

I am struggling to get const properties define outside of module.exports from the object where I am using that module. here is simple example:
ServiceX.js
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
ServiceY.js
const utils = require('utils');
module.exports = {
testLastName: function () {
console.log('framework');
}
}
now if I import both file via require and merging via _.merge(). Output file contain both of the methods, but it doesn't contain any of the const variable define outside exports.
let combined = _.merge(require('ServiceX'), require('ServiceY'));
writing this combined to the third file some this MyService
even if I print this combined object via console.log(combined), I get only both functions, not const properties.
Use case I have:
I have n-number of files in different location, I need to read files, merge all and create a new file with merged content.
Please help me,
Thanks
A problem here is that a service currently looks like:
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
And the reason for that is that if you read all files and concatenate the result then module.exports will be overwritten by the last read file.
But if it instead looked like:
const _ = require('lodash');
module.exports.testFirstname = function () {
console.log('Nodics');
}
You could pull it off by doing fs.readFileSync() on all your services and concatenate the result.
You could also do it as a build command, rather than in code:
cat service*.js > combined.js
You will most likely run into other problems, since if one service uses const _ = require('lodash') and another is doing the same you will try to redefine a const variable.
To get around this you could move have to move your requires into another scope, so they don't conflict at file level scope.

Resources