testing a function from node jasmin - node.js

I'm still fairly new to testing, and am trying to figure out how to test a function in jasmine, node. Up to now I've only tested methods on my object.
I have a node.js module which only exports two of my functions, but I need to be able to test the other functions without exporting them, as I don't want them to be public.
This is some example code, as the real code isn't important
function initialize(item){
//do some initializing
return item;
}
function update(x){
if(!this.initialized){
initialize(this);
}
this.value = x;
return this;
}
module.exports.update = update;
How would I write a test for the initialize function, without using the update method?
Is there a way to do this? Or does everything have to be a part of an object, and then I only export the parts that I need?

By design, functions you don't export are not accessible from outside. Your tests that require your module only see the exports. So you can test the initialize function only through calling the update function. Since the module just initializes once you may clear the require cache before each test to require the module again with a fresh / uninitialized state. Clearing the require cache is officially allowed / not a hack.

Related

Node.js Globalize es6 modules to act like ImportScripts

The question is simple, how do we make es6 modules act like the ImportScript function used on the web browser.
Explanation
The main reason is to soften the blow for developers as they change their code from es5 syntax to es6 so that the transition does not blow up your code the moment you make the change and find out there are a thousand errors due to missing inclusions. It also give's people the option to stay as is indefinitely if you don't want to make the full change at all.
Desired output
ImportScript(A file path/'s); can be applied globally(implicitly) across subsequently required code and vise-verse inside a main file to avoid explicit inclusion in all files.
ES6 Inclusion
This still does not ignore the fact that all your libraries will depend on modules format as well. So it is inevitable that we will still have to include the export statement in every file we need to require. However, this should not limit us to the ability to have a main file that interconnects them all without having to explicitly add includes to every file whenever you need a certain functionality.
DISCLAIMER'S
(Numbered):
(Security) I understand there are many reasons that modules exist and going around them is not advisable for security reasons/load times. However I am not sure about the risk (if any) of even using a method like "eval()" to include such scripts if you are only doing it once at the start of an applications life and to a constant value that does not accept external input. The theory is that if an external entity is able to change the initial state of your program as is launched then your system has already been compromised. So as it is I think the whole argument around Globalization vs modules boils down to the project being done(security/speed needed) and preference/risk.
(Not for everyone) This is a utility I am not implying that everyone uses this
(Already published works) I have searched a lot for this functionality but I am not infallible to err. So If a simple usage of this has already been done that follows this specification(or simpler), I'd love to know how/where I can attain such code. Then I will promptly mark that as the answer or just remove this thread entirely
Example Code
ES5 Way
const fs = require('fs');
let path = require('path');
/* only accepts the scripts with global variables and functions and
does not work with classes unless declared as a var.
*/
function include(f) {
eval.apply(global, [fs.readFileSync(f).toString()])
}
Main file Concept example:
ImportScript("filePath1");loaded first
ImportScript("filePath2");loaded second
ImportScript("filePath3");loaded third
ImportScript("filePath4");loaded fourth
ImportScript("filePath5");loaded fifth
ImportScript("someExternalDependency");sixth
/* where "functionNameFromFile4" is a function defined in
file4 , and "variableFromFile2" is a global dynamic
variable that may change over the lifetime of the
application.
*/
functionNameFromFile4(variableFromFile2);
/* current context has access to previous scripts contexts
and those scripts recognize the current global context as
well in short: All scripts should be able to access
variables and functions from other scripts implicitly
through this , even if they are added after the fact
*/
Typical exported file example (Covers all methods of export via modules):
/*where "varFromFile1" is a dynamic variable created in file1
that may change over the lifetime of the application and "var" is a
variable of type(varFromFile4) being concatenated/added together
with "varFromFile4".
*/
functionNameFromFile4(var){
return var+varFromFile1;
}
//Typical export statement
exportAllHere;
/*
This is just an example and does not cover all usage cases , just
an example of the possible functionality
*/
CONCLUSION
So you still need to export the files as required by the es6 standard , however you only need to import them once in a main file to globalize their functionality across all scripts.
I'm not personally a fan of globalizing all the exports from a module, but here's a little snippet that shows you how one ESM module's exports can be all assigned to the global object:
Suppose you had a simple module called operators.js:
export function add(a, b) {
return a + b;
}
export function subtract(a, b) {
return a - b;
}
You can import that module and then assign all of its exported properties to the global object with this:
import * as m from "./operators.js"
for (const [prop, value] of Object.entries(m)) {
global[prop] = value;
}
// can now access the exports globally
add(1, 2);
FYI, I think the syntax:
include("filePath1")
is going to be tough in ESM modules because dynamic imports in an ESM module using import (which is presumably what you would have to use to implement the include() function you show) are asynchronous (they return a promise), not synchronous like require().
I wonder if a bundler or a transpiler would be an option?
There is experimental work in nodejs related to custom loaders here: https://nodejs.org/api/esm.html#hooks.
If you can handle your include() function returning a promise, here's how you put the above code into that function:
async function include(moduleName) {
const m = await import(moduleName);
for (const [prop, value] of Object.entries(m)) {
global[prop] = value;
}
return m;
}

Is there a way to test functions with Jasmine without exposing them?

I have a utils.js file with some generic utility functions. I use module.exports to expose some of the functions while keep the rest hidden.
module.exports = {
utils_1: function_1,
utils_2: function_2
};
Suppose there are other functions, namely function_3 and function_4 that I would like not to expose, yet I want to test with Jasmine. The only way I can currently do it is by exposing them with the rest of the functions and writing my test suites as usual.
Is there a way to test some functions via Jasmine while not exposing them?
The only thing I can think of is creating a different javascript file containing (and exposing) function_3 and function_4, require that file in my utils.js and at the same time test the new file with Jasmine, but this would split my original file in two (and the same would apply for all other similar files).
function_3 and function_4 are used inside function_1 and function_2, right? So, do you really need to unit test it? I would not. But other people might think the contrary.
Here is my example
// utils.js
function function_3(obj) {
obj.a = 3;
return obj;
}
function function_4(obj) {
obj = function_3(obj); // invoke function_3
obj.b = 4;
return obj;
}
module.exports = { utils_4: function_4 };
The test:
const { utils_4 } = require('utils');
describe('test 1', () => {
it('should pass', () => {
const obj = utils_4 ({});
expect(obj.a).toBeDefined();
expect(obj.a).toEqual(3);
expect(obj.b).toBeDefined();
expect(obj.b).toEqual(4);
});
});
It should pass, right? See that I only test function_4, but, if I change the implementation of function_3 in a way that brakes my test, the test will detect it.
If we change function_3 to
function function_3(obj) {
obj.c = 3;
return obj;
}
The test will fail.
This is an awful example but it is just illustrative. Take the good part of it. But again, other people might think the contrary and will go with an approach to test function_3 directly. Use a coverage lib to see what is covered by your tests.
Hope it helps
Your private / unexposed functions are used within your exported functions, are they not? Thereby, they are tested implicitly as soon as you write all the test cases for your public / exposed functions.
Exposing functions or writing tests on private functions is missing the goal of unit tests: Testing the public interface.
Think of private / unexposed methods as implementation details. You should not write tests for those, as the private methods are abstracting the implementation detail. It becomes tedious and over-complicated to have tests break because some internal behavior changed slightly.
You should be able to refactor or rewrite private functions at a whim, yet as long as they public interface is fulfilled, your tests shall remain green.
It may make sense to create a new module for your function_3 and function_4, depending on what they are doing for you. If I discover, that I want to test something internally, that's a sign to make it into its own module, with its own public interface.
So I'd say that your intention of moving the functions into a different JavaScript file is actually the right idea. (Yet you may realize that here only function_3 needs to be exposed, whereas function_4 can still be hidden).

typescript replaceent for require inside a function in nodejs

I trying to convert a nodejs project to TypeScript and while mostly I did not faced really difficult obstacles during this process, the codebase has few gotchas like this, mostly in startup code:
function prepareConfiguration() {
let cloudConfigLoader = require('../utils/cloud-config');
return cloudConfigLoader.ensureForFreshConfig().then(function() {
//do some stuff
});
}
I may be need just an advice on which approach for refactoring this has less code changes to be made to make it work in TypeScript fashion.
In response to comments, more details:
That require loads the node module, not a JSON file. From that module the ensureForFreshConfig function contacts with a cloud service to load a list of values to rebuild a configuration state object.
Problem is that mdule was made in standard node approach of "module is isngleton object" and its independencies include auth component that will be ready only when the shown require call is made. I know it is not best a way to do so..
Typesript does not allow "mport that module later" except with dynamyc import which is problematic as mentiond in comment.
The "good" approach is to refactor that part of startup and make the ensureForFreshConfig and its dependency to initiate its ntenras on demand via cnstructors.. I just hoped ofr some soluiton to reduce things to be remade durng this transition to the TypeScript
import { cloudConfigLoader } from '../utils/cloud-config'
async function prepareConfiguration() {
await cloudConfigLoader.ensureForFreshConfig()
// do some stuff
// return some-stuff
}
The function is to be used as follows.
await prepareConfiguration()

Proper use of artifacts.require?

I am trying to understand how artifacts.require should be used. I've seen the standard paragraph describing it as being for migrations and testing. From this I infer that the globally scoped artifacts with its method require are automatically defined by the truffle executable tool when doing migrations or running tests. However, I am working with some code that uses artifacts.require outside the context of any migrations or tests, rather, this code just needs to do the usual at and new. However, in this context, the object artifacts is not defined.
Do I have the right picture here? Is this an appropriate use of artifacts.require? If so, what must be done to make it be defined outside of migrations and testing?
Thanks for any suggestions!
artifacts.require really isn't meant to be used outside of a test. this is where it is defined: https://github.com/trufflesuite/truffle-core/blob/3e96337c32aaae6885105661fd1a6792ab4494bf/lib/test.js#L240
when in production code you should load the compiled contract into your application using truffle-contract https://github.com/trufflesuite/truffle-contract
here is a short example (from http://truffleframework.com/docs/getting_started/packages-npm#within-javascript-code and see
http://truffleframework.com/docs/getting_started/contracts#making-a-transaction )
var contract = require("truffle-contract");
var contractJson = require("example-truffle-library/build/contracts/SimpleNameRegistry.json");
var SimpleNameRegistry = contract(contractJson);
SimpleNameRegistry
.deployed()
.then(function(instance) {
return instance.setRegistry(address);
})
.then(function(result) {
// If this callback is called, the transaction was successfully processed.
alert("Transaction successful!")
})
.catch(function(e) {
// There was an error! Handle it.
});

Returning a module in RequireJS

I'm refactoring a large javascript codebase to use RequireJS. Unfortunately, many of the files I'm dealing with are not object-oriented, and cannot return an object without significant modification. Is there a more efficient way to give 'dependent' modules access to the functions and variables contained in a module (without returning an object) ?
I have read about using the exports syntax for defining modules, but it is very unclear whether that would be a valid solution for this situation.
In a defined module, the exports object is what gets exported from the module and passed to whatever module requires it.
Consider this:
define(["exports"], function(exports){
exports.myCustomFunction = function(){};
exports.myCustomObject = {};
exports.myCustomVariable = true;
})
This module will place all the disparate functions and/or objects that you want made public onto the exports object.
At this point RequireJS will use that exports object to pass to a module that requires it:
require(["nameOfCustomModule|filename"], function(myCustomModule){
//evaluates to true
console.log(myCustomModule.myCustomVariable);
})
Here's a simple fiddle. Just bring up your console and you will see the value logged there. http://jsfiddle.net/xeucv/
Hope this clears it up a bit!

Resources