I am attempting to set up a JEST test against JSON key:value pairs, however I am getting test failures. Does anybody have experience in testing JSON key:value pairs?
File name CountryCodeMapping.js
export const ContryCodeMappingList = {
"United States": "USA",
"Argentina": "ARG,
...
Test File name CountryCodeMapping.test.js
import {CountryCodeMappingList
} from "./CountryCodeMappingList.js";
describe("Constant to ensure no change are made that pass tests ", () => {
test("are json key value pairs matching", () => {
expect (CountryCodeMapppingList).toContain(
"UnitesStates: USA",
Argentina": ARG",
...
);
};
To test the key:value pairs aka json object
import{ CountryCodeMappingList
} from "./CountryCodeMappingList.js";
test("are json key value pairs matching", () =>{
expect(CountryCodeMapppingList).toMatchObject({"Uniteds States": USA",
"Agentina": "ARG",
...})
})
Related
I am trying to encode a text to Base64 and using NodeJS, and then while getting the data I am decoding it back from Base64. Now I need to change the data into JSON so I could fill the relevant fields but its not converting to JSON.
Here is my code:
fs.readFile('./setting.txt', 'utf8', function(err, data) {
if (err) throw err;
var encodedData = base64.encode(data);
var decoded = base64.decode(encodedData).replace(/\n/g, '').replace(/\s/g, " ");
return res.status(200).send(decoded);
});
In setting.txt I have the following text:
LENGTH=1076
CRC16=28653
OFFSET=37
MEASUREMENT_SAMPLING_RATE=4
MEASUREMENT_RANGE=1
MEASUREMENT_TRACE_LENGTH=16384
MEASUREMENT_PRETRIGGER_LENGTH=0
MEASUREMENT_UNIT=2
MEASUREMENT_OFFSET_REMOVER=1
This decodes the result properly but when I use JSON.parse(JSON.stringify(decoded ) its not converting to JSON.
Can someone help me with it.
Try below snippet
let base64Json= new Buffer(JSON.stringify({}),"base64").toString('base64');
let json = new Buffer(base64Json, 'ascii').toString('ascii');
What does base-64 encoding/decoding have to do with mapping a list of tuples (key/value pairs) like this:
LENGTH=1076
CRC16=28653
OFFSET=37
MEASUREMENT_SAMPLING_RATE=4
MEASUREMENT_RANGE=1
MEASUREMENT_TRACE_LENGTH=16384
MEASUREMENT_PRETRIGGER_LENGTH=0
MEASUREMENT_UNIT=2
MEASUREMENT_OFFSET_REMOVER=1
into JSON?
If you want to "turn it (the above) into JSON", you need to:
Decide on what its JSON representation should be, then
Parse it into its component bits, convert that into an appropriate data struct, and then
use JSON.stringify() to convert it to JSON.
For instance:
function jsonify( document ) {
const tuples = document
.split( /\n|\r\n?/ )
.map( x => x.split( '=', 2) )
.map( ([k,v]) => {
const n = Number(n);
return [ k , n === NaN ? v : n ];
});
const obj = Object.fromEntries(tuples);
const json = JSON.stringify(obj);
return json;
}
The test is linked to this question here which I raised (& was resolved) a few days ago. My current test is:
// Helpers
function getObjectStructure(runners) {
const backStake = runners.back.stake || expect.any(Number).toBeGreaterThan(0)
const layStake = runners.lay.stake || expect.any(Number).toBeGreaterThan(0)
return {
netProfits: {
back: expect.any(Number).toBeGreaterThan(0),
lay: expect.any(Number).toBeGreaterThan(0)
},
grossProfits: {
back: (runners.back.price - 1) * backStake,
lay: layStake
},
stakes: {
back: backStake,
lay: layStake
}
}
}
// Mock
const funcB = jest.fn(pairs => {
return pairs[0]
})
// Test
test('Should call `funcB` with correct object structure', () => {
const params = JSON.parse(fs.readFileSync(paramsPath, 'utf8'))
const { arb } = params
const result = funcA(75)
expect(result).toBeInstanceOf(Object)
expect(funcB).toHaveBeenCalledWith(
Array(3910).fill(
expect.objectContaining(
getObjectStructure(arb.runners)
)
)
)
})
The object structure of arb.runners is this:
{
"back": {
"stake": 123,
"price": 1.23
},
"lay": {
"stake": 456,
"price": 4.56
}
}
There are many different tests around this function mainly dependent upon the argument that is passed into funcA. For this example, it's 75. There's a different length of array that is passed to funcB dependent upon this parameter. However, it's now also dependent on whether the runners (back and/or lay) have existing stake properties for them. I have a beforeAll in each test which manipulates the arb in the file where I hold the params. Hence, that's why the input for the runners is different every time. An outline of what I'm trying to achieve is:
Measure the array passed into funcB is of correct length
Measure the objects within the array are of the correct structure:
2.1 If there are stakes with the runners, that's fine & the test is straight forward
2.2 If not stakes are with the runners, I need to test that; netProfits, grossProfits, & stakes properties all have positive Numbers
2.2 is the one I'm struggling with. If I try with my attempt below, the test fails with the following error:
TypeError: expect.any(...).toBeGreaterThan is not a function
As with previous question, the problem is that expect.any(Number).toBeGreaterThan(0) is incorrect because expect.any(...) is not an assertion and doesn't have matcher methods. The result of expect.any(...) is just a special value that is recognized by Jest equality matchers. It cannot be used in an expression like (runners.back.price - 1) * backStake.
If the intention is to extend equality matcher with custom behaviour, this is the case for custom matcher. Since spy matchers use built-in equality matcher anyway, spy arguments need to be asserted explicitly with custom matcher.
Otherwise additional restrictions should be asserted manually. It should be:
function getObjectStructure() {
return {
netProfits: {
back: expect.any(Number),
lay: expect.any(Number)
},
grossProfits: {
back: expect.any(Number),
lay: expect.any(Number)
},
stakes: {
back: expect.any(Number),
lay: expect.any(Number)
}
}
}
and
expect(result).toBeInstanceOf(Object)
expect(funcB).toHaveBeenCalledTimes(1);
expect(funcB).toHaveBeenCalledWith(
Array(3910).fill(
expect.objectContaining(
getObjectStructure()
)
)
)
const funcBArg = funcB.mock.calls[0][0];
const nonPositiveNetProfitsBack = funcBArg
.map(({ netProfits: { back } }, i) => [i, back])
.filter(([, val] => !(val > 0))
.map(([i, val] => `${netProfits:back:${i}:${val}`);
expect(nonPositiveNetProfitsBack).toEqual([]);
const nonPositiveNetProfitsLay = ...
Where !(val > 0) is necessary to detect NaN. Without custom matcher failed assertion won't result in meaningful message but an index and nonPositiveNetProfitsBack temporary variable name can give enough feedback to spot the problem. An array can be additionally remapped to contain meaningful values like a string and occupy less space in errors.
I have a config file. It has variables stored in the following manner.
[general]
webapp=/var/www
data=/home/data
[env]
WEBAPP_DEPLOY=${general:webapp}/storage/deploy
SYSTEM_DEPLOY=${general:data}/deploy
As you can see it has 2 sections general and env. Section env uses the variables from section general.
So I want to read this file into a variable. Let's say config. Here's I want config object to look like:
{
general: {
webapp: '/var/www',
data: '/home/data'
},
env: {
WEBAPP_DEPLOY: '/var/www/storage/deploy',
SYSTEM_DEPLOY: '/home/data/deploy'
}
}
I general I am looking for a config parser for nodejs that supports string interpolation.
I would assume most ini libraries don't include the variable expansion functionality, but with lodash primitives a generic "deep object replacer" isn't too complex.
I've switched the : delimiter for . so has and get can lookup values directly.
const { get, has, isPlainObject, reduce } = require('lodash')
// Match all tokens like `${a.b}` and capture the variable path inside the parens
const re_token = /\${([\w$][\w\.$]*?)}/g
// If a string includes a token and the token exists in the object, replace it
function tokenReplace(value, key, object){
if (!value || !value.replace) return value
return value.replace(re_token, (match_string, token_path) => {
if (has(object, token_path)) return get(object, token_path)
return match_string
})
}
// Deep clone any plain objects and strings, replacing tokens
function plainObjectReplacer(node, object = node){
return reduce(node, (result, value, key) => {
result[key] = (isPlainObject(value))
? plainObjectReplacer(value, object)
: tokenReplace(value, key, object)
return result
}, {})
}
> plainObjectReplacer({ a: { b: { c: 1 }}, d: 'wat', e: '${d}${a.b.c}' })
{ a: { b: { c: 1 } }, d: 'wat', e: 'wat1' }
You'll find most config management tools (like ansible) can do this sort of variable expansion for you before app runtime, at deployment.
I'm trying to select certain keys from an JSON array, and filter the rest.
var json = JSON.stringify(body);
which is:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
Want I want:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
}
}
I've checked out How to filter JSON data in node.js?, but I'm looking to do this without any packages.
Now you can use Object.fromEntries like so:
Object.fromEntries(Object.entries(raw).filter(([key]) => wantedKeys.includes(key)))
You need to filter your obj before passing it to json stringify:
const rawJson = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
};
// This array will serve as a whitelist to select keys you want to keep in rawJson
const filterArray = [
"FirstName",
"typeform_form_submits",
];
// this function filters source keys (one level deep) according to whitelist
function filterObj(source, whiteList) {
const res = {};
// iterate over each keys of source
Object.keys(source).forEach((key) => {
// if whiteList contains the current key, add this key to res
if (whiteList.indexOf(key) !== -1) {
res[key] = source[key];
}
});
return res;
}
// outputs the desired result
console.log(JSON.stringify(filterObj(rawJson, filterArray)));
var raw = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
var wantedKeys =["FirstName","typeform_form_submits" ]
var opObj = {}
Object.keys(raw).forEach( key => {
if(wantedKeys.includes(key)){
opObj[key] = raw[key]
}
})
console.log(JSON.stringify(opObj))
I know this question was asked aways back, but I wanted to just toss out there, since nobody else did:
If you're bound and determined to do this with stringify, one of its less-well-known capabilities involves replacer, it's second parameter. For example:
// Creating a demo data set
let dataToReduce = {a:1, b:2, c:3, d:4, e:5};
console.log('Demo data:', dataToReduce);
// Providing an array to reduce the results down to only those specified.
let reducedData = JSON.stringify(dataToReduce, ['a','c','e']);
console.log('Using [reducer] as an array of IDs:', reducedData);
// Running a function against the key/value pairs to reduce the results down to those desired.
let processedData = JSON.stringify(dataToReduce, (key, value) => (value%2 === 0) ? undefined: value);
console.log('Using [reducer] as an operation on the values:', processedData);
// And, of course, restoring them back to their original object format:
console.log('Restoration of the results:', '\nreducedData:', JSON.parse(reducedData), '\nprocessedData:', JSON.parse(processedData));
In the above code snippet, the key value pairs are filtered using stringify exclusively:
In the first case, by providing an array of strings, representing the keys you wish to preserve (as you were requesting)
In the second, by running a function against the values, and dynamically determining those to keep (which you didn't request, but is part of the same property, and may help someone else)
In the third, their respective conversions back to JSON (using .parse()).
Now, I want to stress that I'm not advocating this as the appropriate method to reduce an object (though it will make a clean SHALLOW copy of said object, and is actually surprisingly performant), if only from an obscurity/readability standpoint, but it IS a totally-effective (and mainstream; that is: it's built into the language, not a hack) option/tool to add to the arsenal.
I am testing RESTful webservice using SoapUI. We use Groovy for that.
I am using jsonslurper to parse the response as Object type.
Our reponse is similar to this:
{
"language":[
{
"result":"PASS",
"name":"ENGLISH",
"fromAndToDate":null
},
{
"result":"FAIL",
"name":"MATHS",
"fromAndToDate": {
"from":"02/09/2016",
"end":"02/09/2016"
}
},
{
"result":"PASS",
"name":"PHYSICS",
"fromAndToDate":null
}
]
}
After this, I stuck up on how to.
Get Array (because this is array (starts with -language)
How to get value from this each array cell by passing the key (I should get the value of result key, if name='MATHS' only.)
I could do it using Java, but as just now learning Groovy I could not understand this. We have different keys with same names.
You can just parse it in to a map, then use standard groovy functions:
def response = '''{
"language":[
{"result":"PASS","name":"ENGLISH","fromAndToDate":null},
{"result":"FAIL","name":"MATHS","fromAndToDate":{"from":"02/09/2016","end":"02/09/2016"}},
{"result":"PASS","name":"PHYSICS","fromAndToDate":null}
]
}'''
import groovy.json.*
// Parse the Json string
def parsed = new JsonSlurper().parseText(response)
// Get the value of "languages" (the list of results)
def listOfCourses = parsed.language
// For this list of results, find the one where name equals 'MATHS'
def maths = listOfCourses.find { it.name == 'MATHS' }