Difficult to understand what exactly .has does with array in joi - node.js

I was trying to learn JOI to validate a schema,I came across the following content
const schema = Joi.array().items(
Joi.object({
a: Joi.string(),
b: Joi.number()
})
).has(Joi.object({ a: Joi.string().valid('a'), b: Joi.number() }))
And I validated the scheme as follow let c = arr2.validate([{ a:'a', b:'b'}]) the response I got on validation is
{
value: [ { a: 'a', b: 'b' } ],
error: [Error [ValidationError]: "[0].b" must be a number] {
_original: [ [Object] ],
details: [ [Object] ]
}
}
then I tried the above validation with the schema as follows
const arr2 = Joi.array().items(
Joi.object({
a: Joi.string(),
b: Joi.number()
})
)
even now I got the following validation response
{
value: [ { a: 'a', b: 'b' } ],
error: [Error [ValidationError]: "[0].b" must be a number] {
_original: [ [Object] ],
details: [ [Object] ]
}
}
I'm just confused as to whats the use of .has in the first schema as I can have the valid value directly as below
const arr2 = Joi.array().items(
Joi.object({
a: Joi.string().valid('a'),
b: Joi.number()
})
)
then what exact purpose is .has serving in the first schema

As stated in the .has() documentation:
Verifies that a schema validates at least one of the values in the array
The example schema...
const schema = Joi.array().items(
Joi.object({
a: Joi.string(),
b: Joi.number()
})
).has(Joi.object({ a: Joi.string().valid('a'), b: Joi.number() }))
...requires at least one of the array items to pass the validation { a: Joi.string().valid('a'), b: Joi.number() }
e.g.
{ "a": "a", "b": 12345 }
Your schema on the other hand...
const arr2 = Joi.array().items(
Joi.object({
a: Joi.string().valid('a'),
b: Joi.number()
})
)
...will only accept items that pass the validation { a: Joi.string().valid('a'), b: Joi.number() }.
The first schema will still accept something like...
[
{
"a": "foo",
"b": 12
},
{
"a": "a",
"b": 99
},
{
"a": "bar",
"b": 6
}
]
...where yours will not because not all a keys have the value 'a'.

Related

Atomic deep merge for document MongoDB

I want to atomically deep merge a document in MongoDB. (I am using Node.js)
For example,
I have this document in my DB -
{
id: 123,
field1: {
field2: "abc"
}
}
And I am given this object:
{
field1: {
field3: "def"
}
}
So I'm looking for my document in the DB to change to this:
{
id: 123,
field1: {
field2: "abc",
field3: "def"
}
}
Of course I can do this in a non-atomic way by:
First fetching the document from the DB.
Then in JS do the merge and save it to a new object.
And finally overwrite the new item to the DB.
But I want this to happen in an atomic matter (so the read and the write stages will not happen separately). Is there any way of doing this?
I have heard the aggregation methods $merge and $mergeObjects might help me with that. Question is - are they atomic?
Thanks all :)
You can use the $function operator and go recursive.
Consider this doc in the DB:
{
field1: {
field6: "pre",
field2: "abc",
fieldX: {
aa: [1,2,3],
bb: "pre",
cc: "pre"
}
}
};
This is our candidate merge/overwrite doc:
var my_object = {
field1: {
field3: "def",
field6: "post",
fieldX: {
bb: "post",
zz: "post"
}
},
field4: 77,
field22: "qqq"
};
field4 and field22 are "easy"; they just get added to the doc. Where fields exist like field1, we have to test for objects vs. non-object. For objects, we will recursively descend into the db object and the candidate. For non-objects, we will let the candidate "win" and its value simply overwrites that in the db. This pipeline:
c = db.foo.aggregate([
{$replaceRoot: {newRoot: {$function: {
body: function(db_object, my_object) {
var process = function(myo, key, dbo) {
db_value = dbo[key];
if(undefined == db_value) {
// Easy; db does not have entry at all; just set it:
dbo[key] = myo[key];
} else {
// my_value has to be non-null because this function
// is driven from the keyset in myo.
my_value = myo[key];
if(db_value instanceof Object && my_value instanceof Object) {
// Both objects with same name; descend!
walkObj(db_value, my_value)
} else {
// my_object "wins" the slot and just overwrites dbo:
dbo[key] = myo[key];
}
}
};
var walkObj = function(dbo,myo) {
// Drive the walk from my_object; it overwrites/appends the db:
Object.keys(myo).forEach(function(k) {
process(myo, k, dbo);
});
}
walkObj(db_object, my_object); // entry point!
return db_object;
},
args: [ "$$CURRENT", my_object ],
lang: "js"
}}
}}
// Take each doc and put it back using _id as the key:
,{$merge: {
into: "foo",
on: [ "_id" ],
whenMatched: "merge",
whenNotMatched: "fail"
}}
]);
will produce this result in the DB:
{
_id: ObjectId("63f0f5e45d0b16d1033771e7"),
field1: {
field6: 'post',
field2: 'abc',
fieldX: { aa: [ 1, 2, 3 ], bb: 'post', cc: 'pre', zz: 'post' },
field3: 'def'
},
field22: 'qqq',
field4: 77
}

How I can deserialize in borsh UnorderedMap? Near

How I can deserialize in borsh UnorderedMap?
Now my schema is
const schema = new Map([
[
Record,
{
kind: "struct",
fields: [
["x2", { kind: 'map', key: 'string', value: 'u128'}],
],
},
],
]);
Rust struct:
pub struct Counter {
x2: UnorderedMap<AccountId, Balance>,
}
But I have error: "Expected buffer length 2036811841 isn't within bounds"

NodeJS Validation with JOI [duplicate]

This question already has answers here:
How to validate array of objects using Joi?
(6 answers)
Closed last year.
I am using bellow code:
const coll = [
{ id: 1, name: 'John' },
{ id: 2, name: 'Jemmy' },
{ id: 3, name: 'Jenny' }
];
const schema = Joi.object().keys({
name: Joi.string().min(3).required()
});
return schema.validate(coll);
when my coll array is valid, then also when checking the schema, it shows the following, always going to the error section.
schema validator
{
value: [
{ id: 1, name: 'Action' },
{ id: 2, name: 'Horror' },
{ id: 3, name: 'Comedy' }
],
error: [Error [ValidationError]: "value" must be of type object] {
_original: [ [Object], [Object], [Object] ],
details: [ [Object] ]
}
}
If you want to validate an array containing objects, you could use
const schema = Joi.array().items(Joi.object().keys({
name: Joi.string().min(3).required()
}))

Add unique value to every element in array

I'm fairly new to MongoDB and I'm trying to merge an embedded array in a MongoDB collection, my schema for my Project collection is as follows:
Projects:
{
_id: ObjectId(),
client_id: String,
description: String,
samples: [
{
location: String, //Unique
name: String,
}
...
]
}
A user can upload a JSON file that is in the form of:
[
{
location: String, //Same location as in above schema
concentration: float
}
...
]
The length of the samples array is the same length as the uploaded data array. I'm trying to figure out how to add the data field into every element of my samples array, but I can't find out how to do it based on MongoDB documentation. I can load my json data in as "data" and I want to merge based on the common "location" field:
db.projects.update({_id: myId}, {$set : {samples.$[].data : data[location]}});
But I can't think of how to get the index on the json array in update query, and I haven't been able to find any examples in the mongodb documentation, or questions like this.
Any help would be much appreciated!
MongoDB 3.6 Positional Filtered Updates
So you're actually in the right "ballpark" with the positional all $[] operator, but the problem is that just simply applies to "every" array element. Since what you want is "matched" entries you actually want the positional filtered $[<identifier>] operator instead.
As you note your "location" is going to be unique and within the array. Using "index positions" is really not reliable for atomic updates, but actually matching the "unique" properties is. Basically you need to get from something like this:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
To this:
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
And that really is just a matter of applying some basic functions to the provided input array:
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
The Array.map() simply takes the values of the input and creates a list of identifiers to match the location values within arrayFilters. The construction of the $set statement uses Array.reduce() with two iterations being able to merge keys for each array element processed and for each key present in that array element, after removing the location from consideration since this is not being updated.
Alternately, loop with for..of:
let arrayFilters = [];
let $set = {};
for ( let [i, { location, ...e }] of Object.entries(input) ) {
arrayFilters.push({ [`l${i}.location`]: location });
for ( let [k,v] of Object.entries(e) ) {
$set[`samples.$[l${i}].${k}`] = v;
}
}
Note we use Object.entries() here as well as the "object spread" ... in construction. If you find yourself in a JavaScript environment without this support, then Object.keys() and Object.assign() are basically drop in replacements with little change.
Then those can actually be applied within an update as in:
Project.update({ client_id: 'ClientA' }, { $set }, { arrayFilters });
So the positional filtered $[<identifier>] is actually used here to create "matching pairs" of entries within the $set modifier and within the arrayFilters option of the update(). So for each "location" we create an identifier that matches that value within the arrayFilters and then use that same identifier within the actual $set statement in order to just update the array entry which matches the condition for the identifier.
The only real rule with "identifiers" is that that cannot start with a number, and they "should" be unique but it's not a rule and you simply get the first match anyway. But the updates then only touch those entries which actually match the condition.
Ealier MongoDB fixed Indexes
Failing having support for that, then you are basically falling back to "index positions" and that's really not that reliable. More often than not you will actually need to read each document and determine what is in the array already before even updating. But with at least presumed "parity" where index positions are in place then:
let input = [
{ location: "A", concentration: 3 },
{ location: "B", concentration: 5 },
{ location: "C", concentration: 4 }
];
let $set = input.reduce((o,e,i) =>
({ ...o, [`samples.${i}.concentration`]: e.concentration }),{}
);
log({ $set });
Producing an update statement like:
{
"$set": {
"samples.0.concentration": 3,
"samples.1.concentration": 5,
"samples.2.concentration": 4
}
}
Or without the parity:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update({ client_id: 'ClientA' },{ $set });
Producing the statement matching on the indexes ( after you actually read the document ):
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Noting of course that for each "update set" you really don't have any other option than to read from the document first to determine which indexes you will update. This generally is not a good idea as aside from the overhead of needing to read each document before a write, there is no absolute guarantee that the array itself remains unchanged by other processes in between the read and the write, so using a "hard index" is making the presumption that everything is still the same, when that may not actually be the case.
Earlier MongoDB positional matches
Where data permits it's generally better to cycle standard positional matched $ updates instead. Here location is indeed unique so it's a good candidate, and most importantly you do not need read the existing documents to compare arrays for indexes:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
A bulkWrite() sends multiple update operations, but it does so with a single request and response just like any other update operation. Indeed if you are processing a "list of changes" then returning the document for comparison of each and then constructing one big bulkWrite() is the direction to go in instead of individual writes, and that actually even applies to all previous examples as well.
The big difference is "one update instruction per array element" in the change set. This is the safe way to do things in releases without "positional filtered" support, even if it means more write operations.
Demonstration
A full listing in demonstration follows. Note I'm using "mongoose" here for simplicity, but there is nothing really "mongoose specific" about the actual updates themselves. The same applies to any implementation, and particular in this case the JavaScript examples of using Array.map() and Array.reduce() to process the list for construction.
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
{ arrayFilters }
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output for those who cannot be bothered to run, shows the matching array elements updated:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778605c59470ecaf10fac"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778605c59470ecaf10faf"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778605c59470ecaf10fae"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778605c59470ecaf10fad"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.$[l0].concentration': 3, 'samples.$[l0].other': 'c', 'samples.$[l1].concentration': 4, 'samples.$[l1].other': 'a' } }, { arrayFilters: [ { 'l0.location': 'A' }, { 'l1.location': 'C' } ] })
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778605c59470ecaf10fac",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778605c59470ecaf10faf",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778605c59470ecaf10fae",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778605c59470ecaf10fad",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
Or by hard index:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778e0f7be250f2b7c3fc8"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778e0f7be250f2b7c3fcb"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fca"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fc9"), location: 'C', name: 'Location C' } ], __v: 0 })
Mongoose: projects.findOne({ client_id: 'ClientA' }, { fields: {} })
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.0.concentration': 3, 'samples.0.other': 'c', 'samples.2.concentration': 4, 'samples.2.other': 'a' } }, {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778e0f7be250f2b7c3fc8",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778e0f7be250f2b7c3fcb",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778e0f7be250f2b7c3fca",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778e0f7be250f2b7c3fc9",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
And of course with standard "positional" $ syntax and updates:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b179142662616160853ba4a"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b179142662616160853ba4d"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b179142662616160853ba4c"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b179142662616160853ba4b"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"batch": [
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "A"
},
"update": {
"$set": {
"samples.$.concentration": 3,
"samples.$.other": "c"
}
}
}
},
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "C"
},
"update": {
"$set": {
"samples.$.concentration": 4,
"samples.$.other": "a"
}
}
}
}
]
}
Mongoose: projects.bulkWrite([ { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'A' }, update: { '$set': { 'samples.$.concentration': 3, 'samples.$.other': 'c' } } } }, { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'C' }, update: { '$set': { 'samples.$.concentration': 4, 'samples.$.other': 'a' } } } } ], {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b179142662616160853ba4a",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b179142662616160853ba4d",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b179142662616160853ba4c",
"location": "B",
"name": "Location B"
},
{
"_id": "5b179142662616160853ba4b",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}

Descriptive Hapi/Joi validation error

I've been trying to implement Joi in our node application (joi as standalone, not with hapi) and it seems to validate the schema properly but the error is always the same
[ValidationError: value must be an object]
name: 'ValidationError',
details:
[ { message: 'value must be an object',
path: 'value',
type: 'object.base',
context: [Object] } ],
_object:.....
I never get the specifics on which key it failed on and description of why it failed.
this is a sample schema I'm using:
exports.workersSchema =
{
workers: joi.array({
id: joi.string().alphanum(),
wID: joi.object({
idValue: joi.string().alphanum()
}),
person: {
governmentIDs: joi.array({itemID: joi.string().alphanum()}),
legalName: joi.object({
givenName: joi.string(),
middleName: joi.string(),
preferredSalutations: joi.array(
{
salutationCode: {
longName: joi.string()
}
}
),
preferredName: joi.object().keys({
FormattedName: joi.string()
}),
}),
birthDate: joi.string().alphanum()
}
})
}
And this is the json object I'm sending :
{"workers" : [
{
"id" : "",
"wID" : {
"idValue" : ""
},
"person" : {
"governmentIDs":[{
"itemID": "asd"
}],
"legalName":{
"givenName" : "PA",
"middleName" : "",
"preferredSalutations" : [{
"salutationCode" : {
"longName" : ""
}
}],
"preferredName" : {
"FormattedName" : ""
},
"birthDate" : ""
}]
}
What am i doing wrong here? I even tried to follow something on the blog and while the examples were showing detailed info I never got anything besides
"value must be an object"
It validates it correctly but when it sees a misfit value it just gives that error and nothing else.
Also, if you look at the 'wID' section it has a 'idValue' object but when I get rid of the idValue and just put a alphanum right on the wID key, it also passes the validation.
ps. When validating keys that are objects. Do I have to validate it with
key: Joi.object({
a:Joi.string()
})
or can I just do?:
key: {
a:Joi.string()
}
Thank you so much for the help!
I think there's a couple of issues. First of all, make sure that the object you're validating against is indeed an object with a workers key. The validation seems to be suggesting that you're not providing an object for this base value (an array perhaps)?
Also in a few instances I think you're using the API incorrectly (e.g. joi.array(...) is not valid). I've modified your schema to work how I think you intended. If not, post a sample object and I'll amend.
var schema = {
workers: Joi.array().required().includes({
id: Joi.string().alphanum(),
wID: {
idValue: Joi.string().alphanum()
},
person: {
governmentIDs: Joi.array().includes(Joi.string().alphanum()),
legalName: {
givenName: Joi.string(),
middleName: Joi.string(),
preferredSalutations: Joi.array().includes(Joi.string()),
preferredName: {
formattedName: Joi.string()
},
},
birthDate: Joi.string().alphanum()
}
})
};
Here's a valid object for that schema:
var goodExample = {
workers: [
{
id: 'bhdsf78473',
wID: {
idValue: 'idvalue1'
},
person: {
governmentIDs: ['id1', 'id2'],
legalName: {
givenName: 'Johnny',
middleName: 'Michael',
preferredSalutations: ['sir', 'Dr'],
preferredName: {
formattedName: 'Sir Johnny Michael Smith'
}
},
birthDate: '2411986'
}
}
]
};
Here's an invalid one:
var badExample = {
workers: [
{
id: 'bhdsf7^£$%^£$%8473', // Here's the issue
wID: {
},
person: {
governmentIDs: ['id1', 'id2'],
legalName: {
givenName: 'Johnny',
middleName: 'Michael',
preferredSalutations: ['sir', 'Dr'],
preferredName: {
formattedName: 'Sir Johnny Michael Smith'
}
},
birthDate: '2411986'
}
},
],
};
Joi should give nice detailed output for Joi.assert(example, schema);:
$ node index.js
/.../node_modules/Joi/lib/index.js:121
throw new Error(message + error.annotate());
^
Error: {
"workers": [
{
"wID": {},
"person": {
"governmentIDs": [
"id1",
"id2"
],
"legalName": {
"givenName": "Johnny",
"middleName": "Michael",
"preferredSalutations": [
"sir",
"Dr"
],
"preferredName": {
"formattedName": "Sir Johnny Michael Smith"
}
},
"birthDate": "2411986"
},
"id" [1]: "bhdsf7^£$%^£$%8473"
}
]
}
[1] workers at position 0 fails because id must only contain alpha-numeric characters
at root.assert (/.../node_modules/Joi/lib/index.js:121:19)
at Object.<anonymous> (/.../index.js:57:5)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:929:3
NOTE: This answer is using Joi 5.1.2 (API: https://github.com/hapijs/joi/blob/v5.1.0/README.md). Joi.array().includes() will be dropped in the next release in favour of Joi.array().items()
The object that you posted is not a valid JavaScript object because it's missing some closing } brackets. Here's the valid version:
var obj = {
"workers" : [{
"id" : "", // <-------- Shouldn't be empty
"wID" : {
"idValue" : ""
},
"person" : {
"governmentIDs":[{
"itemID": "asd"
}],
"legalName":{
"givenName" : "PA",
"middleName" : "",
"preferredSalutations" : [{
"salutationCode" : {
"longName" : ""
}
}],
"preferredName" : {
"FormattedName" : ""
},
},
"birthDate" : ""
}
}]
};
If I validate that with my provided schema, I get the following message (using Joi 5.1.0):
[1] workers at position 0 fails because id is not allowed to be empty

Resources