mongodb case insensitive index for 3.4 version - node.js

I have created a collection say users, example documents like given below
[{
"name" : "John",
"code" : "B7"
},
{
"name" : "Sara",
"code" : "F7"
}]
I have created one index on field "name"
db.users.createIndex(
{ name: 1 },
{
collation: {locale: "en", strength: 1},
unique: true
}
)
I want to prevent duplicate case insensitive data for name field for example it will not allow entry for name "jOhn" or "jOHn" like that.
It's working but when I'm inserting data in mongodb is giving error messages
Inserted data given below
db.users.insert([{ "name" : "JoHn", "code" : "B9" }])
Error message given below
{
"message" : "WriteError({'code':11000,'index':0,'errmsg':'E11000 duplicate key error collection: digital_data_delivery.users index: name_1 dup key: { : \\';E7C\\' }','op':{'name':'JoHn','code':'B9','_id':'5b4836a458abe34b442a9811'}})",
"stack" : "script:1:10"
}
Some junk characters are coming here "E7C" I want to know what is this

Related

MongoDB + NodeJS: Document failed validation & Data Types behaviour

I am new to MongoDB and NodeJS,
When i try to create the JsonSchema with data types, string, integer, date and bool, it is created but always throwing an error as document validation error while inserting the data, So i changed the bsonType of one data type to number, then it started creating collection records, but the observation is it is storing as Double datatype, I read somewhere in the stackoverflow, that it stores like that only, but my question is why is this behavior? WHY THE ERROR IS NOT BEING THROWN AT THE TIME OF CREATION OF THE JSONSCHEMA but it is throwing at the time of data insertion?
Also, if we have nested objects let us say, Customer object with Address as nested object, the main object's int/number values are stored as Double where as inside the address object's pincode storing as Int32. This is also very confusing. what is the difference between these objects but the structure of the schema is same.
What are the other ways to implement and having proper validated schema for MongoDB.
>
db.getCollectionInfos({name:"companysInt1s1"})
[
{
"name" : "companysInt1s1",
"type" : "collection",
"options" : {
"validator" : {
"$jsonSchema" : {
"bsonType" : "object",
"required" : [
"tin"
],
"properties" : {
"tin" : {
"bsonType" : "int",
"minLength" : 2,
"maxLength" : 11,
"description" : "must be a string and is not required, should be 11 characters length"
}
}
}
}
},
"info" : {
"readOnly" : false,
"uuid" : UUID("27cba650-7bd3-4930-8d3e-7e6cbbf517db")
},
"idIndex" : {
"v" : 2,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "invoice.companysInt1s1"
}
}
]
> db.companysInt1s1.insertOne({tin:22222})
2019-02-14T15:04:28.712+0530 E QUERY [js] WriteError: Document failed validation :
WriteError({
"index" : 0,
"code" : 121,
"errmsg" : "Document failed validation",
"op" : {
"_id" : ObjectId("5c653624e382c2ec16c16893"),
"tin" : 22222
}
})
WriteError#src/mongo/shell/bulk_api.js:461:48
Bulk/mergeBatchResults#src/mongo/shell/bulk_api.js:841:49
Bulk/executeBatch#src/mongo/shell/bulk_api.js:906:13
Bulk/this.execute#src/mongo/shell/bulk_api.js:1150:21
DBCollection.prototype.insertOne#src/mongo/shell/crud_api.js:252:9
#(shell):1:1
Am i missing something or any other documentation should i be following? Appreciate your guidance...
You need to insert as NumberInt.
when you run this
db.companysInt1s1.insertOne({tin:22222})
you are actually inserting tin as float.
so the correct way to do it is
db.companysInt1s1.insertOne({tin: NumberInt(22222) })

update array in mongoose which matches the condition

my schema looks like
{
qty:{
property1:{
//something
}
property2:[{
size:40,
color:"black",
enabled:"true"
}]
}
}
property 2 is array what i want to do is update those array object whose enabled is true in single query
I tried writing the following query
db.col.update({
"qty.property2.enabled" = "true"
}, {
"qty.property2.color" = "green"
}, callback)
but it is not working
error:
[main] Error: can't have . in field names [qty.pro.size]
db.col.update({"qty.property2.enabled":"true"},{$set: {'qty.property2.$.color': 'green'}}, {multi: true})
this is the way to update element inside array.
equal sign '=' cannot be used inside object
updating array is done using $
Alternative solution for multiple conditions:
db.foo.update({
_id:"i1",
replies: { $elemMatch:{
_id: "s2",
update_password: "abc"
}}
},
{
"$set" : {"replies.$.text" : "blah"}
}
);
Why
So I was looking for similar solution as this question, but in my case I needed array element to match multiple conditions and using currently provided answers resulted in changes to wrong fields.
If you need to match multiple fields, for example let say we have element like this:
{
"_id" : ObjectId("i1"),
"replies": [
{
"_id" : ObjectId("s1"),
"update_password": "abc",
"text": "some stuff"
},
{
"_id" : ObjectId("s2"),
"update_password": "abc",
"text": "some stuff"
}
]
}
Trying to do update by
db.foo.update({
_id:"i1",
"replies._id":"s2",
"replies.update_password": "abc"
},
{
"$set" : {"replies.$.text" : "blah"}
}
);
Would result in updating to field that only matches one condition, for example it would update s1 because it matches update_password condition, which is clearly wrong. I might have did something wrong, but $elemMatch solution solved any problems like that.
Suppose your documet looks like this.
{
"_id" : ObjectId("4f9808648859c65d"),
"array" : [
{"text" : "foo", "value" : 11},
{"text" : "foo", "value" : 22},
{"text" : "foobar", "value" : 33}
]
}
then your query will be
db.foo.update({"array.value" : 22}, {"$set" : {"array.$.text" : "blah"}})
where first curly brackets represents query criteria and second one sets the new value.

All fields search [duplicate]

This question already has answers here:
MongoDB Query Help - query on values of any key in a sub-object
(3 answers)
Closed 6 years ago.
This is my data set, which is part of a bigger json code. I want to write a query, which will match all fields inside the value chain.
Dataset:
"value_chain" : {
"category" : "Source, Make & Deliver",
"hpe_level0" : "gift Chain Planning",
"hpe_level1" : "nodemand to Plan",
"hpe_level2" : "nodemand Planning",
"hpe_level3" : "nodemand Sensing"
},
Example:
If someone searches for "gift", the query should scan through all fields, and if there is a match, return the document.
This is something I tried, but didnt work
db.sw_api.find({
value_chain: { $elemMatch: { "Source, Make & Deliver" } }
})
Sounds like you need to create $text index on all the text fields first since it performs a text search on the content of the fields indexed with a text index:
db.sw_api.createIndex({
"value_chain.category" : "text",
"value_chain.hpe_level0" : "text",
"value_chain.hpe_level1" : "text",
"value_chain.hpe_level2" : "text",
"value_chain.hpe_level3" : "text"
}, { "name": "value_chain_text_idx"});
The index you create is a composite index consisting of 5 columns, and mongo will automatically create the text namespace for you by default if you don't override it. With the above, if you don't specify the index name as
db.sw_api.createIndex({
"value_chain.category" : "text",
"value_chain.hpe_level0" : "text",
"value_chain.hpe_level1" : "text",
"value_chain.hpe_level2" : "text",
"value_chain.hpe_level3" : "text"
});
there is a potential error "ns name is too long (127 byte max)" since the text index will look like this:
"you_db_name.sw_api.$value_chain.category_text_value_chain.hpe_level0_text_value_chain.hpe_level1_text_value_chain.hpe_level2_text_value_chain.hpe_level3_text"
Hence the need to give it a name which is not too long if autogenerated by mongo.
Once the index is created, a db.sw_api.getIndexes() query will show you the indexes present:
/* 1 */
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "dbname.sw_api"
},
{
"v" : 1,
"key" : {
"_fts" : "text",
"_ftsx" : 1
},
"name" : "value_chain_text_idx",
"ns" : "dbname.sw_api",
"weights" : {
"value_chain.category" : 1,
"value_chain.hpe_level0" : 1,
"value_chain.hpe_level1" : 1,
"value_chain.hpe_level2" : 1,
"value_chain.hpe_level3" : 1
},
"default_language" : "english",
"language_override" : "language",
"textIndexVersion" : 3
}
]
Once you create the index, you can then do a $text search:
db.sw_api.find({ "$text": { "$search": "gift" } })

Mongodb aggregate project string as number

I have a mongo script which retrieves a value from an array and creates a new document. However, the value which it retrieves is a string. I need the value to be added to the new document as a number instead of a string because it is read by a graphing engine which ignores the value if it is a string.
From the script below, it is "value": {$arrayElemAt: ["$accountBalances", 1]} which needs to be a number instead of a string. Thanks.
db.std_sourceBusinessData.aggregate(
{ $match : {objectType: "Account Balances"}},
{ $project: {_id: 1,entity_ID: 1,objectOrigin: 1,accountBalances: 1}},
{ $unwind: "$accountBalances" },
{ $match: {"accountBalances": "Sales"}}
,
{$project: {
_id: 1
, "value": {$arrayElemAt: ["$accountBalances", 1]}
,"key": {$literal: "sales"}
,"company": "$entity_ID"
,"objectOrigin" : "$objectOrigin"
}}
,{$out: "entity_datapoints"}
)
This is what I currently get:
{
"_id" : ObjectId("5670961f910e1f54662c1d9d"),
"objectOrigin" : "Xero",
"Value" : "500.00",
"key" : "grossprofit",
"company" : "e56e09ef-5c7c-423e-b699-21469bd2ea00"
}
what I want is:
{
"_id" : ObjectId("5670961f910e1f54662c1d9d"),
"objectOrigin" : "Xero",
"Value" : 500.0000000000000,
"key" : "grossprofit",
"company" : "e56e09ef-5c7c-423e-b699-21469bd2ea00"
}

Upsert embedded object in mongoDB

Given this Person collection:
{
"_id" : ObjectId("4f8e95a718bcv9c74da1e6511a"),
"name" : "John",
"hobbies" : [{
"id" : 001,
"name" : "reading",
"location" : "home"
},{
"id" : 002,
"name" : "sport",
"location" : "outside"
}]
}
and these new/edited Hobby objects:
{
"name" : "walking",
"location" : "outside"
}
and
{
"id" : 001,
"name" : "reading",
"location" : "outside"
}
If I know the Person that I want to manage, what is be the best way to upsert embedded objects?
Currently my approach is to find the Person object, make the required modifications to it in my code, and then save it back to the DB. This works. But I'd like to simplify and reduce the number of round trips to the database.

Resources