MongoDB upsert an array of objects from a list - node.js

I am working on moving my database from sqlite3 to mongo. I went
through mongo university, yet I'm not sure I have found a really good
example of upsertting in bulk.
Use case : user uploads a data file with a list of players and their stats. The app needs to either update a player or add a new player if they do not already exist.
Current Implementation : Function takes a list of Players and creates SQL statement.
let template = '(Player.Rank, Player.Player_ID, Player.Player, Player.Score, TTP, 1),(Player.Rank, Player_ID, ...) ... (... TTP, 1)';
const stmt = `INSERT INTO playerStats (Rank, Player_ID, Player, Score, TPP, Games_Played)
VALUES ${template}
ON CONFLICT(Player_ID) DO UPDATE
SET Score = Score+excluded.Score, TPP=TPP+excluded.TPP, Games_Played=Games_Played+1`;
db.run(stmt, callback);
Im hoping to have each document be a league which contains players, games, and managers.
Mongo DB document template
{
"LEAGUE_ID": "GEO_SAM",
"Players": [
{
"id": "PlayerID",
"name": "Player",
"score": "Score",
"rank": "Rank",
"xPlayed": "Games_Played",
"ttp": "TTP"
}
],
"Managers": [
{...}
],
"Games": [
{...}
]
}
I am totally lost and not sure how to get this done. Do I need to create a loop and ask mongo to upsert on each iteration? I have searched through countless examples but all of them use static hard coded values.
Here is my testing example.
const query = { League_id : "GEO_SAM", Players.$.id: $in{ $Players }};
const update = { $inc: { score: $Player.Score}};
const options = { upsert: true };
collection.updateMany(query, update, options);
I also don't understand how to pass the entire player object to push to the array if the player_id isn't found.

My solution was to create a metaData field containing the league ID with a single player. If anyone else has a better solution I would love to hear from you.
{
MetaData: { "LEAGUE_ID": "GEO_SAM"},
Player: {
"id": "PlayerID",
"name": "Player",
"score": "Score",
"rank": "Rank",
"xPlayed": "Games_Played",
"ttp": "TTP"
}
}
Then I mapped over the values and inserted each one.
client.connect().then((client) => {
const db = client.db(dbName);
const results = Players.map((player) => {
db.collection('Players').updateOne(
{ Player_Name: player.Player_ID },
{
$setOnInsert: {
Player_ID: player.Player_ID,
Player: player.Player,
Rank: player.Rank,
},
$inc: { Score: player.Score, Games_Played: 1, TPP: player.TPP },
},
{ upsert: true, multi: true },
);
});

Related

It is possible to ignore fields if it doesn't exists in document?

I have a model:
const schema = new Schema({
// ....
conditions: {},
// ....
});
Conditions - nested document and I can save anything into it with any key. And let's say we have such conditions:
{
"conditions": {
"age": 10,
"name": "John"
}
}
This is located on the base. Now, I want to find this document, but since I don't know what fields are there, I am facing problems...
const conditions = {
'conditions.age': 10,
'conditions.name': 'John',
'conditions.surname': 'White' // surname doesn't exists
}
const result = await Model.find(conditions);
console.log(result) // [];
And the question is, is it possible to exclude from the filter the fields that are missing in the document? So that find() simply skipped them, did not take them into account...
Use Logical Query Operators $and and $or as below-
const conditions = {
$and: [
{ 'conditions.age': 10, },
{ 'conditions.name': 'John', },
{ $or: [{ 'conditions.surname': { $exists: false } }, { 'conditions.surname': 'White' }] }
]
}
const result = await Model.find(conditions);

Mongodb update all the documents with unique id

I have collection with name products with almost 100k documents. I want to introduce a new key called secondaryKey with unique value uuid in all the documents.
I do this using nodejs.
Problem I am facing:-
When I try the below query,
db.collection('products').updateMany({},{"$set":{secondaryKey: uuid()}});
Here it updates all the documents with same uuid value,
I try with loop to update document one by one,but here issues is I don't have filter value in updateOne because I want to update all the documents.
Can anyone please help me here.
Thanks :)
If you are using MongoDB version >= 4.4 You can try this:
db.products.updateMany(
{},
[
{
$set: {
secondaryKey: {
$function: {
body: function() {
return UUID().toString().split('"')[1];
},
args: [],
lang: "js"
}
}
}
}
]
);
Output
[
{
"_id": ObjectId("..."),
"secondaryKey": "f41b15b7-a0c5-43ed-9d15-69dbafc0ed29"
},
{
"_id": ObjectId("..."),
"secondaryKey": "50ae7248-a92e-4b10-be7d-126b8083ff64"
},
{
"_id": ObjectId("..."),
"secondaryKey": "fa778a1a-371b-422a-b73f-8bcff865ad8e"
}
]
Since it's not the same value you want to put in each document you have to use the loop.
In your loop, you have to update the current document of the iteration. So you have to filter with the _id in the updateOne
The above reply didn't work for me. Plus, it compromises security when you enable javascript on your database (see here $function and javascript enabling on database). The best way is to not overload your server, do your work on local as below:
const { nanoid, customAlphabet } = require('nanoid')
async function asdf() {
const movies = await client.db("localhost").collection("productpost");
var result2 = []
let result = await movies.find({}).toArray()
result.forEach(element => {
const nanoid = customAlphabet('1234567890', 10)
console.log(element.price)
element.price = 4
element.id = nanoid()
result2.push(element)
});
console.log("out reult2", result2)
await movies.deleteMany({})
await movies.insertMany(result2)
})
It will delete any objects on your collections and update with the new ones. Using nanoid as uniqueids.
This is the database object array after adding unique id:
{ "_id": { "$oid": "334a98519a20b05c20574dd1" }, "attach": "[\"http://localhost:8000/be/images/2022/4/bitfinicon.png\"]", "title": "jkn jnjn", "description": "jnjn", "price": 4, "color": "After viewing I am 48.73025772956596% more satisfied with life.", "trademark": "", "category": "[]", "productstate": "Published", "createdat": { "$date": "2022-04-03T17:40:54.743Z" }, "language": "en"}
P.S: Please backup your collection before doing this or filter the array on your needs for not going through all collection.

Update nested array objects in MongoDB

I have to deal with objects of the following type in a NodeJS app (using mongodb driver):
data_test = {
"id": "105-20090412",
"date": new Date('2020-09-04T14:00:00.000Z'),
"station": {
"name": "AQ105",
"loc": {
"type": "Point",
"coordinates": [14.324498, 40.821930]
},
"properties": {}
},
"samples": [{
"t": new Date('2020-09-04T14:14:00.000Z'),
"data": {
//"temp_celsius": 31.81,
//"humRelPercent": 39,
"press_mBar": 1021.12,
"PM10": 200
}
}]
}
I receive every 2 minutes data as above.
I want to:
If the data received has an id not yet present on MongoDB do an insert
If the data received has a sample object with a Date (t property) yet present then add properties to this one (for example readings of different sensors)
If the data received has a sample object with a Date (t property) not yet present in samples array, then add this new one
I would like to do what described above with the minor count possible of round-trips to the MongoDB server.
I hope to have been clear enough.
Any suggestion?
Thanks in advance.
Here's my suggestion, this is not the correct answer. You will need to fiddle with the query portion. The query below should work for 1 & 3, for 2 you will have to play around.
db.collection.updateOne(
{ "id" : "105-20090412", "samples.t": <Date> },
{ $push: { "samples" : <sample> } },
{ $setOnInsert: { station: <station> } },
{ upsert: true }
);
References:
https://docs.mongodb.com/manual/reference/method/db.collection.updateOne/
https://docs.mongodb.com/manual/reference/operator/update/setOnInsert/#up._S_setOnInsert
https://docs.mongodb.com/manual/reference/operator/update/push/
I finally came to the following solution, perhaps not the most efficient one:
try {
const db = client.db(dbName);
const collection = db.collection(collectionName);
// retrive id, station, date and samplesToAdd as separate objects
let {
id,
...dataToInsert
} = data
//id = new ObjectID(id)
const queryBy_id = {
_id: id
}
// first check if doc exists
let res_query = await collection.findOne(queryBy_id)
// if doc does not exists then insert a new one
if (!res_query) {
res_insert = await collection.insertOne({
_id: id,
...dataToInsert
})
return res_insert;
} else {
// retrive samples from initial query
let current_samples = res_query.samples
// check if sample in dataToInsert yet exists
// use getTime to correctly compare dates
let idx = current_samples.findIndex(x => x.t.getTime() == dataToInsert.samples[0].t.getTime())
if (idx >= 0) {
// find index of sample to update
let current_t = current_samples[idx].t
// merge data yet stored with new one
current_samples.data = {
...current_samples[idx].data,
...dataToInsert.samples[0].data
}
let resUpdateSample = await collection.updateOne({
_id: id,
'samples.t': current_t
}, {
$set: {
'samples.$.data': current_samples.data
}
})
return resUpdateSample
} else {
// add data to samples array
let resAddToSamples = await collection.updateOne({
_id: id
}, {
$push: {
samples: dataToInsert.samples[0]
}
})
return resAddToSamples
}
}
} catch (err) {
logger.error(err);
}
How can I improve it?
Thanks.

transform raw query to mongodb query the efficient way

In a nodejs app with mongodb storage, I have the following query from user:
const rawQuery = [
'{"field":"ingredient","type":"AND","value":"green and blue"}',
'{"field":"ingredient","type":"AND","value":"black"}',
'{"field":"ingredient","type":"OR","value":"pink"}',
'{"field":"ingredient","type":"OR","value":"orange"}',
'{"field":"place","type":"AND","value":"london"}',
'{"field":"school","type":"NOT","value":"fifth"}',
'{"field":"food","type":"OR","value":"burger"}',
'{"field":"food","type":"OR","value":"pizza"}',
'{"field":"ownerFirstName","type":"AND","value":"Jimmy"}'
];
I have a collection called restaurant, and a collection called owners.
Would this query aim to handle such a search scenario?
const query = {
$and: : [
{ ingredient: 'green and blue' },
{ ingredient: 'black' },
{ $or : [
{ ingredient: 'pink' },
{ ingredient: 'orange' },
]
},
{ place: 'london' },,
{ school: { $ne: 'fifth' } },
{ $or : [
{ food: 'burger' },
{ food: 'pizza' },
]
}
]
};
How can I transform the rawQuery into this mongo query? (Given that it has to be dynamic, because I have many fields, and in this example I just included a couple of them.)
This example query aims to get the restaurants that match the description/place/school/food queries in the restaurant and also to match the owner's first name from another collection. Each restaurant document will have a ownerUuid field that points to the owner in the other collection.
What is the best solution to do a search in the mongodb for such a query in production env?
How can this be achieved with Elasticsearch?

Mongoose full text search not filtering correctly

So basically i have model with a bunch of string fields like so:
const Schema: Schema = new Schema(
{
title: {
type: String,
trim: true
},
description: {
type: String,
trim: true
},
...
}
);
Schema.index({ '$**': 'text' });
export default mongoose.model('Watch', Schema);
where I index all of them.
Now when I search being that this schema is used as a ref for another model I do a search like this where user is an instance of the other model
const { search, limit = 5 } = req.query;
const query = search && { match: { $text: { $search: new RegExp(search, 'i') } } };
const { schemaRes } = await user
.populate({
path: 'schema',
...query,
options: {
limit
}
})
.execPopulate();
and the searching itself seems to work ok, the problem is when search fields starts to be more specific it seems to me the it does not regard it well.
Example
db
{ title: 'Rolex', name: 'Submariner', description: 'Nice' }
{ title: 'Rolex', name: 'Air-King', description: 'Nice' }
When the search param is Rolex I get both items which is ok but when the search param becomes Rolex Air-King i keep on getting both items which to me is not ok because I would rather get only one.
Is there something I could do to achieve this?
Returning both items is correct, since both items match your search params, but with different similarity score.
You can output the similarity score to help sorting the result.
user.aggregate([
{ $match: { $text: { $search: "Rolex Air-King" } } },
{ $set: { score: { $meta: "textScore" } } }
])
// new RegExp("Rolex Air-King", 'i') is not necessary and even invalid,
// as $search accepts string and is already case-insensitive by default
The query will return
[{
"_id": "...",
"title": "Rolex",
"name": "Air-King",
"description": "Nice",
"score": 2.6
},
{
"_id": "....",
"title": "Rolex",
"name": "Submariner",
"description": "Nice",
"score": 1.1
}]
Since the second result item matches your search query (even partially), MongoDB returns it.
You could use the score to help sort the items. But determining the right threshold to filter the result is complex, as the score depends on the word count as well.
On a side note: You can assign different weights to the fields if they are not equally important
https://docs.mongodb.com/manual/tutorial/control-results-of-text-search/

Resources