I am using Bookshelf JS as my ORM. I want to query for a special entity and update it one query (as if I execute INSERT INTO... WHERE...)
How can I do this without a raw query?
Here is how I do it right now:
async function updateSetting() {
let userSetting = await Setting.where({user: 1, key: 'foo'}).fetch()
await userSetting.save({value: {confirmed: true, order: 66}})
}
You could try this:
async function updateSetting() {
let userSetting = await Setting
.where({user: 1, key: 'foo'})
.save(
{value: {confirmed: true, order: 66}},
{method: 'update', patch: true}
)
}
To update a model without doing a fetch() you need to have the model's id_Attribute (usually just id) or to use a where() clause (as mentioned by #websoftwares)
Something just like:
new Setting({ id: 99 })
.save({ value: { confirmed: true, order: 66} },
{ patch: true })
Note in this case you do not need to specify the method because the presence of id implies you are updating an existing row.
Related
I just can't figure out the query and even if it's allowed to write a single query to push 4 different objects into 4 different arrays deeply nested inside the user Object.
I receive PATCH request from front-end which's body looks like this:
{
bodyweight: 80,
waist: 60,
biceps: 20,
benchpress: 50,
timestamp: 1645996168125
}
I want to create 4 Objects and push them into user's data in Mongo Atlas
{date:1645996168125, value:80} into user.stats.bodyweight <-array
{date:1645996168125, value:60} into user.stats.waist <-array
...etc
I am trying to figure out second argument for:
let user = await User.findOneAndUpdate({id:req.params.id}, ???)
But i am happy to update it with any other mongoose method if possible.
PS: I am not using _id given by mongoDB on purpose
You'll want to use the $push operator. It accepts paths as the field names, so you can specify a path to each of the arrays.
I assume the fields included in your request are fixed (the same four property names / arrays for every request)
let user = await User.findOneAndUpdate(
{ id: req.params.id },
{
$push: {
"stats.bodyweight": {
date: 1645996168125,
value: 80,
},
"stats.waist": {
date: 1645996168125,
value: 60,
},
// ...
},
}
);
If the fields are dynamic, use an object and if conditions, like this:
const update = {};
if ("bodyweight" in req.body) {
update["stats.bodyweight"] = {
date: 1645996168125,
value: 80,
};
}
// ...
let user = await User.findOneAndUpdate(
{ id: req.params.id },
{
$push: update,
}
);
The if condition is just to demonstrate the principle, you'll probably want to use stricter type checking / validation.
try this:
await User.findOneAndUpdate(
{id:req.params.id},
{$addToSet:
{"stats.bodyweight":{date:1645996168125, value:80} }
}
)
I want to be able to update an array of objects where each object has a new unique value assigned to it.
Here is a simplified example of what I'm doing. items is an array of my collection items.
let items = [{_id: '903040349304', number: 55}, {_id: '12341244', number: 1166}, {_id: '667554', number: 51115}]
I want to assign a new number to each item, and then update it in collection:
items = items.map(item => {
item.number = randomInt(0, 1000000);
return item;
})
What would be the best way to update the collection at once? I know that I could do it in forEach instead of map, how ever this seems as a dirty way of doing it, as it won't do the bulk update.
items.forEach(async (item) => {
await this.itemModel.update({_id: item._id}, {number: randomInt(0, 1000000)})
});
I've checked the updateMany as well but my understanding of it is that it's only used to update the documents with a same new value - not like in my case, that every document has a new unique value assigned to it.
After a bit of thinking, I came up with this solution using bulkWrite.
const updateQueries = [];
items.forEach(async (item) => {
updateQueries.push({
updateOne: {
filter: { _id: item._id },
update: { number: item.number },
},
});
});
await this.itemModel.bulkWrite(updateQueries);
About bulkWrite
Sends multiple insertOne, updateOne, updateMany, replaceOne,
deleteOne, and/or deleteMany operations to the MongoDB server in one
command. This is faster than sending multiple independent operations
(like) if you use create()) because with bulkWrite() there is only one
round trip to MongoDB.
You can call an aggregate() to instantly update them without needing to pull them first:
Step1: get a random number with mongoDb build in $rand option which returns a number between 0 and 1
Step2: $multiply this number by 1000000 since that is what you defined ;)
Step3: use another $set with $floor to remove the decimal portion
YourModel.aggregate([
{
'$set': {
'value': {
'$multiply': [
{
'$rand': {}
}, 1000000
]
}
}
}, {
'$set': {
'value': {
'$floor': '$value'
}
}
}
])
Here a picture of how that looks in mongo Compass as a proof of it working:
I tried to find a way to copy/clone instances in Sequelize but without success. Is there any way to do it with a built-in function or without? What I want is to simply copy rows in the database and the new item should have only a different id.
There is no such direct function for that , What you can do is :
Fetch the object that you want to clone/copy
Remove Primary Key from it
Make a new entry from it
model.findOne({ //<---------- 1
where : { id : 1 } ,
raw : true })
.then(data => {
delete data.id; //<---------- 2
model.create(data); //<---------- 3
})
As said, there is no such direct function for that (thanks Vivek)
If you find useful, place the following code on your model class:
async clone() {
let cData = await THISISMYMODEL.findOne({
where: { id: this.id},
raw: true,
});
delete cData.id;
return await THISISMYMODEL.create(data);
}
Take into account that "THISISMYMODEL" should be the model class defined and "id" the primary key attribute used.
Also take into account the use of Foreign Keys (relations with other models), it will use the same keys. Otherwise you should clone those instances too.
You may need to update the name though or some other field to identify it as a copy,
const data = await model.findOne({ where: {id: 1}, raw: true, attributes: { exclude: ['id'] } });
data.name = data.name + '(copy)';
const newRecord = await model.create(data);
Write a Model.create(data) function inside Model.js and call this function from inside of a loop, as many times you need it will create the copy of the same data.
Assume there's a collection with documents where field_1 is unique
[
{
field_1: 'abc',
field_2: 0,
field_3: []
}
]
I want to add another document, but then field_1 is the same 'abc'. In which case I want to increment field_2, and append element into field_3 while updating. And if field_1 is different, create another document.
What is the best way to approach such queries? My first thought was to search, and then insert if no documents are found, or is there a better way? The problem with this approach is, if I'm inserting multiple documents at once, I can't use 'search and, if no doc found, insert' approach effectively.
Mongoose now supports this natively with findOneAndUpdate (calls MongoDB findAndModify).
The upsert = true option creates the object if it doesn't exist. defaults to false.
MyModel.findOneAndUpdate(
{foo: 'bar'}, // find a document with that filter
modelDoc, // document to insert when nothing was found
{upsert: true, new: true, runValidators: true}, // options
function (err, doc) { // callback
if (err) {
// handle error
} else {
// handle document
}
}
);
If the uniqueness of field_1 is enforced by unique index, you can use a kind of optimistic locking.
First you try to update:
db.collection.update(
{
field_1: 'abc'
},
{
$inc: {field_2: 1},
$push: {field_3: 'abc'},
}
);
and check result of the operation - if 1 document updated, no more actions required. Otherwise, it's the first document with field_1 == 'abc', so you try to insert it:
db.collection.insert(
{
field_1: 'abc',
field_2: 0,
field_3: []
}
);
and catch the error. If there is no error, no more actions required. Otherwise there was a concurrent insert, so you need to repeat the update query once more.
I need to increment a column with 1 on some occasions, but the default value of that column is null and not zero. How do I handle this case using sequelize? What method could be utilized?
I could do by checking the column for null in one query and updating it accordingly in the second query using sequelize but I am looking for something better. Could I handle this one call?
I'll confess that I'm not terribly experienced with sequelize, but in general you'll want to utilize IFNULL. Here's what the raw query might look like:
UPDATE SomeTable
SET some_column = IFNULL(some_column, 0) + 1
WHERE <some predicate>
Going back to sequelize, I imagine you're trying to use .increment(), but judging from the related source, it doesn't look like it accepts anything that will do the trick for you.
Browsing the docs, it looks like you might be able to get away with something like this:
SomeModel.update({
some_column: sequelize.literal('IFNULL(some_column, 0) + 1')
}, {
where: {...}
});
If that doesn't work, you're probably stuck with a raw query.
First you need to find the model instance and update via itself, or update directly via Sequelize Static Model API.
Then you'll check whether the updated field got nullable value or not ? If fails then do the manual update as JMar propose above
await model.transaction({isolationLevel: ISOLATION_LEVELS.SERIALIZABLE}, async (tx) => {
const user = await model.User.findOne({
where: {
username: 'username',
},
rejectOnEmpty: true,
transaction: tx,
});
const updatedRecord = await user.increment(['field_tag'], {
transaction: tx,
});
if (!updatedRecord.field_tag) {
/** Manual update & Convert nullable value into Integer !*/
await model.User.update({
field_tag: Sequelize.literal('IFNULL(field_tag, 0) + 1')
}, {
where: {
username: 'username',
},
transaction: tx,
});
}
});