The following Cloud Function has a batch write operation that, in part, updates a single field in a document. This overwrote the entire document and now the document has a single field joinedCount: -1. Is this not the way to update individual fields in documents without overwriting them?
exports.deleteUserTEST = functions.https.onCall(async (data, _context) => {
const uId = data.userId;
const db = admin.firestore();
try {
const batch = db.batch();
const settingsDoc = await db.collection("userSettings").doc(uId).get();
const joinedIds = settingsDoc.get("private.joinedIds");
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(
db.collection("profiles").doc(jId),
{
private: {
joinedCount: admin.firestore.FieldValue.increment(-1), // <-- the culprit
},
},
);
});
await batch.commit();
} catch (error) {
throw new functions.https.HttpsError("unknown", "Failed the delete the user's content.", error);
}
return Promise.resolve(uId);
});
Moving the solution found in the comments by #Dharmaraj into a community answer, this problem was caused by the structure of the document.
Since all the data in the document was inside the private map field, passing a new map through the update method would make it appear that the entire document was being overwritten instead of updated.
In this case, you would need to access the fields through dot notation. This allows those inner fields within the map to be updated, without replacing the entire private map:
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(db.collection("profiles").doc(jId), {
"private.joinedCount": admin.firestore.FieldValue.increment(-1)
});
});
Another example from the documentation:
import { doc, setDoc, updateDoc } from "firebase/firestore";
// Create an initial document to update.
const frankDocRef = doc(db, "users", "frank");
await setDoc(frankDocRef, {
name: "Frank",
favorites: { food: "Pizza", color: "Blue", subject: "recess" },
age: 12
});
// To update age and favorite color:
await updateDoc(frankDocRef, {
"age": 13,
"favorites.color": "Red"
});
Related
This is my first time of using bulkWrite to carry out updates via mongoose. I am building a blog application and I am using it to learn MERN stack. I have a Post model. The Post model has object value which is an array. This is an example of it:
const PostSchema = new mongoose.Schema(
{
postLikes:{
type: Array,
default: []
}
}
)
The postLikes contain mongodb object ids of users who liked a post.
I have a logic for deleting selected users or all users by an admin. The like system does not come with a Like Model of it own. I simply used an array system inside the post model. After deleting a user, I would like to update all post models with likes of the selected users. Some users may have multiple likes across different posts.
In my node, I created a variable like this:
const {selectedIds} = req.body;
The selectedIds came from reactjs like this:
const [selectedUsers, setSelectedUsers] = useState([]);
const arrayOfSelectedUserId = (userId) =>{
setSelectedUsers(prevArray => [...prevArray, userId]);
);
}
For the request, I did it like this:
const response = await axiosPrivate.post(`/v1/users/deleteSelected`, selectedIds, { withCredentials: true,
headers:{authorization: `Bearer ${auth.token}`}})
In nodejs, the selectedUsers ids was passed to this variable:
const {selectedIds} = req.body;
I created the logic this way:
const findIntersection = (array1, array2) => {
return array1.filter((elem) => {
return array2.indexOf(elem) !== -1;
});
}
const filteredPost = posts.filter((singleFilter) => {
const intersection = findIntersection(selectedIds, singleFilter.postLikes);
return singleFilter.postLikes.length !== 0 && intersection.length !== 0;
});
const updatedPosts = filteredPost.map((obj)=>{
const intersection = findIntersection(selectedIds, obj.postLikes);
console.log(intersection )
return {
updateOne: {
filter: { _id: obj._id },
update: { $pull: { postLikes: { $in: intersection } } },
},
};
});
Post.bulkWrite(updatedPosts).then((res) => {
console.log("Documents Updated", res.modifiedCount)
})
The console.log shows the text Document updated and showed number of documents updated. However, if I check my database, the update won't reflect. This means that the selected users' ID is still in the array.
Is there a better method? What Am I doing wrong?
I'm having a NODE.JS project using mongoose 5.x
My model have toJSON method which removes the _id & __v fields perfectly
mySchema.method("toJSON", function toJSON() {
const {__v, _id, ...object} = this.toObject();
return {
id: _id,
...object
};
});
so when fetching data from the db:
const data = myModel.findOne({_id: id});
I get an object that when serialized to the user:
res.json(data);
It doesn't contain the _id and __v fields as required.
The problem is when I use lean():
const data = myModel.findOne({_id: id}).lean();
the data object contains those fields.
I can remove them manually when using lean
but I would prefer to find a way to sanitize the data object in both cases with the same mechanism.
any suggestions?
Thanks in advance.
Not sure if this is what you want but maybe:
const data = myModel.findOne({_id: id}).lean().then(res => {
delete res._id
return res
})
When JSON.stringify() is called on an object, a check if done if it has a property called toJSON. It's not specific to Mongoose, as it works on plain objects:
const myObj = {
_id: 123,
foo: 'bar',
// `toJSON() {...}` is short for `toJSON: function toJSON() {...}`
toJSON() {
// Because "this" is used, you can't use an arrow function.
delete this._id;
return this;
}
};
console.log(JSON.stringify(myObj));
// {"foo":"bar"}
Mongoose doesn't have an option to automatically inject a toJSON function into objects returned by lean(). But that's something you can add.
First, create a function that:
Takes an object with properties
Listens to when Mongoose runs a find query
Tells Mongoose that after the query, it should change the result
The change: merge the result with the object from step 1.
function mergeWithLeanObjects(props) {
// Return a function that takes your schema.
return function(schema) {
// Before these methods are run on the schema, execute a function.
schema.pre(['find', 'findOne'], function() {
// Check for {lean:true}
if (this._mongooseOptions.lean) {
// Changes the document(s) that will be returned by the query.
this.transform(function(res) {
// [].concat(res) makes sure its an array.
[].concat(res).forEach(obj => Object.assign(obj, props));
return res;
});
}
});
};
}
Now add it to your schema:
mySchema = new mongoose.Schema({
foo: String
});
mySchema.plugin(mergeWithLeanObjects({
toJSON() {
delete this._id;
delete this.__v;
return this;
}
}));
Then test it:
const full = await myModel.findOne();
const lean = await myModel.findOne().lean();
console.log(full);
// Logs a Mongoose document, all properties.
{ _id: new ObjectId("62a8b39466768658e7333154"), foo: 'bar', __v: 1 }
console.log(JSON.stringify(full));
// Logs a JSON string, all properties.
{"_id":"62a8b39466768658e7333154","foo":"bar","__v":1}
console.log(lean);
// Logs an Object, all properties.
{ _id: new ObjectId("62a8b39466768658e7333154"),
foo: 'bar', __v: 1, toJSON: [Function: toJSON] }
console.log(JSON.stringify(lean));
// Logs a JSON string, filtered properties.
{"foo":"bar"}
If you want to re-use the plugin with the same settings on multiple schemas, just save the function that mergeWithLeanObjects returns somewhere.
// file: clean.js
module.exports = mergeWithLeanObjects({
toJSON() {
delete this._id;
delete this.__v;
return this;
}
});
// file: some-schema.js
schema1.plugin(require('./clean.js'));
// file: some-other-schema.js
schema2.plugin(require('./clean.js'));
There's also mongoose.plugin() to add the function to all schemas.
Try this to retrieve the _id from a document
myModel.findOne({_id: id}, function(err, doc) {
if(err)
return 'do something with this err'
console.log(doc._id)
})
Problem
Firestore document is not creating nor updating data
let user = db.collection('users').doc(userId);
let data = {
timestamp: FieldValue.serverTimestamp(),
bounties : {
[impressionId] : {
timestamp: timestamp,
amount: amount,
currency: currency
}
}
};
user.set(data, {merge: true});
Expectation
The below example data should be use to create or update cloud firestore document
{
"example-user-1": {
"bounties": {
"example-impression-1": {
"timestamp": "12315443",
"amount": 0.0,
"currency": "null"
}
}
}
}
Results
The document is not created
{"domain":{"domain":null,"_events":{},"_eventsCount":1,"members":[]}}
Option 1
After installing this library and adding this code, it works! but I dont know why.
const {Firestore} = require('#google-cloud/firestore');
// Create a new client
const firestore = new Firestore();
async function quickstart() {
// Obtain a document reference.
const document = firestore.doc('posts/intro-to-firestore');
// Enter new data into the document.
await document.set({
title: 'Welcome to Firestore',
body: 'Hello World',
});
console.log('Entered new data into the document');
// Update an existing document.
await document.update({
body: 'My first Firestore app',
});
console.log('Updated an existing document');
// Read the document.
let doc = await document.get();
console.log('Read the document');
// Delete the document.
await document.delete();
console.log('Deleted the document');
}
quickstart();
user.set(data, {merge: true}).then(data=>{console.log(data)});
When I want to add two records in sequence, only one record is added, on the second it throws an error due to the fact that it cannot create a field with such data:
"NOTES_ID is required","key: NOTES_ID, value: undefined, is not a
number"
How to create an entry for two related tables sequentially from the beginning for the main table, and then for the one that has the foreign key installed.
module.exports.create = async function (req, res) {
const stateMatrix = await StateMatrix.select().exec()
const noteObj = {
DATE: req.body.DATE,
TITLE: req.body.TITLE,
CONTENT: req.body.CONTENT
};
const noteStateObj = {
STATE_DATE: new Date().toLocaleDateString("en-US"),
STATES_ID: stateMatrix[0]._props.STATES_ID_CURR,
NOTES_ID: req.body.NOTE_ID,
USERS_ID: req.decoded.user_id
};
try {
await Notes.create(noteObj);
await NoteStates.create(noteStateObj);
res.status(201).json(noteObj, noteStateObj);
} catch (e) {
errorHandler(res, e);
}
};
Probably NoteStates is related to Notes through note_id field which can not be empty (I guess it's foreign key). It means that you should set it before saving noteStateObj:
// Something like this
const newNote = await Notes.create(noteObj);
noteStateObj.NOTES_ID = newNote.ID;
await NoteStates.create(noteStateObj);
Desctiption
MongoDB's updateMany method doesn't update any documents when used with $rename on some fields.
Example
I try to rename the fields blog.blog_ttile and blog.blog_cotnet. I checked for typos, there are none.
Example model:
User: {
name,
email,
blog: {
blog_ttile,
blog_cotnet
}
}
Code:
const mongoose = require('mongoose');
const User = mongoose.model('User');
const nameChanges = {
"blog.blog_ttile": 'blog.title',
'blog.blog_cotnet': 'blog.content',
};
async function performNameChanges() {
try {
const updatedDocuments = await User.updateMany({}, { $rename: nameChanges });
console.log({ updatedDocuments });
} catch(err) {
console.error(err);
}
}
returns:
{ updatedDocuments: { ok: 0, n: 0, nModified: 0 } }
Additional details
Some fields are correctly recognized. Let's say email in the above example. However, when I try to update the updated name, it doesn't work again. Interestingly, it still detects the original name.
Example:
Renaming email to personal_email works. Renaming personal_email to email afterwards doesn't and returns { ok: 0, n: 0, nModified: 0 }. Calling a rename on email a second time returns { n: <total_records>, nModified: 0, ok: 1 } although no documents have email anymore`.
What could be causing this?
Note:
This question applies for MongoDB without Mongoose with db.getCollection("User").updateMany instead of User.updateMany
I tried doing the same in MongoDB. It works as expected. In your case I'm suspecting Mongoose schema to be the cause of this weird behavior. Mongoose Schema has to have the field you are looking to rename. If the field doesn't exist, it returns nModified 0. The schema will need to have both the old and the new names. Old ones to allow the migration, and the new ones for the new logic in the code.
Your return result is:
{ updatedDocuments: { ok: 0, n: 0, nModified: 0 } }
How is this possible? n=0? for query {}. It is only possible when there are no elements in your collection. n means matched count, it must be equal to total number of records in your collection.
Renaming email to personal_email works
Before first update your schema is fine. But after rename (first update), you should update your schema to :
User: {
name,
personal_email,
blog: {
blog_tile,
blog_contnet
}
}
before running the second update (renaming back to email).
As said in the other answer, it's because your mongoose schema didn't contain the field you wanted to rename.
Instead of keeping the old field around while the migration occurs, you can also specify strict: false in the options, and mongoose will not discard unknown paths:
const mongoose = require('mongoose');
const User = mongoose.model('User');
const nameChanges = {
"blog.blog_ttile": 'blog.title',
'blog.blog_cotnet': 'blog.content',
};
async function performNameChanges() {
try {
const updatedDocuments = await User.updateMany(
{},
{ $rename: nameChanges },
{
// Strict allows to update keys that do not exist anymore in the schema
strict: false,
}
).exec();
console.log({ updatedDocuments });
} catch(err) {
console.error(err);
}
}
use like this thanks you
import { connect } from '../database';
export const renameFileds = async () => {
const db = connect();
//let userlist = await db.UserModel.find();
const updatedDocuments = await db.UserModel.updateMany(
{},
{ $rename: { image: 'picture' } },
{
// Strict allows to update keys that do not exist anymore in the schema
strict: false,
}
).exec();
console.log({ updatedDocuments });
};
renameFileds();