Why firestore document is not creating nor updating data? - node.js

Problem
Firestore document is not creating nor updating data
let user = db.collection('users').doc(userId);
let data = {
timestamp: FieldValue.serverTimestamp(),
bounties : {
[impressionId] : {
timestamp: timestamp,
amount: amount,
currency: currency
}
}
};
user.set(data, {merge: true});
Expectation
The below example data should be use to create or update cloud firestore document
{
"example-user-1": {
"bounties": {
"example-impression-1": {
"timestamp": "12315443",
"amount": 0.0,
"currency": "null"
}
}
}
}
Results
The document is not created
{"domain":{"domain":null,"_events":{},"_eventsCount":1,"members":[]}}

Option 1
After installing this library and adding this code, it works! but I dont know why.
const {Firestore} = require('#google-cloud/firestore');
// Create a new client
const firestore = new Firestore();
async function quickstart() {
// Obtain a document reference.
const document = firestore.doc('posts/intro-to-firestore');
// Enter new data into the document.
await document.set({
title: 'Welcome to Firestore',
body: 'Hello World',
});
console.log('Entered new data into the document');
// Update an existing document.
await document.update({
body: 'My first Firestore app',
});
console.log('Updated an existing document');
// Read the document.
let doc = await document.get();
console.log('Read the document');
// Delete the document.
await document.delete();
console.log('Deleted the document');
}
quickstart();

user.set(data, {merge: true}).then(data=>{console.log(data)});

Related

How to add/update a reference object in firestore admin with nodejs/express

I'm learning about Cloud Firestore (coding on Nodejs/Express - restfull api)
I have two collections:
Users
username
email
password
Restaurants
title
address
content
createdBy (example: reference object Users/UwxCkzuV2if1icGowsT6 )
I donot know how to add a reference object User to Restaurant.
Can you help me to add the code in the section below?
And can you recommend me some articles about CURD with reference objects?
Thank you.
async createRestaurant(req, res, next) {
try {
//debugApp(req.body);
let { error } = Restaurant.validateRestaurant(req.body);
if (error) return next(ApiError.invalid_400());
const { title, content, address } = req.body;
console.log("creating a Restaurant.......");
//----------------------------------------------------
const restaurantsRef = db.collection("restaurants");
const snapshot = await restaurantsRef.where("title", "==", title).get();
if (!snapshot.empty) {
console.log("duplicate_400");
return next(ApiError.duplicate_400());
}
//----------------------------------------------------
var userId = "UwxCkzuV2if1icGowsT6" //this is a example Id
const response = await restaurantsRef.add({
title: title,
content: content,
address: address,
//How to add reference object createdBy: users/UwxCkzuV2if1icGowsT6
//please add your code here. Thanks
//createdBy: .......
});
//----------------------------------------------------
console.log("created a Restaurant.......");
const restaurantJSON = await restaurantDetailsToJSON(response.id);
return ApiSuccess.send(res, { restaurant: restaurantJSON });
} catch (err) {
console.log(err);
return next(
ApiError.internal_500("This restaurant could not be created", err)
);
}
}
The firebase doc function return a DocumentReference. I tend to just save the documentId as the reference. Because to reference that document is as simple as db.collection("users").doc(userId). And fetching that document would be await db.collection("users").doc(userId).get(), which would return Snapshot
const response = await restaurantsRef.add({
title: title,
content: content,
address: address,
//How to add reference object createdBy: users/UwxCkzuV2if1icGowsT6
//please add your code here. Thanks
//createdBy: .......
createdBy: db.collection("users").doc(userId)
});
Firebase Code Documentation
/**
* Get a `DocumentReference` for a randomly-named document within this
* collection. An automatically-generated unique ID will be used as the
* document ID.
*
* #return The `DocumentReference` instance.
*/
doc(): DocumentReference<T>;

Firebase Cloud Function batch write update document overwrote the entire document?

The following Cloud Function has a batch write operation that, in part, updates a single field in a document. This overwrote the entire document and now the document has a single field joinedCount: -1. Is this not the way to update individual fields in documents without overwriting them?
exports.deleteUserTEST = functions.https.onCall(async (data, _context) => {
const uId = data.userId;
const db = admin.firestore();
try {
const batch = db.batch();
const settingsDoc = await db.collection("userSettings").doc(uId).get();
const joinedIds = settingsDoc.get("private.joinedIds");
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(
db.collection("profiles").doc(jId),
{
private: {
joinedCount: admin.firestore.FieldValue.increment(-1), // <-- the culprit
},
},
);
});
await batch.commit();
} catch (error) {
throw new functions.https.HttpsError("unknown", "Failed the delete the user's content.", error);
}
return Promise.resolve(uId);
});
Moving the solution found in the comments by #Dharmaraj into a community answer, this problem was caused by the structure of the document.
Since all the data in the document was inside the private map field, passing a new map through the update method would make it appear that the entire document was being overwritten instead of updated.
In this case, you would need to access the fields through dot notation. This allows those inner fields within the map to be updated, without replacing the entire private map:
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(db.collection("profiles").doc(jId), {
"private.joinedCount": admin.firestore.FieldValue.increment(-1)
});
});
Another example from the documentation:
import { doc, setDoc, updateDoc } from "firebase/firestore";
// Create an initial document to update.
const frankDocRef = doc(db, "users", "frank");
await setDoc(frankDocRef, {
name: "Frank",
favorites: { food: "Pizza", color: "Blue", subject: "recess" },
age: 12
});
// To update age and favorite color:
await updateDoc(frankDocRef, {
"age": 13,
"favorites.color": "Red"
});

Mongoose Set object property in array of objects

I have a document consisting of a Post. Each Post has an array of Comments, which are an object each. So my document looks like this
Now, I want to be able to update the message property in a given Comment object.
So I'll be using the $set method, but how would I be able to select the specific object. Currently, my unfinished method looks like this
export const editComment = async (req, res) => {
const { id } = req.body;
const post = await Post.findById(req.params.id);
const _id = post.id;
const postComments = post.comments.map((comment) => comment._id);
const commentIndex = postComments.indexOf(id.id);
const message = post.comments[commentIndex].message;
try {
await Post.updateOne(
{ _id },
{
$set: {
// Action to update the comment
},
},
{ new: true }
);
res.status(200).json({ message: post });
} catch (error) {
res.status(400).json({ error: error.message });
}
}
I figured selecting the right index of the comment was a good start, but how would I, in the $set method, select the correct Comment object, and then update the message property?
You have to find the right data in database and update its required property. You can do it by the following method
exports.updateMessage= async (_id,objectId, newMessage) => {
return await TableName.updateOne({_id: _id},{{comments: {$elemMatch: {_id: objectId}$set:{message:newMessage}}});
};
Post.updateOne(_id,
{
$set: {
"comments.$[elem].message": "the value you want to set for message"
}
},
{
arrayFilters: [
{
"elem._id": 1 // _id of comment object you want to edit
}
]
})

Only process 500 lines/row at a time createReadStream

I have to read a really large CSV file so search through the google and get to know about createReadStream. I am using a program that read the csv file data and insert it into the mongoDB.
process I am following
process the data using createReadStream (I think it read the file line by line).
Storing data into an array.
Insert the data into mongoDB using insertMany
Now the problem is whole file is first get stored into an array and then I insert into the database.
But what I think is the better approach would be I only store first 500 line/rows into an array insert it into the DB and again follow the same step for the next 500 records
Is it possible to achieve this ?
and also is it the right way to do this ?
my program
const test = async () => {
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try{
stream.pause()
if(!authorName.includes(csvrow.author)) {
const author = new Author({author: csvrow.author})
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if(!companyName.includes(csvrow.company_name)) {
const company = new Company({companyName: csvrow.company_name})
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
})
userData.push(users)
book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
})
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
})
}finally {
stream.resume()
}
})
.on('end', async function() {
try {
Author.insertMany(authorData)
User.insertMany(userData)
Book.insertMany(bookData)
Company.insertMany(companyData)
await Relational.insertMany(relationalData)
parentPort.postMessage("true")
}catch(e){
console.log(e)
parentPort.postMessage("false")
}
})
}
test()
This program is working fine also inserting the data into the DB, But I am looking for something like this:
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow, maxLineToRead: 500) {
// whole code/logic of insert data into DB
})
so maxLineToRead is my imaginary term.
basically my point is I want to process 500 data at a time and insert it into the DB and want to repeat this process till the end.
You can create a higher scoped array variable where you accumulate rows of data as they arrive on the data event. When you get to 500 rows, fire off your database operation to insert them. If not yet at 500 rows, then just add the next one to the array and wait for more data events to come.
Then, in the end event insert any remaining rows still in the higher scoped array.
In this way, you will insert 500 at a time and then however many are left at the end. This has an advantage vs. inserting them all at the end that you spread out the database load over the time you are parsing.
Here's an attempt to implement that type of processing. There are some unknowns (documented with comments) based on an incomplete description of exactly what you're trying to accomplish in some circumstances):
const test = () => {
return new Promise((resolve, reject) => {
const accumulatedRows = [];
async function processRows(rows) {
// initialize data arrays that we will insert
const authorData = [],
companyData = [],
userData = [],
bookData = [],
relationalData = [];
// this code still has a problem that I don't have enough context
// to know how to solve
// If authorName contains csvrow.author, then the variable
// authorId is not initialized, but is used later in the code
// This is a problem that needs to be fixed.
// The same issue occurs for companyID
for (let csvrow of rows) {
let authorId, companyID;
if (!authorName.includes(csvrow.author)) {
const author = new Author({ author: csvrow.author })
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if (!companyName.includes(csvrow.company_name)) {
const company = new Company({ companyName: csvrow.company_name })
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
let users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
});
userData.push(users)
let book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
});
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
});
}
// all local arrays of data are populated now for this batch
// so add this data to the database
await Author.insertMany(authorData);
await User.insertMany(userData);
await Book.insertMany(bookData);
await Company.insertMany(companyData);
await Relational.insertMany(relationalData);
}
const batchSize = 50;
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try {
accumulatedRows.push(csvRow);
if (accumulatedRows.length >= batchSize) {
stream.pause();
await processRows(accumulatedRows);
// clear out the rows we just processed
acculatedRows.length = 0;
stream.resume();
}
} catch (e) {
// calling destroy(e) will prevent leaking a stream
// and will trigger the error event to be called with that error
stream.destroy(e);
}
}).on('end', async function() {
try {
await processRows(accumulatedRows);
resolve();
} catch (e) {
reject(e);
}
}).on('error', (e) => {
reject(e);
});
});
}
test().then(() => {
parentPort.postMessage("true");
}).catch(err => {
console.log(err);
parentPort.postMessage("false");
});

CosmosDB + MongoAPI, updating document workaround?

I've been trying to simply update a CosmosDB document via the mongodb api in my node application, I've been testing in and out, no errors but the value does not update no matter what.
I know updating array elements is not supported which is fine, but this is a top-level key-value pair. Changes simply don't happen with no error whatsoever.
I've been following the Mean.js project with uses CosmosDB + Mongoose + Node + Angular, looking at the API for updating hero and trying some of that code but it still doesn't update.
I've been reading the documentation trying to figure out the default way of handling CRUD operations within CosmosDB and which parts of the MongoAPI it supports but so far no luck.
For tests purposes, I'm using this code:
async function updateUser(id) {
try {
let user = await User.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.firstName = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
So, I've been playing around some more and managed to update a hero using this code:
updateHero('10')
async function updateHero(id) {
const originalHero = {
uid: id,
name: 'Hero2',
saying: 'nothing'
};
Hero.findOne({ uid: id }, (error, hero) => {
hero.name = originalHero.name;
hero.saying = originalHero.saying;
hero.save(error => {
return(hero);
console.log('Hero updated successfully!');
});
});
}
Now I'm just not sure why this has actually worked and why it hasn't before. The main thing that is different is that I'm using an 'uid' instead of the actual ID assigned by CosmosDB.
I tested sample code you provided and they both updated document successfully.
Sample document:
Snippet One:
updateUser('5b46eb0ee1a2f12ea0af307f')
async function updateUser(id) {
try {
let user = await Family.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.name = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
Output One:
Snippet Two:
updateFamily('5b46eb0ee1a2f12ea0af307f')
async function updateFamily(id) {
const updateFamily = {
_id: id,
name: 'ABCD',
};
Family.findOne({ _id : id }, (error, family) => {
family.name = updateFamily.name;
family.save(error => {
console.log(JSON.stringify(family));
console.log('Hero updated successfully!');
return(family);
});
});
}
Output Two:
In addition, you could use db.collection.update() to update document.
db.families.update(
{ _id: '5b46eb0ee1a2f12ea0af307f' },{ $set:
{
name: 'AAAA'
}
})
More details,please refer to the doc: https://docs.mongodb.com/manual/reference/method/db.collection.update/
Hope it helps you.

Resources