mongodb changestream “pipeline” not working nodejs - node.js

I have the following change stream but it does not function changed is not logged once I update using mongo compass.
var pipeline = [
{ $match: { _id: ObjectId(id) } }
];
try {
const collection = client.db("mydb").collection("shop");
const changeStream = collection.watch(pipeline);
changeStream.on('change', (next) => {
//console.log(next);
console.log('changed')
}, err => {
console.log(err);
});
} catch (err) {
console.log(err)
}

Is the problem that you don't normally update the _id of a document in a collection? If, for some reason, you are updating the _id then maybe the problem is in how you're referencing your $match. This works for me:
const pipeline01 = [
{ $match: { 'updateDescription.updatedFields.fieldIamInterestedIn': { $ne: undefined } } },
{ $project: { 'fullDocument._id': 1, 'fullDocument.anotherFieldIamInterestedIn': 1 } },
];
theCollectionIamWatching.watch(pipeline01, { fullDocument: 'updateLookup' }).on('change', async (data) => {
// do the thing I want to do using data.fullDocument
});

Related

toArray is not a function in mongodb,mongoose

to Array is not a function in mongo database, mongoose, node.js
`getCartProducts: (userId) => {
return new Promise(async (resolve, reject) => {
let cart Items = await db.cart.aggregate([
{`your text`
$match: { user: user Id }//matched with the id
},
{
$lookup: {
from: "db.products",
let: { proList: '$products' },
pipeline: [
{
$match: {
$expr: {
$in: ["$_id", '$$proList']
}
}
}
],
as: 'cart Items' //converted as cart Items name
}
}
]).`to array`()
resolve(`cart Items`)
})
}
db. cart. aggregate().to Array is not a function
I tried to remove the to array but it shows as un defined
Issue is you're doing toArray() on promise,
it should be something like this.. you don't need to create custom promise..
const getCartItems = () => {
// ...
const items = await db.cart.aggregate([...])
return items.toArray();
...
}

Getting an {"message":"Invalid update pipeline operator: \"_id\""} error

I am trying to update two rows in my players table based on the id. I am trying to use the updateMany method where id can be found in an array of id's but I am getting the {"message":"Invalid update pipeline operator: \"_id\""} error. I checked the array to make sure it is valid id's. Here is my code
const winningTeam = asyncHandler(async (req, res) => {
req.body.forEach((element) => {
element.wins += 1;
element.lastPlayed = Date.now();
element.percentage = (element.wins / (element.wins + element.wins)) * 1000;
});
let usersId = [];
usersId.push(req.body[0]._id);
if (req.body.length === 2) {
usersId.push(req.body[1]._id);
}
const player = await Player.updateMany({ _id: { $in: usersId } }, req.body);
if (player) {
res.status(200).json(player);
} else {
res.status(400);
throw new Error("Invalid Data");
}
});
You should use $set property for the update parameter. I'm not sure about the structure of your req.body but it should be something like this:
Player.updateMany({ _id: { $in: usersId } }, {$set: req.body});
instead of this:
Player.updateMany({ _id: { $in: usersId } }, req.body);
Take a look at docs for updateMany

node mongoose how to auto increment

Trying to follow the example here:
https://www.tutorialspoint.com/mongodb/mongodb_autoincrement_sequence.htm
export interface RowProps {
id?: number; // This is to auto increment
todoText: string;
}
const addAutoIncrement = async ({ db, collectionName, todoText }) => {
const getNextSequenceValue = (sequenceName: string) => {
const sequenceDocument = db
.collection<RowProps>(collectionName)
.findAndModify({
query: { _id: sequenceName },
update: { $inc: { sequence_value: 1 } },
new: true,
});
console.log('sequenceD', sequenceDocument)
return sequenceDocument.sequence_value;
};
db.collection<RowPropsClient>(collectionName).insertOne(
{
id: getNextSequenceValue('id'),
todoText
},
(err) => {
if (err) {
console.log("err");
}
}
);
}
// db is already defined and works
// I can add to the collection so this also works.
addAutoIncrement({ db, collectionName: 'todos', todoText: 'hello' });
Error: throw new Error('Collection#findAndModify unimplemented by driver');
^
Error: Collection#findAndModify unimplemented by driver
update
Tried to follow this example:
https://medium.com/#salonimalhotra1ind/how-to-increment-a-number-value-in-mongoose-785066ba09d8
const addAutoIncrement = async ({ db, collectionName, todoText }) => {
const modelTodo = db.model(collectionName, TodosSchema);
const res = await new modelTodo({ todoText }).save();
const { _id } = res;
return new Promise((resolve, reject) => {
modelTodo.findOneAndUpdate(
{ _id },
{ $inc: { id: 1 } },
{ new: true },
(err, res) => {
if (err) {
reject(err);
}
resolve(res);
}
);
});
};
**The result is just setting the value to 1 each time - not incrementing**
Collection#findAndModify() is a method that is implemented in the MongoDB shell, but not in the Node.js driver.
You should use Collection#findOneAndUpdate instead:
const { value : sequenceDocument } = db
.collection<RowProps>(collectionName)
.findOneAndUpdate({
{ _id: sequenceName },
{ $inc: { sequence_value: 1 } },
{ returnDocument : 'after' } // equivalent to `new: true`
});
ok I don't know why I didnt do this before. All the online examples make everything unnecessarily complicated.
Just get the total count and then add it.
const addAndIncrement = async ({ db, collection, todoText }) => {
const connectedModel = db.model(collection, TodosSchema);
const documentCount = await connectedModel.count({}); // get the total count
return new connectedModel({ todoText, id: documentCount }).save();
};
Unless anyone comes up with a more performant way, this is what I'm going with.

When I make db process inside async map function, I can't avoid duplicate

I want to add pallet barcode to palletBarcodes field of record.But there is check for avoid add same palletBarcode.I am using below function. But check is not working inside async map function.
myService.js
const palletBarcodes = ["TP2","TP2"]
await Promise.all(palletBarcodes.map(async (palletBarcode) => {
const promise = await this.addPalletBarcode({ transferId, barcode: palletBarcode });
return promise;
}));
async addPalletBarcode({ transferId, barcode, pickerId }) {
const { TransferDataAccess } = this;
const transfer = await TransferDataAccess.getTransfer({ transferId });
if (!transfer) {
throw new TransferNotFoundError();
}
if (transfer.palletBarcodes.length && transfer.palletBarcodes.includes(barcode)) {
throw new PalletBarcodeAlreadyExistsError({ barcode });
}
return TransferDataAccess.pushPalletBarcode({ transferId, barcode });
}
transferDataAccess:
async pushPalletBarcode({ transferId, barcode }) {
const { TransferModel } = this;
return TransferModel
.findOneAndUpdate({
_id: transferId,
},
{
$push: {
palletBarcodes: barcode,
},
})
.lean()
.exec();
}
Instead of $push use $addToSet. $addToSet will treat your key in document as a set and that will automatically avoid duplicates.
You query would then become -
TransferModel.findOneAndUpdate(
{ _id: transferId },
{ $addToSet: { palletBarcodes: barcode } }
);

nodejs mongoose bulk update

I have a collection of documents and I need to add a new field for ever document. If I run a query to get all documents and then update every single one node.js is stopped, may be for memory leak
This is my code
var express = require('express');
var geocoderProvider = 'google';
var httpAdapter = 'http';
var People = require("./models/people").collection.initializeOrderedBulkOp();
var app = express();
var geocoder = require('node-geocoder').getGeocoder(geocoderProvider, httpAdapter, {});
app.get('/', function (req, res) {
People.find({}, function (err, docs) {
if (err) {
res.send(err);
}else{
docs.forEach( function (doc){
geocoder.geocode({address: doc.address, country: 'Italy', zipcode: doc.cap}, function(error, value) {
doc.loc.coordinates[0]=value[0].latitude;
doc.loc.coordinates[1]=value[0].longitude;
People.update({ _id: doc._id }, { $set: { loc: doc.loc }}, { multi: true }, function (error){
if(error){
console.error('ERROR!');
}
});
});
});
}
});
});
var server = app.listen(3000, function () {
var host = server.address().address
var port = server.address().port
console.log('Example app listening at http://%s:%s', host, port)
});
There is any way to bulk update with mongoose?
Thanks in advance
More detailed info about the query and update query.
var bulk = People.collection.initializeOrderedBulkOp();
bulk.find(query).update(update);
bulk.execute(function (error) {
callback();
});
Query is searching with array.
Update needs a $set
var bulk = People.collection.initializeOrderedBulkOp();
bulk.find({'_id': {$in: []}}).update({$set: {status: 'active'}});
bulk.execute(function (error) {
callback();
});
Query is a searching the id
var bulk = People.collection.initializeOrderedBulkOp();
bulk.find({'_id': id}).update({$set: {status: 'inactive'}});
bulk.execute(function (error) {
callback();
});
You can drop down to the collection level and do a bulk update. This action will not be atomic - some of the writes can fail and others might succeed - but it will allow you to make these writes in a single round trip to your database.
It looks like this:
var bulk = People.collection.initializeUnorderedBulkOp();
bulk.find({<query>}).update({<update>});
bulk.find({<query2>}).update({<update2>});
...
bulk.execute(function(err) {
...
});
Check out the docs here: http://docs.mongodb.org/manual/core/bulk-write-operations/
This example should include all the cases that we can mix together using directly with Mongoose bulkWrite() function:
Character.bulkWrite([
{
insertOne: {
document: {
name: 'Eddard Stark',
title: 'Warden of the North'
}
}
},
{
updateOne: {
filter: { name: 'Eddard Stark' },
// If you were using the MongoDB driver directly, you'd need to do
// `update: { $set: { title: ... } }` but mongoose adds $set for
// you.
update: { title: 'Hand of the King' }
}
},
{
deleteOne: {
{
filter: { name: 'Eddard Stark' }
}
}
}
]).then(res => {
// Prints "1 1 1"
console.log(res.insertedCount, res.modifiedCount, res.deletedCount);
});
Official Documentation: https://mongoosejs.com/docs/api.html#model_Model.bulkWrite

Resources