How to pass array instead of required string NodeJS - node.js

Script logic - script receives data from Binance API > Then I have aggregation $avg to calculate the average of one asset. I will have more than one collection so I need to calculate average of every asset.
I have an array where I store collection names for MongoDB.
const symbols = ["ADABTC", "AEBTC", "AIONBTC"]
And I want to calculate average from MongoDB collection.
const collection = db.collection(symbols);
Here - symbols doesn't work for me, but if I simply add "ADABTC" then it works, but it doesn't fixes my problem since I want to use different collection names one after another one.
How I can pass an array if it's required to be a string? I need to use more than 1 collection names.
FULL CODE
const MongoClient = require('mongodb').MongoClient;
const assert = require('assert');
// Connection URL
const url = 'mongodb://username:password#serveripadress:port/dbname?retryWrites=true&w=majority';
const symbols = ["ADABTC", "AEBTC", "AIONBTC"]
// Database Name
const dbName = 'Crypto';
// Create a new MongoClient
const client = new MongoClient(url, { useUnifiedTopology: true });
// Use connect method to connect to the Server
client.connect(function(err, client) {
assert.equal(null, err);
console.log("Connected correctly to server");
const db = client.db(dbName);
simplePipeline(db, function() {
client.close();
});
});
function simplePipeline(db, callback) {
const collection = db.collection(symbols);
collection.aggregate(
[{
'$group': {
_id: null,
'Volume': {
'$avg': '$Volume'
}
}
}],
function(err, cursor) {
assert.equal(err, null);
cursor.toArray(function(err, documents) {
console.log(documents)
callback(documents);
});
}
);
}

It is not possible to pass an array into a function that is asking for a string. In your case what you need to do is join three collections. If you need to aggregate across multiple collections you can use the $lookup aggregation pipeline operator. You can connect using the first collection:
db.collection(symbols) => db.collection(symbols[0])
Then modify your query to join the three collections:
// Join with AEBTC collection
{
$lookup:{
from: symbols[1],
localField: //name of a value in first collection
foreignField: //name of same value in second collection
as: //what you want to call it in the second table
}
},
{ $unwind: //what you called it },
// Join with AIONBTC collection
{
$lookup:{
from: symbols[2],
localField: //name of value in joined collections 1 and 2
foreignField: //name of that value in collection 3,
as: //whatever you want to call it in the joined collection
}
},
{ $unwind: //what you called it },
// define some conditions here
{
$match {}
},
// define which fields are you want to fetch
{
$group: {
_id: null,
'Volume': {
'$avg': '$Volume'
}
}

Related

MongoDB - find one and add a new property

Background: Im developing an app that shows analytics for inventory management.
It gets an office EXCEL file uploaded, and as the file uploads the app convert it to an array of JSONs. Then, it comapers each json object with the objects in the DB, change its quantity according to the XLS file, and add a timestamp to the stamps array which contain the changes in qunatity.
For example:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":100,
"Warehouse":4,
"stamps":[]
}
after the XLS upload, lets say we sold 10 units, it should look like that:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":90,
"Warehouse":4,
"stamps":[{"1548147562": -10}]
}
Right now i cant find the right commands for mongoDB to do it, Im developing in Node.js and Angular, Would love to read some ideas.
for (let i = 0; i < products.length; i++) {
ProductsDatabase.findOneAndUpdate(
{"_id": products[i]['id']},
//CHANGE QUANTITY AND ADD A STAMP
...
}
You would need two operations here. The first will be to get an array of documents from the db that match the ones in the JSON array. From the list you compare the 'product_quantity' keys and if there is a change, create a new array of objects with the product id and change in quantity.
The second operation will be an update which uses this new array with the change in quantity for each matching product.
Armed with this new array of updated product properties, it would be ideal to use a bulk update for this as looping through the list and sending
each update request to the server can be computationally costly.
Consider using the bulkWrite method which is on the model. This accepts an array of write operations and executes each of them of which a typical update operation
for your use case would have the following structure
{ updateOne :
{
"filter" : <document>,
"update" : <document>,
"upsert" : <boolean>,
"collation": <document>,
"arrayFilters": [ <filterdocument1>, ... ]
}
}
So your operations would follow this pattern:
(async () => {
let bulkOperations = []
const ids = products.map(({ id }) => id)
const matchedProducts = await ProductDatabase.find({
'_id': { '$in': ids }
}).lean().exec()
for(let product in products) {
const [matchedProduct, ...rest] = matchedProducts.filter(p => p._id === product.id)
const { _id, product_quantity } = matchedProduct
const changeInQuantity = product.product_quantity - product_quantity
if (changeInQuantity !== 0) {
const stamps = { [(new Date()).getTime()] : changeInQuantity }
bulkOperations.push({
'updateOne': {
'filter': { _id },
'update': {
'$inc': { 'product_quantity': changeInQuantity },
'$push': { stamps }
}
}
})
}
}
const bulkResult = await ProductDatabase.bulkWrite(bulkOperations)
console.log(bulkResult)
})()
You can use mongoose's findOneAndUpdate to update the existing value of a document.
"use strict";
const ids = products.map(x => x._id);
let operations = products.map(xlProductData => {
return ProductsDatabase.find({
_id: {
$in: ids
}
}).then(products => {
return products.map(productData => {
return ProductsDatabase.findOneAndUpdate({
_id: xlProductData.id // or product._id
}, {
sku: xlProductData.sku,
product_name: xlProductData.product_name,
product_cost: xlProductData.product_cost,
product_price: xlProductData.product_price,
Warehouse: xlProductData.Warehouse,
product_quantity: productData.product_quantity - xlProductData.product_quantity,
$push: {
stamps: {
[new Date().getTime()]: -1 * xlProductData.product_quantity
}
},
updated_at: new Date()
}, {
upsert: false,
returnNewDocument: true
});
});
});
});
Promise.all(operations).then(() => {
console.log('All good');
}).catch(err => {
console.log('err ', err);
});

How to watch for changes to specific fields in MongoDB change stream

I am using the node driver for mongodb to initiate a change stream on a document that has lots of fields that update continuously (via some logic on the insert/update end that calls $set with only the fields that changed), but I would like to watch only for changes to a specific field. My current attempt at this is below but I just get every update even if the field isn't part of the update.
I think the "updateDescription.updatedFields" is what I am after but the code I have so far just gives me all the updates.
What would the proper $match filter look like to achieve something like this? I thought maybe checking if it's $gte:1 might be a hack to get it to work but I still just get every update. I've tried $inc to see if the field name is in "updatedFields" as well but that didn't seem to work either.
const MongoClient = require('mongodb').MongoClient;
const uri = 'mongodb://localhost:27017/?replicaSet=rs0';
MongoClient.connect(uri, function(err, client) {
const db = client.db('mydb');
// Connect using MongoClient
var filter = {
$match: {
"updateDescription.updatedFields.SomeFieldA": { $gte : 1 },
operationType: 'update'
}
};
var options = { fullDocument: 'updateLookup' };
db.collection('somecollection').watch(filter, options).on('change', data => {
console.log(new Date(), data);
});
});
So i figured this out...
For anyone else interested: My "pipeline" (filter, in my example) needs to be an array
this works...
const MongoClient = require('mongodb').MongoClient;
const uri = 'mongodb://localhost:27017/?replicaSet=rs0';
MongoClient.connect(uri, function(err, client) {
const db = client.db('mydb');
// Connect using MongoClient
var filter = [{
$match: {
$and: [
{ "updateDescription.updatedFields.SomeFieldA": { $exists: true } },
{ operationType: "update" }]
}
}];
var options = { fullDocument: 'updateLookup' };
db.collection('somecollection').watch(filter, options).on('change', data =>
{
console.log(new Date(), data);
});
});
I'm looking for something similar, but from the blog post at https://www.mongodb.com/blog/post/an-introduction-to-change-streams, it looks like you might need to change your filter to:
var filter = {
$match: {
$and: [
{ "updateDescription.updatedFields.SomeFieldA": { $exists: true } },
{ operationType: 'update'}
]
}
};

How to count the number of key value pairs in a mongodb document using node.js?

My document looks like the following:
{
"Erow1": "funnnyyyy hahaha",
"Erow2": "Prefer a public role",
"Erow3": "Can sometimes be easily distracted",
"Erow4": "I'm awesome"
}
I need to know the number of elements inside the document. For example, here there are 4 key/value pairs.
I'm retrieving the doc on my nodejs server like this
app.get("/editMBTI", function editMBTIFunc(req, res)
{
MongoClient.connect(url, function (err, client) {
assert.equal(null, err);
console.log("Connected Successfully to the Database server");
const db = client.db(dbName);
//getting the whole collection of MBTI sets
var cursor= db.collection("mbti_testcontent").find();
cursor.toArray(function (err, doc) {
assert.equal(err, null);
console.log(doc[0]);
// get the count of the key-value pairs present in doc[0] here.
});
});
});
You can use aggregation framework to do that:
db.col.aggregate([
{
$project: {
numberOfKeys: {
$let: {
vars: { array: { $objectToArray: "$$ROOT" } },
in: { $add: [{ $size: "$$array" }, -1] }
}
}
}
}
])
For each document we're doing a projection which will return _id and a number of keys. Used operators:
$let to define temporary variable
temporary variable array contains all the keys and values inside entire document ($$ROOT is a special variable representing whole doc), to get those keys and values we're using $objectToArray
$size measures the size of an array (length), we also need to subtract 1 representing _id if you don't want to include it in a result

Nodejs: Create search query with reference collection field

My User collection model schema:
var userModel = new Schema({
userAddress: { type: Object, ref: 'useraddress' },
name: String,
});
My User addresses collection model schema:
var addressModel = new Schema({
macAddress: String,
repeat: Number,
});
Get data method is:
module.exports.get = function (req, res) {
var _repeatTime = 2;
var _searchQRY = [];
_searchQRY.push(
{
"useraddress.repeat": { $gte: _repeatTime}
});
userModel.find({ $and: _searchQRY }).populate('useraddress').exec(function (err, results) {
res.json({ record: results})
});
This is my code. I want to filter with address repeat number. But i am not getting correct result with this query.
First Mongoose performs the the search on users collection by {"useraddress.repeat": {$gte: val}} query. And only after the call starts population.
So you should get 0 results as address is not yet populated.
Here are 2 ways of solving this. First, check out this answer please.
You'll need to:
//Any conditions that apply to not populated user collection documents
var userQuery = {};
userModel.find(userQuery)
//Populate only if the condition is fulfilled
.populate('useraddress', null, {"useraddress.repeat": { $gte: _repeatTime}})
.exec(function (err, results) {
results = results.filter(function(doc){
//If not populated it will be null, so we filter them out
return !!doc.useraddress;
});
//Do needed stuff here.
});
The second way is to use aggregation and $lookup (you'll need mongodb v 3.2+). Basically it means to move this population and filtering to DB level.
userModel
.aggregate()
//Anything applying to users collection before population
.match(userQuery)
.lookup({
from: 'address', //Please check collection name here
localField: 'useraddress',
foreignField: '_id',
as: 'useraddress'
})
//Lookup pushes the mathes to an array, in our case it's 1:1, so we can unwind
.unwind('useraddress')
//Filter them as you want
.match({'useraddress.repeat': { $gte: _repeatTime}})
.exec(function (result) {
//Get the result here.
});

how to find the distinct values of a field in a MongoDB collection using Node MongoClient

In my MEAN application, I need to find the distinct values of a field named "DataFields" from my collection "Fixed_Asset_Register", using (non-mangoose) MongoClient:
var mongodb = require('mongodb');
var assert = require('assert');
var MongoClient = mongodb.MongoClient;
MongoClient.connect(url, function (err, db) {
if (err) {
console.log('Unable to connect to the mongoDB server. Error:', err);
} else {
console.log('Connection established to', url);
var collection = db.collection('Fixed_Asset_Register');
var distictDataFieldsValue = ?
}
What is the proper syntax to get back the distinct values of the field named "DataFields"?
I solved this issue using collection.distinct:
collection.distinct("DataFields",(function(err, docs){
console.log(docs);
assert.equal(null, err);
db.close();
}))
You could also do this by way of aggregation framework. The suitable aggregation pipeline would consist of the $group operator stage which groups the document by the DataFields key and create an aggregated field count which stores the documents produced by this grouping. The accumulator operator for this is $sum.
The next pipeline stage would be the $match operator which then filters the documents having a count of 1 to depict distinction.
The last pipeline operator $group then group those distinct elements and adds them to a list by way of $push operator.
In the end your aggregation would look like this:
var collection = db.collection('Fixed_Asset_Register'),
pipeline = [
{
"$group": {
"_id": "$DataFields",
"count": {"$sum": 1}
}
},
{
"$match": { "count": 1 }
},
{
"$group": {
"_id": 0,
"distinct": { "$push": "$_id" }
}
}],
result = collection.aggregate(pipeline).toArray(),
distictDataFieldsValue = result[0].distinct;

Resources