How to splice an object from array of objects? - node.js

The thing is i am doing a array of favorites for each user object.
I did the insert request to the array to add some favorite recipe.
the problem is when i want to remove favorite from the array
its always remove the last object and not the exact object i want.
const recipe = await getRecipes(req.params.id); //gives the recipe object
let user = await User.findById(req.user._id); // gives the user object
console.log(recipe);
user.recipes.splice(user.recipes.indexOf(recipe), 1);
await user.save();
res.send(user);

The problem is that your call to indexOf passing the recipe object is not finding the element in the array so it returns -1. See how this code works:
let x = [{id: 1}, {id: 2}, {id: 3}]
let obj = {id: 2}
let i = x.indexOf(obj)
// i is -1 since obj isn't in the array.
// Another object that looks like obj is there,
// but they aren't the same exact object
console.log("i=",i)
// This will remove the last since splicing with -1 does that
x.splice( x.indexOf("d"), 1)
console.log(x)
// when the array has objects in it you can use `findIndex`
let y = [{id: 1}, {id: 2}, {id: 3}]
let j = y.findIndex(e => e.id === obj.id)
console.log("j=",j)
y.splice( j, 1 )
console.log(y)
So what you want to do is find a reliable way to find the index of the recipe in the array. See the 2nd example for how you can find the index within the object. Array.findIndex lets you compare objects in a way that's specific to the object structure.

If you want to remove the object form the array, you can use filter method like this:
user.recipes = user.recipes.filter(r => {
return r !== recipe;
})

Related

mongodb pull is not removing all items

I'm working on a project in nodejs using mongodb as my database. I'm trying to get rid of elements within my array that have dates before today. The problem that I'm having is that at most 5 elements are being deleted. I want all elements that meet this criteria to be deleted. Also, when I don't have user.possible.pull(items._id) const result = await user.save() all elements that meet this criteria are shown in my deletePossible array. However, when I do have user.possible.pull(items._id) const result = await user.save() at most 5 are being shown as well.
In my database, my User document looks like:
_id: '',
name: '',
possible: Array
0 Object
date: "Tues Jan 10 2023",
_id: "63c0b169b6fa12ac49874a13"
1 Object
date: "Wed Jan 11 2023",
_id: "63c0b172b6fa12ac49874a32"
...
My code:
const user = await User.findById(args.userId)
const deletePossible = [];
for (var items of user.possible) {
if (+new Date(items.date) < +new Date().setHours) {
deletePossible.push(items._id)
user.possible.pull(items._id)
const result = await user.save()
}
}
`
console.log(deletePossible)
I've tried a number of things such as:
for (var item of deletePossible) {
user.possible.pull(item)
const result = await user.save()
}
following deletePossible.push(items._id), and
const userInfo = await User.updateOne( { _id: args.userId}, {possible:{$pull:[...deletePossible] }} )
which removes all of the arrays from possible regardless of if it's contained within deletePossible and then adds a random _id. Nothing I have tried seems to work. Does anyone have any idea why this is happening and how to get this to work properly? I would really appreciate any help or advice. Thank you!
You can simply filter user.possible and save the updated User:
const user = await User.findById(args.userId);
if (!user) return;
// Change the condition based on your needs
user.possible = user.possible.filter(p => new Date(p.date) >= new Date());
await user.save();
The core of the issue appears to not be related to Mongo or Mongoose really, but is rather just a standard algorithmic logic problem.
Consider the following code, which iterates over an array, logs each element, and removes the third element when it arrives at it:
const array = [0, 1, 2, 3, 4];
for (const element of array) {
console.log(element);
if (element === 2) {
array.splice(2, 1); // remove the element at index 2 from the array
}
}
This code outputs:
0
1
2
4
Notice anything interesting? 3 has been skipped.
This happens because deleting an element from an array causes everything in front of it to move up a position. So if you're looking at 2 and you delete it, then 3 moves into 2's place, and 4 moves into 3's place. So then if you look at the next position, you're now looking at 4, not 3. The code never sees the 3.
This is why you should never change an array while iterating over it. A lot of languages won't even allow you to (if you're using iterators), they'll throw some sort of "underlying collection was modified during iteration" error. You can make it work if you know what you're doing (often just by iterating over the array backwards), but there are usually better solutions anyway, like using Array.prototype.filter().
One easy solution is to iterate over a copy of the array, so that when you do the delete, the array you're iterating over (the copy) isn't changed. That would look like this:
for (const item of [...user.possible]) {
if (/* some condition */) {
user.possible.pull(item._id);
}
}
Another problem with your code: +new Date().setHours will always evaluate to NaN since setHours is a function and converting a function to a number always results in NaN. I suspect this is just a typo you introduced while struggling with the original issue.
The suggestion to use filter() is even better.

JSON.parse fails on ObjectId

I am trying to convert a string for use in mongodb but fails.
let pipeline = JSON.parse('[{"$match": {"_id": ObjectId("5b5637acbd3e9c2068ef80c3")}]');
// results in "SyntaxError: Unexpected token O in JSON at position 20"s
let pipeline = JSON.parse('[{"$match": {"_id": "5b5637acbd3e9c2068ef80c3"}]');
let response = await db.collect('<collection_name>').aggregate(pipeline).toArray();
// returns [] parse works but mongodb doesn't return any rows!
// This works but its not the solution I am looking for.
let pipeline = [{"$match": {"_id": ObjectId("5b5637acbd3e9c2068ef80c3")}];
let response = await db.collect('<collection_name>').aggregate(pipeline).toArray();
I tried using the BSON type but had no luck.
My current work around is to remove the ObjectId() from the string and use a Reviver function with JSON.parse
const ObjectId = require('mongodb').ObjectID;
let convertObjectId = function (key,value){
if (typeof value === 'string' && value.match(/^[0-9a-fA-F]{24}$/)){
return ObjectId(value);
} else {
return value;
};
}
let pipeline = JSON.parse('[{"$match": {"_id": "5b5637acbd3e9c2068ef80c3"}]',convertObjectId);
let response = await db.collect('<collection_name>').aggregate(pipeline).toArray();
// returns one record.
Unfortunately, [{"$match": {"_id": ObjectId("5b5637acbd3e9c2068ef80c3")}] is not valid JSON.
The value of a property in JSON can only be an object (ex.: {}), an array (ex.: []), a string (ex.: "abc"), a number (ex.: 1), a boolean (ex.: true), or null. See an example of these values here: https://en.wikipedia.org/wiki/JSON#Example.
What you could do is add ObjectId() manually after parsing the JSON. This would mean that the value of _id would be a string first, which is valid JSON.
Then, you can loop through your parsed JSON to add ObjectId (see reference here: https://mongodb.github.io/node-mongodb-native/api-bson-generated/objectid.html):
const ObjectId = require('mongodb').ObjectID;
const pipeline = JSON.parse('[{"$match": {"_id": "5b5637acbd3e9c2068ef80c3"}]');
const pipelineWithObjectId = pipeline.map(query => ({
$match: {
...query.$match,
_id: ObjectId(query.$match._id)
}
});
const response = await db.collect('<collection_name>').aggregate(pipelineWithObjectId).toArray();
This should work with the example you provided but there are multiple caveats:
Parsing a query like that could be a vulnerability if the string contains user input that has not been sanitized: https://blog.websecurify.com/2014/08/hacking-nodejs-and-mongodb.html.
This particular code snippet would only work for queries with $match, which means that this code is not very scalable.
This code is not elegant.
All these reasons, for what they are worth, make me think that you would be better off using an object rather than a string for your queries.

How to search data in mongodb with dynamic fields using mongoose?

I've a node.js api in which user sends the required fields as an array to be fetched from the mongodb database. I need to find the data of that fields using Find query. I've written forEach statement to loop through that array and got the array elements. But when I try to get the results by inserting the array elements in the query, it doesn't giving the required results. Could any one please help me in resolving the issue by seeing the code below?
templateLevelGraphData: async function(tid,payload){
let err, templateData, respData = [], test, currentValue;
[err,templateData] = await to(Template.findById(tid));
var templateId = templateData.templateId;
payload.variables.forEach(async data=>{
console.log(data); //data has the array elements like variables=["humidity"]
[err, currentValue] = await to(mongoose.connection.db.collection(templateId).find({},{data:1}).sort({"entryDayTime":-1}).limit(1).toArray());
console.log(currentValue);
});
return "success";
}
The expected output is,
[ { humidity: 36 } ]
But I'm getting only _id like,
[ { _id: 5dce3a2df89ab63ee4d95495 } ]
I think data is not applying in the query. But I'm printing the data in the console where it's giving the correct results by displaying the array elements like, humidity. What I need to do to make it work?
When you are passing {data: 1} you are passing an array where is expecting name of column.
You have to create an object where the keys are going to be the elements of the array and set them to 1.
const projection = data.reduce((a,b) => (a[b]=1, a), {});
[...] .find({}, projection) [...]
Actually I got the solution.
for(let i=0;i<payload.variables.length;i++){
var test = '{"'+ payload.variables[i] +'":1,"_id":0}';
var query = JSON.parse(test);
[err, currentValue] = await to(mongoose.connection.db.collection(templateId).find({"deviceId":deviceId},query).sort({"entryDayTime":-1}).limit(1).toArray());
console.log(currentValue); //It's giving the solution
}

Lodash _.merge function not overwriting properties with updated information

In my user edit route, I am trying to use the Lodash merge function to update the returned user document (from Mongoose) with the updates sent in req.body. Here is my code:
const { dobYear, dobMonth, dobDay } = req.body;
const dob = formatDob(dobYear, dobMonth, dobDay);
const user = await db.User.findById(req.params.id);
const updates = pick(req.body, [ 'fullName', 'email', 'password', 'gender', 'address', 'avatar_url' ]);
merge(user, [updates, dob]);
let updatedUser = await user.save();
The problem is even when I send an updated email in the request, the merge does not seem to actually overwrite the old email value with the new one (from updates).
You are trying to merge an object with an array in the way you are passing the arguments to merge.
See documentation. And I assume your confusion came from the fact that in the docs they have _.merge(object, [sources]). However if you see in the Arguments section they have:
[sources] (...Object): The source objects. Meaning a list of objects rather than an actual array.
Try this:
var user = { 'a': 1, 'b': 2 };
var updates = { 'a': 3, 'c': 4, 'email': 'aaa#bbb.com' };
let dob = '11-11-2019'
let result = _.merge(user, updates, {dob}); // <-- ...Object
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.11/lodash.min.js"></script>
merge(user, [updates, dob]);.
Incorrect usage. Accepts the following :
Arguments
object (Object): The destination object.
[sources] (...Object): The source objects.
Observe the..., that means it accepts a variable number of arguments.
merge(user, updates, { dob })

Replacing an object in an object array in Redux Store using Javascript/Lodash

I have an object array in a reducer that looks like this:
[
{id:1, name:Mark, email:mark#email.com},
{id:2, name:Paul, email:paul#gmail.com},
{id:3,name:sally, email:sally#email.com}
]
Below is my reducer. So far, I can add a new object to the currentPeople reducer via the following:
const INITIAL_STATE = { currentPeople:[]};
export default function(state = INITIAL_STATE, action) {
switch (action.type) {
case ADD_PERSON:
return {...state, currentPeople: [ ...state.currentPeople, action.payload]};
}
return state;
}
But here is where I'm stuck. Can I UPDATE a person via the reducer using lodash?
If I sent an action payload that looked like this:
{id:1, name:Eric, email:Eric#email.com}
Would I be able to replace the object with the id of 1 with the new fields?
Yes you can absolutely update an object in an array like you want to. And you don't need to change your data structure if you don't want to. You could add a case like this to your reducer:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => {
if (person.id === action.payload.id) {
return action.payload;
}
return person;
}),
};
This can be be shortened as well, using implicit returns and a ternary:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => (person.id === action.payload.id) ? action.payload : person),
};
Mihir's idea about mapping your data to an object with normalizr is certainly a possibility and technically it'd be faster to update the user with the reference instead of doing the loop (after initial mapping was done). But if you want to keep your data structure, this approach will work.
Also, mapping like this is just one of many ways to update the object, and requires browser support for Array.prototype.map(). You could use lodash indexOf() to find the index of the user you want (this is nice because it breaks the loop when it succeeds instead of just continuing as the .map would do), once you have the index you could overwrite the object directly using it's index. Make sure you don't mutate the redux state though, you'll need to be working on a clone if you want to assign like this: clonedArray[foundIndex] = action.payload;.
This is a good candidate for data normalization. You can effectively replace your data with the new one, if you normalize the data before storing it in your state tree.
This example is straight from Normalizr.
[{
id: 1,
title: 'Some Article',
author: {
id: 1,
name: 'Dan'
}
}, {
id: 2,
title: 'Other Article',
author: {
id: 1,
name: 'Dan'
}
}]
Can be normalized this way-
{
result: [1, 2],
entities: {
articles: {
1: {
id: 1,
title: 'Some Article',
author: 1
},
2: {
id: 2,
title: 'Other Article',
author: 1
}
},
users: {
1: {
id: 1,
name: 'Dan'
}
}
}
}
What's the advantage of normalization?
You get to extract the exact part of your state tree that you want.
For instance- You have an array of objects containing information about the articles. If you want to select a particular object from that array, you'll have to iterate through entire array. Worst case is that the desired object is not present in the array. To overcome this, we normalize the data.
To normalize the data, store the unique identifiers of each object in a separate array. Let's call that array as results.
result: [1, 2, 3 ..]
And transform the array of objects into an object with keys as the id(See the second snippet). Call that object as entities.
Ultimately, to access the object with id 1, simply do this- entities.articles["1"].
If you want to replace the old data with new data, you can do this-
entities.articles["1"] = newObj;
Use native splice method of array:
/*Find item index using lodash*/
var index = _.indexOf(currentPeople, _.find(currentPeople, {id: 1}));
/*Replace item at index using splice*/
arr.splice(index, 1, {id:1, name:'Mark', email:'mark#email.com'});

Resources