I have the follwing set of data
const data = [{
name: 'test2',
desc: test2,
id: 2,
},
{
name: 'test3',
desc: test3,
id: 3,
},
{
name: 'test4',
desc: test4,
id: 4,
},
]
I want to update using sequelize, i don't want to use any kind of loop to update. However I've tried bulkCreate with updateOnDuplicate
await models.project.bulkCreate(data, {
updateOnDuplicate: ['id'],
},
But I'm not able to update and getting sequalize error. Can anyone tell the optimized solutions for the same?
Related
I am reading off from a Redis cache which has the user data with 'JSON string',
However in the Nodejs application, it sends the data as in the below format,
I want to create the exact same structure from the Redis JSON string, I am struggling to understand how to re-create the 'BinaryRow' in here.
if I do
util.inspect(user_info)
the final output needs to be like below.
user: [
BinaryRow {
user_id: 7558073,
country_id: 191,
city_id: 1975002,
name: 'iphone',
birth: 1980-09-25T18:30:00.000Z,
mode: 'active',
gender: 'M'
}
],
country: [ BinaryRow { country_title: 'Australia' } ],
city: [ BinaryRow { city_title: 'Gampahas' } ],
photo: [
BinaryRow {
photo_id: 100813,
visible: 'Y',
ref: 'ssss'
}
]
I have many to many association like this following model:
const Movie = sequelize.define('Movie', { name: DataTypes.STRING });
const Actor = sequelize.define('Actor', { name: DataTypes.STRING });
const ActorMovies = sequelize.define('ActorMovies', {
MovieId: {
type: DataTypes.INTEGER,
references: {
model: Movie,
key: 'id'
}
},
ActorId: {
type: DataTypes.INTEGER,
references: {
model: Actor,
key: 'id'
}
}
});
Movie.belongsToMany(Actor, { through: ActorMovies });
Actor.belongsToMany(Movie, { through: ActorMovies });
And I succsessfully create Movie when create an Actor record with this following code:
Actor.create({
name: 'Jhony',
movies: [
{ name: 'Movie 1'}, // it will generate Movie with ID 1
{ name: 'Movie 2'} // it will generate Movie with ID 2
]
}, {
include: [ Movie ]
})
but my question how can I attach multiple existing Movie record when creating an Actor?
I already try:
Actor.create({
name: 'Edward',
movieIds: [1, 2]
}, {
include: [ Movie ]
})
and:
Actor.create({
name: 'Edward',
movies: [{id: 1}, {id: 2}]
}, {
include: [ Movie ]
})
But stil didn't work. Anyone can help me, please. Thanks in advance
You can't link existing movies to a new actor while creating it. You need to call setMovies of the new actor model instance:
const actor = await Actor.create({
name: 'Edward',
})
await actor.setMovies([1, 2])
Also, please pay attention that if you execute more than one query that changes something in DB it would be much more reliable to use transactions to turn all this queries into one atomic operation.
I have a query that I originally wrote in the console:
g.V().hasLabel('group')
.has('type', 'PowerUsers')
.local(__.union(
__.project('group').by(__.valueMap().by(__.unfold())),
__.inE().outV().project('user').by(__.valueMap().by(__.unfold())))
.fold()).unfold().toList()
I get something like:
==>{group={owner=A, group_id=21651399-91fd-4da4-8608-1bd30447e773, name=Group 8, type=PowerUsers}}
==>{user={name=John, user_id=91f5e306-77f1-4aa1-b9d0-23136f57142d}}
==>{user={name=Jane, user_id=7f133d0d-47f3-479d-b6e7-5191bea52459}}
==>{group={owner=A, group_id=ef8c81f7-7066-49b2-9a03-bad731676a8c, name=Group B, type=PowerUsers}}
==>{user={name=Max, user_id=acf6abb8-08b3-4fc6-a4cb-f34ff523d628}}
==>{group={owner=A, group_id=07dff798-d6db-4765-8d74-0c7be66bec05, name=Group C, type=PowerUsers}}
==>{user={name=John, user_id=91f5e306-77f1-4aa1-b9d0-23136f57142d}}
==>{user={name=Max, user_id=acf6abb8-08b3-4fc6-a4cb-f34ff523d628}}
When I run that query with NodeJS, I was expecting to get a similar result, but I don't. I get something like this:
[ { group:
{ owner: 'A',
group_id: '21651399-91fd-4da4-8608-1bd30447e773',
name: 'Group 8',
type: 'PowerUsers' } },
{ user:
{ name: 'John',
user_id: '91f5e306-77f1-4aa1-b9d0-23136f57142d'} },
{ user:
{ name: 'John',
user_id: '91f5e306-77f1-4aa1-b9d0-23136f57142d'} },
{ user:
{ name: 'Jane',
user_id: '7f133d0d-47f3-479d-b6e7-5191bea52459'} },
{ user:
{ name: 'Jane',
user_id: '7f133d0d-47f3-479d-b6e7-5191bea52459'} },
{ group:
{ owner: 'A',
group_id: 'ef8c81f7-7066-49b2-9a03-bad731676a8c',
name: 'Group B',
type: 'PowerUsers' } },
{ user:
{ name: 'Max',
user_id: 'acf6abb8-08b3-4fc6-a4cb-f34ff523d628' } },
...
Because I have the same users in different groups, I can't use dedup(), and if the results where the same in NodeJS as Groovy, that'd be perfect. Unfortunately, they are not, and I don't understand why the results in NodeJS are all messed up, considering that the query is exactly the same
I feel like you could just use a nested project() to keep your group with your users:
g.V().hasLabel('group')
.has('type', 'PowerUsers')
.project('group', 'users')
.by(__.valueMap().by(__.unfold()))
.by(__.in().project('user').by(__.valueMap().by(__.unfold()).fold()))
In this way the you don't have to worry about ordering anything. I think it might be more Gremlin-esque though to use the "group" as a key in a Map with the values being a list of users:
g.V().hasLabel('group')
.has('type', 'PowerUsers')
.group()
.by(__.valueMap().by(__.unfold()))
.by(__.in().project('user').by(__.valueMap().by(__.unfold()).fold()))
I've one collection i.e 'comments', Which contain two fields. Like below -
Comments -
{
_id: 'unique_string',
dataVar: [{
name: {
type: string
},
value: {
type: string
}
}]
}
You can assume the collection data is like below -
[
_id: 'xxxxxx',
user: 'robin',
dataVar: [{
name: 'abc',
value: 123
}, {
name: 'bcd',
value: 12345
}]
]
Now the problem is - (I'm using mongoose in nodeJS application)
- How to update and insert the data in 'dataVar' ?
- If data is no available then new document should be created.
Case 1.
If user send the post data like
{
_id: 'xxxxxx',
user: 'robin',
dataVar: [{name: 'abc', value: 12345}]
}
then after performing query, the above document (whose _id is 'xxxxxx') should be update like below -
{
_id: 'xxxxxx',
user: 'robin',
dataVar: [
{name: 'abc', value: 12345},
{name: 'bcd', value: 12345}
]
}
Case 2.
If data is not present then new document should be created.
To update record in mongodb, you can use like
comments.findOneAndUpdate({ _id: 'xxxxxx' }, { $push: { dataVar: {$each: [{name: 'abc', value: 123}, {name: 'def', value: 456}] } } })
You can use {upsert: true} option for creating field in collection if not exist.
For updating record, you have to use sub-schema inside your main schema. So that _id field is created in every object of array. So you can uniquely identify object to update inside array.
comments.findOneAndUpdate({ "dataVar._id": 'xxxxxx' }, { $set: { name: 'xyz', value: 789 } })
In this way, you can update object, which is inside array.
If you don't want to use _id, you have at-least one field which contains unique values. So that you can findOneAndUpdate that specific object inside array.
I have a crazy array look like this:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
]
I want to map Block to a single array to look like this:
[ { Id: '1' },
{ Id: '2' },
{ Id: '3' },
{ Id: '4' }
]
I have tried doing like this:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
]
const idList = data.map(blockData => {
return blockData[0].Block;
});
console.log(idList)
What did I do wrong?
.map will create a new item for every index of the old array. If your input array has 2 items, the output array will also only have 2 items - but you want 4 items, so .map won't work. Use flatMap instead, to flatten:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
];
const idList = data.flatMap(([{ Block }]) => Block);
console.log(idList)
flatMap is only implemented on newer implementations, though - otherwise, use a polyfill or a different method, like reduceing into an array:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
];
const idList = data.reduce((a, [{ Block }]) => a.concat(Block), []);
console.log(idList)
your data is an array inside array so you should use map twice, Buth that will give you an array with 2 elements now you need to reduce or flatten the resulting array to get the desired output.
data.map(a=>{return a[0].Block.map(b=>{ return b})}).reduce((o, m) => [...m, ...o], [])
The most succinct way to do it would be with a reduce statement:
const reducer = (a, b) => a.concat(b[0].Block);
const idList = data.reduce(reducer, []);
This way it will be much clearer what you are trying to do.