Here i am trying to update data in mysql database using bulkcreate [updateonduplicate] in sequelize and i don't have any primary key in my table. So what happing is, instead of updating the data the new entries are entering in database with updated value. I want to update old values only
Here is my code
let insertArray = [];
insertArray.push({
order_id: 62924376,
delivery_status : 80
});
insertArray.push({
order_id: 62934013,
delivery_status : 80
});
let updateOrder = await Sequelize.userOrders.bulkCreate(
insertArray,
{ updateOnDuplicate: ["delivery_status"] }
)
this code is entering new data in database. Here i have to update old values only
Related
I'm new to sequelize and I want to check if the data exists in the table before doing a multiple insert. I already manage to insert the data with bulkCreate but I have a little trouble associating the check function (findOne) with bulkCreate. Thanks for your help.
sequelize
.sync({})
Test.bulkCreate([
{TestName: TestName[0] ,
ProductName: ProductName ,
PanelRunmode: panelRunmode[0] ,
TestResource: TestResource[0] ,
ProductRevision: ProductRevision[0] ,
TotalTime: TotalTime[0], },
]).then(() => {
return Test.findAll();
}).then(tests => {
console.log(tests) // ... in order to get the array of user objects
})
I am using Sequelize in my node js server. I am ending up with validation errors because my code tries to write the record twice instead of creating it once and then updating it since it's already in DB (Postgresql).
This is the flow I use when the request runs:
const latitude = req.body.latitude;
var metrics = await models.user_car_metrics.findOne({ where: { user_id: userId, car_id: carId } })
if (metrics) {
metrics.latitude = latitude;
.....
} else {
metrics = models.user_car_metrics.build({
user_id: userId,
car_id: carId,
latitude: latitude
....
});
}
var savedMetrics = await metrics();
return res.status(201).json(savedMetrics);
At times, if the client calls the endpoint very fast twice or more the endpoint above tries to save two new rows in user_car_metrics, with the same user_id and car_id, both FK on tables user and car.
I have a constraint:
ALTER TABLE user_car_metrics DROP CONSTRAINT IF EXISTS user_id_car_id_unique, ADD CONSTRAINT user_id_car_id_unique UNIQUE (car_id, user_id);
Point is, there can only be one entry for a given user_id and car_id pair.
Because of that, I started seeing validation issues and after looking into it and adding logs I realize the code above adds duplicates in the table (without the constraint). If the constraint is there, I get validation errors when the code above tries to insert the duplicate record.
Question is, how do I avoid this problem? How do I structure the code so that it won't try to create duplicate records. Is there a way to serialize this?
If you have a unique constraint then you can use upsert to either insert or update the record depending on whether you have a record with the same primary key value or column values that are in the unique constraint.
await models.user_car_metrics.upsert({
user_id: userId,
car_id: carId,
latitude: latitude
....
})
See upsert
PostgreSQL - Implemented with ON CONFLICT DO UPDATE. If update data contains PK field, then PK is selected as the default conflict key. Otherwise, first unique constraint/index will be selected, which can satisfy conflict key requirements.
Okay so, I'm displaying a friends table with Sequelize in Nodejs, Everything goes to plan but I run into a problem. For the friends table i store the user ids and then access the user data with those ids then display it onto ejs. I want to access another table using the current tables data.
Heres the code to access the Friends table, I made it a function so i can access it from ejs
let getFriends = async (id) => {
const project = await database.Friends.findAndCountAll({ where: { fid: id } });
return project.rows
}
you can access another table using include.
new code will be
let getFriends = async (id) => {
const project = await database.Friends.findAndCountAll({ where: { fid: id },
include:[{
model:model.User,as:'user'}]
});
return project;
}
I have a table in Postgres and for ORM I am using sequelize I need to update the table name so I've tried the below migration for this
queryInterface.renameTable('table1', 'table2', { transaction }
but for some reason, it's creating a new table with table2 with the same data as table 1 but table 1 still exits with blank data.is this correct behavior of this function so'll add a delete query.
This works
queryInterface.renameTable('OldTableName', 'NewTableName');
For that use case i normally rely on raw queries:
const { sequelize } = queryInterface;
await sequelize.query(
`ALTER TABLE table1
RENAME TO table2`,
{
type: QueryTypes.RAW,
raw: true,
transaction,
},
);
I have a table in my database called users. In this table I only store user ID, username and password. Now, in another table called user_meta, I have the following columns: id, uid, meta_key, meta_value. I'm trying to find a way for Bookshelf to automatically load all records in user_meta where uid == userid, and store them as model.meta[meta_key] = meta_value. Sadly, I haven't been able to find a way to make this possible.
If it is possible at all, the 2nd step would be to also save all values in model.meta back on update / insert, inserting records where meta_key doesn't exist for that user ID yet, and updating where it does.
Try to set the associations (relations) between the models:
var User = bookshelf.Model.extend({
tableName: 'users',
meta: function() {
return this.hasMany(Meta);
}
});
var Meta = bookshelf.Model.extend({
tableName: 'user_meta',
user: function() {
return this.belongsTo(User);
}
});
http://bookshelfjs.org/#one-to-many