Related
So I want to update a document in Mongo.
I have a document and I want to modify it and send it back. But for some reason, when I do, it changes the value into ObjectId. And I figured out an alternative, but I don't like it...
async updateDogs() {
const cats = await this.catModel.find().exec();
const dog: DogDocument = await this.dogModel
.findOne({ _id: '62b57f65fa0e953b3e0494eb' })
.populate('cats')
.exec();
console.log('All cats:', cats);
console.log('Dingo Dog in find', dog);
dog.cats = cats;
console.log('Dingo dog in find but with Cat ObjectId()', dog.cats);
console.log(dog.cats[0].name);
dog.cats = [];
dog.cats.push(...cats);
console.log('Dingo dog in find and with cat', dog.cats);
console.log(dog.cats[0].name);
}
And this is the output:
All cats: [
{
_id: new ObjectId("62af2508025adb0b7e6e446f"),
name: 'Mikey',
age: 0,
breed: 'string',
__v: 0
},
{
_id: new ObjectId("62b57fd0fa0e953b3e0494f5"),
name: 'Mini',
age: 0,
breed: 'string',
__v: 0
}
]
Dingo Dog in find {
_id: new ObjectId("62b57f65fa0e953b3e0494eb"),
name: 'Dingo',
age: 0,
__v: 0,
cats: [
{
_id: new ObjectId("62af2508025adb0b7e6e446f"),
name: 'Mikey',
age: 0,
breed: 'string',
__v: 0
}
]
}
Dingo dog in find but with Cat ObjectId() [
new ObjectId("62af2508025adb0b7e6e446f"),
new ObjectId("62b57fd0fa0e953b3e0494f5")
]
First cat name is undefined
Dingo dog in find and with cat [
{
name: 'Mikey',
age: 0,
breed: 'string',
_id: new ObjectId("62af2508025adb0b7e6e446f"),
__v: 0
},
{
name: 'Mini',
age: 0,
breed: 'string',
_id: new ObjectId("62b57fd0fa0e953b3e0494f5"),
__v: 0
}
]
First cat name is Mikey
The way i add cats to dog.cats it change the result.
Do you have any idea why ? I Need to have CatDocument and not ObjectId.
And i can do another populate but i don't want to make another request because i already have it. And use push is ugly i think...
For a project I'm working on we get some data delivered in an Excel sheet which I convert to CSV through Excel.
These files contain measurements with different categories but the same ID.
Example
readingId; category; result;
1 ; cat 1 ; A
1 ; cat 2 ; B
2 ; cat1 ; C
I've then converted the CSV to JSON and wrote a function to output the data into different objects
const fs = require('fs');
const path = require('path');
exports.convertJson = (file) => {
let rawData = fs.readFileSync(file);
let jsonData = JSON.parse(rawData);
let rawOutput = [];
for (output of jsonData) {
rawOutput.push({
locationId: output.Meetlocatienummer,
date: output.Aanmaakdatum_score,
subCategorie: output.Bestekspost,
score: output.Score,
scoreNumber: output.Cijfer,
categories: output.Categorie,
coordinates: output.Coordinaten,
neighbourhoodIndex: output.BUURTCODE,
quality: output.KWALITEIT,
district: output.STADSDEEL,
distrcitIndex: output.STADSDLCD,
street: output.STRAATNAAM,
neighbourhood: output.WIJK,
cluster: output.Cluster,
});
}
return rawOutput;
};
Which outputs the following results
[
{
locationId: 10215,
date: undefined,
subCategorie: 'Meubilair-afvalbak-vullingsgraad',
score: '',
scoreNumber: 8,
categories: 'Meubilair',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Meubilair-container-bijgeplaatst afval rondom container',
score: 'A+',
scoreNumber: 10,
categories: 'Meubilair',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Riolering-kolk-belemmering inlaat',
score: 'A+',
scoreNumber: 10,
categories: 'Riolering',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-open verharding-elementenverharding-onkruid',
score: 'A',
scoreNumber: 8,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-natuurlijk afval',
score: 'A',
scoreNumber: 8,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-uitwerpselen',
score: 'A+',
scoreNumber: 10,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-zwerfafval grof',
score: 'A',
scoreNumber: 8,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-veegvuil goten',
score: 'A',
scoreNumber: 8,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-onkruid rondom obstakels',
score: 'B',
scoreNumber: 6,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-grof vuil',
score: 'A+',
scoreNumber: 10,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 10215,
date: undefined,
subCategorie: 'Verharding-zwerfafval fijn',
score: 'A',
scoreNumber: 8,
categories: 'Verharding',
coordinates: '52.072843, 4.287723',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Xaverystraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
},
{
locationId: 7466,
date: undefined,
subCategorie: 'Meubilair-afvalbak-vullingsgraad',
score: 'B',
scoreNumber: 6,
categories: 'Meubilair',
coordinates: '52.072647, 4.288656',
neighbourhoodIndex: 10,
quality: 'residentiekwaliteit',
district: 'Segbroek',
distrcitIndex: 3,
street: 'Jan Krosstraat',
neighbourhood: 'Regentessekwartier',
cluster: 'WRF'
}
]
In the end I would like to write this information to MongoDB and I had the following scheme in mind to reduce the loads of duplicated data
{
locationId: output.Meetlocatienummer,
date: output.Aanmaakdatum_score,
subCategories: [
{
subCategory: output.Bestekspost,
score: output.Score,
scoreNumber: output.Cijfer,
},
],
categories: [{ category: output.Categorie }],
coordinates: output.Coordinaten,
neighbourhoodIndex: output.BUURTCODE,
quality: output.KWALITEIT,
district: output.STADSDEEL,
distrcitIndex: output.STADSDLCD,
street: output.STRAATNAAM,
neighbourhood: output.WIJK,
cluster: output.Cluster,
}
This project is a hobby project while learning NodeJS. The actual data are readings how much the streets of the city I work for a poluted with litter. It's a bit boring to read thousands of lines in Excel to find the hotspots of the city as it's a bit boring to just read some scores and graphs so I though it would be nice to import it into Leaflet through NodeJS.
The actual backend will contain more functionality as I learn Node and maybe in the future React, that's why I try to write it myself rather then importing the data into Google maps, which works oke but lacks detailed category filtering.
I hope my idea is a bit clear and someone can point me in the right direction.
Edit 1
I got a bit further with lodash.
return _(rawOutput)
.groupBy('locationId')
.map((obj) => _.assignWith({}, ...obj, (val1, val2) => val1 || val2))
.value();
I found the above snippet and now I only get 1 output per unique locationId but now I'm stuck with contructing the final output with the subcategories.
I was also playing around a bit with csv-parser to directly go from the csv to a proper json output, which would be ideal because I don't have to convert it manually then.
I'll get back to it tomorrow :-)
If you take that JSON and mongoimport into MongoDB, you can use the following pipeline to transform it -- although honestly, a little python script on the outside could construct the structure just as easily and then you would still import the condensed data.
db.foo.aggregate([
{$group: {_id: "$locationId",
subCategories: {$push: {subCategory: "$subCategorie", score:"$score", scoreNumber: "$scoreNumber"}},
categories: {$push: "$categories"},
// Just take the first occurance of each of these since they are claimed
// to be the same.
date: {$first: "$date"},
neighbourhoodIndex: {$first: "$neighbourhoodIndex"},
quality: {$first: "$quality"},
district: {$first: "$district"},
distrcitIndex: {$first: "$distrcitIndex"},
street: {$first: "$street"},
neighbourhood: {$first: "$neighbourhood"},
cluster: {$first: "$cluster"},
coordinates: {$first: "$coordinates"}
}}
// Now that we have a single doc with locationId x and a coordinate, convert
// the string lat,long "52.072843, 4.287723" into a GeoJSON Point which is
// a long,lat array of doubles. We convert by using $addFields to
// overwrite the original coordinates field:
,{$addFields: {"coordinates": {$let: {
vars: {pt: {$split:["$coordinates",","]}},
in: {"type": "Point", "coordinates": [
{$toDouble: {$trim: {input:{$arrayElemAt:["$$pt",1]}}}},
{$toDouble: {$trim: {input:{$arrayElemAt:["$$pt",0]}}}}
]
}
}}
}}
// Put the whole transformed thing into a new collection named "foo2":
,{$out: "foo2"}
]);
Alright, in the end the code from Buzz Moschetti was exactly what I wanted to get rid of the duplicated data. I didn't hear about aggregates yet so thanks for that.
I ended up using the CSV Parse library to convert the CSV to JSON, drop that into the database and then query out the duplicates with the code from Buzz.
I haven't written the code yet to write back the cleaned up data back to the database but that shouldn't be to hard so I'll just post what I have now for as reference for others.
First of all I have written a csv helper for the conversion.
const fs = require('fs');
const { parse } = require('csv');
const moment = require('moment');
exports.processFile = async (filePath) => {
const records = [];
const input = fs.createReadStream(filePath);
const parser = parse({
// CSV options
bom: true,
delimiter: ';',
cast: (value, context) => {
if (context.header) return value;
// Convert data
if (context.column === 'date') {
const dateString = moment(value, 'dd-mm-yyyy h:mm');
const date = dateString.toDate();
return date;
}
// Convert coordinates to GeoJSON
if (context.column === 'coordinates') {
const coordinate = value.split(',');
const geoData = {
type: 'Point',
coordinate: [coordinate[0], coordinate[1]],
};
return geoData;
}
// Output rest of the fields
return String(value);
},
columns: [
'locationId', // meetlocatienummer
'date', // aanmaakdatum score
'subCategory', //bestekpost
'category', // categorie
'score', // score
'coordinates', //coordinaten
undefined, // buurt
undefined, // buurtcode
undefined, // gebied
undefined, // id
'quality', //kwaliteit
undefined, // stadsdeel
'districtIndex', //stadsdlcd
'street', //straatnaam
undefined, //vaknr
'neighbourhood', //wijk
undefined, //wijkcode
'cluster', //cluster
'scoreNumber', //cijfer
undefined, // week
undefined, // maand
undefined, // jr-mnd
undefined, // jaar
],
trim: true,
from_line: 2,
skip_records_with_empty_values: true,
});
// parser.on('error', (err) => {
// console.log(err);
// const error = new Error(err);
// error.httpStatusCode = 500;
// throw error;
// });
//const transformer = transform((record, callback) => {});
input.pipe(parser).on('error', (err) => {
input.close();
});
for await (let record of parser) {
// Skip all lines without coordinates
if (record.coordinates.coordinate[1] === undefined) {
continue;
}
// Push filename to the record object
record.fileName = filePath;
// Push records for final output
records.push(record);
//console.log('Records converted');
}
return records;
};
I'm uploading the file with the Multer lirbary. Here's the POST action in my import data controller. After the file has been uploaded the conversion starts. If an error occurs the file gets deleted again and no records are written to the database. If the conversion succeeds the records will be written to importdatas in MongoDB, these are still the 'dirty' records, so loads of duplicates but without useless data which gets filtered by the CSV Parse helper. (Basically all the data without coordinates)
exports.postImportData = (req, res, next) => {
const uploadedCSV = req.file;
//console.log(uploadedCSV);
// Load imported CSV files from DB to be able to delete them
ImportedCSVFile.find()
.sort({ date: -1 })
.then((result) => {
// Check if there are already files imported
let hasFiles = null;
if (result.length > 0) {
hasFiles = 1;
}
// If there are any erros with the file being uploaded
if (req.fileValidationError) {
return res.render('admin/import-data/import-data', {
pageTitle: 'Importeer data',
path: '/admin/import-data',
files: result,
activeAdmin: true,
errorMessage: req.fileValidationError,
validationErrors: [],
hasFiles,
});
}
// If there's no file uploaded
if (!uploadedCSV) {
return res.render('admin/import-data/import-data', {
pageTitle: 'Importeer data',
path: '/admin/import-data',
files: result,
activeAdmin: true,
errorMessage: 'Geen bestand geselecteerd',
validationErrors: [],
hasFiles,
});
}
(async () => {
const csvFile = await fileHelper.hasFile(uploadedCSV);
try {
const records = await convert.processFile(csvFile);
// Write all CSV data to importdatas in MongoDB
await ImportData.insertMany(records)
.then((result) => {
console.log('Data imported');
// Push info about the uploaded file into 'importedcsvfiles'
const importedCSVFile = new ImportedCSVFile({
filePath: fileHelper.hasFile(uploadedCSV),
originalName: uploadedCSV.originalname,
});
return (
importedCSVFile
.save() // Save all CSV data into 'importedcsvfiles' in MongoDB
.then((result) => {
res.redirect('/admin/import-data');
})
// Catch save filepath error
.catch((err) => {
console.log('save failed');
const error = new Error(err);
error.httpStatusCode = 500;
return next(error);
})
);
})
// Catch insert CSV data into DB error
.catch((err) => {
console.log('insert many failed');
const error = new Error(err);
error.httpStatusCode = 500;
return next(error);
});
} catch (err) {
// console.log(error);
fileHelper.removeFile(csvFile);
return res.render('admin/import-data/import-data', {
pageTitle: 'Importeer data',
path: '/admin/import-data',
files: result,
activeAdmin: true,
errorMessage:
'Het geselecteerde bestand heeft niet de juiste indeling. Neem contact op met de beheerder.',
validationErrors: [],
hasFiles,
});
}
})();
});
};
Also wrote a delete option which removes the CSV file and all the database records which are linked to that file
exports.postDeleteData = (req, res, next) => {
const dataId = req.body.dataId;
ImportedCSVFile.findById(dataId)
.then((result) => {
// console.log('FilePath:');
// console.log(result.filePath);
const filePath = result.filePath;
const deleteData = async () => {
await ImportData.deleteMany({ filePath: filePath })
.then((result) => {})
.catch((err) => {
const error = new Error(err);
error.httpStatusCode = 500;
return next(error);
});
await ImportedCSVFile.findByIdAndDelete(dataId)
.then((result) => {
console.log('Data deleted');
fileHelper.removeFile(filePath);
res.redirect('/admin/import-data');
})
.catch((err) => {
console.log('here');
const error = new Error(err);
error.httpStatusCode = 500;
return next(error);
});
};
return deleteData();
})
.catch((err) => {
const error = new Error(err);
error.httpStatusCode = 500;
return next(error);
});
};
And for now the Aggregate code from Buzz to clean up the data and drop it into Leaflet so I get 1 point with all the different categories.
const { ImportData, OutputData } = require('../models/importData.model');
// Main controller for the homepage
exports.getMainController = (req, res, next) => {
ImportData.aggregate([
{
$group: {
_id: '$coordinates',
subCategories: {
$push: {
subCategory: '$subCategory',
score: '$score',
scoreNumber: '$scoreNumber',
},
},
categories: { $push: '$category' },
// Just take the first occurance of each of these since they are claimed
// to be the same.
date: { $first: '$date' },
quality: { $first: '$quality' },
districtIndex: { $first: '$districtIndex' },
street: { $first: '$street' },
neighbourhood: { $first: '$neighbourhood' },
cluster: { $first: '$cluster' },
},
},
]).exec((err, locations) => {
if (err) {
throw next(err);
}
//console.log(locations);
res.render('index.ejs', {
pageTitle: 'Kaart',
path: '/kaart',
activeAdmin: true,
data: locations,
errorMessage: null,
});
});
};
As of now I just query this data like I said in the beginning. As I am still learning a lot about Javascript and Node I now started to build a frontend with React. Once I got on going there I will convert all this code to an API and I'll finish this part of the project.
Restaurants is a collection and has objects like below:
{
_id: new ObjectId("61723c7378b6d3a5a02d908e"),
name: 'The Blue Hotel',
location: 'Noon city, New York',
phoneNumber: '122-536-7890',
website: 'http://www.bluehotel.com',
priceRange: '$$$',
cuisines: [ 'Mexican', 'Italian' ],
overallRating: 0,
serviceOptions: { dineIn: true, takeOut: true, delivery: true },
reviews: [
{
_id: new ObjectId("61736a0f65b9931b9e428789"),
title: 'asd',
reviewer: 'khoh',
rating: 3,
dateOfReview: '5/12/2002',
review: 'hey'
},
_id: new ObjectId("61736a0f65b9931b9e428790"),
title: 'dom',
reviewer: 'firuu',
rating: 4,
dateOfReview: '25/1/2002',
review: ' bruh'
}
]
}
I am using the below code to find this object based on the review id provided
async get(reviewId) {
const restaurantsCollection = await restaurants();
reviewId = ObjectId(reviewId)
const r = await restaurantsCollection.findOne({reviews: {$elemMatch: {_id: reviewId}}})
return r
This returns the whole object from the restaurant collection, what do I do if I want only the review displayed whose id is provided in get(reviewID)
Output:
{
_id: new ObjectId("61736a0f65b9931b9e428790"),
title: 'dom',
reviewer: 'firuu',
rating: 4,
dateOfReview: '25/1/2002',
review: ' bruh'
}
With a projection, specify the fields to return
The following returns only the review whose id is provided in get(reviewID)
async get(reviewId) {
const restaurantsCollection = await restaurants();
reviewId = ObjectId(reviewId)
const r = await restaurantsCollection.findOne(
{ reviews: { $elemMatch: { _id: reviewId } } },
{ "reviews.$": 1 }
)
return r
}
Test Here
You can also use find instead of fineOne
Query
replace the ObjectId("61736a0f65b9931b9e428789") with reviewId
this will return the reviews that match the _id in an array
if you want to get the first only, in case there is always max 1
you can replace the last project with
{"$project": {"_id": 0, "review": {"$arrayElemAt": ["$reviews", 0]}}}
*not sure if this is what you need
Test code here
aggregate(
[{"$match": {"reviews._id": ObjectId("61736a0f65b9931b9e428789")}}
{"$set":
{"reviews":
{"$filter":
{"input": "$reviews",
"cond":
{"$eq": ["$$this._id", ObjectId("61736a0f65b9931b9e428789")]}}}}},
{"$project": {"_id": 0, "reviews": 1}}])
This might not be the correct answer of your question but you can try something like this.
const r = await restaurantsCollection.findOne({reviews: {$elemMatch: {_id: reviewId}}})?.reviews.find(review => review._id.equals(reviewId))
I have a crazy array look like this:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
]
I want to map Block to a single array to look like this:
[ { Id: '1' },
{ Id: '2' },
{ Id: '3' },
{ Id: '4' }
]
I have tried doing like this:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
]
const idList = data.map(blockData => {
return blockData[0].Block;
});
console.log(idList)
What did I do wrong?
.map will create a new item for every index of the old array. If your input array has 2 items, the output array will also only have 2 items - but you want 4 items, so .map won't work. Use flatMap instead, to flatten:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
];
const idList = data.flatMap(([{ Block }]) => Block);
console.log(idList)
flatMap is only implemented on newer implementations, though - otherwise, use a polyfill or a different method, like reduceing into an array:
const data = [
[{ Name: 'Name 1', Block: [{Id: "1"}, {Id: "2"}] }],
[{ Name: 'Name 2', Block: [{Id: "3"}, {Id: "4"}] }],
];
const idList = data.reduce((a, [{ Block }]) => a.concat(Block), []);
console.log(idList)
your data is an array inside array so you should use map twice, Buth that will give you an array with 2 elements now you need to reduce or flatten the resulting array to get the desired output.
data.map(a=>{return a[0].Block.map(b=>{ return b})}).reduce((o, m) => [...m, ...o], [])
The most succinct way to do it would be with a reduce statement:
const reducer = (a, b) => a.concat(b[0].Block);
const idList = data.reduce(reducer, []);
This way it will be much clearer what you are trying to do.
i need to get the id for the inserted/updated record when using .upsert() in sequelize.
right now .upsert() returns a boolean indicating whether the row was created or updated.
return db.VenueAddress.upsert({
addressId:address.addressId,
venueId: venue.venueId,
street: address.street,
zipCode: address.zipCode,
venueAddressDeletedAt: null
}).then(function(test){
//test returned here as true or false how can i get the inserted id here so i can insert data in other tables using this new id?
});
I don't think that returning the upserted record was available when the OP asked this question, but it has since been implemented with this PR. As of Sequelize v4.32.1, you can pass a boolean returning as a query param to select between returning an array with the record and a boolean, or just a boolean indicating whether or not a new record was created.
You do still need to provide the id of the record you want to upsert or a new record will be created.
For example:
const [record, created] = await Model.upsert(
{ id: 1, name: 'foo' }, // Record to upsert
{ returning: true } // Return upserted record
);
I wanted upsert to return the created or updated object. It doesn't because only PGSQL supports it directly, apparently.
So I created a naive implementation that will - probably in a non-performant way, and possibly with all sorts of race conditions, do that:
Sequelize.Model.prototype.findCreateUpdate = function(findWhereMap, newValuesMap) {
return this.findOrCreate({
where: findWhereMap,
defaults: findWhereMap
})
.spread(function(newObj, created) {
// set:
for(var key in newValuesMap) {
newObj[key] = newValuesMap[key];
}
return newObj.save();
});
};
Usage when trying to create/update a move in a game (contrived example alert!):
models.Game
.findOne({where: {gameId: gameId}})
.then(function(game) {
return db.Move.findCreateUpdate(
{gameId: gameId, moveNum: game.moveNum+1},
{startPos: 'kr4', endPos: 'Kp2'}
);
});
This is what worked for me:
Model.upsert({
title:your title,
desc:your description,
location:your locations
}).then(function (test) {
if(test){
res.status(200);
res.send("Successfully stored");
}else{
res.status(200);
res.send("Successfully inserted");
}
})
It will check db to find based on your primary key. If it finds then, it will update the data otherwise it will create a new row/insert into a new row.
i know this is an old post, but in case this helps anyone
const upsert = async (model: any, values: any, condition: any): Promise<any> => {
const obj = await model.findOne({ where: condition })
if (obj) {
// only do update is value is different from queried object from db
for (var key in values) {
const val = values[key]
if (parseFloat(obj[key]) !== val) {
obj.isUpdatedRecord = true
return obj.update(values)
}
}
obj.isUpdatedRecord = false
return obj
} else {
// insert
const merged = { ...values, ...condition }
return model.create(merged)
}
}
It isn't using upsert, but .bulkCreate has an updateOnDuplicate parameter, which allows you to update certain fields (instead of creating a new row) in the event that the primary key already exists.
MyModel.bulkCreate(
newRows,
{
updateOnDuplicate: ["venueId", ...]
}
)
I believe this returns the resulting objects, and so I think this might enable the functionality you're looking for?
janmeier said:
This is only supported by postgres, so to keep the API consistent across dialects this is not possible.
please see : https://github.com/sequelize/sequelize/issues/3354
I believe my solution is the most up to date with most minimal coding.
const SequelizeModel = require('sequelize/lib/model')
SequelizeModel.upsert = function() {
return this.findOne({
where: arguments[0].where
}).then(obj => {
if(obj) {
obj.update(arguments[0].defaults)
return
}
return this.create(arguments[0].defaults)
})
}
I know this is an old post, but in case this helps anyone...you can get the returned id or any other value in this way based on OP data.
var data = {
addressId:address.addressId,
venueId: venue.venueId,
street: address.street,
zipCode: address.zipCode,
venueAddressDeletedAt: null
}
const result = await db.VenueAddress.upsert(data, { returning: true });
console.log('resulttttttttttttttttt =>', result)
res.status(200).json({ message: 'Your success message', data: result[0].id});
Noticed how I passed { returning: true } and get the value from the result data.
Super old, but if it helps someone:
const [city, created] = await City.upsert({
id: 5,
cityName: "Glasgow",
population: 99999,
});
created is the boolean saying whether the item was created, and in city you have the whole item, where you can get your id.
No need of returning, and this is db agnostic :)
The only solution for SQLite in Sequelize 6.14.0 is to query the inserted row again
I haven't found a solution that works besides a new SELECT query.
It does work in PostgreSQL however.
Presumably, this is because RETURNING was only implemented relatively recently in SQLite 3.35.0 from 2021: https://www.sqlite.org/lang_returning.html and Sequelize doesn't use that version yet.
I've tried both:
Model.upsert with returning: true: did not work on SQLite. BTW, as mentioned at: https://sequelize.org/api/v6/class/src/model.js~model#static-method-upsert returning already defaults to true now, so you don't need to pass it explicitly
Model.bulkCreate with updatOnDuplicate
In both of those cases, some dummy value is returned when the object is present, not the one that is actually modified.
Minimal runnable examples from https://cirosantilli.com/sequelize
update_on_duplicate.js
#!/usr/bin/env node
const assert = require('assert')
const path = require('path')
const { DataTypes, Sequelize } = require('sequelize')
let sequelize
if (process.argv[2] === 'p') {
sequelize = new Sequelize('tmp', undefined, undefined, {
dialect: 'postgres',
host: '/var/run/postgresql',
})
} else {
sequelize = new Sequelize({
dialect: 'sqlite',
storage: 'tmp.sqlite',
})
}
function assertEqual(rows, rowsExpect) {
assert.strictEqual(rows.length, rowsExpect.length)
for (let i = 0; i < rows.length; i++) {
let row = rows[i]
let rowExpect = rowsExpect[i]
for (let key in rowExpect) {
assert.strictEqual(row[key], rowExpect[key])
}
}
}
;(async () => {
const Integer = sequelize.define('Integer',
{
value: {
type: DataTypes.INTEGER,
unique: true, // mandatory
},
name: {
type: DataTypes.STRING,
},
inverse: {
type: DataTypes.INTEGER,
},
},
{
timestamps: false,
}
);
await Integer.sync({ force: true })
await Integer.create({ value: 2, inverse: -2, name: 'two' });
await Integer.create({ value: 3, inverse: -3, name: 'three' });
await Integer.create({ value: 5, inverse: -5, name: 'five' });
let rows
// Initial state.
rows = await Integer.findAll({ order: [['id', 'ASC']]})
assertEqual(rows, [
{ id: 1, value: 2, name: 'two', inverse: -2 },
{ id: 2, value: 3, name: 'three', inverse: -3 },
{ id: 3, value: 5, name: 'five', inverse: -5 },
])
// Update.
rows = await Integer.bulkCreate(
[
{ value: 2, name: 'TWO' },
{ value: 3, name: 'THREE' },
{ value: 7, name: 'SEVEN' },
],
{ updateOnDuplicate: ["name"] }
)
// PostgreSQL runs the desired:
//
// INSERT INTO "Integers" ("id","value","name") VALUES (DEFAULT,2,'TWO'),(DEFAULT,3,'THREE'),(DEFAULT,7,'SEVEN') ON CONFLICT ("value") DO UPDATE SET "name"=EXCLUDED."name" RETURNING "id","value","name","inverse";
//
// but "sequelize": "6.14.0" "sqlite3": "5.0.2" does not use the desired RETURNING which was only added in 3.35.0 2021: https://www.sqlite.org/lang_returning.html
//
// INSERT INTO `Integers` (`id`,`value`,`name`) VALUES (NULL,2,'TWO'),(NULL,3,'THREE'),(NULL,7,'SEVEN') ON CONFLICT (`value`) DO UPDATE SET `name`=EXCLUDED.`name`;
//
// so not sure how it returns any IDs at all, is it just incrementing them manually? In any case, those IDs are
// all wrong as they don't match the final database state, Likely RETURNING will be added at some point.
//
// * https://stackoverflow.com/questions/29063232/sequelize-upsert
// * https://github.com/sequelize/sequelize/issues/7478
// * https://github.com/sequelize/sequelize/issues/12426
// * https://github.com/sequelize/sequelize/issues/3354
if (sequelize.options.dialect === 'postgres') {
assertEqual(rows, [
{ id: 1, value: 2, name: 'TWO', inverse: -2 },
{ id: 2, value: 3, name: 'THREE', inverse: -3 },
// The 6 here seems to be because the new TWO and THREE initially take up dummy rows,
// but are finally restored to final values.
{ id: 6, value: 7, name: 'SEVEN', inverse: null },
])
} else {
assertEqual(rows, [
// These IDs are just completely wrong as mentioned at: https://github.com/sequelize/sequelize/issues/12426
// Will be fixed when one day they use RETURNING.
{ id: 4, value: 2, name: 'TWO', inverse: undefined },
{ id: 5, value: 3, name: 'THREE', inverse: undefined },
{ id: 6, value: 7, name: 'SEVEN', inverse: undefined },
])
}
// Final state.
rows = await Integer.findAll({ order: [['id', 'ASC']]})
assertEqual(rows, [
{ id: 1, value: 2, name: 'TWO', inverse: -2 },
{ id: 2, value: 3, name: 'THREE', inverse: -3 },
{ id: 3, value: 5, name: 'five', inverse: -5 },
{ id: 6, value: 7, name: 'SEVEN', inverse: null },
])
})().finally(() => { return sequelize.close() });
upsert.js
#!/usr/bin/env node
const assert = require('assert')
const path = require('path')
const { DataTypes, Sequelize } = require('sequelize')
let sequelize
if (process.argv[2] === 'p') {
sequelize = new Sequelize('tmp', undefined, undefined, {
dialect: 'postgres',
host: '/var/run/postgresql',
})
} else {
sequelize = new Sequelize({
dialect: 'sqlite',
storage: 'tmp.sqlite',
})
}
function assertEqual(rows, rowsExpect) {
assert.strictEqual(rows.length, rowsExpect.length)
for (let i = 0; i < rows.length; i++) {
let row = rows[i]
let rowExpect = rowsExpect[i]
for (let key in rowExpect) {
assert.strictEqual(row[key], rowExpect[key])
}
}
}
;(async () => {
const Integer = sequelize.define('Integer',
{
value: {
type: DataTypes.INTEGER,
unique: true,
},
name: {
type: DataTypes.STRING,
},
inverse: {
type: DataTypes.INTEGER,
},
},
{
timestamps: false,
}
);
await Integer.sync({ force: true })
await Integer.create({ value: 2, inverse: -2, name: 'two' });
await Integer.create({ value: 3, inverse: -3, name: 'three' });
await Integer.create({ value: 5, inverse: -5, name: 'five' });
let rows
// Initial state.
rows = await Integer.findAll({ order: [['id', 'ASC']]})
assertEqual(rows, [
{ id: 1, value: 2, name: 'two', inverse: -2 },
{ id: 2, value: 3, name: 'three', inverse: -3 },
{ id: 3, value: 5, name: 'five', inverse: -5 },
])
// Update.
rows = [(await Integer.upsert({ value: 2, name: 'TWO' }))[0]]
if (sequelize.options.dialect === 'postgres') {
assertEqual(rows, [
{ id: 1, value: 2, name: 'TWO', inverse: -2 },
])
} else {
// Unexpected ID returned due to the lack of RETURNING, we wanted it to be 1.
assertEqual(rows, [
{ id: 3, value: 2, name: 'TWO', inverse: undefined },
])
}
rows = [(await Integer.upsert({ value: 3, name: 'THREE' }))[0]]
if (sequelize.options.dialect === 'postgres') {
assertEqual(rows, [
{ id: 2, value: 3, name: 'THREE', inverse: -3 },
])
} else {
assertEqual(rows, [
{ id: 3, value: 3, name: 'THREE', inverse: undefined },
])
}
rows = [(await Integer.upsert({ value: 7, name: 'SEVEN' }))[0]]
if (sequelize.options.dialect === 'postgres') {
assertEqual(rows, [
{ id: 6, value: 7, name: 'SEVEN', inverse: null },
])
} else {
assertEqual(rows, [
{ id: 6, value: 7, name: 'SEVEN', inverse: undefined },
])
}
// Final state.
rows = await Integer.findAll({ order: [['value', 'ASC']]})
assertEqual(rows, [
{ id: 1, value: 2, name: 'TWO', inverse: -2 },
{ id: 2, value: 3, name: 'THREE', inverse: -3 },
{ id: 3, value: 5, name: 'five', inverse: -5 },
{ id: 6, value: 7, name: 'SEVEN', inverse: null },
])
})().finally(() => { return sequelize.close() });
package.json
{
"name": "tmp",
"private": true,
"version": "1.0.0",
"dependencies": {
"pg": "8.5.1",
"pg-hstore": "2.3.3",
"sequelize": "6.14.0",
"sql-formatter": "4.0.2",
"sqlite3": "5.0.2"
}
}
In both of those examples, we see that PostgreSQL runs the desired:
INSERT INTO "Integers" ("id","value","name") VALUES (DEFAULT,2,'TWO'),(DEFAULT,3,'THREE'),(DEFAULT,7,'SEVEN') ON CONFLICT ("value") DO UPDATE SET "name"=EXCLUDED."name" RETURNING "id","value","name","inverse";
which works due to RETURNING, but sequelize does not use the desired RETURNING
INSERT INTO `Integers` (`id`,`value`,`name`) VALUES (NULL,2,'TWO'),(NULL,3,'THREE'),(NULL,7,'SEVEN') ON CONFLICT (`value`) DO UPDATE SET `name`=EXCLUDED.`name`;
Tested on Ubuntu 21.10, PostgreSQL 13.5.
Which I myself resolved as follows:
return db.VenueAddress.upsert({
addressId:address.addressId,
venueId: venue.venueId,
street: address.street,
zipCode: address.zipCode,
venueAddressDeletedAt: null
},{individualHooks: true}).then(function(test){
// note individualHooks
});