Mongoose 6.6.1, MongoDB 6.0.1
Our team uses Mongo/Mongoose every day - but just the basics. I experiment with more advanced features, so I got a reputation as the local "Mongoose Guru" - but in the real world, I'm so not.
I clean up our log tables with a deleteMany for docs older than a week - no problem - simple filter.
But now I just want to keep the last 1000 docs & delete the rest - assumed it would be trivial, but how?
Mongoose 6.6.1 documents the 'deleteMany' methods for 2 Object types: Model & Query
MyModel.deleteMany([filter], [options]) calls the deleteMany method on the Model -
MyModel.find().deleteMany([filter], [options]) calls it on the Query.
Slightly different syntax
Since queries are chainable, my first assumption was that query.deleteMany() WITHOUT ANY ARGUMENTS would only delete the documents that were already filtered by the previous query. I played with both skip & limit - similar results.
MyModel.find() is a Query that matches all docs - so deleteMany removes them all
MyModel.find().limit(10) matches 10 docs.
Logically I assumed MyModel.find().limit(10).deleteMany(); should delete only those 10
BUT - that generates an error:
MongoServerError: The limit field in delete objects must be 0 or 1. Got 10
I can hack that but it's messy and I like elegant - am I missing something about deleteMany?
Just in case anyone is interested, the hack:
async function keepLast(myModel, n=10) {
let edge = await myModel.findOne().sort({ _id: -1 }).skip(n);
let res = await myModel.deleteMany({_id:{$lte:edge._id}});
return res;
}
I'd like to think there was a better way that would also give more insight...
The database is in Azure cloud and not being used in production currently. There are 80.000 rows and a uprn is a VARCHAR(100);
I'm already using JOI to validate each UPRN as well;
I'm using KNEX with a SQL Server database with the following whereIn query:
knex(LOCATIONS.table).whereIn(LOCATIONS.uprn, req.body.uprns)
but this takes 8-12s to complete and sometimes timesout. if I use .toQuery() on the same thing, SSMS will return the result within 1-2.
If I do a raw query, the resulting .toQuery() or toString() works in SSMS and returns results. But if I try to use the raw directly, it will return 0 results.
I'm looking to either fix what's making whereIn so slow or get the raw query working.
EDIT 1:
After much debugging and trying -- it seems that the bug is due to how knex deals with arrays, so I made a for-of loop to add ? ? ? for each array element and then inputed the array for all params.
This led me to realizing the performance issue is due to SQL server way of parameterising.
I ended up building a raw query string with all of the parameters and validating the input with Joi string/regex config:
Joi.string()
.min(1)
.max(35)
.regex(/^[a-z\d\-_\s]+$/i)
allowing only for alphanumeric, dashes and spaces which should prevent sql injection.
I'm going to look deeper into security issues with this and might make a separate login that can only SELECT data from that table and nothing more to run with these queries.
Needed to just handle it raw and validate separately.
I'm able to generate query for multi inserts or update thanks to pg-promise helpers but I was wondering if I could follow the advice of the author and put all queries outside of my javascript code (See here https://github.com/vitaly-t/pg-promise/wiki/SQL-Files and here : https://github.com/vitaly-t/pg-promise-demo).
When I use the insert helpers, the return query looks like :
INSERT INTO "education"("candidate_id","title","content","degree","school_name","start_date","still_in","end_date","picture_url") VALUES('6','My degree','Business bachelor','Bachelor +','USC','2018-05-15T02:00:00.000+02:00'::date,false,null::date,null),('6','Another degree','Engineering','Master degree','City University','2018-05-15T02:00:00.000+02:00'::date,false,null::date,null)
The idea is that I don't know how many inserts I want to do at the same time, so it has to be dynamic.
The following code doesn't work as I'm passing an array of object instead of an object :
db.none(`INSERT INTO "education"("candidate_id","title","content","degree","school_name","start_date","still_in","end_date","picture_url")
VALUES($<candidate_id>, $<title>, $<content>, $<degree>, $<school_name>, $<start_date>, $<still_in>, $<end_date>, $<picture_url>)`, data)
This code spreads the object but is still not correct to make a proper query :
db.none(`INSERT INTO "education"("candidate_id","title","content","degree","school_name","start_date","still_in","end_date","picture_url")
VALUES($1:list)`,
[data])
Any idea ? Is it at least possible or in the case where I don't know how many records I want to insert in advance I have to call pgp.helpers everytime ?
You confuse static and dynamic SQL. SQL files are there for SQL queries that are mainly static, i.e. you still can inject dynamically a lot, but when most of the query is dynamic, there is no longer any point putting it into an SQL file.
And the helpers namespace is there for dynamic queries only. So you are asking about two separate things, to join things that do not need to be joined.
Im using Objection.js as my ORM for a simple rainfall application. I need to be able to dynamically update and entry of one table when a lower level tables entires has been updated. To do this I need the whole entry I am updating so I can use that data to correctly update the dynamically updated entry.
Im using the $afterUpdate hook for the lower level table entry which. The issue I am having is that when I log this within the $afterUpdate hook function it only contains the properties for the parts of the entry I want to update. How can I get the entire entry? Im sure I could get the record by running an additional query to the DB but I was hoping there would be away to avoid this. Any help would be appreciated
I think, as of right now, you can only get the whole model with an extra query.
If you are doing the update with an instance query ($query) you can get the other properties from options.old.
Query:
const user = await User.query().findById(userId);
await user.$query()
.patch({ name: 'Tom Jane' })
Hook:
$afterUpdate(opt, queryContext) {
console.log(opt.old)
}
Patch
If you don't need to do this in the hook, you might want to use patch function chained with first().returning('*') to get the whole model in a single query, it's more efficient than patchAndFetchById in postgreSQL. Like stated in the documentation.
Because PostgreSQL (and some others) support returning('*') chaining, you can actually insert a row, or update / patch / delete (an) existing row(s), and receive the affected row(s) as Model instances in a single query, thus improving efficiency. See the examples for more clarity.
const jennifer = await Person
.query()
.patch({firstName: 'Jenn', lastName: 'Lawrence'})
.where('id', 1234)
.returning('*')
.first();
References:
http://vincit.github.io/objection.js/#postgresql-quot-returning-quot-tricks
https://github.com/Vincit/objection.js/issues/185
https://github.com/Vincit/objection.js/issues/695
i have the current design in mysql :
Table filesubject
Is there a way in Kohana to set relationship in a way that if i make something like
ORM::factory('filesubject')->where('file_id','=',$file->id)->find_all()->as_array());
That i get all the joins from the other tables ?
I'm not sure about your question. To automatically join models, first setup your relationships ($_belongs_to etc) and then look at:
In your model:
ORM property: $_load_with. eg: protected $_load_with= array(model1, model2, etc)
Or at run time:
ORM method: with(). eg: ORM::factory('filesubject')->with('model')->with('model2')->find_all()
I don't think the as_array() function pulls in the joined data though. Once it's actually performing the join you'd need to overwrite as_array (or write your own function) to output the nested key/pair values from the joined properties.