I modified my enitity and created it's migration using npx mikro-orm migration:create. Now when i try to apply the migration using npx mikro-orm migration:up.
I get the following error.
NotNullConstraintViolationException: alter table "user" add column "role" text check ("role" in ('admin', 'chief_editor', 'editor', 'user')) not null; - column "role" contains null values
at PostgreSqlExceptionConverter.convertException (C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\postgresql\PostgreSqlExceptionConverter.js:24:24)
at PostgreSqlDriver.convertException (C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\core\drivers\DatabaseDriver.js:194:54)
at C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\core\drivers\DatabaseDriver.js:198:24
at processTicksAndRejections (node:internal/process/task_queues:93:5)
at Function.runSerial (C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\core\utils\Utils.js:484:22)
at C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\migrations\MigrationRunner.js:23:17
at PostgreSqlConnection.transactional (C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\knex\AbstractSqlConnection.js:53:25)
at MigrationRunner.run (C:\dev\nodejs\nestjs\blog-api\node_modules\#mikro-orm\migrations\MigrationRunner.js:20:13)
previous error: alter table "user" add column "role" text check ("role" in ('admin', 'chief_editor', 'editor', 'user')) not null; - column "role" contains null values
at Parser.parseErrorMessage (C:\dev\nodejs\nestjs\blog-api\node_modules\pg-protocol\src\parser.ts:369:69)
at Parser.handlePacket (C:\dev\nodejs\nestjs\blog-api\node_modules\pg-protocol\src\parser.ts:188:21)
at Parser.parse (C:\dev\nodejs\nestjs\blog-api\node_modules\pg-protocol\src\parser.ts:103:30)
at TLSSocket.<anonymous> (C:\dev\nodejs\nestjs\blog-api\node_modules\pg-protocol\src\index.ts:7:48)
at TLSSocket.emit (node:events:376:20)
at TLSSocket.EventEmitter.emit (node:domain:470:12)
at addChunk (node:internal/streams/readable:311:12)
at readableAddChunk (node:internal/streams/readable:286:9)
at TLSSocket.Readable.push (node:internal/streams/readable:225:10)
at TLSWrap.onStreamRead (node:internal/stream_base_commons:192:23) {
length: 117,
severity: 'ERROR',
code: '23502',
detail: undefined,
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'user',
column: 'role',
dataType: undefined,
constraint: undefined,
file: 'tablecmds.c',
line: '4857',
routine: 'ATRewriteTable'
}
Why is a migration giving me this error?
Related
I have a manytomany relation.
A Coordination can have several countries.
It ends up with 3 tables: coordination, country and coordination_country_join
#Entity('coordination')
...
#ManyToMany(() => CountryEntity)
#JoinTable({
joinColumn: {
name: 'event_id_w',
referencedColumnName: 'event_id_w',
},
inverseJoinColumn: {
name: 'CountryUnCode',
referencedColumnName: 'UN_Code',
},
})
countries: string[];
When I save my Coordination, with an array of countries, it works fine, but I found out weird sequence in the SQL statements.
To update the relations (the countries, content of coordination_country_join), it does
INSERT INTO coordination_country_join ... all the new countries of the given coordination
then DELETE from coordination_country_join all the old countries of the relation
Which does not work when I save the coordination while no country has changed because it tries to insert a pair (countryId, coordinationId) which already exists in the coordination_country_join table.
How can I fix this issue ?
Thanks
query: SELECT "CoordinationEntity"."gid" AS "CoordinationEntity_gid", "CoordinationEntity"."objectid" AS "CoordinationEntity_objectid", "CoordinationEntity"."gdacsid" AS "CoordinationEntity_gdacsid", "CoordinationEntity"."type" AS "CoordinationEntity_type", "CoordinationEntity"."name" AS "CoordinationEntity_name", "CoordinationEntity"."coordinato" AS "CoordinationEntity_coordinato", "CoordinationEntity"."requestor" AS "CoordinationEntity_requestor", "CoordinationEntity"."activation" AS "CoordinationEntity_activation", "CoordinationEntity"."spacechart" AS "CoordinationEntity_spacechart", "CoordinationEntity"."glidenumbe" AS "CoordinationEntity_glidenumbe", "CoordinationEntity"."url" AS "CoordinationEntity_url", "CoordinationEntity"."date_creat" AS "CoordinationEntity_date_creat", "CoordinationEntity"."date_close" AS "CoordinationEntity_date_close", "CoordinationEntity"."status" AS "CoordinationEntity_status", "CoordinationEntity"."comment" AS "CoordinationEntity_comment", "CoordinationEntity"."event_id_w" AS "CoordinationEntity_event_id_w", ST_AsGeoJSON("CoordinationEntity"."the_geom")::json AS "CoordinationEntity_the_geom" FROM "coordination" "CoordinationEntity" WHERE "CoordinationEntity"."gid" IN ($1) -- PARAMETERS: [36]
query: SELECT "CoordinationEntity_countries_rid"."event_id_w" AS "event_id_w", "CoordinationEntity_countries_rid"."CountryUnCode" AS "CountryUnCode" FROM "country" "country" INNER JOIN "coordination_countries_country" "CoordinationEntity_countries_rid" ON ("CoordinationEntity_countries_rid"."event_id_w" = $1 AND "CoordinationEntity_countries_rid"."CountryUnCode" = "country"."UN_Code") ORDER BY "CoordinationEntity_countries_rid"."CountryUnCode" ASC, "CoordinationEntity_countries_rid"."event_id_w" ASC -- PARAMETERS: [50]
query: START TRANSACTION
query: INSERT INTO "coordination_countries_country"("event_id_w", "CountryUnCode") VALUES ($1, $2) -- PARAMETERS: [50,4]
query failed: INSERT INTO "coordination_countries_country"("event_id_w", "CountryUnCode") VALUES ($1, $2) -- PARAMETERS: [50,4]
error: error: duplicate key value violates unique constraint "PK_622c3d328cd639f1f6deb8f3874"
at Parser.parseErrorMessage (/home/florent/dev/smcs/api/node_modules/pg-protocol/src/parser.ts:369:69)
at Parser.handlePacket (/home/florent/dev/smcs/api/node_modules/pg-protocol/src/parser.ts:188:21)
at Parser.parse (/home/florent/dev/smcs/api/node_modules/pg-protocol/src/parser.ts:103:30)
at Socket.<anonymous> (/home/florent/dev/smcs/api/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (events.js:375:28)
at addChunk (internal/streams/readable.js:290:12)
at readableAddChunk (internal/streams/readable.js:265:9)
at Socket.Readable.push (internal/streams/readable.js:204:10)
at TCP.onStreamRead (internal/stream_base_commons.js:188:23) {
length: 274,
severity: 'ERROR',
code: '23505',
detail: 'Key (event_id_w, "CountryUnCode")=(50, 4) already exists.',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'coordination_countries_country',
column: undefined,
dataType: undefined,
constraint: 'PK_622c3d328cd639f1f6deb8f3874',
file: 'nbtinsert.c',
line: '434',
routine: '_bt_check_unique'
}
query: ROLLBACK
These are the errors I got from backend while validating in model and received in 'err' variable. If I console.log(err), then it shows following errors.
Error [ValidationError]: employees validation failed: fullName: This field is required from model, email: This field is required from model
at ValidationError.inspect (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\error\validation.js:61:24)
at formatValue (internal/util/inspect.js:703:31)
at inspect (internal/util/inspect.js:272:10)
at formatWithOptions (internal/util/inspect.js:1887:40)
at Object.Console.<computed> (internal/console/constructor.js:284:10)
at Object.log (internal/console/constructor.js:294:61)
at handleValidationError (D:\Programming\MERN\CRUD\routes\/employeeRoute.js:46:13)
at D:\Programming\MERN\CRUD\routes\/employeeRoute.js:33:17
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\model.js:4915:16
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\helpers\promiseOrCallback.js:16:11
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\model.js:4938:21
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\model.js:492:16
at D:\Programming\MERN\CRUD\node_modules\kareem\index.js:246:48
at next (D:\Programming\MERN\CRUD\node_modules\kareem\index.js:167:27)
at next (D:\Programming\MERN\CRUD\node_modules\kareem\index.js:169:9)
at Kareem.execPost (D:\Programming\MERN\CRUD\node_modules\kareem\index.js:217:3) {
errors: {
fullName: MongooseError [ValidatorError]: This field is required from model
at new ValidatorError (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\error\validator.js:29:11)
at validate (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1178:13)
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1161:7
at Array.forEach (<anonymous>)
at SchemaString.SchemaType.doValidate (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1106:14)
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\document.js:2378:18
at processTicksAndRejections (internal/process/task_queues.js:79:11) {
properties: [Object],
kind: 'required',
path: 'fullName',
value: '',
reason: undefined,
[Symbol(mongoose:validatorError)]: true
},
email: MongooseError [ValidatorError]: This field is required from model
at new ValidatorError (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\error\validator.js:29:11)
at validate (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1178:13)
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1161:7
at Array.forEach (<anonymous>)
at SchemaString.SchemaType.doValidate (D:\Programming\MERN\CRUD\node_modules\mongoose\lib\schematype.js:1106:14)
at D:\Programming\MERN\CRUD\node_modules\mongoose\lib\document.js:2378:18
at processTicksAndRejections (internal/process/task_queues.js:79:11) {
properties: [Object],
kind: 'required',
path: 'email',
value: '',
reason: undefined,
[Symbol(mongoose:validatorError)]: true
}
},
_message: 'employees validation failed'
}
Now if I console.log('err.errors.fullName.message'). It works perfectly.
But if there are more keys, it's obvious that we need to use loop. So if I try to iterate through keys of object as follow, then it throws following error.
for(field in err.errors){
.......
.......
}
Error is:
ReferenceError: field is not defined
What's the problem here ?
First of all your backend should not give this type of error response as you mentioned in the above. This must be like a JSON response of errors. e.g
{ success: false, errors: [{fullName: "fullName is required"}] }
Coming to your question. If you want to show error properly you can loop through the error which you can pic by the err.errors
eg err.errors.forEach(error => console.log(error) )
But again this is a very wrong way to throw the validation error. To validate any data you can use joi, request-validator, validatorjs, etc. Which will give you a perfect error response as you want? Why I am giving stress on this because you are exposing entire directory, files, model name, etc.
I hope this may help you to solve your problem
I'm working on some project about renting houses and appartments and I've reached the point when i need to implement filtering houses based on features they have(wifi, security and other staff). In the beginning I decided to try Sequelize ORM for the first time. Stuff like adding, creating, editing is working fine, but the filtering part is where I have some problems.
I'm working with nodejs,express and postgresql.
I need to find all houses that have features listed in the array of features IDs. Here is what I've tried. In this example I'm trying to get houses which have features with ids 1, 2 and 4.
db.House.findAll({
include: [{
model: db.HouseFeature,
as: 'HouseFeatures',
where: {
featureId: {
[Op.contains]: [1, 2, 4] //<- array of featuresIds
}
}
}]
})
Fetching houses by single feature id works fine because i don't use Op.contains there.
Here are some relations related to this case:
House.hasMany(models.HouseFeature, { onDelete: 'CASCADE' });
HouseFeature.belongsTo(models.House);
HouseFeature contains featureId field.
Here is the error I get:
error: оператор не существует: integer #> unknown
at Connection.parseE (C:\***\server\node_modules\pg\lib\connection.js:601:11)
at Connection.parseMessage (C:\***\server\node_modules\pg\lib\connection.js:398:19)
at Socket.<anonymous> (C:\***\server\node_modules\pg\lib\connection.js:120:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
name: 'error',
length: 397,
severity: 'ОШИБКА',
code: '42883',
detail: undefined,
hint:
'Оператор с данными именем и типами аргументов не найден. Возможно, вам следует добавить явные приведения типов.',
position: '851',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file:
'd:\\pginstaller.auto\\postgres.windows-x64\\src\\backend\\parser\\parse_oper.c',
line: '731',
routine: 'op_error',
sql:
'SELECT "House"."id", "House"."title", "House"."description", "House"."price", "House"."address", "House"."lat", "House"."lon", "House"."kitchen", "House"."bathrooms", "House"."floor", "House"."totalFloors", "House"."people", "House"."area", "House"."bedrooms", "House"."trusted", "House"."createdAt", "House"."updatedAt", "House"."CityId", "House"."ComplexId", "House"."OwnerProfileId", "House"."HouseTypeId", "House"."RentTypeId", "HouseFeatures"."id" AS "HouseFeatures.id", "HouseFeatures"."featureId" AS "HouseFeatures.featureId", "HouseFeatures"."createdAt" AS "HouseFeatures.createdAt", "HouseFeatures"."updatedAt" AS "HouseFeatures.updatedAt", "HouseFeatures"."HouseId" AS "HouseFeatures.HouseId" FROM "Houses" AS "House" INNER JOIN "HouseFeatures" AS "HouseFeatures" ON "House"."id" = "HouseFeatures"."HouseId" AND "HouseFeatures"."featureId" #> \'1,2\';'
Sorry for some russian there.
UPDATE:
I've managed to do what i needed by changing each House relating only to one HouseFeature, and by changing that HouseFeature model to store array of featureIds. Op.contains works fine.
db.House.findAll({
include: [{
model: db.HouseFeature,
as: 'HouseFeature',
where: {
features: {
[Op.contains]: req.body.features
}
},
}]
})
// Associations
HouseFeature.belongsTo(models.House);
House.hasOne(models.HouseFeature, { onDelete: 'CASCADE' });
const HouseFeature = sequelize.define('HouseFeature', {
features: {
type: DataTypes.ARRAY(DataTypes.INTEGER)
}
}, {});
Now i have one little issue. Can I somehow link HouseFeature model with Feature model to fetch feature icon images and name later on? With Feature ids being stored inside HouseFeature array.
Please check the difference between Op.in and Op.contains:
[Op.in]: [1, 2], // IN [1, 2]
[Op.contains]: [1, 2] // #> [1, 2] (PG array contains operator)
It looks like HouseFeatures.featureId is a PK with type integer, not a postgres array.
Please try:
db.House.findAll({
include: [{
model: db.HouseFeature,
as: 'HouseFeatures',
where: {
featureId: {
[Op.in]: [1, 2, 3]
}
}
}]
})
or even
db.House.findAll({
include: [{
model: db.HouseFeature,
as: 'HouseFeatures',
where: {
featureId: [1, 2, 3]
}
}]
})
instead
I am trying to create the following type in postgresql using pg nodejs package. I have written a function that queries the pool and attempts to create this type as follows:
return pool.query(
`
CREATE TYPE grade_sheet AS (
subjectName VARCHAR(100),
teacherName VARCHAR(100),
uti VARCHAR(32),
markAllocated REAL CHECK (markAllocated >= 0.0 AND markAllocated <= 100.00),
markObtained REAL CHECK (markObtained >= 0.0 AND markObtained <= 100.00),
gradeObtained CHAR(2),
dateTaken TIMESTAMP
);
`
);
When I am trying to run the script, I get the following syntax error:
{ error: syntax error at or near "CHECK"
at Connection.parseE (/home/zerocool/myschool/node_modules/pg/lib/connection.js:554:11)
at Connection.parseMessage (/home/zerocool/myschool/node_modules/pg/lib/connection.js:379:19)
at Socket.<anonymous> (/home/zerocool/myschool/node_modules/pg/lib/connection.js:119:22)
at Socket.emit (events.js:127:13)
at addChunk (_stream_readable.js:269:12)
at readableAddChunk (_stream_readable.js:256:11)
at Socket.Readable.push (_stream_readable.js:213:10)
at TCP.onread (net.js:590:20)
name: 'error',
length: 95,
severity: 'ERROR',
code: '42601',
detail: undefined,
hint: undefined,
position: '195',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'scan.l',
line: '1087',
routine: 'scanner_yyerror' }
Constraints cannot be used in types. But in domains the can. Domains however cannot have multiple attributes. But you can solve your problem by using both:
create a domain including your check constraint
create a type an use the domain
It could look like:
CREATE DOMAIN grade_sheet_real
real
CHECK (value >= 0.0
AND value <= 100.00);
CREATE TYPE grade_sheet AS
(subjectname varchar(100),
teachername varchar(100),
uti varchar(32),
markallocated grade_sheet_real,
markobtained grade_sheet_real,
gradeobtained char(2),
datetaken timestamp);
For my test suite, I want to bulkWrite test info in the database, and then bulk delete any of the test info entered throughout the test to come back to a clean slate. I do so by running a bulkWrite on the db to which I pass the content of a JSON file loaded via nodeJS's require statement.
The problem is that for the dataset
[ { deleteOne: { username: 'test-author' } } ]
passed to models[key].collection.bulkWrite(action[key]), where key is the name of the model of interest and action is the JSON file,I get the following error:
{ MongoError: Wrong type for 'q'. Expected a object, got a null.
at Function.MongoError.create (/var/www/website/server/node_modules/mongodb-core/lib/error.js:31:11)
at /var/www/website/server/node_modules/mongodb-core/lib/connection/pool.js:483:72
at authenticateStragglers (/var/www/website/server/node_modules/mongodb-core/lib/connection/pool.js:429:16)
at Connection.messageHandler (/var/www/website/server/node_modules/mongodb-core/lib/connection/pool.js:463:5)
at Socket.<anonymous> (/var/www/website/server/node_modules/mongodb-core/lib/connection/connection.js:339:20)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at Socket.Readable.push (_stream_readable.js:134:10)
at TCP.onread (net.js:548:20)
name: 'MongoError',
message: 'Wrong type for \'q\'. Expected a object, got a null.',
ok: 0,
errmsg: 'Wrong type for \'q\'. Expected a object, got a null.',
code: 14,
codeName: 'TypeMismatch' }
I have done some research and have been unable to find a solution to this problem. The error itself is pretty meaningless, so I can't grasp much out of it. Any idea how to solve the problem?
Any help would be greatly appreciated!
Cheers!
As per the MongoDB API, the deleteOne, deleteMany, updateOne, updateMany, replaceOne, and replaceMany operation requires to have a property filter which acts as the filter for the query.
However Mongoose's API shows the following (mistaken) example:
Character.bulkWrite([
...
{
deleteOne: {
{ name: 'Eddard Stark' }
}
}
]).then(handleResult);
Hence, the data sent over changes from:
[{
"deleteOne": { "username": "test-author" }
}]
to
[{
"deleteOne": { "filter": { "username": "test-author" }}
}]
I'll make sure to pass the message along to the mongoosejs dev group.