Sequelize - Update table that has trigger on it - not working - node.js

I am kinda of new with Sequelize and Node.js
I am trying to use Sequelize with a MSSQL database and figure out what i can do with it.
I've established the connection, created a model based on a existing table.
Such table has multiple triggers on it.
When i try to execute something like this
sampletable.update({
NAME: "TEST"
},
{
where: {ID: 0},
silent: true
},
).then(function(result){
console.log(result);
})
where "sampletable" is an imported model
var sampletable = sequelize.import('./models/sampletable.js');
that was created with with "SequelizeAuto" (based on existing table's structure)
var SequelizeAuto = require('sequelize-auto')
var auto = new SequelizeAuto(config.database, config.username, config.password, config);
auto.run(function (err) {
if (err) throw err;
});
i get following error
"Unhandled rejection SequelizeDatabaseError: The target table 'sampletable' of the DML statement cannot have any enabled triggers if the statement contains an OUTPUT clause without INTO clause."
Statement executed
Executing (default): UPDATE [sampletable] SET [NAME]=N'test' OUTPUT INSERTED.* WHERE [id] = 0
Is it possible to update table with triggers with sequelize??
If yes, can anybody point me in the right direction??
I've googled, checked doco, but i can;t find much about it.
TIA

Try adding hasTrigger: true to your model options. This prevents Tedious from trying to output the results from the base table, and uses a temp table instead.

Related

Sequelize table name updates

I have a table in Postgres and for ORM I am using sequelize I need to update the table name so I've tried the below migration for this
queryInterface.renameTable('table1', 'table2', { transaction }
but for some reason, it's creating a new table with table2 with the same data as table 1 but table 1 still exits with blank data.is this correct behavior of this function so'll add a delete query.
This works
queryInterface.renameTable('OldTableName', 'NewTableName');
For that use case i normally rely on raw queries:
const { sequelize } = queryInterface;
await sequelize.query(
`ALTER TABLE table1
RENAME TO table2`,
{
type: QueryTypes.RAW,
raw: true,
transaction,
},
);

When updating Sequelize model, fails and returns different table name than I provided?

I'm running PostgreSQL 11, and attempting to update one of its table entries using Sequelize.
My NodeJS code:
require('env2')('.env');
const Sequelize = require('sequelize');
const sequelize=new Sequelize(process.env.database,process.env.username,process.env.password,{
dialect:'postgres'
});
const Agent=sequelize.define('agent');
updateValues={available:false};
Agent.update(updateValues,{where:{phonenumber:'+18005551212'}}).then(result=>{console.log(result)});
The table agent has the structure:
id primary key serial,
phonenumber varchar(100),
available boolean
When I run the NodeJS code, I get this error:
Executing (default): UPDATE "agents" SET "updatedAt"='2018-12-27 10:16:54.504 +0
0:00' WHERE "phonenumber" = '+18005551212'
Unhandled rejection SequelizeDatabaseError: relation "agents" does not exist
Why is this update failing? I don't understand why the error is talking about the relation "agents", when I provided the table name as "agent" in sequelize.define(agent).
The update is successful when I use raw SQL as follows:
sequelize.query("update agent set available=false where phonenumber='+18005551212'").then(result=>{
console.log(result);
});
By default sequelize creates a table with a plural of its definition name, so when you do sequelize.define('agent') it actually creates a table with name agents . If you don't want to change your table name naturally you can use freezeTableName: true option in sequelize definition. More can be found in this answer. Refer Sequelize configuration doc
.

Fetch data from Table using sequelizejs in Node.js

I am new in node.js, i want to fetch the data from table of the postgresql using sequelize ORM in node.js. I am using below code but its not working.
const apisListModel = sequelize.define('apisList', {});
apisListModel.findAll().then((data)=>{
JSON.stringify(data,undefined,2);
},(e)=>{
console.log(e);
});
Its give me error
Executing (default): SELECT "id", "createdAt", "updatedAt" FROM
"apisLists" AS "apisList";
{ SequelizeDatabaseError: relation "apisLists" does not exist
But apiList table exist in my DATABASE
Sequelize changes the tableName to plural implicitly. You can define options as
const apisListModel = sequelize.define('apisList', {}, {
tableName: 'apisList'
})
The query trying to fetch data from apisLists but your table name is apiList. Note the "s" in table name.
If this is the case, you can specify the table name using the option
tableName: 'apiList'
Here is the Documentation link

Query condition missed key schema element : Validation Error

I am trying to query dynamodb using the following code:
const AWS = require('aws-sdk');
let dynamo = new AWS.DynamoDB.DocumentClient({
service: new AWS.DynamoDB(
{
apiVersion: "2012-08-10",
region: "us-east-1"
}),
convertEmptyValues: true
});
dynamo.query({
TableName: "Jobs",
KeyConditionExpression: 'sstatus = :st',
ExpressionAttributeValues: {
':st': 'processing'
}
}, (err, resp) => {
console.log(err, resp);
});
When I run this, I get an error saying:
ValidationException: Query condition missed key schema element: id
I do not understand this. I have defined id as the partition key for the jobs table and need to find all the jobs that are in processing status.
You're trying to run a query using a condition that does not include the primary key. This is how queries work in DynamoDB. You would need to do a scan for the info in your case, however, I don't think that is the best option.
I think you want to set up a global secondary index and use that to query for the processing status.
In another answer #smcstewart responded to this question. But he provides a link instead of commenting why this error occurs. I want to add a brief comment hoping it will save your time.
AWS docs on Querying a Table states that you can do WHERE condition queries (e.g. SQL query SELECT * FROM Music WHERE Artist='No One You Know') in the DynamoDB way, but with one important caveat:
You MUST specify an EQUALITY condition for the PARTITION key, and you can optionally provide another condition for the SORT key.
Meaning you can only use key attributes with Query. Doing it in any other way would mean that DynamoDB would run a full scan for you which is NOT efficient - less efficient than using Global secondary indexes.
So if you need to query on non-key attributes using Query is usually NOT an option - best option is using Global Secondary Indexes as suggested by #smcstewart.
I found this guide to be useful to create a Global secondary index manually.
If you need to add it using CloudFormation here is a relevant page.
I was getting this error for a different scenario. Here is my scenario.
(It's very unlikely that anyone else ends up with this case, but incase)
I had a query working on a Table (say table A). Table A had a partition key m_id and sort key u_id.
I had a query to fetch data using m_id. The query was working.
'''
var queryParams = {
ExpressionAttributeValues: {
':m_id': mId
},
KeyConditionExpression: 'm_id = :m_id',
TableName: "A"
};
let connections = await docClient.query(queryParams).promise();
'''
I created another Table say Table B. I made some errors in naming keys so I simply deleted and created a table with the same name again, Table B. Table B had partition key m_id, and sort key s_id.
I copied pasted the same query which I was using for Table A, I changed Table name only because partition key had the same name.
To my shock, I get this expectation.
"ValidationException: Query condition missed key schema element"
I rechecked all the names, I compared the query with the working query. Everything was fine.
I thought maybe because, I was deleting recreating Table B, it could be something with that. So I create a fresh Table with a new Name Table B2 with the same key names as Table B.
In my query that was throwing exceptions, I changed only the Table name from B to B2.
And the Exception was gone.
If you are getting this on a fresh table, where no query has worked earlier, creating a new Table with a new name is an option.
If you delete a Table only to change partition key names, it may be safer to use a new name for Table as well (Dynamo could be referring metadata by table names and not by internal identifiers, it is possible that old metadata stays even if you delete a table. Just a guess given I faced this case).
EDIT:2022-July-12
This error does not leave me. My own answer was helpful but one more case, there was a trailing space in name of Key in the table. And Dynamo does not even check for spaces in key names.
You have to create an global secondary index for the status field.
Then, you code could look like smth like this:
dynamo.query({
TableName: "Jobs",
IndexName: 'status',
KeyConditionExpression: '#s = :st',
ExpressionAttributeValues: {
':st': 'processing'
},
ExpressionAttributeNames: {
'#s': 'status',
},
}, (err, resp) => {
console.log(err, resp);
});
Note: scan operation is indeed very costly, especially if you table is huge in size
i solved the problem using AWS.DynamoDB.DocumentClient() with scan, for sample (nodejs):
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: "product",
FilterExpression: "#cg = :data",
ExpressionAttributeNames: {
"#cg": "categoria",
},
ExpressionAttributeValues: {
":data": category,
}
};
docClient.scan(params, onScan);
function onScan(err, data) {
if (err) {
// for the log in server
console.error("Unable to scan the table. Error JSON:", JSON.stringify(err, null, 2));
res.json(err);
} else {
console.log("Scan succeeded.");
res.json(data);
}
}

Oracle Insert with nodeJS causing missing keyword error

I am trying to create table and insert data into oracle db using nodeJS
below is my query code
oracledb.getConnection(
{
user: userId,
password: password,
connectString: `(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = ${host})(PORT = ${port}))(CONNECT_DATA = (SID = ${sid})))`
},
function (err, connection) {
if (err) {
errorHandler(err);
return;
}
connection.execute(
query,
function (err, result) {
if (err) {
errorHandler(err);
oracleFunctions.connection.doRelease(connection, errorHandler);
return;
}
successHandler(result);
oracleFunctions.connection.doRelease(connection, errorHandler);
});
});
As a query I am using the below for creating table.
begin
execute immediate 'CREATE TABLE "table0" ("Channel" VARCHAR(128),
"Pay_mode" VARCHAR(128),
"Product_group" VARCHAR(128),
"PivotDimension" VARCHAR(128),
"Values" DECIMAL(18,6))';
execute immediate 'CREATE TABLE "table1" ("Channel" VARCHAR(128),
"Pay_mode" VARCHAR(128),
"Product_group" VARCHAR(128),
"PivotDimension" VARCHAR(128),
"Values" DECIMAL(18,6))';
end;
I am also using the below for inserting data into the created tables.
INSERT ALL
INTO "table0" VALUES ('AM' , 'PP' , 'NP ISP Annuity' , '1' , '0.11' )
INTO "table0" VALUES ('AM' , 'PP' , 'NP ISP Annuity' , '2' , '0.26' )
SELECT * from dual;
These queries work perfectly on Jetbrain's DataGrip.
I am executing the create table query first and when success response is returned, I am inserting rows into the tables.
However, strangely, when I execute the code, two errors occur
Error: ORA-00955: name is already used by an existing object
ORA-06512: at line 2
for creating table and
Error: ORA-00905: missing keyword
for inserting into the table.
Even more strangely, the table gets created fine but the insertion does not happen.
I have checked many times to make sure that the create table runs only once and insertion only runs once for each table. What could I possibly be doing wrong?
There is nothing strange here. You're asked to do doubtful things and Oracle tries to complete it, no more.
First of all, when using Oracle, you should almost never to create tables dinamically. ORA-955 means that you executed your request more than once. You created tables, ok. You asked to create it again. What answer you're waiting for? Oracle said that these tables already exists. Nothing strange, I guess.
ORA-905, most probably, means that you tries to execute select * from dual as a part of INSERT ALL statement. Oracle is waiting for next INTO statement but got SELECT instead of it, so it indicates syntax error. Again, nothing strange.

Resources