Prisma ORM- create stored procedure - node.js

Using Prisma client 3 I'm trying to create a stored procedure.
The motivation behind it is:
I need to query a table that will be created on run time.
To do this, I need to use dynamic queries,
and I read that stored procedures will be the better practice in this case (pass the table name as a parameter).
I would like for each member of my team to have the updated version of the stored procedure (like all the tables in Prisma)
So, what I've decided to do is to create the stored procedure with prisma.$executeRaw when the app starts and call it when I need.
The code:
let prisma = new PrismaClient();
let res = await prisma.$executeRawUnsafe(`
CREATE PROCEDURE \`module-events\`.GetAllProducts()
BEGIN
select 555;
END
`);
The result:
Invalid `prisma.$executeRaw()` invocation:
Raw query failed. Code: `1295`. Message: `This command is not supported in the prepared statement protocol yet`
As you can see the $executeRawUnsafe() returns the same results. Is there any way to create a stored procedure with Prisma? Is there a way to run a "free style" query that is not limited by Prisma?
I understood from this answer that it is possible to create the stored procedure:
You could also use $executeRaw to generate the stored procedure or use the tool/CLI of your choice.

Related

Consume Oracle Procedure - Node.js Knex.js

Good morning people! all very well?
I have a problem developing an API that will consume some information in an Oracle database.
I developed the API framework in Node.js with a Knex.js query builder. I developed this query to the database directly in the back-end and after running I then set up the Procedure that would make this query directly in the database.
But I can't consume this Procedure from the backend to PLSQL. In the Knex documentation there is no information regarding the consumption of Stored Procedures. Searching forums I saw that some dev used knex.query or knex.execute to execute a Begin and then consume the Procedure through parameters. But when I try to run this way, I get an error saying that knex.query or knex.execute is not a function.
Can someone I know let me know what's wrong? Or is there any other way to do this consumption natively (without using a framework) or is there a framework better prepared for this type of execution?
const data = await connection.execute(
`
BEGIN
SP_GUIA_PROCEDURE(P_NUMB_GUIA => 000254, P_NUMB_BENEF => '000025448911000');
END;
`
);
**TypeError: connection.execute is not a function**
Thank you very much in advance.
It seems you are using Knex to get the connection. There is no method named execute() available in Knex. You can invoke the stored procedures using connection.raw(). Also it is a nice practice to use binds as arguments to the procedure. Here is the sample code (with bindings) that can help:
const data = await connection.raw(
`BEGIN
SAMPLE_PROCEDURE(:id,:name, :oval);
END;`,
{
id: 11, // Bind type is determined from the data. Default direction is BIND_IN
name: {val: 'James', dir: oracledb.BIND_INOUT},
oval: {type: oracledb.NUMBER, dir: oracledb.BIND_OUT}
}
);
SAMPLE_PROCEDURE is the stored procedure and the result is an array containing outbinds.

Validate #versionColumn value before saving an entity with TypeORM

I'm currently working on saving data in a postgres DB using TypeORM with the NestJS integration. I'm saving data which keeps track of a version property using TypeORM's #VersionColumn feature, which increments a number each time save() is called on a repository.
For my feature it is important to check this version number before updating the records.
Important
I know I could technically achieve this by retrieving the record before updating it and checking the versions, but this leaves a small window for errors. If a 2nd user updates the same record in that millisecond between the get and save or if it would take longer for some weird reason, it would up the version and make the data in the first call invalid. TypeORM doesn't check the version value, so even if a call has a lower value than what is in the database, it still saves the data eventhough it should be seen as out of date.
1: User A checks latest version => TypeORM gives back the latest version: 1
2: User B updates record => TypeORM ups the version: 2
3: User A saves their data with version 1 <-- This needs to validate the versions first.
4: TypeORM overwrites User B's record with User A's data
What I'm looking for is a way to make TypeORM decline step 3 as the latest version in the database is 2 and User A tries to save with version 1.
I've tried using the querybuilder and update statements to make this work, but the build-in #VersionColumn only up the version on every save() call from a repository or entity manager.
Besides this I also got a tip to look into database triggers, but as far as I could find, this feature is not yet supported by TypeORM
Here is an example of the setup:
async update(entity: Foo): Promise<boolean> {
const value = await this._configurationRepository.save(entity);
if (value === entity) {
return true;
}
return false;
}
In my opinion, something like this is much better served through triggers directly in the Database as it will fix concerns around race conditions as well as making it so that modifications made outside the ORM will also update the version number. Here is a SQL Fiddle demonstrating triggers in action. You'll just need to incorporate it into your schema migrations.
Here is the relevant DDL from the SQL Fiddle example:
CREATE TABLE entity_1
(
id serial PRIMARY KEY,
some_value text,
version int NOT NULL DEFAULT 1
);
CREATE OR REPLACE FUNCTION increment_version() RETURNS TRIGGER AS
$BODY$
BEGIN
NEW.version = NEW.version + 1;
RETURN NEW;
END;
$BODY$
LANGUAGE plpgsql VOLATILE;
CREATE TRIGGER increment_entity_1_version
BEFORE UPDATE
ON entity_1
FOR EACH ROW
EXECUTE PROCEDURE increment_version();
The same trigger function can be used for any table that has a version column in case this is a pattern you want to use across multiple tables.
I think you are looking for concurrency control. If this is the case there is a solution in this about 1/2 the way down. TypeORM concurrency control issue

How to get all the procedure name and definition in a given schema in Redshift?

When using Redshift, I would like to get the names of all the procedure that were created in a schema, along with their definition.
I know you can use the SHOW PROCEDURE command to get the definition but that requires to have the procedure name.
In SVV_TABLE there is only information regarding tables and view but not procedure.
So if anyone knows how to get that ?
Redshift doesn't have a system view for that yet but you can use tbe PG_PROC table and join it with pg_namespace to filter on schema name.
select proname, proargnames, prosrc
from PG_PROC
join pg_namespace on pg_namespace.oid = pg_proc.pronamespace
where nspname = 'your_schema_name' and procsecdef = true;
The procsecdef = true is to get only stored procedure definition otherwise you also get python UDF.

how to pass output parameter to ms-sql stored procedure using knex

how to pass output parameter to ms-sql stored procedure using knex Query builder.
we are using Knex in node js to call MS-SQL raw query and stored procedure. In MS-SQL, we can pass both IN and OUTPUT parameter in stored procedure. Now, i am stuck, how to pass output parameter to ms-sql stored procedure.
Procedure parameter is like this :
exec sp_procedurename 'username',#RecCount OUTPUT,'',1,30,''
node-mssql driver has pretty custom syntax for that (https://github.com/tediousjs/node-mssql#execute-procedure-callback). As far as I know that API cannot be accessed through knex.

node-postgres: how to prepare a statement without executing the query?

I want to create a "prepared statement" in postgres using the node-postgres module. I want to create it without binding it to parameters because the binding will take place in a loop.
In the documentation i read :
query(object config, optional function callback) : Query
If _text_ and _name_ are provided within the config, the query will result in the creation of a prepared statement.
I tried
client.query({"name":"mystatement", "text":"select id from mytable where id=$1"});
but when I try passing only the text & name keys in the config object, I get an exception :
(translated) message is binding 0 parameters but the prepared statement expects 1
Is there something I am missing ? How do you create/prepare a statement without binding it to specific value in order to avoid re-preparing the statement in every step of a loop ?
I just found an answer on this issue by the author of node-postgres.
With node-postgres the first time you issue a named query it is
parsed, bound, and executed all at once. Every subsequent query issued
on the same connection with the same name will automatically skip the
"parse" step and only rebind and execute the already planned query.
Currently node-postgres does not support a way to create a named,
prepared query and not execute the query. This feature is supported
within libpq and the client/server protocol (used by the pure
javascript bindings), but I've not directly exposed it in the API. I
thought it would add complexity to the API without any real benefit.
Since named statements are bound to the client in which they are
created, if the client is disconnected and reconnected or a different
client is returned from the client pool, the named statement will no
longer work (it requires a re-parsing).
You can use pg-prepared for that:
var prep = require('pg-prepared')
// First prepare statement without binding parameters
var item = prep('select id from mytable where id=${id}')
// Then execute the query and bind parameters in loop
for (i in [1,2,3]) {
client.query(item({id: i}), function(err, result) {...})
}
Update: Reading your question again, here's what I believe you need to do. You need to pass a "value" array as well.
Just to clarify; where you would normally "prepare" your query, just prepare the object you pass to it, without the value array. Then where you would normally "execute" your query, set the value array in the object and pass it to the query. If it's the first time, the driver will do the actual prepare for you the first time around, and simple do binding and execution for the rest of the iteration.

Resources