Consume Oracle Procedure - Node.js Knex.js - node.js

Good morning people! all very well?
I have a problem developing an API that will consume some information in an Oracle database.
I developed the API framework in Node.js with a Knex.js query builder. I developed this query to the database directly in the back-end and after running I then set up the Procedure that would make this query directly in the database.
But I can't consume this Procedure from the backend to PLSQL. In the Knex documentation there is no information regarding the consumption of Stored Procedures. Searching forums I saw that some dev used knex.query or knex.execute to execute a Begin and then consume the Procedure through parameters. But when I try to run this way, I get an error saying that knex.query or knex.execute is not a function.
Can someone I know let me know what's wrong? Or is there any other way to do this consumption natively (without using a framework) or is there a framework better prepared for this type of execution?
const data = await connection.execute(
`
BEGIN
SP_GUIA_PROCEDURE(P_NUMB_GUIA => 000254, P_NUMB_BENEF => '000025448911000');
END;
`
);
**TypeError: connection.execute is not a function**
Thank you very much in advance.

It seems you are using Knex to get the connection. There is no method named execute() available in Knex. You can invoke the stored procedures using connection.raw(). Also it is a nice practice to use binds as arguments to the procedure. Here is the sample code (with bindings) that can help:
const data = await connection.raw(
`BEGIN
SAMPLE_PROCEDURE(:id,:name, :oval);
END;`,
{
id: 11, // Bind type is determined from the data. Default direction is BIND_IN
name: {val: 'James', dir: oracledb.BIND_INOUT},
oval: {type: oracledb.NUMBER, dir: oracledb.BIND_OUT}
}
);
SAMPLE_PROCEDURE is the stored procedure and the result is an array containing outbinds.

Related

Prisma ORM- create stored procedure

Using Prisma client 3 I'm trying to create a stored procedure.
The motivation behind it is:
I need to query a table that will be created on run time.
To do this, I need to use dynamic queries,
and I read that stored procedures will be the better practice in this case (pass the table name as a parameter).
I would like for each member of my team to have the updated version of the stored procedure (like all the tables in Prisma)
So, what I've decided to do is to create the stored procedure with prisma.$executeRaw when the app starts and call it when I need.
The code:
let prisma = new PrismaClient();
let res = await prisma.$executeRawUnsafe(`
CREATE PROCEDURE \`module-events\`.GetAllProducts()
BEGIN
select 555;
END
`);
The result:
Invalid `prisma.$executeRaw()` invocation:
Raw query failed. Code: `1295`. Message: `This command is not supported in the prepared statement protocol yet`
As you can see the $executeRawUnsafe() returns the same results. Is there any way to create a stored procedure with Prisma? Is there a way to run a "free style" query that is not limited by Prisma?
I understood from this answer that it is possible to create the stored procedure:
You could also use $executeRaw to generate the stored procedure or use the tool/CLI of your choice.

Why can't I store a PriorityQueue into MongoDB

Recently I have decided to replace arrays with priority queues for storing my list of jobs for a user into MongoDB. I use NodeJS and ExpressJS for backend. The priority queue I attempted to store is from an external package which can be installed by running the following command in terminal:
yarn add js-priority-queue
For some reason the priority queue works perfectly prior to storing it into MongoDB. However, the next time I attempt to take it out of MongoDB and use it, its functionality is missing. I declare its type as Schema.Types.Mixed in the Schema. Am I doing something wrong or is it not possible to store instantiated class objects into MongoDB?
As far as I know, when you store things in MongoDB they are stored as extended JSON (EJSON) in binary format (BSON)
const { EJSON } = require('bson');
const test = EJSON.stringify({a: new Date(), foo:function(){console.log('foo');}})
console.log(test) // "{"a":{"$date":"2020-07-07T14:45:49.475Z"}}"
So any sort of function is lost.

Express with pug, Postgres and proper MVC

I recently started using Node.js + Express.js (generated with pug) + pg-promise for handling db.
My first target is to obtain data from Postgres (already set up) and display it pretty using render and pug. Let's say it is user list from Users table.
On this restful tutorial I have learned how to get data and return it as JSON - it worked.
Based on Mozilla's tutorial I seperated my code:
routes/users.js: where for '/' I call user_controller.user_list method (using router.get)
controllers/userController.js I have exported user_list where I would like to ask model for data and call render if I have results
queries.js which is kinda my model? But I'm not sure. It has API: connection to db with promises and one function for every query I am going to use in Controllers. I believe I should have like one Model file per table (or any logical entity) but where to store pgp connections?
This file is based on first tutorial I mentioned
// queries.js (connectionString is set properly to my postgres)
var pgp = require('pg-promise')(options);
var db = pgp(connectionString);
function getUsers(req, res, next) {
db.any('SELECT (user_id, username) FROM public.users ORDER BY user_id ASC LIMIT 1000')
.then(function (data) {
res.json({ data: data });
})
.catch(function (err) {
return next(err);
});
}
module.exports = {
getUsers: getUsers
};
Here starts my problem as most tutorials uses mongoose which is very model-db-schema-friendly and what I have is simple 'SELECT ...' string I pass to pg-promise's any() function.
Therefore I have no model class like User.
In userControllers.js I don't know how to call getUsers() to handle its data. Returning JS object from getUsers() would be nice.
Also: where should I call render? In controller or only in
db.any(...).then(function (data) { <--here--> })
Before, I also tried to embed whole Postgres handling into Controller but from db.any() I got this array for handling:
[{ row: '(1,John)' },{ row: '(2,Amy)' },{ row: '(50,Peter)' } ]
Didn't know how go from there as I probably lost my API functionality as well ;-)
I am browsing through multiple tutorials how to handle MVC but usually they handle MongoDB and
satisfy readers with res.send() not render().
I am not sure that I understand what your question is exactly about, but since I do not have enough reputation to comment, I'll do my best to help you with your interrogations. :)
First, regarding the queries.js file, it is IMO not exactly a model, but rather a DAO (Data Access Object) file. DAO comes between you Model (which is actually you database) and your Controller layers. There usually is a DAO file per object (User, Pet, whatever you want) in your data model.
When the data model is rather complex, it can be useful to use an Object Relational Mapping (ORM) such as Mongoose to map your database and execute complexe processes on your objects. In such a case, you might need a specific file per object so as to describe your model and store your queries. But since you don't need an ORM, you DAO can directly interact with your database. That is why you do not have a User.js file.
Regarding the way the db object should be used, I think you should refer directly to pg-promise documentation on the matter.
IMPORTANT: For any given connection, you should only create a single
Database object in a separate module, to be shared in your application
(see the code example below). If instead you keep creating the
Database object dynamically, your application will suffer from loss in
performance, and will be getting a warning in a development
environment (when NODE_ENV = development)
As a matter of fact, a db object in pg-promise sort of represents the database itself and is actually designed for the simultaneous use of several databases, which does not seem to be your case for the moment.
Finally, when it comes to the render function, I believe it should be in the controller, as your DAO is not supposed to know how the data it has gathered is going to be used.
Modularity is always a time-saving choice on the long-term.
Furthermore, note that you might later need a Business Layer between your DAO and your controller, in order to preprocess and postprocess data you are going to persist or to display. In such a case, if you need for instance to ask for data from your database, you will need to render data after it is processed by the Business layer. If the render is made in the DAO layer, it will not be possible.
In the link I provided earlier to pg-promise's db object connection, you will also find documentation on the any() method. You might already have looked it up.
It specifically states that it returns
A promise object that represents the query result:
When no rows are returned, it resolves with an empty array.
When 1 or more rows are returned, it resolves with the array of rows.
so your returned data is a JS Array. If you want to make it a JS object, just use
JSON.stringify(yourArray) to process your data before rendering it in your controller.
But I wonder if Pug is not able to use your data directly.
Also, if you cannot get any data out of your DAO, maybe you should check that your data object is not empty, as such a case is tolerated by the any() method. If you expect your query to always return something, you might want to consider using the many() or the one() methods.
I hope this helps you.

Should I seperate input validation and DB query?

I am now working on a API back-end. I've written a series of functions to query a database, like:
(yes, NodeJS)
function addGuys(guys, cb) {
var sql = 'INSERT INTO guys ...';
guys.ForEach(function(guy) {
// Construct sql according to keys given in `guy`
// Validate here?
// add col names, values to sql statment, etc
}
// Execute query
}
The problem now is, should I validate input in these DB functions, or should I validate input before sending it to the DB functions? The latter seems cleaner, but I am a novice and still not sure. Any advice will be welcomed. Thanks.
You should not construct SQL directly yourself at all. Instead you should use a library for that, which does its own validation and escaping. A popular example would be http://docs.sequelizejs.com/en/v3/

How to implement a fast, queryable and persistant database in phantomjs?

I have been using phantomjs for doing some heavy lifting for me in a server side dom environment. Till now I have been putting by data structures in-memory (i.e. doing nothing special with them) and everything was fine.
But recently under some use cases i started running into following problems:
memory usage becoming too high making swap to kick in and seriously effecting my performance.
not being able to resume from the last save point since in-memory data structures are not persistent (obviously)
This forced me to look for a database solution to be used on phantom but again I am running into issues while deciding on a solution:
I don't want my performance to get too effected.
it has to be persistent and queryable
how do i even connect to a database from inside phantom script.
Can anyone guide me to a satisfactory solution?
Note: I have almost decided on sqlite but connecting to it from phantom is still an issue. Nodejs provides sqlite3 node module, i am trying to browserify it for phantom.
Note Note: Browserify didn't worked! Back to ground zero!! :-(
Thanx in advance!
Phantomjs' filesystem API allows you to read and write binary files with:
buf = fs.read(FILENAME, 'b') and
fs.write(FILENAME, buf, 'b')
sql.js (https://github.com/kripken/sql.js/) gives you a javascript SQLite
implementation you can run in phantomjs.
Combine the 2 and you have a fast, persistent, queryable SQL database.
Example walkthrough
Get javascript SQLite implementation (saving to /tmp/sql.js)
$ wget https://raw.githubusercontent.com/kripken/sql.js/master/js/sql.js -O /tmp/sql.js
Create a test SQLite database using the command-line sqlite3 app (showing it is persistent and external to your phantomjs application).
sqlite3 /tmp/eg.db
sqlite> CREATE TABLE IF NOT EXISTS test(id INTEGER PRIMARY KEY AUTOINCREMENT, created INTEGER NOT NULL DEFAULT CURRENT_TIMESTAMP);
sqlite> .quit
Save this test phantomjs script to add entries to the test database and verify behaviour.
$ cat /tmp/eg.js
var fs = require('fs'),
sqlite3 = require('./sql.js'),
dbfile = '/tmp/eg.db',
sql = 'INSERT INTO test(id) VALUES (NULL)',
// fs.read returns binary 'string' (not 'String' or 'Uint8Array')
read = fs.read(dbfile, 'b'),
// Database argument must be a 'string' (binary) not 'Uint8Array'
db = new sqlite3.Database(read),
write,
uint8array;
try {
db.run(sql);
} catch (e) {
console.error('ERROR: ' + e);
phantom.exit();
}
// db.export() returns 'Uint8Array' but we must pass binary 'string' to write
uint8array = db.export();
write = String.fromCharCode.apply(null, Array.prototype.slice.apply(uint8array));
fs.write(dbfile, write, 'b');
db.close();
phantom.exit();
Run the phantomjs script to test
$ /usr/local/phantomjs-2.0.0-macosx/bin/phantomjs /tmp/eg.js
Use external tool to verify changes were persisted.
sqlite3 /tmp/eg.db
sqlite> SELECT * FROM test;
id created
1 2015-03-28 10:21:09
sqlite>
Some things to keep in mind:
The database is modified on disk only when you call fs.write.
Any changes you make are invisible to external programs accessing the same SQLite database file until you call fs.write.
The entire database is read into memory with fs.read.
You may want to have different OS files for different tables -- or versions of tables -- depending on your application and the amount of data in the tables, to address the memory requirements you mentioned.
Passing what is returned by sqlite3.export() to fs.write will corrupt the SQLite database file on disk (it will no longer be a valid SQLite database file).
Uint8Array is NOT the correct type for the fs.write parameter.
Writing a binary data in phantomjs works like this:
var db_file = fs.open(db_name, {mode: 'wb', charset: ''});
db_file.write(String.fromCharCode.apply(null, db.export()));
db_file.close();
You have to set the charset to '' because otherwise the writing goes wrong.

Resources