node-oracledb giving Error: NJS-044: named JSON object is not expected in this context while executing stored procedure - node.js

I am facing an issue to call oracle db stored procedure using node-oracledb npm ("oracledb": "^3.1.2") and ("#types/oracledb": "^3.1.0") into node.js application. The stored procedure takes 3 input paramters of type string, string and array of oracleDB type respectively. However, while passing last parameter of DB type, node.js application throws an exception of "NJS-044: named JSON object is not expected in this context".
// DB payload
let obj = {
tableOwner: 'Mr X',
tableName: 'Demo',
retentionData: this.CreateArrayFromJSONObject(array_of_data)
}
// DB procedure
let procedure: string = `BEGIN PKG_ARCHIVAL_TOOL.P_RETENTION_POLICY_CREATE(:tableOwner, :tableName, :retentionData); END;`;
/// DB execution function call
DBService.getInstance().ExecuteDBProcedureRequest(procedure, userPolicyJSON);
// DB executing
public ExecuteDBProcedureRequest = (procedure: string, inputBody: any) : Promise<any> => {
return new Promise((resolve, reject) => {
DBConn.execute(procedure, inputBody, { autoCommit: true}, (err: oracledb.DBError, result: oracledb.Result) => {
if(err) {
reject(err);
}
if(result) {
resolve(Utils.CreateJSONObject(result));
}
})
});
}
// SQL procedure call
PKG_ARCHIVAL_TOOL.P_RETENTION_POLICY_CREATE(
P_TABLE_OWNER => P_TABLE_OWNER,
P_TABLE_NAME => P_TABLE_NAME,
P_RETEN_DATA => V_DATA,
P_ID => V_ID,
P_OUT => V_OUT
);
P_RETEN_DATA is a table of a record:-
Record - TYPE R_RETENTION_POLICY_DEF IS RECORD(
COLUMN_NAME VARCHAR2(40) NOT NULL DEFAULT ' ',
COLUMN_POS NUMBER NOT NULL DEFAULT 1,
COLUMN_TYPE VARCHAR2(10) NOT NULL DEFAULT 'NUMBER',
OPERATOR VARCHAR2(10) NOT NULL DEFAULT '=',
GATE VARCHAR2(10) DEFAULT NULL,
BRAC_ST NUMBER DEFAULT 0,
BRAC_ED NUMBER DEFAULT 0
);
Table :- TYPE T_RETENTION_POLICY_DEF IS TABLE OF R_RETENTION_POLICY_DEF;
array_of_data = [["FNAME, 1, "VARCHAR2", ">", "OR", 0, 0], ["LNAME, 1, "VARCHAR2", "=", "AND", 0, 0]]

Binding to a record will only work in node-oracledb 4, which is under development here.
Your code may also have other issues (number of parameters in the PL/SQL call, trying to pass some kind of array? to a record etc).
The general solution with node-oracledb 3.1 is to use a wrapper PL/SQL block that you can bind permissible types into. This wrapper block then massages the values into a record and calls your target procedure, P_RETENTION_POLICY_CREATE.
Given this SQL:
set echo on
create or replace package rectest as
type rectype is record (name varchar2(40), pos number);
procedure myproc (p_in in rectype, p_out out rectype);
end rectest;
/
show errors
create or replace package body rectest as
procedure myproc (p_in in rectype, p_out out rectype) as
begin
p_out := p_in;
end;
end rectest;
/
show errors
You would call it like:
// Node-oracledb 3.1
'use strict';
const oracledb = require('oracledb');
const config = require('./dbconfig.js');
let sql, binds, options, result;
async function run() {
let connection;
try {
connection = await oracledb.getConnection(config);
sql =
`declare
i_r rectest.rectype; -- input record
o_r rectest.rectype; -- output record
begin
i_r.name := :i_nm;
i_r.pos := :i_ps;
rectest.myproc(i_r, o_r);
:o_nm := o_r.name;
:o_ps := o_r.pos;
end;`;
binds = [
{i_nm: 'def', i_ps: 456},
{i_nm: 'ghi', i_ps: 789},
];
const options = {
bindDefs:
{ i_nm: { type: oracledb.STRING, maxSize: 40 },
i_ps: { type: oracledb.NUMBER },
o_nm: { type: oracledb.STRING, maxSize: 40, dir: oracledb.BIND_OUT },
o_ps: { type: oracledb.NUMBER, dir: oracledb.BIND_OUT }
}
};
result = await connection.executeMany(sql, binds, options);
console.log(result);
} catch (err) {
console.error(err);
} finally {
if (connection) {
try {
await connection.close();
} catch (err) {
console.error(err);
}
}
}
}
run();
The output is
{
outBinds: [ { o_nm: 'def', o_ps: 456 }, { o_nm: 'ghi', o_ps: 789 } ]
}

Related

How to test mongoose methods using sinon fakes?

I have the following arrangement of tests using sinon, mocha and chai:
type ModelObject = {
name: string;
model: typeof Categoria | typeof Articulo | typeof Usuario;
fakeMultiple: () => object[];
fakeOne: (id?: string) => object;
}
const models: ModelObject[] = [
{
name: 'categorias',
model: Categoria,
fakeMultiple: () => fakeMultiple({ creator: oneCategoria }),
fakeOne: oneCategoria
},
{
name: 'articulos',
model: Articulo,
fakeMultiple: () => fakeMultiple({ creator: oneArticulo }),
fakeOne: oneArticulo
},
{
name: 'usuarios',
model: Usuario,
fakeMultiple: () => fakeMultiple({ creator: oneUsuario }),
fakeOne: oneUsuario
}
];
const randomModel = models[Math.floor(Math.random() * models.length)];
describe(`v1/${randomModel.name}`, function () {
this.afterEach(function () {
sinon.restore();
});
context.only("When requesting information from an endpoint, this should take the Model of the requested endpoint and query the database for all the elements of that model", function () {
it.only(`Should return a list of elements of ${randomModel.name} model`, function (done) {
const fakes = randomModel.fakeMultiple();
const findFake = sinon.fake.resolves({ [randomModel.name]: fakes });
sinon.replace(randomModel.model, 'find', findFake);
chai.request(app)
.get(`/api/v1/${randomModel.name}`)
.end(
(err, res) => {
expect(res).to.have.status(200);
expect(res.body.data).to.be.an('object');
expect(res.body.data).to.have.property(randomModel.name);
expect(res.body.data[randomModel.name]).to.have.lengthOf(fakes.length);
expect(findFake.calledOnce).to.be.true;
done();
}
)
});
}}
I use this to test an endpoint that arbitrary returns information about a given model. In my controllers, I'm using a dynamic middleware to determine which model is going to be queried, for example, if the route consumed is "api/v1/categorias", it will query for Categorias model. If the route consumed is "api/v1/articulos", it will query for Articulos model, and so on.
To make the query, i use the following service:
import { Articulo } from '../models/articulo';
import { Usuario } from '../models/usuario';
import { Categoria } from '../models/categoria';
import logger from '../config/logging';
import { Model } from 'mongoose';
const determineModel = (model: string): Model<any> => {
switch (model) {
case 'articulos':
return Articulo;
case 'usuarios':
return Usuario;
case 'categorias':
return Categoria;
default:
throw new Error(`Model ${model} not found`);
}
};
export const getInformation = async (schema: string, page: number, limit: number) => {
try {
const model = determineModel(schema);
const data = await model.find().skip((page - 1) * limit).limit(limit);
const dataLength = await model.find().countDocuments();
return {
data,
total: dataLength,
};
} catch (err) {
logger.error(err);
console.log(err);
throw err;
}
};
The problem here lies when running my tests, it seems that is unable to run the .skip() and .limit() methods for my model.find()
error: model.find(...).skip is not a function
TypeError: model.find(...).skip is not a function
I think that I need to fake those methods, because when running the same test without skip and limit, it works as a charm. My problem lies in the fact that I don't know how to fake those, or to see if my guess is correct.
As a note, I have default params for the variables page and limit (1 and 15 respectively) so I'm not passing empty values to the methods.

ERROR ORA-01008 using nodeOracledb with TS

i have one problem in result of my code, i tring consult one script ORACLE with nodejs using TS but i don't know why this error apear in my console i tring many ways to fix this error and i can't fix them, i hope your can help me whit this, bellow follow my code and screenshot of my error.
Controller
async bipagem(req: Request, res: Response) {
try {
let credentials = super.openToken(req);
let { p_fil_filial, p_set_cdgo, p_mini_fab, p_codigo_barra } = req.query;
let info = await this.rep.bipagem(
p_fil_filial as string,
p_set_cdgo as string,
p_mini_fab as string,
p_codigo_barra as string,
credentials as string
);
res.json(info);
} catch (error) {
catchErr(res, error);
}
}
Repository
public async bipagem(
p_fil_filial: string,
p_set_cdgo: string,
p_mini_fab: string,
p_codigo_barra: string,
userPool: string
) {
let conn;
try {
conn = await connection(userPool);
const resultado = await conn.execute(
`DECLARE
result SYS_REFCURSOR;
BEGIN
-- Call the function
:result := brio.pck_fab0024.bipagem(p_fil_filial => :p_fil_filial,
p_set_cdgo => :p_set_cdgo,
p_mini_fab => :p_mini_fab,
p_codigo_barra => :p_codigo_barra,
p_msg => :p_msg);
END;`,
{
p_fil_filial,
p_set_cdgo,
p_mini_fab,
p_codigo_barra,
p_msg: { type: oracledb.STRING, dir: oracledb.BIND_OUT },
}
);
return resultado;
} catch (erro) {
console.log(erro);
} finally {
if (conn) conn.close();
}
}
Screenshot error
ORA-01008 means "not all variables bound". It looks like you have 6 variables in your PL/SQL block, but only 5 variables that are assigned to those. :result is not bound.

Error: ORA-01008: not all variables bound

I have this error for a few hours and I can't identify the problem. Error: ORA-01008: not all variables bound.
Controller
async bipagem(req: Request, res: Response) {
try {
let credentials = super.openToken(req)
let { p_fil_filial, p_set_cdgo, p_mini_fab, p_codigo_barra } = req.query
let info = await this.rep.bipagem(
p_fil_filial as string,
p_set_cdgo as string,
p_mini_fab as string,
p_codigo_barra as string,
credentials as string
)
res.json(info)
} catch (error) {
catchErr(res, error)
}
}
}
Repository
public async bipagem(
p_fil_filial: string,
p_set_cdgo: string,
p_mini_fab: string,
p_codigo_barra: string,
userPool: string
) {
let conn
try {
conn = await connection(userPool)
const ret = await conn.execute(
`DECLARE
c_result SYS_REFCURSOR;
BEGIN
-- Call the function
:result := brio.pck_fab0024.bipagem(p_fil_filial => :p_fil_filial,
p_set_cdgo => :p_set_cdgo,
p_mini_fab => :p_mini_fab,
p_codigo_barra => :p_codigo_barra,
p_msg => :p_msg);
DBMS_SQL.RETURN_RESULT(c_result);
END;`,
{
p_fil_filial,
p_set_cdgo,
p_mini_fab,
p_codigo_barra,
p_msg: { type: oracledb.STRING, dir: oracledb.BIND_OUT }
}
)
return { ...(ret.outBinds as object), conteudo: ret.implicitResults[0] }
} catch (e) {
console.log('Erro na fab0024: ', e.message)
return {
p_fil_filial,
p_set_cdgo,
p_codigo_barra,
p_msg: '',
conteudo: []
}
} finally {
if (conn && typeof conn !== 'string') conn.close()
}
}
}
I tried to include the p_msg parameter and got this return error TS2339: Property 'bipagem' does not exist on type 'unknown'.
Your PL/SQL block has six bind parameters but you are passing only five values. Hence it is not a surprise that you get an error saying that one of the local variables isn't bound.
I think you have missed the fact that :result in the line below is also a bind parameter:
:result := brio.pck_fab0024.bipagem(p_fil_filial => :p_fil_filial,
I suspect you meant to assign the result to the local variable c_result (to which you don't currently assign any value) instead of an extra bind parameter:
c_result := brio.pck_fab0024.bipagem(p_fil_filial => :p_fil_filial,

Elasticsearch node js point in time search_phase_execution_exception

const body = {
query: {
geo_shape: {
geometry: {
relation: 'within',
shape: {
type: 'polygon',
coordinates: [$polygon],
},
},
},
},
pit: {
id: "t_yxAwEPZXNyaS1wYzYtMjAxN3IxFjZxU2RBTzNyUXhTUV9XbzhHSk9IZ3cAFjhlclRmRGFLUU5TVHZKNXZReUc3SWcAAAAAAAALmpMWQkNwYmVSeGVRaHU2aDFZZExFRjZXZwEWNnFTZEFPM3JReFNRX1dvOEdKT0hndwAA",
keep_alive: "1m",
},
};
Query fails with search_phase_execution_exception at onBody
Without pit query works fine but it's needed to retrieve more than 10000 hits
Well, using PIT in NodeJS ElasticSearch's client is not clear, or at least is not well documented. You can create a PIT using the client like:
const pitRes = await elastic.openPointInTime({
index: index,
keep_alive: "1m"
});
pit_id = pitRes.body.id;
But there is no way to use that pit_id in the search method, and it's not documented properly :S
BUT, you can use the scroll API as follows:
const scrollSearch = await elastic.helpers.scrollSearch({
index: index,
body: {
"size": 10000,
"query": {
"query_string": {
"fields": [ "vm_ref", "org", "vm" ],
"query": organization + moreQuery
},
"sort": [
{ "utc_date": "desc" }
]
}
}});
And then read the results as follows:
let res = [];
try {
for await (const result of scrollSearch) {
res.push(...result.body.hits.hits);
}
} catch (e) {
console.log(e);
}
I know that's not the exact answer to your question, but I hope it helps ;)
The usage of point-in-time for pagination of search results is now documented in ElasticSearch. You can find more or less detailed explanations here: Paginate search results
I prepared an example that may give an idea about how to implement the workflow, described in the documentation:
async function searchWithPointInTime(cluster, index, chunkSize, keepAlive) {
if (!chunkSize) {
chunkSize = 5000;
}
if (!keepAlive) {
keepAlive = "1m";
}
const client = new Client({ node: cluster });
let pointInTimeId = null;
let searchAfter = null;
try {
// Open point in time
pointInTimeId = (await client.openPointInTime({ index, keep_alive: keepAlive })).body.id;
// Query next chunk of data
while (true) {
const size = remained === null ? chunkSize : Math.min(remained, chunkSize);
const response = await client.search({
// Pay attention: no index here (because it will come from the point-in-time)
body: {
size: chunkSize,
track_total_hits: false, // This will make query faster
query: {
// (1) TODO: put any filter you need here (instead of match_all)
match_all: {},
},
pit: {
id: pointInTimeId,
keep_alive: keepAlive,
},
// Sorting should be by _shard_doc or at least include _shard_doc
sort: [{ _shard_doc: "desc" }],
// The next parameter is very important - it tells Elastic to bring us next portion
...(searchAfter !== null && { search_after: [searchAfter] }),
},
});
const { hits } = response.body.hits;
if (!hits || !hits.length) {
break; // No more data
}
for (hit of hits) {
// (2) TODO: Do whatever you need with results
}
// Check if we done reading the data
if (hits.length < size) {
break; // We finished reading all data
}
// Get next value for the 'search after' position
// by extracting the _shard_doc from the sort key of the last hit
searchAfter = hits[hits.length - 1].sort[0];
}
} catch (ex) {
console.error(ex);
} finally {
// Close point in time
if (pointInTime) {
await client.closePointInTime({ body: { id: pointInTime } });
}
}
}

Order of function execution in Node async/await

I have a doubt in making sure the functions run in sequence and the result from the first call is used in second call.
DB Function
async runquery(){
try {
...
const results = await db.statementExecPromisified(statement, []);
return results;
} catch (e) {
console.log("Error - " +JSON.stringify(e));
return e;
}
}
Group id
async function groupByID(approvers) {
const group = _.groupBy(approvers, 'ID');
return Object.keys(group).map(ID=> {
return group[ID].reduce((approvers, cur, idx) => ({
...approvers,
['NAME' + (idx + 1)]: cur.ID,
}), { ID});
})
Final Function
async function preparePayload() {
if(levels.length != 0 ){
let statement= `SELECT * FROM ITEMS `
list = await runquery(statement) ;
id = await groupByID(list) ;
}
let result={}
result.ID=id;
}
Output from DB : [{ID:1,NAME:'F1'},{ID:2,NAME:'F2'},{ID:1,NAME:'F3'}]
Expected Output : [{ID:1,NAME1:'F1',NAME2:'F2'},{ID:2,NAME:'F2'}]
But the output is not coming as expected.
[{
"ID": "2"
},
{
"ID": "1"
}]
I am guessing this is due to sequence of the function execution because when i do console.log(id) works as expected but when i assign to the result variable it gives unexpected output.
I am not sure if I put the question correctly .Kindly let me know if it needs to be more detail.
I don't see anything wrong with your code apart from two small changes
groupByID doesn't need to be async and hence don't need to be awaited
['NAME' + (idx + 1)]: cur.ID, needs to be changed to ['NAME' + (idx + 1)]: cur.NAME,
For simplicity I have removed the db call and replaced it with a promise which returns the result which you have mentioned in your post.
See the working code here - https://repl.it/repls/PepperyGrossForms
It gives me this result
{
ID: [ { ID: '1', NAME1: 'F1', NAME2: 'F3' }, { ID: '2', NAME1: 'F2' } ]
}
Hope this helps.

Resources