postgres SELECT query returns unusable result - node.js

I have a simple SELECT query that is returning an unusable result. I am using pg-promise in node.js
[
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image.jpg,Image)"
},
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image2.jpg,Image 2)"
}
]
but I was expecting a basic json structure like
[
{
id: '',
title: '',
image: ''
},
{...etc}
]
Why is it doing this? How do I get a normalized result?
My query looks like the below:
CREATE OR REPLACE FUNCTION get_photos(
title_param TEXT
)
RETURNS TABLE(
id UUID,
image varchar(200),
title varchar(200)
) AS
$func$
BEGIN
RETURN QUERY SELECT
i.id,
i.image,
i.title
FROM images AS i
WHERE i.title = title_param;
END;
$func$ LANGUAGE PLPGSQL;
Here is my db conenctor setup, almost all defaults.
require('dotenv').config();
const Promise = require('bluebird');
const pg = require('pg-promise')({
promiseLib: Promise
});
const config = {
user: process.env.USER,
host: process.env.HOST,
database: process.env.DATABASE,
password: process.env.PASSWORD
};
const db = pg(config);
export default db;
Here is the express endpoint that is calling the function:
export const getData = async (req, res) => {
const { title } = req.query;
let data;
try {
data = await db.many('SELECT function_name($1)', [title]);
} catch (err) {
data = err;
}
res.send(data);
};
EDIT
I ran the query manually instead of through a function and the data returned correctly which means that there is an issue with my TABLE() return. What could possibly cause this issue?
images = await db.many(`
SELECT
p.id,
p.img,
p.type,
p.title
FROM photos p
WHERE p.type = '${type}';
`, [type]);

Because the function is defined as returning a table, you need to use it like a table:
SELECT * FROM function_name($1)

Use func as the query method:
data = await db.func('function_name', [title]);
It assumes you return a table, and so will work for you by default.
And for stored procedures, there's proc method.
Also, your parameter formatting for the images query is wrong, see Named Parameters:
IMPORTANT: Never use the reserved ${} syntax inside ES6 template strings ...

Related

How to send paginated result as response after performing find operation in Mongodb?

I have this query to display in a table on frontend so I used paginate which is working fine
tableSchema.statics.getn = (query, options) => {
return mongoose.model(MODEL_NAME).paginate(query, options);
};
But when I am trying to perform search query then I am unable to perform paginate on that. Is there any way to send response as paginated form to all the searched queries
I tried following code
tableSchema.statics.search = query => {
const Id = Number(query);
const isNumeric = value => /^\d+$/.test(value);
if (!isNumeric(query)) {
if (query.includes("#")) {
const regex = new RegExp(query, "i");
return mongoose.model(MODEL_NAME).find({ "recipies.to": regex }).paginate(query);
}
return mongoose.model(MODEL_NAME).find({ "temp.name": query });
}
return mongoose.model(MODEL_NAME).find({ recipies: { Id } });
};
It is throwing me error that paginate is not a function. I tried storing find query result in object then performed paginate still it was not working.
I am using "mongoose-paginate-v2" for pagination
Hi I think you missed to add pagination pluging in model section.
const mongoose = require('mongoose');
const mongoosePaginate = require('mongoose-paginate-v2');
const mySchema = new mongoose.Schema({
/* your schema definition */
});
mySchema.plugin(mongoosePaginate);
const myModel = mongoose.model('SampleModel', mySchema);
myModel.paginate().then({}); // Usage
You need to add mongoosePaginate in model as plugin.
let options = {
sort: { createdOn: 1 },
page: 1,
limit: 10
};
ModelName.paginate({ 'recipies.to': 'value' }, options, function (err, result) {
if (err) {
console.log(err);
} else {
// Here you will get paginate array please console and check
console.log(result);
}

How to use MongoDB $ne on nested object property

I have a node API which connects to a mongoDB through mongoose. I am creating an advanced results middleware that enabled selecting, filtering, sorting, pagination etc. based on a Brad Traversy course Node.js API Masterclass With Express & MongoDB. This is all good.
I am adapting the code from the course to be able to use the $ne (not equal) operator and I want to be able to get a model that is not equal to a nested property (user id) of the model. I am using this for an explore feature to see a list of things, but I don't want to show the user their own things. I am having trouble figuring out how to access the id property.
********************* UPDATE *********************
It seems all the documentation I've read recommends writing const injected like this:
const injected = {
'user._id': { "$ne": req.user.id }
};
but for some reason it is not working. I can query top level properties that are just a plain string value like this:
const injected = {
access: { "$ne": "public" }
};
but not a property on an object. Does anyone know why? Is it because the property I want to query is an id? I've also tried:
const injected = {
'user._id': { "$ne": mongoose.Types.ObjectId(req.user.id) }
};
which also does not work...
So the model looks like this:
{
name: 'Awesome post',
access: 'public',
user: {
_id: '2425635463456241345', // property I want to access
}
}
then the actual advanced results middleware looks like this and it's the 'injected' object where I am trying to access id. In the course brad uses this syntax to use lte (/?averageCost[lte]=10000) but I do not get any results with my ne. Can anyone help me here?
const advancedResults = (model, populate) => async (req, res, next) => {
let query;
const injected = {
access: 'public',
'user._id[ne]': req.user.id, // I don't think user._id[ne] is correct
};
}
// Copy req.query
const reqQuery = { ...req.query, ...injected };
console.log('injected: ', injected);
// Fields to exclude
const removeFields = ['select', 'sort', 'page', 'limit'];
// Loop over removeFields and delete them from reqQuery
removeFields.forEach(param => delete reqQuery[param]);
// Create query string
let queryStr = JSON.stringify(reqQuery);
// Create operators ($gt, $gte, etc)
queryStr = queryStr.replace(/\b(gt|gte|lt|lte|in|ne)\b/g, match => `$${match}`);
// Finding resource and remove version
query = model.find(JSON.parse(queryStr)).select('-__v');
// Select Fields
if (req.query.select) {
const fields = req.query.select.split(',').join(' ');
query = query.select(fields);
}
// Sort
if (req.query.sort) {
const sortBy = req.query.sort.split(',').join(' ');
query = query.sort(sortBy);
} else {
query = query.sort('-createdAt');
}
// Pagination
const page = parseInt(req.query.page, 10) || 1;
const limit = parseInt(req.query.limit, 10) || 25;
const startIndex = (page - 1) * limit;
const endIndex = page * limit;
const total = await model.countDocuments(JSON.parse(queryStr));
query = query.skip(startIndex).limit(limit);
if (populate) {
query = query.populate(populate);
}
// Executing query
const results = await query;
// Pagination result
const pagination = {};
if (endIndex < total) {
pagination.next = {
page: page + 1,
limit,
};
}
if (startIndex > 0) {
pagination.prev = {
page: page - 1,
limit,
};
}
res.advancedResults = {
success: true,
count: results.length,
pagination,
data: results,
};
next();
};
module.exports = advancedResults;
Answering your question about how to use $ne:
The use of $ne is as follows:
"field":{
"$ne": yourValue
}
Into your query should be like:
"user._id": {
"$ne": req.user.id
}
Example here
$ne operator will return all document where the field value don't match with the given value.
As you have done, to acces the nested field is necessary use the dot notation.
Also, to ensure it works, if your schema defines _id as ObjectId maybe is necessary parse req.user.id to ObjectId.
But if in your schema is a string then should works.
So try (not tested at all):
const injected = {
'user._id': { "$ne": req.user.id }
};

nodejs teradata returns "nodeJava_java_math_BigDecimal{}" instead of decimal value in the resultset?

i am running a simple select query. the code runs fine and its returning the resultset as well showing all the column values which are in character format but showing "nodeJava_java_math_BigDecimal{}" insted of decimal format columns ?
var Teradata = require('node-teradata');
var config = {
url: 'jdbc:teradata://abc.com/database=abab',
username: '****',
password: '****',
driver: './jars/',
minPoolSize: 1,
maxPoolSize: 100,
keepalive: {
interval: 60000,
query: 'SELECT 1',
enabled: true
}
};
var teradata = new Teradata(config);
var sql = "select name,QTY from products where id='700018'";
return teradata.read(sql)
.then(function(response) {
console.log(response);
});
the result its printing on console is:
[{name:'Apple Juice',QTY:nodeJava_java_math_BigDecimal{}}]
You can retype the properties of the returned object with the JavaScript types you know you'll be working with, using methods like Number([...]) or .toString()
return teradata.read(sql)
.then(function(response) {
return response.map(respObj => objExtractor(respObj));
});
function objExtractor(teradataObj) {
return {
name: teradataObj.name.toString(),
QTY: Number(teradataObj.QTY).toFixed(0)
}
}
Quoting CraZySacX from here:
When you get back a node-java wrapped object
(nodeJava_java_math_BigDecimal), you have access to all of the
functions exposed by the Java API in both a sync and async form.
For example, the Java 7 BigDecimal API has a function called intValue
Which in your case would be:
return teradata.read(sql)
.then(function(response) {
response[0].QTY.intValue(function(err, intVal){
var newQTY = intVal;
console.log(newQTY);
}
});

Call a function on each object in an array in Node

I'm trying to batch together a database update to re-populate a mongo collection. I've created an object to hold the properties needed to lookup the data from an external source, and then add it back to a MongoDb collection.
The array looks like this:
const pops = [
{ table: 'SFAccounts',
label: 'Account__c',
createListName: 'Accounts'
},
{ table: 'SFTimes',
label: 'CusTime__c',
createListName: 'Time'
}]
I want to then create a function that takes 'table', 'label, and 'createListName' and it does something basically like this..
async function processData(table, label, createListName) {
// Get some info from Salesforce
const dataFromSF = await getMetaDataFromSalesForce(table)
// Extract the parts I actually need
const relevantBits = dataFromSF.filter(field => field.name === label)
//Create a new list in the db
const createResult = await List.create( { name: createListName, values: relevantBits } )
return createResult
}
The end goal is to get to something like
await Promise.all(processData(pops))
Which will await all the tables being pulled and populated into the database.
If you change the args of processData:
async function processData({table, label, createListName}) {
// Get some info from Salesforce
const dataFromSF = await getMetaDataFromSalesForce(table)
// Extract the parts I actually need
const relevantBits = dataFromSF.filter(field => field.name === label)
//Create a new list in the db
const createResult = await List.create( { name: createListName, values: relevantBits } )
return createResult
}
it's just await Promise.all(pops.map(processData));

Omiting column names / inserting objects directly into node-postgres

I'd like to pass dictionaries with column names as keys, thus avoiding declaring the column names within the query itself (typing them directly).
Assume I have a table User with 2 column names:
idUser(INT)
fullName(VARCHAR)
To create a record using node-postgres, I'll need to declare within the query the column names like so:
var idUser = 2;
var fullName = "John Doe";
var query = 'INSERT INTO User(idUser, age) VALUES ($1, $2)';
database.query(query, [idUser, fullName], function(error, result) {
callback(error, result.rows);
database.end();
});
I'd prefer if there was a way to just pass a dictionary & have it infer the column names from the keys - If there's an easy trick I'd like to hear it.
E.g something like this:
var values = {
idUser : 2,
fullName: "John Doe"
};
var query = 'INSERT INTO User VALUES ($1)';
database.query(query, [values], function(error, result) {
callback(error, result.rows);
database.end();
});
A complete example of doing it with pg-promise:
const pgp = require('pg-promise')(/*options*/);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
const values = {
idUser: 2,
fullName: 'John Doe'
};
// generating the insert query:
const query = pgp.helpers.insert(values, null, 'User');
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
db.none(query)
.then(data => {
// success;
})
.catch(error => {
// error;
});
And with focus on high performance it would change to this:
// generating a set of columns from the object (only once):
const cs = new pgp.helpers.ColumnSet(values, {table: 'User'});
// generating the insert query:
const query = pgp.helpers.insert(values, cs);
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
There's no support for key-value values in the insert statement, so it can not be done with native sql.
However, the node-postgres extras page mentions multiple sql generation tools, and for example Squel.js parameters can be used to construct sql in a way very close like what you're looking for:
squel.insert()
.into("User")
.setFieldsRows([
{ idUser: 2, fullName: "John Doe" }
])
.toParam()
// => { text: 'INSERT INTO User (idUser, fullName) VALUES (?, ?)',
// values: [ 2, 'John Doe' ] }
My case was a bit special as I had a field named order in the JSON object which is a keyword in SQL. Therefore I had to wrap everything in quotes using a JSONify() function.
Also note the numberedParameters argument as well as the double quotes around the 'Messages' string.
import { pool } from './connection';
function JSONify(obj: Map<string, any>) {
var o = {};
for (var i in obj) {
o['"' + i + '"'] = obj[i]; // make the quotes
}
return o;
}
// I have a table named "Messages" with the columns order and name
// I also supply the createdAt and updatedAt timestamps just in case
const messages = [
{
order: 0,
name: 'Message with index 0',
createdAt: new Date().toISOString(),
updatedAt: new Date().toISOString(),
}
]
// Create the insert statement
const insertStatement = insert({ numberedParameters: true })
.into('"Messages"')
.setFieldsRows(messages.map((message) => JSONify(message)))
.toParam();
console.log(insertStatement);
// Notice the quotes wrapping the table and column names
// => { text: 'INSERT INTO "Messages" ("order", "name", "createdAt", "updatedAt") VALUES ($1, $2, $3, $4)',
// values: [ 0, 'Message with index 0', '2022-07-22T13:51:27.679Z', '2022-07-22T13:51:27.679Z' ] }
// Create
await pool.query(insertStatement.text, insertStatement.values);
See the Squel documentation for more details.
And this is how I create the pool object if anyone is curious.
import { Pool } from 'pg';
import { DB_CONFIG } from './config';
export const pool = new Pool({
user: DB_CONFIG[process.env.NODE_ENV].username,
host: DB_CONFIG[process.env.NODE_ENV].host,
database: DB_CONFIG[process.env.NODE_ENV].database,
password: DB_CONFIG[process.env.NODE_ENV].password,
port: DB_CONFIG[process.env.NODE_ENV].port,
});

Resources