Nodejs Inserting multiple rows into postgres errors on data - node.js

I have a script reads data from a json file and inserts them into postgres. Well, that's what it's supposed to do.
The json is read properly and I can see the data as an array in the variables. The connection is made to postgres successfully.
Then I get this error:
syntax error at or near "'{"Archived":"false","ClientEmail":"imaclientcompany#gmail.com","ClientId":52,"ClientName":"Ima Client","DateCreated":1637074825658,"DateSubmitted":1637076927912,"ExternalClientId":"null","Id":"8b9391c0-00af-4710-9481-e0e33ddea546","Practitioner":"bob#jonesperformancecenter.com","PractitionerId":"57b49e3d12cd2f144cebb405","PractitionerName":"Bob Jones","QuestionnaireId":"612524e13ccc040f58dd134e","QuestionnaireName":"PTS Pre Session Form","Status":"Completed"}'" at character 224.
The details of the error:
length: 566,
severity: 'ERROR',
code: '42601',
detail: undefined,
hint: undefined,
position: '224',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'scan.l',
line: '1145',
routine: 'scanner_yyerror'
Character 224 is in the middle of the Id field.
The 42601 error is a syntax with no hint or detail. When I google 'scanner_yyerror' all the hits refer to unescaped single quotes. There are no single quotes in the data.
Here is the script:
const pg = require('pg');
const { Pool } = require('pg');
const format = require('pg-format');
const fs = require('fs');
let rawintakes = fs.readFileSync('./data/intakes.json', 'utf8');
let intakes = JSON.parse(rawintakes);
let query1 = format('INSERT INTO intake_summary (Archived, ClientEmail, ClientId, ClientName, DateCreated, DateSubmitted, ExternalClientId, Id, Practitioner, PractitionerId, PractitionerName, QuestionnaireId, QuestionnaireName, Status) VALUES %L returning id', intakes);
async function run() {
let client;
try {
client = new pg.Pool({
connectionString: 'postgres://postres:reallystrongpassword#localhost:5432/cldba01'
});
await client.connect();
let {rows} = await client.query(query1, intakes);
console.log(rows);
} catch (e) {
console.error(e);
} finally {
client.end();
}
}
run();
I can input the same data using SQL without a problem. I have deleted the first record and the same problem occurs on the second one at the same position 224.
I've looked at the query and I don't see a syntax error there either.
Any ideas?

My advice is to use SEQUELIZE instead of SQL , it is far easier and more clear.

Related

postgres SELECT query returns unusable result

I have a simple SELECT query that is returning an unusable result. I am using pg-promise in node.js
[
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image.jpg,Image)"
},
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image2.jpg,Image 2)"
}
]
but I was expecting a basic json structure like
[
{
id: '',
title: '',
image: ''
},
{...etc}
]
Why is it doing this? How do I get a normalized result?
My query looks like the below:
CREATE OR REPLACE FUNCTION get_photos(
title_param TEXT
)
RETURNS TABLE(
id UUID,
image varchar(200),
title varchar(200)
) AS
$func$
BEGIN
RETURN QUERY SELECT
i.id,
i.image,
i.title
FROM images AS i
WHERE i.title = title_param;
END;
$func$ LANGUAGE PLPGSQL;
Here is my db conenctor setup, almost all defaults.
require('dotenv').config();
const Promise = require('bluebird');
const pg = require('pg-promise')({
promiseLib: Promise
});
const config = {
user: process.env.USER,
host: process.env.HOST,
database: process.env.DATABASE,
password: process.env.PASSWORD
};
const db = pg(config);
export default db;
Here is the express endpoint that is calling the function:
export const getData = async (req, res) => {
const { title } = req.query;
let data;
try {
data = await db.many('SELECT function_name($1)', [title]);
} catch (err) {
data = err;
}
res.send(data);
};
EDIT
I ran the query manually instead of through a function and the data returned correctly which means that there is an issue with my TABLE() return. What could possibly cause this issue?
images = await db.many(`
SELECT
p.id,
p.img,
p.type,
p.title
FROM photos p
WHERE p.type = '${type}';
`, [type]);
Because the function is defined as returning a table, you need to use it like a table:
SELECT * FROM function_name($1)
Use func as the query method:
data = await db.func('function_name', [title]);
It assumes you return a table, and so will work for you by default.
And for stored procedures, there's proc method.
Also, your parameter formatting for the images query is wrong, see Named Parameters:
IMPORTANT: Never use the reserved ${} syntax inside ES6 template strings ...

No database tables are found

I have a sqlite database and with the programm sqliteStudio I can connect and save things. My Problem is the connection to the database with node.js with the npm package "sqlite". Every time I want to do a query it can't find the table. My code:
const connection = await sqlite.open('./db.sqlite');
const data = await connection.run("select * from item")
After that with console.log(connection), I get the following message with looks good in my opinion
driver: Database { open: true, filename: './db.sqlite', mode: 65542 },
Promise: [Function: Promise] }
But the output of console.log(data) is always a error message:
{ Error: SQLITE_ERROR: no such table: item errno: 1, code: 'SQLITE_ERROR' }
I think it is a problem with async/await or with my .sqlite file but I don't know
you are right, you have to use the arrow-function to wait until database is open, see here:
// database is not existing
console.log('open database and create table');
db = new sqlite3.Database('./event.db',() => {
db.run('CREATE TABLE logTable(logfile TEXT,logdate TEXT,referto TEXT, area TEXT,status TEXT,action TEXT)',() => {
...
});
// if db is existing
db = new sqlite3.Database('./event.db'() => {
...
});

nodejs teradata returns "nodeJava_java_math_BigDecimal{}" instead of decimal value in the resultset?

i am running a simple select query. the code runs fine and its returning the resultset as well showing all the column values which are in character format but showing "nodeJava_java_math_BigDecimal{}" insted of decimal format columns ?
var Teradata = require('node-teradata');
var config = {
url: 'jdbc:teradata://abc.com/database=abab',
username: '****',
password: '****',
driver: './jars/',
minPoolSize: 1,
maxPoolSize: 100,
keepalive: {
interval: 60000,
query: 'SELECT 1',
enabled: true
}
};
var teradata = new Teradata(config);
var sql = "select name,QTY from products where id='700018'";
return teradata.read(sql)
.then(function(response) {
console.log(response);
});
the result its printing on console is:
[{name:'Apple Juice',QTY:nodeJava_java_math_BigDecimal{}}]
You can retype the properties of the returned object with the JavaScript types you know you'll be working with, using methods like Number([...]) or .toString()
return teradata.read(sql)
.then(function(response) {
return response.map(respObj => objExtractor(respObj));
});
function objExtractor(teradataObj) {
return {
name: teradataObj.name.toString(),
QTY: Number(teradataObj.QTY).toFixed(0)
}
}
Quoting CraZySacX from here:
When you get back a node-java wrapped object
(nodeJava_java_math_BigDecimal), you have access to all of the
functions exposed by the Java API in both a sync and async form.
For example, the Java 7 BigDecimal API has a function called intValue
Which in your case would be:
return teradata.read(sql)
.then(function(response) {
response[0].QTY.intValue(function(err, intVal){
var newQTY = intVal;
console.log(newQTY);
}
});

nodeJS inserting Data into PostgreSQL error

I have a weird error using NodeJS with a PostgreSQL and I hope you can maybe help me out.
I have a huge amount of data sets, about 2 Million entries that I want to insert into my DB.
One data consists of 4 columns:
id: string,
points: float[][]
mid: float[]
occurences: json[]
I am inserting data like so:
let pgp = require('pg-promise')(options);
let connectionString = 'postgres://archiv:archiv#localhost:5432/fotoarchivDB';
let db = pgp(connectionString);
cityNet.forEach((arr) => {
db
.none(
"INSERT INTO currentcitynet(id,points,mid,occurences) VALUES $1",
Inserts("${id},${points}::double precision[],${mid}::double precision[],${occurences}::json[]",arr))
.then(data => {
//success
})
.catch(error => {
console.log(error);
//error
});
})
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this._rawDBType = true;
this.formatDBType = function() {
return data.map(d => "(" + pgp.as.format(template, d) + ")").join(",");
};
This works out for exactly for the first 309248 data pieces, then suddenly it just errors out with the following for (what it seems like) every next data it tries to insert:
{ error: syntax error at end of input
at Connection.parseE (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:539:11)
at Connection.parseMessage (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:366:17)
at Socket.<anonymous> (/home/christian/Masterarbeit_reworked/projekt/server/node_modules/pg-promise/node_modules/pg/lib/connection.js:105:22)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at Socket.Readable.push (_stream_readable.js:134:10)
at TCP.onread (net.js:548:20)
name: 'error',
length: 88,
severity: 'ERROR',
code: '42601',
detail: undefined,
hint: undefined,
position: '326824',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'scan.l',
line: '1074',
routine: 'scanner_yyerror' }
The 'position' entry changes for every iterating error-message.
I can redo that and it will always error after 309248 entries.
When I try to insert less, like 1000 entries, the error does not occur.
That really confuses me. I thought PostgreSQL does not have any max amount of rows. Also the error message does not help me at all.
SOLVED
The error was found. In my data there were "null" entries that have slipped into it. Filtering out null-data worked out.
I will try out the other recommendations for inserting data, since the current way works, but the performance is very crappy.
I'm the author of pg-promise. Your whole approach should be changed to the one below.
Proper way to do massive inserts via pg-promise:
const pgp = require('pg-promise')({
capSQL: true
});
const db = pgp(/*connection details*/);
var cs = new pgp.helpers.ColumnSet([
'id',
{name: 'points', cast: 'double precision[]'},
{name: 'mid', cast: 'double precision[]'},
{name: 'occurences', cast: 'json[]'}
], {table: 'currentcitynet'});
function getNextInsertBatch(index) {
// retrieves the next data batch, according to the index, and returns it
// as an array of objects. A normal batch size: 1000 - 10,000 objects,
// depending on the size of the objects.
//
// returns null when there is no more data left.
}
db.tx('massive-insert', t => {
return t.sequence(index => {
const data = getNextInsertBatch(index);
if (data) {
const inserts = pgp.helpers.insert(data, cs);
return t.none(inserts);
}
});
})
.then(data => {
console.log('Total batches:', data.total, ', Duration:', data.duration);
})
.catch(error => {
console.log(error);
});
UPDATE
And if getNextInsertBatch can only get the data asynchronously, then return a promise from it, and update the sequence->source callback accordingly:
return t.sequence(index => {
return getNextInsertBatch(index)
.then(data => {
if (data) {
const inserts = pgp.helpers.insert(data, cs);
return t.none(inserts);
}
});
});
Related Links:
tx
sequence / spex.sequence
ColumnSet
Multi-row insert with pg-promise
I'm not sure, but it looks like you got wrong data structure at the last element(309249) and PostgreSQL cannot parse some property

Bind message supplies 1 parameters, but prepared statement "" requires 2

I have a database goods with two columns id jsonb primary_key and name.
Using this query:
const query = 'INSERT INTO "goods" (id, name) VALUES ($1, $2)'
together with the following data:
const data = {id: 1, name: "milk"};
gives me the following error:
{ [error: bind message supplies 1 parameters, but prepared statement "" requires 2]
name: 'error',
length: 130,
severity: 'ERROR',
code: '08P01',
detail: undefined,
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'postgres.c',
line: '1556',
routine: 'exec_bind_message' }
I have a postgres database set up, connected via pg.Pool() and executing javascript to insert my data.
Edit:
This is how I prepare my query:
pool.query(query, [data]).then(() => {
console.log("ok")
})
.catch(error => {
console.log(error)
});
Edit2:
Using the following:
const query = 'INSERT INTO "goods" (id, name) VALUES ($1, $2)'
const data = JSON.stringify([1, "milk"]);
pool.query(query, data).then(() => {
console.log("ok")
})
.catch(error => {
console.log(error)
});
Just spits out the following error: [TypeError: self.values.map is not a function]
As per docs, parameters must be JavaScript object (which is array). So you don't need to stringify data
Try this:
const query = 'INSERT INTO goods (id, name) VALUES ($1, $2)'
const data = [1, "milk"];
pool.query(query, data).then(....)
Or
pool.query({
text: 'INSERT INTO goods (id, name) VALUES ($1, $2)',
values: [1, 'milk']
}).then(...)
As per documentation, a Prepared Statement expects an array of values, not an object with properties, i.e. your data must be: const data = [1, "milk"];
I had the same problem using slonik.
Don't use interpolation
Don't do this!
connection.query(sql`
SELECT 1
FROM foo
WHERE bar = ${baz}
`);
Use value placeholders
Do this - wrap variable with single quote
connection.query(sql`
SELECT 1
FROM foo
WHERE bar = ${'baz'}
`);

Resources