I am new to SQL. I need to create a procedure that returns a number of orders for each customer.
I have a 'customers' table and an 'orders' table. Both tables are in the same database "shop".
In the end I need to get an array of all customers and the number of orders was placed.
this is what I have, somthing I'm doing wrong, i have an error with wrong syntax
CREATE PRODIDURE customers.total_count(IN id INT)
BEGIN
SELECT COUNT
FROM orders AS order_count INNER JOIN as order_count
ON customers.id= orders.id
WHERE contacts.id = id
END
another question - can I call for the procedure within NodeJS?
pls advice the correct syntax for it
export function getCustomers(){
const [result] = await shop.execute( ??????? )
return result
}
pls help
To call SQL procedures from within NodeJS, you need to use an sql library. In this example from some of my own projects, I am using the MySQL library and connection pools.
const query = "CALL StoredProcedureName(?, ?);"
sql.query(query, [variable1, variable2], function(result, error) {
if (error) {
//do something with the error
return;
}
//do something here with result
})
The example I've given above avoids sql injection by replacing the question marks with the variables inside the brackets.
Related
I am setting up my first REST API to query a Postgres database I set up. I have two CRUD methods that currently work that query for all rows in a table and for rows where ID = something, respectively.
The problem I'm having occurs when trying to query when the request parameter is a String. Here is the error I'm getting:
error: invalid input syntax for type integer: "NaN"
Here is how I've set up my GET route and endpoint URL:
const getShowByTitle = (request, response) => {
const title = request.params.title
pool.query('SELECT * FROM show WHERE title = $1', [title], (error, results) => {
if (error) {
throw error
}
response.status(200).json(results.rows)
})
}
app.get('/show/:title', getShowByTitle)
Expected result is that sending a GET request using a show title as a parameter (String) returns a JSON response of just that show.
Any help or direction to some useful resources would be greatly appreciated. Thank you.
There are some issues here, first in SQL the name of the tables should be in plural "shows", second you are making the select without quotes, you need something like:
"SELECT * FROM show WHERE title = '$1'"
Third, since the user can use uppercase and down cases you need a more robust way to search for text using LIKE, ILIKE and ~, ~* operators.
https://www.2ndquadrant.com/en/blog/text-search-strategies-in-postgresql/
Fourth and more important, you are not filtering the string and you are at risk of suffering an SQL injection because "UPDATE admins SET password='newsom22th88';" is going to be executed in your database.
After some debugging my original code wasn't working because I had multiple verb-route matches. I solved the issue by creating a new unique verb-route match. Below is my complete code for querying using a String. Aarkerio's other points still hold; I need to alter my code to avoid SQL injection as well as create a more robust search.
const getShowByTitle = (request, response) => {
const title = request.params.title
pool.query('SELECT * FROM show WHERE title = $1', [title], (error, results) => {
if (error) {
throw error
}
response.status(200).json(results.rows)
})
}
I am trying to create table and insert data into oracle db using nodeJS
below is my query code
oracledb.getConnection(
{
user: userId,
password: password,
connectString: `(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = ${host})(PORT = ${port}))(CONNECT_DATA = (SID = ${sid})))`
},
function (err, connection) {
if (err) {
errorHandler(err);
return;
}
connection.execute(
query,
function (err, result) {
if (err) {
errorHandler(err);
oracleFunctions.connection.doRelease(connection, errorHandler);
return;
}
successHandler(result);
oracleFunctions.connection.doRelease(connection, errorHandler);
});
});
As a query I am using the below for creating table.
begin
execute immediate 'CREATE TABLE "table0" ("Channel" VARCHAR(128),
"Pay_mode" VARCHAR(128),
"Product_group" VARCHAR(128),
"PivotDimension" VARCHAR(128),
"Values" DECIMAL(18,6))';
execute immediate 'CREATE TABLE "table1" ("Channel" VARCHAR(128),
"Pay_mode" VARCHAR(128),
"Product_group" VARCHAR(128),
"PivotDimension" VARCHAR(128),
"Values" DECIMAL(18,6))';
end;
I am also using the below for inserting data into the created tables.
INSERT ALL
INTO "table0" VALUES ('AM' , 'PP' , 'NP ISP Annuity' , '1' , '0.11' )
INTO "table0" VALUES ('AM' , 'PP' , 'NP ISP Annuity' , '2' , '0.26' )
SELECT * from dual;
These queries work perfectly on Jetbrain's DataGrip.
I am executing the create table query first and when success response is returned, I am inserting rows into the tables.
However, strangely, when I execute the code, two errors occur
Error: ORA-00955: name is already used by an existing object
ORA-06512: at line 2
for creating table and
Error: ORA-00905: missing keyword
for inserting into the table.
Even more strangely, the table gets created fine but the insertion does not happen.
I have checked many times to make sure that the create table runs only once and insertion only runs once for each table. What could I possibly be doing wrong?
There is nothing strange here. You're asked to do doubtful things and Oracle tries to complete it, no more.
First of all, when using Oracle, you should almost never to create tables dinamically. ORA-955 means that you executed your request more than once. You created tables, ok. You asked to create it again. What answer you're waiting for? Oracle said that these tables already exists. Nothing strange, I guess.
ORA-905, most probably, means that you tries to execute select * from dual as a part of INSERT ALL statement. Oracle is waiting for next INTO statement but got SELECT instead of it, so it indicates syntax error. Again, nothing strange.
I am using pg-promise package with Nodejs to execute PostgreSQL queries. I want to see the queries executed. Only specific queries, say, just one query that I want to debug.
I can see that one recommended way is to use the pg-monitor to catch the events and log them as mentioned here in the examples documentation.
Without using pg-monitor, is there a simple way to just print the prepared query that is executed. I can't see it in the docs.
Example:
db.query("SELECT * FROM table WHERE id = $/id/", {id: 2})
How to print this query to yield?
SELECT * FROM table WHERE id = 2
is there a simple way to just print the prepared query that is executed...
A query in general - yes, see below. A Prepared Query - no, those are by definition formatted on the server-side.
const query = pgp.as.format('SELECT * FROM table WHERE id = $/id/', {id: 2});
console.log(query);
await db.any(query);
And if you want to print all queries executed by your module, without using pg-monitor, simply add event query handler when initializing the library:
const initOptions = {
query(e) {
console.log(e.query);
}
};
const pgp = require('pg-promise')(initOptions);
I have a Xamarin.Forms mobile app using Azure Easy Tables setup and working for all CRUD operations. I now need some node.js server-side functionality which will enable me to update increase a column's count in tableB from an insert script on tableA. i.e. where tableB.someId = tableA.someId
So far I have:
// INSERT into tableA
table.insert(function(context) {
logger.info('Running tableA.insert');
// get tableB
var tableB = azureMobileApps.tables.table('tableB');
// here i need to increase the noOfReviews column on tableB by one
............
I'm a complete starter to Azure node.js, can anyone help?
Iain
You can load up records from other tables with something similar to the following:
table.insert(function (context) {
var tableB = context.tables('tableB');
var tableBRecords = tableB.where({ id: 'someId' }).read()
.then(function (records) {
records[0].count++;
return tableB.update(records[0]);
})
.then(context.execute);
});
You can find API documentation for the context object at http://azure.github.io/azure-mobile-apps-node/global.html#context.
Hope this helps!
Based on my understanding, I think you can try to execute custom SQL statements to implement for increasing the tableB column which value count the related column in tableA, please see refer to the section "How to: Execute custom SQL statements" of the article https://azure.microsoft.com/en-us/documentation/articles/app-service-mobile-node-backend-how-to-use-server-sdk/#CustomAPI.
We're working on a Node/Express web app with a Postgres database, using the node-postgres package. We followed the instructions in this question, and have our query working written this way:
exports.getByFileNameAndColName = function query(data, cb) {
const values = data.columns.map(function map(item, index) {
return '$' + (index + 2);
});
const params = [];
params.push(data.fileName);
data.columns.forEach(function iterate(element) {
params.push(element);
});
db.query('SELECT * FROM columns ' +
'INNER JOIN files ON columns.files_id = files.fid ' +
'WHERE files.file_name = $1 AND columns.col_name IN (' + values.join(', ') + ')',
params, cb
);
};
data is an object containing a string fileName and an array of column names columns.
We want this query to extract information from our 'columns' and 'files' tables from a dynamic number of columns.
db.query takes as parameters (query, args, cb), where query is the SQL query, args is an array of parameters to pass into the query, and cb is the callback function executed with the database results.
So the code written in this way returns the correct data, but (we think) it's ugly. We've tried different ways of passing the parameters into the query, but this is the only format that has successfully returned data.
Is there a cleaner/simpler way to pass in our parameters? (e.g. any way to pass parameters in a way the node-postgres will accept without having to create an additional array from my array + non-array elements.)
Asking this because:
perhaps there's a better way to use the node-postgres package/we're using it incorrectly, and
if this is the correct way to solve this type of issue, then this code supplements the answer in the question referenced above.
Hello I tried to translate "but (we think) it's ugly" I believe my response answers your question.
In that same question you reference you will find this response
In which the user takes the pg-promise with special-case variable formatting
In your case it may look something like this using shared connection but in your example I would actually recommend using a plain db.query Im just using the shared connection to show you how i extended the "ugly":
exports.getByFileNameAndColName = function query(data,cb) {
var sco;
const params = [];
params.push(data.fileName);
data.columns.forEach(function iterate(element) {
params.push(element);
});
db.connect()
.then(function(obj){
sco=obj;
return sco.query('SELECT * FROM columns ' +
'INNER JOIN files ON columns.files_id = files.fid ' +
'WHERE files.file_name = $1 AND columns.col_name IN ($2^)',
pgp.as.csv(params)));
},function(reason){
console.log(reason);
})
.done(function(){
if(sco){
sco.done();
cb();
}
});
};
Now again I'm not sure what you meant by ugly but in my use case the return format was something like this:
{
column:[
{
id: data,
data: data,
col_name: data,
files_id: data,
fid: data,
files_name: data
},...
]
}
And in my case I really wanted this:
{
column:[
{
id: data,
data: data,
col_name: data,
files_id: data,
},...
],
file:[
{
fid: data,
files_name: data
},...
]
}
So in order to do that I took the same shared connection and added a extra variable to manage the results. Now this may not answer your question or I just might be on to something but I suggest looking into pg-promises it could be helpful for advance queries and formatting.
My question was asking if there was a way to use the node-postgres library in way that cleaned up our params creation code before the query. However, from the several deleted answers as well as the remaining one, it seems like we're being ornery and those few extra lines aren't that big of a deal and that this is the best way to write this code. So, I'm marking this question "answered," although now it appears that it wasn't the greatest question and perhaps we shouldn't have asked it in the first place.