Does a dynamic prepared statement makes sense? - node.js

I want to create dynamic prepared statements, that every part is dynamic, the values, the table and the WHERE part.
I use nodejs + PostgreSQL and the pg module to talk to the PostgreSQL. The pg module offers a different syntax to go along with node.js , but I guess the principles are the same. This is based to the official example here
//dynamic that can change
let select = 'name , email, age';
let table = 'user';
let where = 'id=$1 AND gender=$2';
let values = [1,'female'];
//prepare
const query = {
// give the query a unique name
name: 'fetch-user',
text: 'SELECT' + select + 'FROM' + table + 'WHERE' + where,
values: values
}
//execute
client.query(query)
.then(res => console.log(res.rows[0]))
.catch(e => console.error(e.stack))
I was wondering if this will make sense , performance-wise.
I red the documentation and , what I understand is that by having all the parts of a prepared statement dynamic, then the planning may be not so effective , or not effective at all.
What should I do? Should I keep this dynamic syntax? Or it doesn't make any sense, so I have to create multiple prepared statements and use them for different tables?
Thanks

There should be no performance issues here. The "dynamic" part of your SQL is just a string you're passing into the query object, so the only performance to consider is resolving the text property. You're passing your database a fully prepared statement; it's nodejs that is resolving the different variables to come up with the query object's text property.

Related

Getting "Please rebuild this data combination" on a computer but not on another one

This is my first try at using the Power Query... I've build a "dynamic" query in which I can change the retrieved fields as well as the filtering fields and values to be used by the query.
It's working perfectly on my computer but as soon as I try to execute it on another computer, I get the "Please rebuild this data combination" error. I saw some post saying I'll have to kind of split my query but I have not been able to figure it out.
Here is what my 2 tables look like:
Condition and fields selection
and here is my Query with the error:
Query
This might not be very elegant, but it allow me, thru a VBA script, to generate the list of fields to be retrieved and to generate the condition to be used by the SQL.
Any idea why it's not working on the other computers or how to improved the solution I'm using?
Thank you!
Notes:
Hi, all my Privacy Level are already set to 'None'.
I've tried to parametrize my code but I can't figure how. The Where condition is dynamic: it could be Where Number = "1234" but in other condition, the where might be like: 'Where Assignee = "xyz"'.
Here is a simplified example of my code:
let
Source = Sql.Database("xxxx", "yyyy", [Query=
"Select network, testid
from CM3T1M1 "
& paramConditions[Conditions]{0} &
" "])
in
Source
rebuild query, Formula.Firewall
That's a feature to prevent prevent accidentally leaking data. You can change the privacy level to ignore it
See also: docs.microsoft/dataprivacyfirewall
Is the dynamic query inserting those cells into the SQL query ? Report Parameters are nice for letting the user change variables without having to re-edit the query.
Parameterized native SQL queries
from: https://blog.crossjoin.co.uk/2016/12/11/passing-parameters-to-sql-queries-with-value-nativequery-in-power-query-and-power-bi/
let
Source = Sql.Database("localhost", "Adventure Works DW"),
Test = Value.NativeQuery(
Source,
"SELECT * FROM DimDate
WHERE EnglishMonthName=#MonthName AND
EnglishDayNameOfWeek=#DayName",
[
MonthName = "March",
DayName = "Tuesday"
]
)
in
Test
Dynamic Power Query version of SQL Query
To dynamically generate this SQL Query
select NUMBER, REQUESTED_BY from SourceTable
where NUMBER = 404115
Table.SelectRows is your Where.
SelectColumns is your select
let
Source = ...,
filterByNum = 404115,
columnNames = {"NUMBER", "REQUESTED_BY"},
removedColumns = Table.SelectColumns(
Source, columnNames, MissingField.Error
),
// I used 'MissingField.Error' so you know right away
// if there's a typo or bug
// assuming you are comparing Source[NUMBER]
filteredTable = Table.SelectRows(
Source, each [NUMBER] = filterByNum
)
in
filteredTable

Kentico 10 ObjectQuery join multiple tables

I am basically trying to run a query that gives me all the Users that have purchased a product with a particular SKU. Essentially this SQL here:
SELECT u.FirstName, u.LastName, u.Email
FROM COM_OrderItem oi INNER JOIN COM_Order o ON oi.OrderItemOrderID = o.OrderID
INNER JOIN COM_Customer c ON o.OrderCustomerID = c.CustomerID
INNER JOIN CMS_User u ON c.CustomerUserID = u.UserID
WHERE oi.OrderItemSKUID = 1013
I was trying to use the ObjectQuery API to try and achieve this but have no idea how to do this. The documentation here does not cover the specific type of scenario I am looking for. I came up with this just to try and see if it works but I don't get the three columns I am after in the result:
var test = OrderItemInfoProvider
.GetOrderItems()
.Source(orderItems => orderItems.Join<OrderInfo>("OrderItemOrderID", "OrderID"))
.Source(orders => orders.Join<CustomerInfo>("OrderCustomerID", "CustomerID"))
.Source(customers => customers.Join<UserInfo>("CustomerUserID", "UserID"))
.WhereEquals("OrderItemSKUID", 1013).Columns("FirstName", "LastName", "Email").Result;
I know this is definitely wrong and I would like to know the right way to achieve this. Perhaps using ObjectQuery is not the right approach here or maybe I can somehow just use raw SQL. I simply don't know enough about Kentico to understand the best approach here.
Actually, the ObjectQuery you created is correct. I tested it and it is providing the correct results. Are you sure that there are indeed orders in the system, which contain a product with SKUID 1013 (you can check that in the COM_OrderItem database table)?
Also, how are you accessing the results? Iterating through the results should look like this:
foreach (DataRow row in test.Tables[0].Rows)
{
string firstName = ValidationHelper.GetString(row["FirstName"], "");
string lastName = ValidationHelper.GetString(row["LastName"], "");
string email = ValidationHelper.GetString(row["Email"], "");
}

Flask-AppBuilder equivalent of SQLite WHERE clause to filter column data

I'm new to Flask and have started designing a front end for an inventory management database using Flask-AppBuilder.
I have created several models and have have managed to display my sqlite data in tables using Flask-AppBuilder's views.
However, I don't seem to be able to find the equivalent of SQLite WHERE clause to filter or "restrict" column data. I've been reading a lot about sqlalchemy, filters, queries but this has left me more confused that anything else and the explanations seem to be extremely elaborate and complicated to do something which is extremely simple.
Assuming we reproduce the following SQLite query in Flask-AppBuilder:
SELECT Field_A
FROM Table_A
WHERE Field_A = 'some text'
with:
result = session.query(Table_A).filter_by(Field_A = 'some text').all()
Where does the above line of code go in my app?
Considering I have the following Class:
class Table_A(Model):
id = Column(Integer, primary_key=True)
Field_A = Column(String)
def __repr__(self):
return self
and View:
class Table_AView(ModelView):
datamodel = SQLAInterface(Table_AView)
label_columns = {'Field_A':'A'}
list_columns = ['Field_A']
After much digging flask-appbuilder uses it's own filterclass in order to enable you to filter your views.
All the classes are referenced here on GitHub:
Flask Filter Clases List
Also not the difference between FilterEqual and FilterEqualFunction here:
What is the difference between : FilterEqual and FilterEqualFunction?
For other customisation and first port of call of Flask-appbuilder go straight to the API Reference where you'll find a couple of examples of the filterclass in action.
In essence it is extremely simple. In your views.py code within the ModelView class you want to filter simply add base_filters = [['field_A', FilterEqual, 'abc']] like so:
`class Table_AView(ModelView):
datamodel = SQLAInterface(Table_AView)
label_columns = {'Field_A':'A'}
list_columns = ['Field_A']
base_filters = [['field_A', FilterEqual, 'abc']]`
This will only show the lines where the field_A variable is equal to abc.
Hope this helps someone as it took me nearly (sigh) two weeks to figure it out...
SQLALchemy is an ORM (Object-Relational Mapping), it mean that you dont have to deal with raw SQL, you will call a function that you "build" (by adding filters in your case). It will transparently generate an SQL query, execute it, and return the result as python objects.
I would suggest you to read closely at sqlalchemy documentation about filters again, especially filter_by :
http://docs.sqlalchemy.org/en/latest/orm/query.html#sqlalchemy.orm.query.Query.filter_by
It is the easiest way to apply a WHERE with sqlalchemy.
If you have declared correctly the model for Table_A, you should be able to use it so:
result = session.query(Table_A).filter_by(Field_A = 'some text').all()
Here session.query(Table_A).filter_by(Field_A = 'some text') will generate the SQL, and .all() will execute it.

How to use Lowercase function in Sequelize Postgres

I am trying to use the lowercase function to do string searching in Sequelize.
I manage to do it using the ilike.
My question is how to use the lowercase function in this scenario?
The findAll using ilike is as following:
Db.models.Person.findAll(where: {firstName: {$ilike: `somename`}});
How do I change it to lower(firstname) = lower('somename');
PostgreSQL generally uses case-sensitive collations. This means data that appears in each column is compared literally against your query.
You have several options:
1. Follow Ben's answer, and wrap your wrap columns and database in a sequelize.fn('lower') call.
Pros: No database changes.
Cons: You need to remember to use it for every query. Foregoes indexes (unless you've already created a functional index) and scans tables sequentially, resulting in slower look-ups with large tables. Quite verbose code.
2. Use ILIKE, to case-insensitively match a pattern
To find the name exactly:
Db.models.Person.findAll(where: {firstName: {$iLike: 'name'}});
To find a fragment, which may be contained within arbitrary characters:
Db.models.Person.findAll(where: {firstName: {$iLike: '%name%'}});
Pros: Easy to remember. No Sequelize function wrappers - it's a built-in operator, so syntax is neater. No special indexes or database changes required.
Cons: Slow, unless you start messing with extensions like pg_trgm
3. Define your text columns with the citext type, which implicitly compares lowercase
Defining your column types as 'citext' (instead of text or character varying) has the same practical effect of turning this:
select * from people where name = 'DAVID'
to this...
select * from people where LOWER(name) = LOWER('DAVID')
The PostgreSQL documentation shows this as an example of how to create your table with the citext type:
CREATE TABLE users (
nick CITEXT PRIMARY KEY,
pass TEXT NOT NULL
);
INSERT INTO users VALUES ( 'larry', md5(random()::text) );
INSERT INTO users VALUES ( 'Tom', md5(random()::text) );
INSERT INTO users VALUES ( 'Damian', md5(random()::text) );
INSERT INTO users VALUES ( 'NEAL', md5(random()::text) );
INSERT INTO users VALUES ( 'Bjørn', md5(random()::text) );
SELECT * FROM users WHERE nick = 'Larry';
TL;DR basically swap out your "text" columns for "citext".
The citext module comes bundled with PostgreSQL 8.4, so there's no need to install any extensions. But you do need to enable it on each database you use it with the following SQL:
CREATE EXTENSION IF NOT EXISTS citext WITH SCHEMA public;
And then in your Sequelize definitions, define a type attribute:
// Assuming `Conn` is a new Sequelize instance
const Person = Conn.define('person', {
firstName: {
allowNull: false,
type: 'citext' // <-- this is the only change
}
});
Then your searches against that column will always be case-insensitive with regular where = queries
Pros: No need to wrap your queries in ugly sequelize.fn calls. No need to remember to explicitly lowercase. Locale aware, so works across all character sets.
Cons: You need to remember to use it in your Sequelize definitions when first defining your table. Always activated - you need to know that you'll want to do case insensitive searching when defining your tables.
You can use native functions in the where clause:
Db.models.Person.findAll({
where: sequelize.where(
sequelize.fn('lower', sequelize.col('firstname')),
sequelize.fn('lower', 'somename')
)
});
which would translate to
select * from person where lower(firstname) = lower('somename');

Entity Framework 4.1 & existing database

Hi I have an existing database with a table with 30 fields, I want to split the table into many models so I could retrieve/save fields that I need and not every time retrieve/save the whole object from the db. using c#.
I think I should be using Code-First. Could someone provide an example or a tutorial link?
thanks,
You don't need to split table to be able to load a subset of field or persist subset of fields. Both operations are available with the whole table mapped to single entity as well.
For selection you simply have to use projection:
var data = from x in context.HugeEntities
select new { x.Id, x.Name };
You can use either anonymous type in projection or any non-mapped class.
For updates you can simply use:
var data = new HugeEntity { Id = existingId, Name = newName };
context.HugeEntities.Attach(data);
var dataEntry = context.Entry(data);
dataEntry.Property(d => d.Name).IsModified = true; // Only this property will be updated
context.SaveChanges();
Or:
var data = new HugeEntity { Id = existingId };
context.HugeEntities.Attach(data);
data.Name = newName;
context.SaveChanges(); // Now EF detected change of Name property and updated it
Mapping multiple entities to single table must follows very strict rules and it is possible only with table splitting were all entities must be related with one-to-one relation (and there are some problems with more than two entities per split table in code first) or with table-per-hierarchy inheritance. I don't think that you want to use any of them for this case.

Resources