Now, I have some results from JSON parse to SQL, and where clause is different with postgresql.
Then, need to convert with regex or replace it, but I am not familiar with regex.
SELECT Column_name, Column_name_2, Column_name_3 FROM sample WHERE LIKE(Column_name_2, "text") OR LIKE(Column_name_2, "text_2") OR LIKE(Column_name_2, "text_3") OR LIKE(Column_3, cwcwsd) LIMIT 100;
Output:
SELECT Column_name, Column_name_2, Column_name_3 FROM sample WHERE Column_name_2 LIKE "text" OR Column_name_2 LIKE "text_2" OR Column_name_2 LIKE "text_3" OR Column_name_2 LIKE cwcwsd LIMIT 100;
Alternative approach is to create a PostgreSQL function LIKE(text, text):
create or replace function LIKE(text, text)
returns boolean
language sql
AS $$
SELECT $1 LIKE $2;
$$;
Test it:
SELECT LIKE('success', 'su%');
Returns true;
Related
I have two sets of similar codes that gives different output. The first example does not return any output but the second example returns a output using the same search input.
First Example:
sql = "SELECT accessionID, title, ISBN, publisher, publicationYear FROM Books WHERE %s LIKE %s";
cursor.execute(sql,(col, "%" + values + "%",))
Second Example:
sql = "SELECT accessionID, title, ISBN, publisher, publicationYear FROM Books WHERE title LIKE %s";
cursor.execute(sql,("%" + values + "%",))
The codes that I am trying to code out is that WHERE is dynamic that depends on which text field user searches on. For example, if a user searches something on the title text box, it will only look into Title.
Another way I could think of is to use If conditions to hardcode, but it only works for the first If conditions and subsequent one does not work.
My question is how to make the SQL line dynamic (using first example) in the sense that I can do two %s in the SQL query line and still get the same output?
I am looking to use a string that takes in some parameters from my cells and use it as my sql statement like so:
" select * from table where " & data_from_cells & " group by ...;"
store it as sqlstring
let
mystring = sqlstring,
myQuery = Odbc.Query("driver={Oracle in etc etc etc", mystring)
and I run into this error
Formula.Firewall: Query '...' (step 'myQuery') references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
now apparently I can't combine external queries and another query -- but I am only using passed-on parameters from Excel for my sql string? I am hoping to use WITH keyword as well to make nested queries using parameters but it doesn't even let me combine values from excel with an sql statement..
to be clear, the data_from_cells was transformed and formatted as a string.
When queries that do stuff are called by other queries that do other stuff, sometimes you can get firewall issues.
The way to get around that is for everything to be done in a single query.
The way to get around that without ending up with horrible code is to change your called queries from returning the result to functions that return the result.
For sqlstring:
() => " select * from table where " & data_from_cells & " group by ...;" // returns a function that gets the query string when called
Then your "myQuery" query can be
let
mystring = sqlstring(), //note the parentheses!
myQuery = Odbc.Query("driver={Oracle in etc etc etc", mystring)
I need to create a dynamic query based on two string parameters:
description = "This is the description"
comment = "This is the comment"
query = "insert into case(desc, comm) value(description, comment)"
Note:
there might be single quote and double quotes in both description and comment.
How do I use formatted %s to generate the query string?
Thank you very much.
UPDATE:
Thanks to Green Cloak Guy (his/her answer has minors to be corrected), the right query is:
query = f"insert into case(description, comment) value(\'{description}\', \'{comment}\')"
Use an f-string.
query = f"insert into case({description}, {comment}) value({description}, {comment})"
Don't use any type of string formatting to do actual database queries - that leads to you having a SQL Injection problem. Use a database library instead, that properly sanitizes the data.
But if all you need to do is parse some variables into a string, this is more flexible than the % formatting that other languages tend to use (and that's technically still available in python via "some_string %s %s %s" % (str1, str2, str3)")
I am stuck to find out better one between str() and format() in python
"SELECT schools.deis_income , schools.school_name,SUM(money.coin_in_amount) AS coinamount, SUM(money.note_in_amount) AS noteamount , SUM(money.coffee_coin_in_amount) AS coffeeamount , SUM(money.coin_out_amount) AS coinoutamount, SUM(money.note_out_amount) AS noteoutamount FROM money_transactions AS money JOIN school_admin_details AS sa on sa.id = money.school_admin_id JOIN schools ON schools.id=sa.school_id WHERE sa.school_id ={school_id} AND money.transaction_time BETWEEN '{start_date}' AND '{end_date}' GROUP BY schools.id".format(school_id=school_id,start_date=start_date,end_date=end_date)
I use format function here. can I use str() ?
please tell me which one give me quick result, str() or format() ???
If your question is which of this:
foo = "some text " + str(some_var) + " and some other text"
or this:
foo = "some text {} and some other text".format(var)
is "better", the general consensus is very clear: string formatting is much easier to read and maintain and the one pythonic way to go.
Now for your particular example, the answer is that both are totally wrong - unless you're ok to give full access to your database to even the most inept script kiddie. For SQL queries, the proper solution is to use prepared statements, where your db connector will take care of proper formatting and sanitizing of the values:
# assumes MySQL - for other vendors check your own
# db-api connector's doc for the correct placeholder
query = "SELECT somefield FROM mytable where somedate > %(somedate)s and something_else = %(someval)s"
cursor.execute(query, {"somedate": some_date, "someval": 42})
I have a pl/pgsql script that needs to check if a word/sentence is in a string, and it must take care of word boundaries, and case insenstive.
Example:
String: "my label xx zz yy", Pattern: "my label", MATCH
String: "xx my label zz", Pattern: "my label", MATCH
String: "my labelxx zz", Pattern: "my label", NO MATCH
So the obvious solution is to use a regex, like this:
select _label ~* (E'\\y' || _pattern || E'\\y') into _match;
It works but is slow, compared to a simple
select _label ilike '%' || _pattern || '%' into _match;
This is wrapped in a function that my script calls A LOT (in the tens of millions, I do a lot of recursion), and with this requirement the overall runtime doubled.
Now my question is, is there a faster way to implement this ?
Thanks.
EDIT: ended up using this:
if _label ilike '%' || _pattern || '%' then
select _label ~* (E'\\m' || _pattern || E'\\M') into _match;
end if;
and it is significantly faster.
I would consider the full text search capabilities, but from what you're describing, I'd likely implement this using PostgreSQL arrays.
First: define a function that takes a label, lowercases it (or uppercase if you prefer), splits it on word boundaries, and returns an array. Say:
CREATE OR REPLACE FUNCTION label_to_array(text) RETURNS text[] AS $$
SELECT regexp_split_to_array(lower($1), E'\\W');
$$ LANGUAGE sql IMMUTABLE;
$ select label_to_array('my label xx zz yy');
label_to_array
---------------------
{my,label,xx,zz,yy}
Now, create a GIN index over this function:
CREATE INDEX sometable_label_array_key ON sometable
USING GIN((label_to_array(label));
From here, PostgreSQL can use this index for many queries involving array operators, such as "contains":
SELECT *
FROM sometable
WHERE label_to_array(label) #> label_to_array('my label');
This query would split 'my label' into {my,label}, and would then use the index to find a list of rows containing my, intersect that with the list of rows containing label, and then return the result. This isn't exactly equivalent to your original query (since it doesn't check their order), but since it uses an index to eliminate most of the rows in the table, adding the original check on the end would work just fine:
SELECT *
FROM sometable
WHERE label_to_array(label) <# label_to_array('my label')
AND label ~* (E'\\y' || 'my label' || E'\\y');