Good afternoon, I study the library for working with postgresql from python, it’s written in the description:
Never, never, NEVER use Python string concatenation (+) or string parameters interpolation (%) to pass variables to a SQL query string. Not even at gunpoint.
I want to output from the reports table, columns object, data
I tried to make a function like this:
def select(self, column, table):
with conn.cursor() as cursor:
stmt = sql.SQL('SELECT {} FROM {}').format(
sql.Identifier(column),
sql.Identifier(table))
cursor.execute(stmt)
for row in cursor:
print(row)
But I get an error:
psycopg2.ProgrammingError: column "object, data" does not exist
LINE 1: SELECT "object, data" FROM "object"
I managed to achieve the desired result using the function:
def select(self, column, table):
with conn.cursor() as cursor:
cursor.execute("SELECT %s FROM %s" %(column,table))
return cursor.fetchall()
Can you please tell me how to make a function without using %s?
Instead of passing the string columns, you should have an list of the columns names you want to pass. Now you can use sql.SQL(', ').join() to join them.
def select(self, columns, table):
with conn.cursor() as cursor:
stmt = sql.SQL('SELECT {} FROM {}').format(
sql.SQL(', ').join(sql.Identifier(n) for n in columns),
sql.Identifier(table))
cursor.execute(stmt)
for row in cursor:
print(row)
Related
This connection works, but the result is just the text of the query itself:
Connection = cx_Oracle.connect(user=username, password=password, dsn=dsn, encoding=enc)
query = 'simple select statement'
cursor = Connection.cursor()
cursor.execute(query)
Connection.commit()
cursor.close()
print(query)
The result in the dataframe prints 'SELECT RECV_MBR_ID...' instead of the ID's. What am I missing?
This is not unexpected! You are simply printing the value to which you set that variable! You need to fetch the results. You can do this in one of several ways. I'll show a couple of the more common ones here:
for row in cursor.execute(query):
print(row)
OR
cursor.execute(query)
print(cursor.fetchall())
I have a Python class:
class Database:
conn = sqlite3.connect(‘database.db’)
c = conn.cursor()
def __init__(self):
pass
Inside this class I have a multiple methods that I will use with my Database class such as:
def create_table(self, table_name, *args):
pass
def add_user(self):
pass
def remove_user(self):
pass
And so on.
My question is: how do I use *args with my ‘create_table’ function if I am not sure how many columns I will have. For example I know I will have first, last and pay columns, than my function will look like this:
def create_table(self, table_name, *args):
c.execute("""CREATE TABLE ‘{}’ (‘{}’ text, ‘{}’ text, ‘{}’
integer).format(self.table_name, self.first, self.last, self.pay)”””)
c.commit()
So if I want to create table I can do this:
Item = Database()
Item.create_table('employees', ‘First_name’, ’Last_name’, 100000)
But what if I don’t know how many columns I will have?
Thanks
def create_table(self, tableName, *args,):
columns = ''
for i in args:
columns += i
columns += ' '
message = '"""CREATE TABLE {} ({})"""'.format(tableName, columns[:-1])
return message
print(create_table('employees','first', 'text,', 'last', 'text,', 'pay', 'integer'))
Not sure how variable your columns are; or the time frame. Seems you have have a base set of definitions and then later, a new column pops up. So assuming your table starts with the 4 you mentioned above.
We run the create table first, then loop over the files, updating as we go, and if we find a new column, we run an ALTER TABLE tableName ADD column_name datatype and then obviously you update based on the key.
Or you can run over the table start to finish and create at once as qafrombayarea suggests. Our json files are just not that disciplined.
Sorry for this but I'm real new to sqlite: i've created a database from an excel sheet I had, and I can't seem to fetch the values of the column I need
query = """ SELECT GNCR from table"""
cur.execute(query)
This actually works, but
query = """ SELECT ? from table"""
cur.execute(query, my_tuple)
doesn't
Here's my code:
def print_col(to_print):
db = sqlite3.connect('my_database.db')
cur = db.cursor()
query = " SELECT ? FROM my_table "
cur.execute(query, to_print)
results = cur.fetchall()
print(results)
print_col(('GNCR',))
The result is:
[('GNCR',), ('GNCR',), ('GNCR',), ('GNCR',), [...]]
instead of the actual values
What's the problem ? I can't figure it out
the "?" character in query is used for parameter substitution. Sqlite will escape the parameter you passed and replace "?" with the send text. So in effect you query after parameter substitution will be SELECT 'GNCR' FROM my_table where GNCR will be treated as text so you will get the text for each row returned by you query instead of the value of that column.
Basically you should use the query parameter where you want to substitute the parameter with escaped string like in where clause. You can't use it for column name.
Can't execute a psycopg2 insert query (Postgres db), which uses the best practice %s sign for inserting and contains a LIKE statement which includes a % sign.
% sign in LIKE statement is interpreted as an insert placeholder.
'IndexError: tuple index out of range' is thrown.
Tried escaping % with backslash, didn't work out.
with psycopg2.connect(some_url) as conn:
with conn.cursor() as cur:
query = """
SELECT id
FROM users
WHERE surname IN %s AND named LIKE '%john'
"""
cur.execute(query, (tuple(["smith", "mcnamara"]),))
data = cur.fetchall()
Try using a placeholder also for the LIKE expression, and then bind a literal with a wildcard to it:
query = """
SELECT id
FROM users
WHERE surname IN %s AND named LIKE %s"""
cur.execute(query, (tuple(["smith", "mcnamara"]), "%John",))
data = cur.fetchall()
try this one:
with psycopg2.connect(some_url) as conn:
with conn.cursor() as cur:
query = """
SELECT id
FROM users
WHERE surname IN %s AND named LIKE '%sjohn'
"""
cur.execute(query, (tuple(["smith", "mcnamara"]), '%'))
data = cur.fetchall()
I have to code on python sqlite3 a function to count rows of a table.
The thing is that the user should input the name of that table once the function is executed.
So far I have the following. However, I don't know how to "connect" the variable (table) with the function, once it's executed.
Any help would be great.
Thanks
def RT():
import sqlite3
conn= sqlite3.connect ("MyDB.db")
table=input("enter table name: ")
cur = conn.cursor()
cur.execute("Select count(*) from ?", [table])
for row in cur:
print str(row[0])
conn.close()
Columns and Tables Can't be Parameterized
As explained in this SO answer, Columns and tables can't be parameterized. A fact that might not be documented by any authoritative source (I couldn't find one, so if you you know of one please edit this answer and/or the one linked above), but instead has been learned through people trying exactly what was attempted in the question.
The only way to dynamically insert a column or table name is through standard python string formatting:
cur.execute("Select count(*) from {0}".format(table))
Unfortunately This opens you up to the possibility of SQL injection
Whitelist Acceptable Column/Table Names
This SO answer explains that you should use a whitelist to check against acceptable table names. This is what it would look like for you:
import sqlite3
def RT():
conn = sqlite3.connect ("MyDB.db")
table = input("enter table name: ")
cur = conn.cursor()
if table not in ['user', 'blog', 'comment', ...]:
raise ... #Include your own error here
execute("Select count(*) from {0}".format(table))
for row in cur:
print str(row[0])
conn.close()
The same SO answer cautions accepting submitted names directly "because the validation and the actual table could go out of sync, or you could forget the check." Meaning, you should only derive the name of the table yourself. You could do this by making a clear distinction between accepting user input and the actual query. Here is an example of what you might do.
import sqlite3
acceptable_table_names = ['user', 'blog', 'comment', ...]
def RT():
"""
Client side logic: Prompt the user to enter table name.
You could also give a list of names that you associate with ids
"""
table = input("enter table name: ")
if table in acceptable_table_names:
table_index = table_names.index(table)
RT_index(table_index)
def RT_index(table_index):
"""
Backend logic: Accept table index instead of querying user for
table name.
"""
conn = sqlite3.connect ("MyDB.db")
cur = conn.cursor()
table = acceptable_table_names[table_index]
execute("Select count(*) from {0}".format(table))
for row in cur:
print str(row[0])
conn.close()
This may seem frivolous, but this keeps the original interface while addressing the potential problem of forgetting to check against a whitelist. The validation and the actual table could still go out of sync; you'll need to write tests to fight against that.