Cannot find postgres table created using python program - python-3.x

I wrote a python program to create a postgres table and populate it with some data.Here is the code below
conn = psycopg2.connect(database="metrics", user="souvik", password="*******", host="localhost", port="5432")
cur = conn.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS device_mse(date date,metric character varying(255),device character varying(255),mse double precision);")
insert_statement = "INSERT INTO device_mse VALUES (%s,%s,%s,%s);"
cur.executemany(insert_statement, result)
The program runs successfully and doesn't throw any error.However in the postgres command line when I search for the table in the database metrics using \dt, it doesn't show up.
I even tried /d+ device_mse but it says no relation found.What is going wrong?

As pointed out by #a_horse_with_no_name, I forgot to add a commit statement to the program because of which the table was never created.
Updated code
conn = psycopg2.connect(database="metrics", user="souvik", password="*******", host="localhost", port="5432")
cur = conn.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS device_mse(date date,metric character varying(255),device character varying(255),mse double precision);")
insert_statement = "INSERT INTO device_mse VALUES (%s,%s,%s,%s);"
cur.executemany(insert_statement, result)
cur.commit()

Related

Problem with one-to-many relationship sqlite3

I created one-to-many relationship table and according to the sqlite3 documentation I can't insert value into the child table if the referenced table column value in the parent table does not exist.
import sqlite3
class Database:
def __init__(self, database_name):
self.database_name = database_name
def create_table(self, table_name, *columns):
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"CREATE TABLE IF NOT EXISTS {table_name}({columns})"
cursor.execute(_SQL)
conn.commit()
cursor.close()
conn.close()
def insert_values(self, table_name, values, *columns):
dynamic_values = ('?, ' * len(columns))[0:-2]
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"INSERT INTO {table_name}({columns}) VALUES ({dynamic_values})"
cursor.execute(_SQL, values)
conn.commit()
cursor.close()
conn.close()
def view_values(self, table_name, *columns):
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"SELECT {columns} FROM {table_name}"
cursor.execute(_SQL)
the_data = cursor.fetchall()
cursor.close()
conn.close()
return the_data
data = Database("games.db")
#
# data.create_table("supplier_groups", "group_id integer PRIMARY KEY", "group_name text NOT NULL")
#
data.insert_values("supplier_groups", ("Domestic", ), "group_name")
# data.create_table("suppliers ", "supplier_id INTEGER PRIMARY KEY",
# "supplier_name TEXT NOT NULL",
# "group_id INTEGER NOT NULL, "
# "FOREIGN KEY (group_id) REFERENCES supplier_groups (group_id)")
data.insert_values("suppliers", ('ABC Inc.', 9), "supplier_name", "group_id")
as you see on this line: data.insert_values("supplier_groups", ("Domestic", ), "group_name") - I'm inserting a value into supplier_groups table
and then right here: data.insert_values("suppliers", ('ABC Inc.', 9), "supplier_name", "group_id") - I'm inserting value into suppliers table with the group_id that does not exist in the group_suppliers table. Python executes it successfully and adds value to the database, however when attemping to execute this command in SQLITE browser I get this error:
Execution finished with errors. Result: FOREIGN KEY constraint failed which is what python should also have done instead of adding it into the database.
So, could anyone explain me what's going on here? Do I understand something in the wrong way? Help would be appreciated
From Section 2. Enabling Foreign Key Support in the sqlite doc:
Assuming the library is compiled with foreign key constraints enabled, it must still be enabled by the application at runtime, using the PRAGMA foreign_keys command. For example:
sqlite> PRAGMA foreign_keys = ON;

How to dynamically input the table name and fetch the results in SQLite?

I have just started learning SQLite and was creating a project which has a .sqlite file in which there are multiple tables. I want to ask the user to input the table_name and then the program will fetch the columns present in that particular table.
So far I have done this.
app_database.py
def column_names(table_name):
conn = sqlite3.connect('northwind_small.sqlite')
c = conn.cursor()
c.execute("PRAGMA table_info(table_name)")
columns = c.fetchall()
for c in columns :
print(c[1])
conn.commit()
conn.close()
our-app.py
import app_database
table_name = input("Enter the table name = ")
app_database.column_names(table_name)
when I run our-app.py I don't get anything.
C:\Users\database-project>python our-app.py
Enter the table name = Employee
C:\Users\database-project>
Can anyone tell me how should I proceed?

An issue with inserting blob data into SQL tables

I'm trying to create a code piece that inserts an object I've created to store data in a very specific way into an SQL table as a blob type, and it keeps giving me an ' sqlite3.InterfaceError: Error binding parameter 1 - probably unsupported type.' error.
Has any of you encountered something similar before? Do you have any ideas how to deal with it?
conn = sqlite3.connect('my_database.db')
c = conn.cursor()
params = (self.question_id, i) #i is the object in question
c.execute('''
INSERT INTO '''+self.current_test_name+''' VALUES (?, ?)
''',params)
conn.commit()
conn.close()
For starters, this would be a more appropriate execute statement as it is way cleaner:
c.execute("INSERT INTO "+self.current_test_name+" VALUES (?, ?)", (self.question_id, i))
You are also missing the table you are inserting into (or the columns if self.current_test_name is the table name.)
Also, Is the column in the database setup to handle the data type for the provided input for self.question_id and i? (Not expecting TEXT when you provided INT?)
Example of a working script to insert into a table that has 2 columns named test and test2:
import sqlite3
conn = sqlite3.connect('my_database.db')
c = conn.cursor()
c.execute("CREATE TABLE IF NOT EXISTS test(test INT, test2 INT)")
conn.commit()
for i in range(10):
params = (i, i) # i is the object in question
c.execute("INSERT INTO test (test, test2) VALUES (?, ?)", params)
conn.commit()
conn.close()

How to copy one database table data to another database table sqlite3 python

i have two database #1 is tripplanner.db and the #2 is trip.db .I want to add trip.db table 'restaurants' data to db # 1 tripplanner.db table 'restaurants'(which is empty now). I am using sqlite which is builtin in python.
Help me please.And tell me how i can execute this in python.
import sqlite3
import os
conn = sqlite3.connect('trip.db')
c = conn.cursor()
c.execute("DROP TABLE IF EXISTS things")
c.execute("ATTACH DATABASE ? AS db2", (os.path.join('data', 'db', 'trip_tripplanner.db'),))
c.execute("SELECT things FROM db2.sqlite_master WHERE type='table' AND name='things'")
c.execute(c.fetchone()[0])
c.execute("INSERT INTO trip.things SELECT * FROM db2.things")
conn.commit()
conn.close()
This code is what i have tried so far by seeing posts in stackoverflow.but it is giving me error because i don't know what is 'data' in os.path.join('data').

Count number of rows in Pysqlite3

I have to code on python sqlite3 a function to count rows of a table.
The thing is that the user should input the name of that table once the function is executed.
So far I have the following. However, I don't know how to "connect" the variable (table) with the function, once it's executed.
Any help would be great.
Thanks
def RT():
import sqlite3
conn= sqlite3.connect ("MyDB.db")
table=input("enter table name: ")
cur = conn.cursor()
cur.execute("Select count(*) from ?", [table])
for row in cur:
print str(row[0])
conn.close()
Columns and Tables Can't be Parameterized
As explained in this SO answer, Columns and tables can't be parameterized. A fact that might not be documented by any authoritative source (I couldn't find one, so if you you know of one please edit this answer and/or the one linked above), but instead has been learned through people trying exactly what was attempted in the question.
The only way to dynamically insert a column or table name is through standard python string formatting:
cur.execute("Select count(*) from {0}".format(table))
Unfortunately This opens you up to the possibility of SQL injection
Whitelist Acceptable Column/Table Names
This SO answer explains that you should use a whitelist to check against acceptable table names. This is what it would look like for you:
import sqlite3
def RT():
conn = sqlite3.connect ("MyDB.db")
table = input("enter table name: ")
cur = conn.cursor()
if table not in ['user', 'blog', 'comment', ...]:
raise ... #Include your own error here
execute("Select count(*) from {0}".format(table))
for row in cur:
print str(row[0])
conn.close()
The same SO answer cautions accepting submitted names directly "because the validation and the actual table could go out of sync, or you could forget the check." Meaning, you should only derive the name of the table yourself. You could do this by making a clear distinction between accepting user input and the actual query. Here is an example of what you might do.
import sqlite3
acceptable_table_names = ['user', 'blog', 'comment', ...]
def RT():
"""
Client side logic: Prompt the user to enter table name.
You could also give a list of names that you associate with ids
"""
table = input("enter table name: ")
if table in acceptable_table_names:
table_index = table_names.index(table)
RT_index(table_index)
def RT_index(table_index):
"""
Backend logic: Accept table index instead of querying user for
table name.
"""
conn = sqlite3.connect ("MyDB.db")
cur = conn.cursor()
table = acceptable_table_names[table_index]
execute("Select count(*) from {0}".format(table))
for row in cur:
print str(row[0])
conn.close()
This may seem frivolous, but this keeps the original interface while addressing the potential problem of forgetting to check against a whitelist. The validation and the actual table could still go out of sync; you'll need to write tests to fight against that.

Resources