Python Postgresql function? - python-3.x

I've got a PostgreSQL function that inserts into 1 public table from another. Here are the create tables and function codes:
CREATE TABLE IF NOT EXISTS public.a_input
(
in_text character varying COLLATE pg_catalog."default"
)
TABLESPACE pg_default;
CREATE TABLE IF NOT EXISTS public.tot_input
(
in_text character varying COLLATE pg_catalog."default"
)
TABLESPACE pg_default;
insert into public.a_input(
select 'a');
insert into public.a_input(
select 'b');
insert into public.a_input(
select 'c');
CREATE FUNCTION public.inputfunct(
)
RETURNS void
LANGUAGE 'plpgsql'
COST 100
VOLATILE PARALLEL UNSAFE
AS $BODY$
BEGIN
INSERT INTO public.tot_input (
SELECT
in_text
FROM public.a_input);
END;
$BODY$;
So, the table public.a_input has 3 entries ('a','b','c'). And the public.inputfunct will select those 3 rows and insert them into public.tot_input. I've tested this in PostgreSQL and it works like I expected.
Then, I go over to python and I have this code:
#####################
hostname='localhost'
user='postgres'
password='MyPassword'
dbname='postgres'
################
import psycopg2
#########
try:
con=psycopg2.connect( host=hostname, user=username, password=password, dbname=database )
except NameError:
print('error')
else:
print('connected')
cur=con.cursor()
cur.callproc("public.inputfunct")
con.commit
con.close()
When I run this, the 'connected' message prints, so I know I'm connecting correctly. I don't get an error when this runs. But, When I select from public.tot_input - there are no rows. It's like the function is running, but no rows end up in the tot_input table. Any suggestions?

Related

Index is used in PostgreSQL but not in YugabyteDB

(question from slack)
The following script uses the index in the last select on postgres but not on yugabyte.
drop table if exists entry2;
CREATE TABLE entry2 (comp_id int,
path varchar,
index varchar,
archtype varchar,
other JSONB,
PRIMARY KEY (comp_id, path,index));
DO $$
BEGIN
FOR counter IN 1..200000 BY 1 LOOP
insert into entry2 values (counter,'/content[open XXX- XXX-OBSERVATION.blood_pressure.v1,0]','0','open XXX- XXX-OBSERVATION.blood_pressure.v1','{"data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value" :132,"data[at0001]/events[at0006]/data[at0003]/items[at0005]/value/value": 92}');
insert into entry2 values (counter,'/content[open XXX- XXX-OBSERVATION.blood_pressure.v1,0]','1','open XXX- XXX-OBSERVATION.blood_pressure.v1',('{"data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value" :'||(130+ counter) ||',"data[at0001]/events[at0006]/data[at0003]/items[at0005]/value/value": 90}')::jsonb);
insert into entry2 values (counter,'/content[open XXX- XXX-OBSERVATION.heart_rate-pulse.v1,0]','0','open XXX- XXX-OBSERVATION.heart_rate-pulse.v1','{"data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value" :132,"/data[at0002]/events[at0003]/data[at0001]/items[at0004]/value/value": 113}');
END LOOP;
END; $$;
drop index if exists blood_pr;
create index blood_pr on entry2(((other ->> 'data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value')::integer ));
explain analyse
select (other ->> 'data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value')::integer from entry2
where (other ->> 'data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value')::integer > 140
order by (other ->> 'data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value')::integer::integer
limit 10
;
PostgreSQL uses the index to avoid a sort and then get immediately the first 10 rows
In PostgreSQL the index is used to avoid a sort because the order of the index is the same as the order by. Sorted indexes (B-Tree) are the default in PostgreSQL but not in YugabyteDB which is a distributed SQL database where indexes are hash-sharded by default on the first column. You can create it as range-sharded with ASC or DESC:
create index blood_pr on entry2(((other ->> 'data[at0001]/events[at0006]/data[at0003]/items[at0004]/value/value')::integer ) ASC);

Trying to save a sqlite table inside another table using python

The problem now is that I can only enter one record. No errors are recorded. It just takes the first record from one database and puts in the other database. I am trying to create a machine usable database from the user interface database. I will try to transfer around 100 records once it is working. I would appreciate in comments or suggestions. Thank you!
import sqlite3
sql = 'INSERT INTO heavenStream (scene, cascade, enclosure, sensor, streamer, dither) VALUES (?, ?, ?, ?, ?, ?)'
def dropTable(crs,conn):
crs.execute("DROP TABLE IF EXISTS heavenStream")
def createTable(crs,conn):
sql ='''CREATE TABLE heavenStream(
id INTEGER PRIMARY KEY AUTOINCREMENT,
scene TEXT,
cascade TEXT,
enclosure TEXT,
sensor TEXT,
streamer TEXT,
dither TEXT,
timeStream TEXT,
streamTime TEXT
)'''
crs.execute(sql)
print("Table created successfully........")
def insert_one(conn, crs):
crs.execute("SELECT * FROM animalStream")
for row in crs:
scene = row[1]
cascade = row[2]
enclosure = row[3]
sensor = row[4]
streamer = row[5]
dither = row[6]
print(f"{row[1]} {row[2]} {row[3]} {row[4]} {row[5]} {row[6]}")
try:
crs.execute(sql, (scene, cascade, enclosure,
sensor,streamer,dither))
except sqlite3.IntegrityError as err:
print('sqlite error: ', err.args[0]) # column name is
not unique
conn.commit()
def main():
conn = sqlite3.connect("/home/harry/interface/wildlife.db")
crs = conn.cursor()
dropTable(crs,conn)
createTable(crs,conn)
insert_one(conn, crs)
# conn.commit()
conn.close()
print('done')
main()
The user interface database has had records deleted. There is one record with an id of 64 and the rest are in the 90's.
The cursor (crs) changes here
crs.execute(sql, (scene, cascade, enclosure,sensor,streamer,dither))
after the first insert. Therefore, there are "no more rows to fetch" in the orginal crs.
One solution would be to instantiate another cursor for the insert. Another solution would be to fetchall() the rows into a variable and iterate over that variable as with:
rows = crs.execute("SELECT * FROM animalStream").fetchall()
for row in rows:

Cannot update existing row on conflict in PostgreSQL with Psycopg2

I have the following function defined to insert several rows with iteration in Python using Psycopg2 and PostgreSQL 11.
When I receive the same obj (with same id), I want to update its date.
def insert_execute_values_iterator(
connection,
objs: Iterator[Dict[str, Any]],
page_size: int = 1000,
) -> None:
with connection.cursor() as cursor:
try:
psycopg2.extras.execute_values(cursor, """
INSERT INTO objs(\
id,\
date,\
) VALUES %s \
ON CONFLICT (id) \
DO UPDATE SET (date) = (EXCLUDED.date) \
""", ((
obj['id'],
obj['date'],
) for obj in objs), page_size=page_size)
except (Exception, Error) as error:
print("Error while inserting as in database", error)
When a conflict happens on the unique primary key of the table while inserting an element, I get the error:
Error while inserting as in database ON CONFLICT DO UPDATE command
cannot affect row a second time
HINT: Ensure that no rows proposed for insertion within the same command have duplicate constrained values.
FYI, the clause works on PostgreSQL directly but not from the Python code.
Use unique VALUE-combinations in your INSERT statement:
create table foo(id int primary key, date date);
This should work:
INSERT INTO foo(id, date)
VALUES(1,'2021-02-17')
ON CONFLICT(id)
DO UPDATE SET date = excluded.date;
This one won't:
INSERT INTO foo(id, date)
VALUES(1,'2021-02-17') , (1, '2021-02-16') -- 2 conflicting rows
ON CONFLICT(id)
DO UPDATE SET date = excluded.date;
DEMO
You can fix this by using DISTINCT ON() in a SELECT statement:
INSERT INTO foo(id, date)
SELECT DISTINCT ON(id) id, date
FROM (VALUES(1,CAST('2021-02-17' AS date)) , (1, '2021-02-16')) s(id, date)
ORDER BY id, date ASC
ON CONFLICT(id)
DO UPDATE SET date = excluded.date;

Problem with one-to-many relationship sqlite3

I created one-to-many relationship table and according to the sqlite3 documentation I can't insert value into the child table if the referenced table column value in the parent table does not exist.
import sqlite3
class Database:
def __init__(self, database_name):
self.database_name = database_name
def create_table(self, table_name, *columns):
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"CREATE TABLE IF NOT EXISTS {table_name}({columns})"
cursor.execute(_SQL)
conn.commit()
cursor.close()
conn.close()
def insert_values(self, table_name, values, *columns):
dynamic_values = ('?, ' * len(columns))[0:-2]
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"INSERT INTO {table_name}({columns}) VALUES ({dynamic_values})"
cursor.execute(_SQL, values)
conn.commit()
cursor.close()
conn.close()
def view_values(self, table_name, *columns):
columns = ", ".join(columns)
conn = sqlite3.connect(self.database_name)
cursor = conn.cursor()
_SQL = f"SELECT {columns} FROM {table_name}"
cursor.execute(_SQL)
the_data = cursor.fetchall()
cursor.close()
conn.close()
return the_data
data = Database("games.db")
#
# data.create_table("supplier_groups", "group_id integer PRIMARY KEY", "group_name text NOT NULL")
#
data.insert_values("supplier_groups", ("Domestic", ), "group_name")
# data.create_table("suppliers ", "supplier_id INTEGER PRIMARY KEY",
# "supplier_name TEXT NOT NULL",
# "group_id INTEGER NOT NULL, "
# "FOREIGN KEY (group_id) REFERENCES supplier_groups (group_id)")
data.insert_values("suppliers", ('ABC Inc.', 9), "supplier_name", "group_id")
as you see on this line: data.insert_values("supplier_groups", ("Domestic", ), "group_name") - I'm inserting a value into supplier_groups table
and then right here: data.insert_values("suppliers", ('ABC Inc.', 9), "supplier_name", "group_id") - I'm inserting value into suppliers table with the group_id that does not exist in the group_suppliers table. Python executes it successfully and adds value to the database, however when attemping to execute this command in SQLITE browser I get this error:
Execution finished with errors. Result: FOREIGN KEY constraint failed which is what python should also have done instead of adding it into the database.
So, could anyone explain me what's going on here? Do I understand something in the wrong way? Help would be appreciated
From Section 2. Enabling Foreign Key Support in the sqlite doc:
Assuming the library is compiled with foreign key constraints enabled, it must still be enabled by the application at runtime, using the PRAGMA foreign_keys command. For example:
sqlite> PRAGMA foreign_keys = ON;

How to use INSERT query to avoid duplicate entries in postgresql database tables

Hi..while using the follwing code i am getting duplicate entries in my table..
Please suggest some method to avoid such duplicate entries..!!
Is there any other mode of INSERT query to acheive duplication free tables..???
import psycopg2
def connect():
con=psycopg2.connect("dbname='book_store' user='postgres' password='5283' host='localhost' port='5432' ")
cur=con.cursor()
cur.execute("CREATE TABLE if not exists books(id SERIAL PRIMARY KEY,title TEXT NOT NULL,author TEXT NOT NULL,year integer NOT NULL,isbn integer NOT NULL)")
con.commit()
con.close()
def insert(title,author,year,isbn):
con=psycopg2.connect("dbname='book_store' user='postgres' password='5283' host='localhost' port='5432'")
cur=con.cursor()
cur.execute("INSERT INTO books(title,author,year,isbn) VALUES(%s,%s,%s,%s)",(title,author,year,isbn))
con.commit()
con.close()
connect()
insert("the sun","helen",1997,23456777)
insert("the sun","helen",1997,23456777)
Here the same entry gets added again..where i want my code to neglect such duplication..!!!
Ideally there should be primary key or Unique key constraint defined on the table to avoid duplicates but if you want to insert only if that record doesn't exists then you can use below insert statement with select & where not exists clause
INSERT INTO books(title,author,year,isbn) select #title,#author,#year,#isbn from books where
not exists (select 1 from books where title=#title and author=#author and year=#year and isbn=#isbn);
In where condition should check for Primary OR Unique key columns instead of all the columns.

Resources