NoSuchNodeError when determine tables in file - python-3.x

I use PyTables, I want to check whether a table has been created or not, if not, then create it.
I use the following code:
if handle.__contains__(handle.root.grades)==False:
handle.create_table('/', 'grades', grades)
while when there is no such table named "grades", the program report error:"NoSuchNodeError: group / does not have a child named grades"
once there is a table named "grades", the following
handle.__contains__(handle.root.grades)
returns True.
How should I determine whether there is certain table?

I use the following expression to deal with the problem.
import tables as tb
try:
handle.__contains__(handle.root.grades)
print('contains grades')
except tb.NoSuchNodeError:
print('there is no grades, will create it')
handle.create_table('/', 'grades', grades)
and in Pytables, the expression for file.contains is
def __contains__(self, path):
try:
self.get_node(path)
except NoSuchNodeError:
return False
else:
return True
so, does pytables have some problems? or I don't import tables correctly?

Related

Cx_Oracle fetch crash

So I've queried data from oracle database using cursor.execute(). A relatively simple select query. It works.
But when I try to fetch data from it, python crashes.
The same occurs for fetchall(), fetchmany() and fetchone().
When the query first broke in fetchmany() I decided to loop through fetchone() and it worked for the first two rows then broke at the third.
I'm guessing it is because there's too much data in third row.
So, is there any way to bypass this issue and pull the data?
(Please ignore the wrong indents could not copy properly in my phone)
EDIT:
I removed four columns with type "ROWID". There was no issue after that. I was easily able to fetch 100 rows in one go.
So to confirm my suspicion I went ahead and created another copy with only those rowed columns and it crashes as expected.
So is there any issue with ROWID type?
Test table for the same.
Insert into TEST_FOR_CX_ORACLE (Z$OEX0_LINES,Z$OEX0_ORDER_INVOICES,Z$OEX0_ORDERS,Z$ITEM_ROWID) values ('ABoeqvAEyAAB0HOAAM','AAAL0DAEzAAClz7AAN','AAAVeuABHAAA4vdAAH','ABoeo+AIVAAE6dKAAQ');
Insert into TEST_FOR_CX_ORACLE (Z$OEX0_LINES,Z$OEX0_ORDER_INVOICES,Z$OEX0_ORDERS,Z$ITEM_ROWID) values ('ABoeqvABQAABKo6AAI','AAAL0DAEzAAClz7AAO','AAAVeuABHAAA4vdAAH','ABoeo+AIVAAE6dKAAQ');
Insert into TEST_FOR_CX_ORACLE (Z$OEX0_LINES,Z$OEX0_ORDER_INVOICES,Z$OEX0_ORDERS,Z$ITEM_ROWID) values ('ABoeqvABQAABKo6AAG','AAAL0DAEzAAClz7AAP','AAAVeuABHAAA4vdAAH','ABoeo+AHIAAN+OIAAM');
Insert into TEST_FOR_CX_ORACLE (Z$OEX0_LINES,Z$OEX0_ORDER_INVOICES,Z$OEX0_ORDERS,Z$ITEM_ROWID) values ('ABoeqvAEyAAB0HOAAK','AAAL0DAEzAACl0EAAC','AAAVeuABHAAA4vdAAH','ABoeo+AHIAAN+OIAAM');
Script:
from cx_Oracle import makedsn,connect,Cursor
from pandas import read_sql_table, DataFrame, Series
from time import time
def create_conn( host_link , port , service_name , user_name , password ):
dsn=makedsn(host_link,port,service_name=service_name)
return connect(user=user_name, password=password, dsn=dsn)
def initiate_connection(conn):
try:
dbconnection = create_conn(*conn)
print('Connected to '+conn[2]+' !')
except Exception as e:
print(e)
dbconnection = None
return dbconnection
def execute_query(query,conn):
dbconnection=initiate_connection(conn)
try:
cursor = dbconnection.cursor()
print ('Cursor Created!')
return cursor.execute(query)
except Exception as e:
print(e)
return None
start_time = time()
query='''SELECT * FROM test_for_cx_oracle'''
try:
cx_read_query = execute_query(query,ecspat_c)
time_after_execute_query = time()
print('Query Executed')
columns = [i[0] for i in cx_read_query.description]
time_after_getting_columns = time()
except Exception as e:
print(e)
print(time_after_execute_query-start_time,time_after_getting_columns-time_after_execute_query)
Unfortunately, this is a bug in the Oracle Client libraries. You will see it if you attempt to fetch the same rowid value multiple times in consecutive rows. If you avoid that situation all is well. You can also set the environment variable ORA_OCI_NO_OPTIMIZED_FETCH to the value 1 before you run the query to avoid the problem.
This has been reported earlier here: https://github.com/oracle/python-cx_Oracle/issues/120

asyncpg fetch feedback (python)

I have been using psycopg2 to manage items in my PostgreSQL database. Recently someone suggested that I could improve my database transactions by using asyncio and asyncpg in my code. I have looked around Stack Overflow and read though the documentation for examples. I have been able to create tables and insert records, but I haven't been able to get the execution feedback that I desire.
For example in my psycopg2 code, I can verify that a table exists or doesn't exist prior to inserting records.
def table_exists(self, verify_table_existence, name):
'''Verifies the existence of a table within the PostgreSQL database'''
try:
self.cursor.execute(verify_table_existence, name)
answer = self.cursor.fetchone()[0]
if answer == True:
print('The table - {} - exists'.format(name))
return True
else:
print ('The table - {} - does NOT exist'.format(name))
return False
except Exception as error:
logger.info('An error has occurred while trying to verify the existence of the table {}'.format(name))
logger.info('Error message: {}').format(error)
sys.exit(1)
I haven't been able to get the same feedback using asyncpg. How do I accomplish this?
import asyncpg
import asyncio
async def main():
conn = await asyncpg.connect('postgresql://postgres:mypassword#localhost:5432/mydatabase')
answer = await conn.fetch('''
SELECT EXISTS (
SELECT 1
FROM pg_tables
WHERE schemaname = 'public'
AND tablename = 'test01'
); ''')
await conn.close()
#####################
# the fetch returns
# [<Record exists=True>]
# but prints 'The table does NOT exist'
#####################
if answer == True:
print('The table exists')
else:
print('The table does NOT exist')
asyncio.get_event_loop().run_until_complete(main())
You used fetchone()[0] with psycopg2, but just fetch(...) with asyncpg. The former will retrieve the first column of the first row, while the latter will retrieve a whole list of rows. Being a list, it doesn't compare as equal to True.
To fetch a single value from a single row, use something like answer = await conn.fetchval(...).

InternalError: current transaction is aborted, commands ignored until end of transaction block ,on UNIQUE constraint

I tried to get executed with my except: statement... while attempt to oppose the functionality of UNIQUE constraint..But ended with exceptional error..
The Postgresql database table already contains the row that I have used in
db.insert("The News","AparnaKumar",1995,234569654)
but it works well on inserting unrepeated rows..
import psycopg2
class database:
def __init__(self):
self.con=psycopg2.connect("dbname='book_store' user='postgres' password='5283' host='localhost' port='5432' ")
self.cur=self.con.cursor()
self.cur.execute("CREATE TABLE if not exists books(id SERIAL PRIMARY KEY,title TEXT NOT NULL UNIQUE,author TEXT NOT NULL,year integer NOT NULL,isbn integer NOT NULL UNIQUE)")
self.con.commit()
def insert(self,title,author,year,isbn):
try:
self.cur.execute("INSERT INTO books(title,author,year,isbn) VALUES(%s,%s,%s,%s)",(title,author,year,isbn))
self.con.commit()
except:
print("already exists..")
def view(self):
self.cur.execute("SELECT * FROM books")
rows=self.cur.fetchall()
print(rows)
def search(self,title=None,author=None,year=None,isbn=None):
self.cur.execute("SELECT * FROM books WHERE title=%s or author=%s or year=%s or isbn=%s",(title,author,year,isbn))
row=self.cur.fetchall()
print(row)
db=database()
db.insert("The News","AparnaKumar",1995,234569654)
db.view()
db.search(year=1995)
You can either modify your python function to rollback the transaction, or modify the sql to not insert a new row if a conflict occurs (postgresql versions 9.5+):
option 1:
def insert(self,title,author,year,isbn):
try:
self.cur.execute("INSERT INTO books(title,author,year,isbn) VALUES(%s,%s,%s,%s)",(title,author,year,isbn))
self.con.commit()
except:
self.con.rollback()
print("already exists..")
option 2 (works for postgresql versions 9.5 or later):
def insert(self,title,author,year,isbn):
try:
self.cur.execute("INSERT INTO books(title,author,year,isbn) VALUES(%s,%s,%s,%s) ON CONFLICT DO NOTHING",(title,author,year,isbn))
self.con.commit()
except:
print("already exists..")
Use a SAVEPOINT.
See the second question in the psycopg FAQ list.
If you want to keep an transaction open, you should use a SAVEPOINT and ROLLBACK to that SAVEPOINT when the exception occurs. But since you use a catchall exception (too broad), this will also happen on other errors. So it gets a bit out of control.
Perhaps a very simple solution is to check using a SELECT if the record you're about to insert already exists.

Calling a function from another file within an if-Python 3.x

I've found resources on here but they're pertaining to locally embedded functions. I have one file called "test" and another called "main". I want test to contain all of my logic while main will contain a complete list of functions which each correlate with a health insurance policy. There are hundreds of policies so it would become quite tedious to write an if statement in "test" for each one each time. I'd like to write as few lines as possible to call a function based off of what a value states. Something like:
insurance = input()
The end result would not be an input but for testing/learning purposes it is. The input would always correlate with an insurance policy exactly if it exists. So on "tests" I currently have:
from inspolicy.main import bcbs, uhc, medicare
print('What is the insurance?(bcbs, uhc, medicare)')
insurance = input()
if insurance.lower() == 'bcbs':
bcbs()
elif insurance.lower() == 'uhc':
uhc()
elif insurance.lower() == 'medicare':
medicare()
else:
print('This policy can not be found in the database. Please set aside.')
With "main" including:
def bcbs():
print('This is BCBS')
def uhc():
print('This is UHC')
def medicare():
print('This is Medicare')
So is there a way to have the input (i.e. insurance) be what is referenced against to call the function from "main"?
Thank you in advance for your time!
The best approach to this is to use a dictionary to map between the name of your insurance policies and the function that deals with them. This could be a hand-built dict in one of your modules, or you could simply use the namespace of the main module (which is implemented using a dictionary):
# in test
import types
import inspolicy.main # import the module itself, rather than just the functions
insurance = input('What is the insurance?(bcbs, uhc, medicare)')
func = getattr(inspolicy.main, insurance, None)
if isinstance(func, types.FunctionType):
func()
else:
print('This policy can not be found in the database. Please set aside.')
Let's consider this is your main.py
def uhc():
print("This is UHC")
It is possible to do something like that in test.py:
import main
def unknown_function():
print('This policy can not be found in the database. Please set aside.')
insurance = input()
try:
insurance_function = getattr(main, insurance.lower())
except AttributeError:
insurance_function = unknown_function
insurance_function()
Then, if you type "uhc" as your input, you will get the uhc function from main.py and call it.

How to extract all attributes from a Azure table entity object ?

I have a code that extracts specific attributes from an Azure Table entity object :
def run(self):
file_handle=partition_key +'_blob.csv'
ts=TableService(account_name='dev',account_key='eNiDww==')
i=0
next_pk=None
next_rk=None
part_k="PartitionKey eq '%s'"%(partition_key)
with open(file_handle, 'a') as fp:
while True:
entities=ts.query_entities('Eventsdata', filter=part_k,select = 'what goes here',next_partition_key=next_pk,next_row_key=next_rk,top=1000)
i+=1
json_dict={}
for ent in entities:
if hasattr(ent,'Day'):
day=ent.Day:
else:
day=None
if hasattr(ent,'EventDetailsJSON'):
eventJson=ent.EventDetailsJSON:
else:
eventJson=None
if hasattr(ent,EventSubType):
eventSubtype=ent.EventSubType:
else:
eventSubtype=None
print(day,eventJson)
\\
if hasattr(entities,'x_ms_continuation'):
x_ms_continuation=getattr(entities,'x_ms_continuation')
next_pk=x_ms_continuation['nextpartitionkey']
next_rk=x_ms_continuation['nextrowkey']
else:
break;
I can specify as many fields as I want and extract them.
Is there a way to extract all the attributes without specifying specific fields? In the tableservice.query_entities function there is an optional handle for selecting fields but not sure if there is a command to select all the fields.
The reason being if more attributes get added in the future then this code would not capture them.
If you want to get all fields simply don't specify anything for 'select'.

Resources