I'm trying to call a Teradata Stored procedure in python. But it is giving the following error.
cursor = session.cursor()
cursor.callproc("usr.RESULTSET", (teradata.InOutParam("today"),
teradata.OutParam("p2")))
output = cursor.fetchone()
print(output)
Error
raise DatabaseError(i[2], u"[{}] {}".format(i[0], msg), i[0])
teradata.api.DatabaseError: (6, '[HY000] [Teradata][ODBC Teradata Driver] (6) Internal Error (Exception).')
INFO:teradata.udaexec: UdaExec exiting. (2019-05-17 10:02:13,350; udaexec.py:68)
I managed to execute the query via bteq file, and now it's giving me the correct result set. Not sure why my previous method failed. bteq file contains the stored procedure and it's parameters.
ex: stored_procedure.bteq
CALL dbname.UsrFail(arg1,arg2)
rows = []
with td.TeraSession(td_sys_name, td_cred) as session:
results = session.execute(file="/tperf/stored_procedure.bteq")
for result in results.fetchall():
rows.append(result.values)
Related
I'm getting an error when attempting to insert from a temp table into a table that exists in Synapse, here is the relevant code:
def load_adls_data(self, schema: str, table: str, environment: str, filepath: str, columns: list) -> str:
if self.exists_schema(schema):
if self.exists_table(schema, table):
if environment.lower() == 'prod':
schema = "lvl0"
else:
schema = f"{environment.lower()}_lvl0"
temp_table = self.generate_temp_create_table(schema, table, columns)
sql0 = """
IF OBJECT_ID('tempdb..#CopyDataFromADLS') IS NOT NULL
BEGIN
DROP TABLE #CopyDataFromADLS;
END
"""
sql1 = """
{}
COPY INTO #CopyDataFromADLS FROM
'{}'
WITH
(
FILE_TYPE = 'CSV',
FIRSTROW = 1
)
INSERT INTO {}.{}
SELECT *, GETDATE(), '{}' from #CopyDataFromADLS
""".format(temp_table, filepath, schema, table, Path(filepath).name)
print(sql1)
conn = pyodbc.connect(self._synapse_cnx_str)
conn.autocommit = True
with conn.cursor() as db:
db.execute(sql0)
db.execute(sql1)
If I get rid of the insert statement and just do a select from the temp table in the script:
SELECT * FROM #CopyDataFromADLS
I get the same error in either case:
pyodbc.ProgrammingError: ('42000', '[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Not able to validate external location because The remote server returned an error: (409) Conflict. (105215) (SQLExecDirectW)')
I've run the generated code for both the insert and the select in Synapse and they ran perfectly. Google has no real info on this so could someone assist with this? Thanks
pyodbc.ProgrammingError: ('42000', '[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Not able to validate external location because The remote server returned an error: (409) Conflict. (105215) (SQLExecDirectW)')
This error occurs mostly because of authentication or access.
Make sure you have blob storage contributor access.
In the copy into script, add the authentication key for blob storage, unless it is a public blob storage.
I tried to repro this using copy into statement without authentication and got the same error.
After adding authentication using SAS key data is copied successfully.
Refer the Microsoft document for permissions required for bulk load using copy into statements.
I've lots of records to be inserted into sql server. I'm using pyodbc and cursor.fast_executemany() to achieve this. The regular cursor.execute() is inserting records too slowly.
I'm following this article as a reference: https://towardsdatascience.com/how-i-made-inserts-into-sql-server-100x-faster-with-pyodbc-5a0b5afdba5
sample code I'm using:
query = "SELECT * FROM dbo.my_table"
df = pd.read_sql(query, conn)
my_insert_statement = f"INSERT INTO myschema.new_table(colA, colB, colC, colD, colE) values(?,?,?,?,?)"
cursor = conn.cursor()
cursor.fast_executemany = True
cursor.fast_executemany(my_insert_statement,df.values.tolist())
conn.commit()
cursor.close()
conn.close()
But I keep getting the below error although I don't have any boolean columns.
'bool' object is not callable
I don't know how to surpass this error and I really need to insert records in bulk and quick into my database table.
Any ideas on why this error occurs and how to solve this?
The second line in the following snippet has to be wrong. You set fast_executemany to True and then try to call it with fast_executemany().
cursor.fast_executemany = True
cursor.fast_executemany(my_insert_statement,df.values.tolist())
I looked at your guide and you have to replace the second line in the snippet with:
cursor.executemany(my_insert_statement,df.values.tolist())
there are only 2 functions and 1 import.
after running the script, the db has been generated ..
the server responds with 500 error
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator at service#webmailer.de to inform them of the time this error occurred, and the actions you performed just before this error.
More information about this error may be available in the server error log.
In the error log is only
23.08.2021 06:03:54 [...] End of script output before headers: cSQL01.py
#!/usr/bin/env python
import sqlite3
def erzeuge_tabellen():
with sqlite3.connect("strategie.db") as verbindung:
cursor = verbindung.cursor()
cursor.execute("DROP TABLE IF EXISTS viren;")
cursor.execute("DROP TABLE IF EXISTS viren_typ;")
cursor.execute("DROP TABLE IF EXISTS vorfall;")
sql = '''CREATE TABLE viren(
name TEXT PRIMARY KEY,typ INTEGER,status TEXT)'''
cursor.execute(sql)
sql = '''CREATE TABLE viren_typ(
typ INTEGER PRIMARY KEY,groesse INT,signatur TEXT)'''
cursor.execute(sql)
sql = '''CREATE TABLE vorfall(
name TEXT, ort TEXT, vorfall TEXT)'''
cursor.execute(sql)
def schreibe_daten():
with sqlite3.connect("strategie.db") as verbindung:
cursor = verbindung.cursor()
sql = '''INSERT INTO viren_typ(typ, groesse, signatur)
VALUES (1, 128, 'ABAABA'),(2, 256, 'ABAABA'),
(3, 256, 'BCCBCB');'''
cursor.execute(sql)
sql = '''INSERT INTO viren(name, typ, status)
VALUES ('T800', 1, 'aktiv'),('T803', 2, 'aktiv'),
('Bit13', 3, 'aktiv'),('Gorf3', 1, 'aktiv'),
('Gorf7', 2, 'aktiv');'''
cursor.execute(sql)
erzeuge_tabellen()
schreibe_daten()
the above script did call the function
ABOVE or WITHOUT that line
print ("Content-Type: text/html;charset=utf-8\n")
having now inserted that line, no error is thrown
I am trying to do a COPY INTO statement for about 10 files of size 100mb each that are stored in the data lake, and it keeps throwing this error;
Msg 110806, Level 11, State 0, Line 7
110806;A distributed query failed: A severe error occurred on the current command. The results, if any, should be discarded.
Operation cancelled by user.
The command i used is;
COPY INTO AAA.AAAA FROM 'https://AAAA.blob.core.windows.net/data_*.csv' WITH (
CREDENTIAL = (IDENTITY = 'MANAGED IDENTITY'), FIELDQUOTE = N'"', FIELDTERMINATOR = N',',FIRSTROW = 2 );
where did i go wrong? please advise.
I'm using Pytest to create a test for my functions. I do not know what is wrong with my test. I added my query from my programme.
This is for windows and I'm reading a Sqlite3 data. My current code is working, but my test for my update function is not responding.
def test_update_record_of_table(capsys):
connection = sqlite3.connect("Northwind.db", timeout=10)
cursor_db = connection.cursor()
cursor_db.execute("select count(*) from test")
before_count = cursor_db.fetchone()[0]
cursor_db.execute("INSERT INTO {table_name}({fields}) VALUES ({new_record});".format(table_name='test', fields=tuple('CategoryName'), new_record=1400))
print("before_count", before_count)
assign_12_2.update_record_of_table("test", "CategoryName", "asdssf", 3, cursor_db)
cursor_db.execute("select CategoryName from test")
a = cursor_db.fetchall()[0]
connection.commit()
print(a)
assert before_count
This is my error:
cursor_db.execute("INSERT INTO {table_name}({fields}) VALUES ({new_record});".format(table_name='test', fields=tuple('CategoryName'), new_record=1400))
E sqlite3.OperationalError: near "(": syntax error
OperationalError