How to automatically running daily SQL script (Oracle) using python? - excel

How to automatically run daily SQL script (Oracle) using python?
- Is that possible to do?
- If possible, How can I export the result from the query to excel by automatically?

You need to use python cron to execute script every day please refer the documentation https://pypi.python.org/pypi/python-crontab.
use cx_Oracle in python. Below is the documentation for that:
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/OOW11/python_db/python_db.htm
this link will help you in order to create your desired script.
Once you get the result in database then you can either use oracle scheduler job or batch scheduler jobs from os to execute your oracle query result into excel: https://dba.stackexchange.com/questions/222209/how-to-automate-csv-exports-of-queries-in-pl-sql-developer-oracle

Related

Databricks python notebook to execute %sql commandlet based on condition

I have created a python notebook in Databricks, I have python logic and need to execute a %sql commandlet.
Say I wanted to execute that commandlet2 based on a python variable
cmd1
EXECUTE_SQL= True
cmd2
if condition :
%sql .....
As mentioned, you can use following Python code (or Scala) to make behavior similar to the %sql cell:
if condition:
display(spark.sql("your-query"))
One advantage of this approach is that you can embed variables into the query text.
Another alternate which I used is
Extracted the sql to a different notebook,
In my case i don't want any results back
Also I am cleaning up the delta tables and deleting contents.
/clean-deltatable-notebook (an sql notebook)
delete from <database>.<table>
used the dbutils.run.notebook() from the python notebook.
cmd2
if condition :
result = dbutils.run.notebook('<path-of-(clean-deltatable-noetbook)',timeout_seconds = 30)
print(result)
Link on dbutils.notebook.run() from Databricks

Embed python into looker

Is there a natural way to wrap Python code to display in Looker?
The ideal dataflow for my problem is SQL DB->python-> looker, or alternatively, looker->python-> looker. 
I am hoping to embed a.py into lookML so that I can automate python analysis, ready to display in looker.
You can't call Python code from Looker. Depending on the database you are using, you may want to look into creating a UDF within that database and then call it using SQL.

how to extract snowflake tables schema and store-procedures using python script?

I'm intermediate with python and beginner with snowflakes.
Here i'm able to connect snowflakes and fetch table data.
But main problem is to extract tables schema and store-procedures from snowflakes using python script.
thanks in advance.
Use the get_ddl feature of snowflake as shown below to get the table schema
And use SHOW PROCEDURES to get the procedures
You can see the below python package to find out how you can run the above queries through python
https://github.com/Infosys/Snowflake-Python-Development-Framework
look at the execute_snowquery function

Crontab and PgAgent to run python script

I'm a newbie in Postgre and I have this Python Script which convert my excel file into a PD dataframe. After, the data is send into my PostgreSQL Database.
.....
engine = create_engine('postgresql+psycopg2://username:password#host:port/database')
df.head(0).to_sql('table_name', engine, if_exists='replace',index=False) #truncates the table
conn = engine.raw_connection()
cur = conn.cursor()
output = io.StringIO()
df.to_csv(output, sep='\t', header=False, index=False)
output.seek(0)
contents = output.getvalue()
cur.copy_from(output, 'table_name', null="") # null values become ''
conn.commit()
...
I Would like the script to be run daily with a crontab or a PgAgent Job. I'm currently having my database on my local machine which will be later transfer to a server. Whats the best way to schedule tasks which I will use later on a online server? Also, Can i run a schedule a PgAgent to run a python script?
Crontab is a very good tool for scheduling tasks that you want to run repeatedly at specific times or on a restart.crontab -e allows the current user to edit their crontab file. For example,
30 18 * * * python ~/Documents/example.py
Will run "example.py" at 18:30 everyday assuming the user is logged in. The program will run with the privileges of whosever crontab file it is, assuming that they are logged in. Crontab is very easy to use/edit, completely reliable and what I use personally for scheduling tasks on my own server.

Executing python against a script stored in database

db: mysql
lang: python
framework:
django
Operating System: Linux (ubuntu)
Hello,
Is there a way to execute a python against a content of a script that is stored in a database? For example, a content of a file is stored in a db column text. Would the only solution be to create a temporary file, dump the content from the db into the file and then run python os command against it? I'm assuming the content of the executed script will need to be stored such that it escapes quotes etc.
I'm open to suggestions on what database to use to accomplish my goal. MySQL will require additional wrappers before storage of the file content and possibly apply others to reply qoutes/datetime/etc.
Please advise if additional information necessary, but in essence i'm looking to store python script content in a db, retrieve it and run it against the python interpreter.
Thank you in advance for any advise.
You can use the compile built in function.
s = """def f(x):
return x + x
print(f(22))
"""
code = compile(s, "string", "exec")
exec(code)
# OUT: 44
Although I'm wondering if you couldn't just store a data structure and use that with some pre-defined code. Executing arbitrary code in this way could be dangerous, and a security risk.
This seems very similar to SO post here:
Dynamically loading Python application code from database under Google App Engine
Here is information on exec
http://docs.python.org/release/2.5.2/ref/exec.html
Python Wiki page for Python+MySQL
http://wiki.python.org/moin/MySQL

Resources