Embed python into looker - python-3.x

Is there a natural way to wrap Python code to display in Looker?
The ideal dataflow for my problem is SQL DB->python-> looker, or alternatively, looker->python-> looker. 
I am hoping to embed a.py into lookML so that I can automate python analysis, ready to display in looker.

You can't call Python code from Looker. Depending on the database you are using, you may want to look into creating a UDF within that database and then call it using SQL.

Related

how to extract snowflake tables schema and store-procedures using python script?

I'm intermediate with python and beginner with snowflakes.
Here i'm able to connect snowflakes and fetch table data.
But main problem is to extract tables schema and store-procedures from snowflakes using python script.
thanks in advance.
Use the get_ddl feature of snowflake as shown below to get the table schema
And use SHOW PROCEDURES to get the procedures
You can see the below python package to find out how you can run the above queries through python
https://github.com/Infosys/Snowflake-Python-Development-Framework
look at the execute_snowquery function

Need Jython output in the form of Excel

applist = AdminApp.list().split("\n")
for a in applist:
print a
continue
I need the output of this script in the form of excel sheet
I'm assuming you want to create and write data to an Excel file with Jython. There are several ways to do it:
You can use Java from Jython and use Apache POI - example
You can simply write to a CSV file, and on Windows, Excel should open it automatically.
You can use a Python library like xlwt or openpyxl.
You'll have to install it, e.g.: easy_install openpyxl.
Then you can follow the tutorial here You didn't say if this code runs on your WebSphere instance, if so it may be a bit harder to install a Python module, but surely not impossible.

How to automatically running daily SQL script (Oracle) using python?

How to automatically run daily SQL script (Oracle) using python?
- Is that possible to do?
- If possible, How can I export the result from the query to excel by automatically?
You need to use python cron to execute script every day please refer the documentation https://pypi.python.org/pypi/python-crontab.
use cx_Oracle in python. Below is the documentation for that:
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/OOW11/python_db/python_db.htm
this link will help you in order to create your desired script.
Once you get the result in database then you can either use oracle scheduler job or batch scheduler jobs from os to execute your oracle query result into excel: https://dba.stackexchange.com/questions/222209/how-to-automate-csv-exports-of-queries-in-pl-sql-developer-oracle

Writing into a Jupyter Notebook from Python

Is it possible for a Python script to write into a iPython Notebook?
with open("my_notebook.ipynb", "w") as jup:
jup.write("print(\"Hello there!\")")
If there's some package for doing so, can I also control the way cells are split in the notebook?
I'm designing a software tool (that carries out some optimization) to prepare an iPython notebook that can be run on some server performing scientific computations.
I understand that a related solution is to output to a Python script and load it within a iPython Notebook using %load my_python_script.py. However, that involves a user to type stuff that I would ideally like to avoid.
Look at the nbformat repo on Github. The reference implementation is shown there.
From their docs
Jupyter (né IPython) notebook files are simple JSON documents, containing text, source code, rich media output, and metadata. Each segment of the document is stored in a cell.
It also sounds like you want to create the notebook programmatically, so you should use the NotebookNode object.
For the code, something like, should get you what you need. new_cell_code should be used if you have code cells versus just plain text cells. Text cells should use the existing markdown formatting.
import nbformat
notebook = nbformat.v4.new_notebook()
text = """Hello There """
notebook['cells'] = [nbformat.v4.new_markdown_cell(text)]
notebook= nbformat.v4.new_notebook()
nbformat.write(notebook,'filename.ipynb')

Executing python against a script stored in database

db: mysql
lang: python
framework:
django
Operating System: Linux (ubuntu)
Hello,
Is there a way to execute a python against a content of a script that is stored in a database? For example, a content of a file is stored in a db column text. Would the only solution be to create a temporary file, dump the content from the db into the file and then run python os command against it? I'm assuming the content of the executed script will need to be stored such that it escapes quotes etc.
I'm open to suggestions on what database to use to accomplish my goal. MySQL will require additional wrappers before storage of the file content and possibly apply others to reply qoutes/datetime/etc.
Please advise if additional information necessary, but in essence i'm looking to store python script content in a db, retrieve it and run it against the python interpreter.
Thank you in advance for any advise.
You can use the compile built in function.
s = """def f(x):
return x + x
print(f(22))
"""
code = compile(s, "string", "exec")
exec(code)
# OUT: 44
Although I'm wondering if you couldn't just store a data structure and use that with some pre-defined code. Executing arbitrary code in this way could be dangerous, and a security risk.
This seems very similar to SO post here:
Dynamically loading Python application code from database under Google App Engine
Here is information on exec
http://docs.python.org/release/2.5.2/ref/exec.html
Python Wiki page for Python+MySQL
http://wiki.python.org/moin/MySQL

Resources