I'm a newbie in Postgre and I have this Python Script which convert my excel file into a PD dataframe. After, the data is send into my PostgreSQL Database.
.....
engine = create_engine('postgresql+psycopg2://username:password#host:port/database')
df.head(0).to_sql('table_name', engine, if_exists='replace',index=False) #truncates the table
conn = engine.raw_connection()
cur = conn.cursor()
output = io.StringIO()
df.to_csv(output, sep='\t', header=False, index=False)
output.seek(0)
contents = output.getvalue()
cur.copy_from(output, 'table_name', null="") # null values become ''
conn.commit()
...
I Would like the script to be run daily with a crontab or a PgAgent Job. I'm currently having my database on my local machine which will be later transfer to a server. Whats the best way to schedule tasks which I will use later on a online server? Also, Can i run a schedule a PgAgent to run a python script?
Crontab is a very good tool for scheduling tasks that you want to run repeatedly at specific times or on a restart.crontab -e allows the current user to edit their crontab file. For example,
30 18 * * * python ~/Documents/example.py
Will run "example.py" at 18:30 everyday assuming the user is logged in. The program will run with the privileges of whosever crontab file it is, assuming that they are logged in. Crontab is very easy to use/edit, completely reliable and what I use personally for scheduling tasks on my own server.
Related
I have been running a web scraper script written in Python. I had to terminate the Python script because of an issue with my internet connection. At the time, the script has run for almost 2-3 hours. I used a for loop to write the data into a CSV file. I had used 'file.close()' to save the file once the for loop is over; but as I terminated the program early, my time of two hours have wasted.
Once I tried to delete the newly created CSV file(its size is 0kB), it is said 'The action can't be completed because the file is open in Python'. I thought that all the data I extracted is now on the RAM.(maybe that's why I don't get the permission to close the 0kB sized CSV file?)
So, is there any way to access those data and write the data into the above-mentioned CSV file? (Otherwise, I will have to run to the same program for another two hours and wait for the results)
Here's my code!
#! python3.8
fileCsv = open('newCsv.csv','w',newline='')
outputWriter = csv.writer(fileCsv)
for i in range(100,000): # whatever range
num, name = 10000, 'hello' # The data extracted from the website
ourputWriter.writerow([num,name])
time.sleep(1)
fileCsv.close() # My program was terminated before this line, in the for loop
Using with should help here.
with open('newCsv.csv','w') as wr:
for i in range(100,000): # whatever range
num, name = 10000, 'hello' # The data extracted from the website
wr.writerow([num,name])
time.sleep(1)
I am trying to run a SAS program on linux using python. This server has SAS installed (version 9.4). And python version I have is 3.7.3.
I learned there is a library saspy which I could use however my project does not have access to this library. So, I found another alternative, my intension is to invoke the sas program in python and get the return code when it is successfully completed. This Sas program usually takes 1hr to complete.
So, I would like to use the return code when it is successful and later would like to notify myself through an email. I wrote the below code (sample) and I unable to make the subprocess work ? Any help is appreciated. Thanks
#!usr/bin/python
import os
import subprocess
import time
**My function begins here**
def Trigger_func(T):
if T == 1:
start = time.time() /** want to start a timer to know how long it takes to complete*/
#cmd = os.system('BGsas -p path/Sample.sas') /** SAS code is being invoked properly if used in this way */
sas_script = 'path/Sample.sas' /* Just a variable for the sas code path */
cmd2 = f'BGsas -p {sas_script}'
result = subprocess.call(cmd2, shell=True) /* if I use this subprocess to call the sas code it is not working, I am getting the return code <> 0 **/
if result == 0:
stop = time.time() - start
return [1, stop]
else:
stop = time.time() - start
return [0, stop]
""""""""
When the above process is completed, I will use this success return code to notify myself through
an email
""""""""""""
subprocess.call is an older method of doing this, but it should work; per the documentation, you need to use the returncode attribute to access the return code.
You may be better off using subprocess.run(), which returns a CompletedProcess instance.
Either way, you probably should ensure that your shell command (which looks like a .bat file) actually returns the value from the SAS execution. If it uses call, for example (in Windows batch script), it may actually be running SAS in the background - which is consistent with what you're describing here, and also consistent with the filename (BGsas). You may want to use a different script to launch SAS in the foreground.
I embedded the sas program in shell script and then invoked in python using p = subprocess.run(['sh', './path/shellscript.sh']) and used the p.returncode for further operations.
I see BGsas is not working as intended in my case. Thanks Joe and Tom for your time.
I need to build a solution for a Use Case and I am still a bit of a novice on Python capabilities for version 3.9.9.
Use Case:
User Billy wants to run a script against a Snowflake server Azure database, call it sandbox, using a his own python script on his local machine.
Billy's python script, to keep connection settings secure, needs to call a snowflake_conn.py script, which is located in another network folder location (\abs\here\is\snowflake_conn.py), and pass arguments for DB & Schema.
The call will return a connection to Snowflake Billy can use to run his SQL script.
I am envisioning something like:
import pandas as pd
import snowflake_conn # I need to know how to find this in a network folder, not local.
# and then call the custom conn function
snowflake_connect('database','schema')
# where it returns the snowflake.connector.connect.cursor() as sfconn
conn1 = sfconn.conn()
qry = r'select * from tablename where 1=1'
conn1.execute(qry)
df = conn1.fetch_pandas_all()
I saw something like this..but that was from back in 2016 and likely prior to 3.9.9.
import sys
sys.path.insert(0, "/network/modules/location") # OR "\\abs\here\is\" ??
import snowflake_conn
That snowflake_conn.py file uses a configparser.ConfigParser() .read() command to open a config.ini file in the same folder as the snowflake_conn.py script.
I am following the instructions in another stackoverflow question, link below that is 4 years old, to help get the config.ini setup completed.
import my database connection with python
I also found this link, which seems to also point to only a local folder structure, not network folder.
https://blog.finxter.com/python-how-to-import-modules-from-another-folder/
Eventually I want to try to encrypt the .ini file to protect the contents of that .ini file for increased security, but not sure where to start on that yet.
How to automatically run daily SQL script (Oracle) using python?
- Is that possible to do?
- If possible, How can I export the result from the query to excel by automatically?
You need to use python cron to execute script every day please refer the documentation https://pypi.python.org/pypi/python-crontab.
use cx_Oracle in python. Below is the documentation for that:
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/OOW11/python_db/python_db.htm
this link will help you in order to create your desired script.
Once you get the result in database then you can either use oracle scheduler job or batch scheduler jobs from os to execute your oracle query result into excel: https://dba.stackexchange.com/questions/222209/how-to-automate-csv-exports-of-queries-in-pl-sql-developer-oracle
db: mysql
lang: python
framework:
django
Operating System: Linux (ubuntu)
Hello,
Is there a way to execute a python against a content of a script that is stored in a database? For example, a content of a file is stored in a db column text. Would the only solution be to create a temporary file, dump the content from the db into the file and then run python os command against it? I'm assuming the content of the executed script will need to be stored such that it escapes quotes etc.
I'm open to suggestions on what database to use to accomplish my goal. MySQL will require additional wrappers before storage of the file content and possibly apply others to reply qoutes/datetime/etc.
Please advise if additional information necessary, but in essence i'm looking to store python script content in a db, retrieve it and run it against the python interpreter.
Thank you in advance for any advise.
You can use the compile built in function.
s = """def f(x):
return x + x
print(f(22))
"""
code = compile(s, "string", "exec")
exec(code)
# OUT: 44
Although I'm wondering if you couldn't just store a data structure and use that with some pre-defined code. Executing arbitrary code in this way could be dangerous, and a security risk.
This seems very similar to SO post here:
Dynamically loading Python application code from database under Google App Engine
Here is information on exec
http://docs.python.org/release/2.5.2/ref/exec.html
Python Wiki page for Python+MySQL
http://wiki.python.org/moin/MySQL