Dash app deployed on Heroku cannot read CSV file - python-3.x

I'm trying to use Heroku to deploy my Dash app, which is supposed to read data from a local CSV file. The deployment was successful, but if I open the URL of the app, it gives me an Application Error.
I have checked the Heroku logs and I found a FileNotFoundError which tells me the CSV file from which the app reads the data does not exist, but it works if I run the app locally. In fact, the CSV file exists in my directory, so I want to know if there's another way to go about this.
EDIT: Actually, this is how my app.py code starts. The FileNotFoundError points to the part where I read the CSV file with pandas.
How can I get my app to read the CSV file?
import dash
import dash_core_components as dcc
import dash_html_components as html
import dash_table as tablefrom
from dash.dependencies import Input, Output
import plotly as py
import plotly.graph_objs as go
import numpy as np
import pandas as pd
filepath='C:\\Users\\DELL\\Desktop\\EDUCATE\\DATA CSV\\crop_prod_estimates_GH.csv'
data=pd.read_csv(filepath,sep=',',thousands=',')
data.dropna(inplace=True)
data[['REGION','DISTRICT','CROP']]=data[['REGION','DISTRICT','CROP']].astype('category')
data.CROP=data.CROP.str.strip()
data.drop(data.columns[0],axis=1,inplace=True)

Solved it!!!!!!!!!
I uploaded my csv data file on my github repository and had the app.py read data from it.like:
url = 'https://raw.githubusercontent.com/your_account_name/repository_name/master/file.csv'
df = pd.read_csv(url,sep=",")
df.head()

You can store the csv file at the same location where your app.py exists.
Change from:
filepath='C:\\Users\\DELL\\Desktop\\EDUCATE\\DATA CSV\\crop_prod_estimates_GH.csv'
To:
filepath='crop_prod_estimates_GH.csv'
It should work.

Upload your csv file on cloudinary:
urlfile = 'https://res.cloudinary.com/hmmpyq8rf/raw/upload/v1604671300/localisationDigixpress_n8s98k.csv'
df = pd.read_csv(urlfile,sep=",")
df.head()

Related

Why Pandas does not read xlsx files?

I am having trouble reading an xlsx file on Pandas.. The same code used to work before but does not work anymore. I tried a lot of ways but to no avails.
Here is my code
import pandas as pd
from io import StringIO
df = pd.read_csv("Muzika.xlsx")
print(df)
Your file is not .csv, it is an excel file, so try read_excel(): https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.read_excel.html

Read a utf-16LE file directly in cloud function -python/GCP

I have a csv file with utf-16le encoding, I tried to open it in cloud function using
import pandas as pd
from io import StringIO as sio
with open("gs://bucket_name/my_file.csv", "r", encoding="utf16") as f:
read_all_once = f.read()
read_all_once = read_all_once.replace('"', "")
file_like = sio(read_all_once)
df = pd.read_csv(file_like, sep=";", skiprows=5)
I get the error that the file is not found on location. what is the issue? When I run the same code locally with a local path it works.
Also when the file is in utf-8 encoding I can read it directly with
df = pd.read_csv("gs://bucket_name/my_file.csv, delimiter=";", encoding="utf-8", skiprows=0,low_memory=False)
I need to know if I can read the utf16 file directly with pd.read_csv()? if no, how do I make with open() recognize the path?
Thanks in advance!
Yes, you can read the utf-16 csv file directly with the pd.read_csv() method.
For the method to work please make sure that the service account attached to your function has access to read the CSV file in the Cloud Storage bucket.
Please ensure whether the encoding of the csv file you are using is “utf-16” or “utf-16le” or “utf-16be” and use the appropriate one in the method.
I used python 3.7 runtime.
My main.py file and requirement.txt file looks as below. You can
modify the main.py according to your use case.
main.py
import pandas as pd
def hello_world(request):
#please change the file's URI
data = pd.read_csv('gs://bucket_name/file.csv', encoding='utf-16le')
print (data)
return f'check the results in the logs'
requirement.txt
pandas==1.1.0
gcsfs==0.6.2

Is there anyway to call PubChem API In python?

I have been using PubChem API to convert Chemical smiles to the structure but still have an error.
Here is my google colab I try with PIL image plus TKinter
https://colab.research.google.com/drive/1TE9WxXwaWKSLQzKRQoNlWFqztVSoIxB7
My desired output should be in structure format like this
https://pubchem.ncbi.nlm.nih.gov/rest/pug/compound/smiles/O=C(N1C=CN=C1)N2C=CN=C2/PNG?record_type=2d&image_size=large
Download and display in a Jupyter Notebook
from urllib.request import urlretrieve
from IPython.display import Image
smiles = 'NC1=NC(C)=C(C2=CC=C(S(=O)(C)=O)C(F)=C2)S1'
urlretrieve('https://pubchem.ncbi.nlm.nih.gov/rest/pug/compound/smiles/'+smiles+'/PNG', 'smi_pic.png')
p = Image(filename='smi_pic.png')
p
Output

Python Bokeh FileInput Widget

I want to upload a .txt file with two column datas using the bokeh FileInput widget and then plot the data interactively. Can someone provide me a minimal example?
Thanks a lot.
There is an example on importing files via the server directory structure and papaparse here:
Upload a CSV file and read it in Bokeh Web app
This was made a long time ago, before the FileInput widget was officially included with the Bokeh 1.3.0 distribution. Now it should work with this new widget however i cannot find the documentation on how to add a server callback to it.
After testing i come to this use of the new FileInput widget:
from bokeh.io import curdoc
from bokeh.models.widgets import FileInput
def upload_fit_data(attr, old, new):
print("fit data upload succeeded")
print(file_input.value)
file_input = FileInput(accept=".csv,.json,.txt")
file_input.on_change('value', upload_fit_data)
doc=curdoc()
doc.add_root(file_input)
This will give you the file data as a base64 encoded string (file_input.data). I leave it up to you to convert the base64 string to what you want and plot the data.

How to upload Cloud files into Python

I use to upload excel files into pandas dataframe
pd.ExcelFile if the files are in my local drive
How can I do the same if I have an Excel file in Google Drive or Microsoft One Drive and I want to connect remotely?
You can use read_csv() on a StringIO object:
from StringIO import StringIO # moved to io in python3.
import requests
r = requests.get('Your google drive link')
data = r.content
df = pd.read_csv(StringIO(data))

Resources