GoogleAuth: ModuleNotFoundError: No module named 'google.colab' - python-3.x

I want to import data from google drive to a data frame. But my code breaks down at the 'from google.colab import auth' step. It says google.colab module not found.
Tried suggestions at
How to resolve: ModuleNotFoundError: No module named 'google.colab'
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
auth.authenticate_user()
import gspread
from oauth2client.client import GoogleCredentials
gc = gspread.authorize(GoogleCredentials.get_application_default())
Error message:
ModuleNotFoundError: No module named 'google.colab'

If you are running the code on your local machine, try using PyDrive to read from google drive to your local machine:
!pip install PyDrive
from pydrive.drive import GoogleDrive
drive = GoogleDrive(gauth)
# Auto-iterate through all files that matches this query
file_list = drive.ListFile({'q': "'root' in parents and trashed=false"}).GetList()
# You can download file content using GetContentFile(filename) by initializing
# GoogleDriveFile instance with file id.
file6 = drive.CreateFile({'id': file5['id']})
file6.GetContentFile('catlove.png') # Download file as 'catlove.png'.

That's probably because the package isn't installed. Download and install it with PIP:
pip install google-colab

Related

ProQEXAFS: ImportError: cannot import name 'donaich' from 'lmfit.lineshapes'

I'm trying to get the ProXAS_v2.43 running for the evaluation of QEXAFS data. I installed the necessary packages the manual provided, but when I try to start the program I get the following error: ImportError: cannot import name 'donaich' from 'lmfit.lineshapes' (C:\Users\sq0346\Anaconda3\lib\site-packages\lmfit\lineshapes.py)
All packages required listed by conda search , should be present.
Mainly: Pandas, Scipy, Numpy-indexed, Xraylarch
Full error:
File
~\Anaconda3\envs\py38\Lib\site-packages\ProQEXAFS-GUI-master\ProXAS-2.43\ProXAS_v2.43.py:9
in
import tkinter, time, os, psutil, subprocess, sys, shutil, ast, codecs, re, larch, gc, peakutils.peak, itertools
File ~\Anaconda3\lib\site-packages\larch_init_.py:47 in
from . import builtins
File ~\Anaconda3\lib\site-packages\larch\builtins.py:21 in
from . import math
File ~\Anaconda3\lib\site-packages\larch\math_init_.py:4 in
from .utils import (linregress, realimag, as_ndarray,
File ~\Anaconda3\lib\site-packages\larch\math\utils.py:11 in
from .lineshapes import gaussian, lorentzian, voigt
File ~\Anaconda3\lib\site-packages\larch\math\lineshapes.py:16 in
from lmfit.lineshapes import (gaussian, lorentzian, voigt, pvoigt, moffat,
ImportError: cannot import name 'donaich' from 'lmfit.lineshapes'
(C:\Users\sq0346\Anaconda3\lib\site-packages\lmfit\lineshapes.py)
Updating XRaylrach to version 0.9.60 resolved it, but produced a new error:
File
~\Anaconda3\Lib\site-packages\ProQEXAFS-GUI-master\ProXAS-2.43\ProXAS_v2.43.py:9
in
import tkinter, time, os, psutil, subprocess, sys, shutil, ast, codecs, re, larch, gc, peakutils.peak, itertools
File ~\Anaconda3\lib\site-packages\larch_init_.py:48 in
from .version import date, version, release_version
ImportError: cannot import name 'release_version' from
'larch.version'
(C:\Users\sq0346\Anaconda3\lib\site-packages\larch\version.py)
Update xraylarch to its latest version. That will fix the misspelled import.

How to handle importing a package import if it doesn't exist

import os
import re
import fitz # requires fitz, PyMuPDF
import pdfrw
import subprocess
import os.path
import sys
from PIL import Image
In my case fitz doesn't exist since it needs PyMuPDF. Is there a way to tell python to download dependencies and install them if they don't exist?
I am new to python and learning a lot. Apologizes in advance
Using Python 3.9.4
Platform that I am developing on is macOS but will be deploying on Windows
Editor is VSCode
Using try-catch to handle missing package
Ex:
import subprocess
def install(package):
subprocess.call(['pip', 'install', package])
try:
import fitz # requires fitz, PyMuPDF
except:
install('fitz')
A better practice would be to handle this before your code is executed. Example: using a requirements.txt file with all the dependent packages. And running the file before code execution.

ffmpeg still not working on heroku [discord.py]

I have tried all the methods I found on Internet. Here is the screenshot of my heroku builds:
Here are the modules I imported:
import asyncio
import functools
import itertools
import math
import random
import discord
import youtube_dl
from async_timeout import timeout
from discord.ext import commands
from dotenv import load_dotenv
import os
This is the requirements.txt file:
discord.py
PyNaCl==1.3.0
dnspython==1.16.0
async-timeout==3.0.1
python-dotenv
youtube_dl
And I get the following in the heroku logs:
2021-01-04T17:29:40.235769+00:00 app[worker.1]: ffmpeg: error while loading shared libraries: libvpx.so.5: cannot open shared object file: No such file or directory
Where is the problem

Importing Colab modules into Colab

I am new to using Colab, I want to create modules that are imported into other notebooks. I have tried a number of methods and the best I end up with is this error message:
ModuleNotFoundError Traceback (most recent call last)
in ()
----> 1 import converters
ModuleNotFoundError: No module named 'converters'
Can you import a *.ipynb as a module into Colab?
I have mounted the gdrive and set the root dir - see code below.
from google.colab import drive
drive.mount('/content/gdrive/', force_remount=True)
root_dir = "content/gdrive/My Drive/Colab Notebooks/projects/utilities/"
base_dir = root_dir + "projects/"
I have tried other methods as well, I have also tried sharing the notebook. help. thanks
Try :
# (1) Mount your google drive in google colab
from google.colab import drive
drive.mount('/content/drive')
# (2) Insert the directory using sys
import sys
sys.path.insert(0,’/content/gdrive/My Drive/Colab Notebooks/projects/utilities/’)
# Import your module or file
import my_module
OR
from google.colab import drive
drive.mount('/content/drive')
sys.path.append('/content/gdrive/My Drive/Colab Notebooks/projects/utilities/')
!cp -r "/content/drive/My Drive/Colab Notebooks/projects/utilities/my_module.ipynb" '/content/'

How to import .py file in jupyter notebook from AWS S3

I am working on a jupyter notebook in AWS, I have two files: main.ipynb and utils.py, what I would like to do is to import utils in my jupyter notebook file.
Unfortunately I have tried the following solutions and none of them are working:
import sys
sys.path.append("/home/jovyan/dir1")
%load utils.py
And to directly import after changing the directory
import utils
my file "utils.py":
def hello():
print("hello")
Problem solved:
the solution is to add those :
s3 = boto3.resource('s3')
s3.meta.client.download_file(os.environ['S3_BUCKET'], "dir1/utils.py", "utils.py")
import utils
Export python path to the path from where you are trying to import utils.
Run this below command from you file path
export $PYTHONPATH=.

Resources