I'm trying to analyze customer shopping data and I'm trying this using lifetimes package in Python.I'm unable to import estimation module in lifetime
from lifetimes.utils import *
from lifetimes.plotting import *
from lifetimes.estimation import *
from lifetimes.estimation import *
ModuleNotFoundError: No module named 'lifetimes.estimation'
install lifetimes version 0.2.2.2
https://pypi.org/project/Lifetimes/0.2.2.2/
pip install Lifetimes==0.2.2.2
Thought to expand on it a bit more in case anyone else faces the same issue (due to following redundant codes).
The lifetimes.estimation module is no longer available within the lifetimes package (as per the latest model documentation). All the functionalities in this module were moved to other modules and you really do not need this module anymore. Continue using the latest model version 0.11.1 and don't import this specific submodule.
Related
If we write a function which depends on some packages / modules, do we need to import all the modules from within function? Or do we somehow (how?) check for dependencies and raise an error / warning? How to do it in efficient manner as the function might be called many-many times? And how do we deal with aliases? I mean if the code that calls the function has imported a required package but used an alias, e.g. import numpy as np, how do we access it from within the function?
So looking at your comment I would suggest you have answered your own question. You understand that you don’t expect the user to know about a function’s module dependencies.
You say you want to have this function in a module - let’s call it my_module for now.
So you have a file my_module.py:
#Add any imports code in this module relies on
import os
import numpy as np
def my_numpy_func(arg1, arg2):
#function code using numpy e.g.
my_arr = np.array(arg1)
#etc etc
return result
def some_other_func():
#blah blah blah
Now anyone who wants to use your function can simply write from my_module import my_numpy_func and not have to worry about loading the dependencies.
Note: this doesn’t have anything to do with ensuring that a user actually has required non-standard packages installed on their machine. For example if they have not installed numpy they will get an ImportError when they try to import your function.
If you want to distribute your code and make it so that users don’t need to worry about that, then you probably need to make your code a package that makes the dependencies a requirement on installation.
I am trying to import ByteArray ,from the Cryptonite library.
My cabal file has cryptonite in the Build depends ,and my import statement looks like this
import Crypto.Internal.ByteArray (ByteArray, Bytes)
import qualified Crypto.Internal.ByteArray as B
The error I get is
Could not load module ‘Crypto.Internal.ByteArray’
it is a hidden module in the package ‘cryptonite-0.25’
I have seen other code examples which use this specific import statement ,what am I missing here?
As per GHC Docs, hidden modules
"cannot be imported, but they are still subject to the overlapping
constraint: no other package in the same program may provide a module
of the same name."
I'm trying to do the Udacity mini project and I've got the latest version of the SKLearn library installed (20.2).
When I run:
from sklearn.decomposition import RandomizedPCA
I get the error:
ImportError: cannot import name 'RandomizedPCA' from 'sklearn.decomposition' (/Users/kintesh/Documents/udacity_ml/python3/venv/lib/python3.7/site-packages/sklearn/decomposition/__init__.py)
I actually even upgraded the version using:
pip3 install -U scikit-learn
Which upgraded from 0.20.0 to 0.20.2, which also uninstalled and reinstalled... so I'm not sure why it can't initialise sklearn.decomposition.
Are there any solutions here that might not result in completely uninstalling python3 from my machine?! Would ideally like to avoid that.
Any help would be thoroughly appreciated!
Edit:
I'm doing some digging and trying to fix this, and it appears as though the __init__.py file in the decomposition library on the SKLearn GitHub doesn't reference RandomizedPCA... has it been removed or something?
Link to the GitHub page
As it turns out, RandomizePCA() was depreciated in an older version of SKLearn and is simply a parameter in PCA().
You can fix this by changing the import statement to:
from sklearn.decomposition import PCA as RandomizedPCA
... and then your classifier looks like this:
pca = RandomizedPCA(n_components=n_components, svd_solver='randomized', whiten=True).fit(X_train)
However, if you're here because you're doing the Udacity Machine Learning course on Eigenfaces.py, you'll notice that the PIL library is also deprecated.
Unfortunately I don't have a solution for that one, but here's the GitHub issue page, and here's a kind hearted soul that used a Jupyter Notebook to solve their mini-project back when these repositories worked.
I hope this helps, and gives enough information for the next person to get into Machine Learning. If I get some time I might take a crack at recoding eigenfaces.py for SKLearn 0.20.2, but for now I'm just going to crack on with the rest of this course.
In addition to what #Aaraeus said, the PIL library has been forked to Pillow.
You can fix the PIL import error using
pip3 install pillow
That's probably totally noob question which has something to do with python module importing, but I can't understand why the following is valid:
> import tensorflow as tf
> f = tf.train.Feature()
> from tensorflow import train
> f = train.Feature()
But the following statement causes an error:
> from tensorflow.train import Feature
ModuleNotFoundError: No module named 'tensorflow.train'
Can please somebody explain me why it doesn't work this way? My goal is to use more short notation in the code like this:
> example = Example(
features=Features(feature={
'x1': Feature(float_list=FloatList(value=feature_x1.ravel())),
'x2': Feature(float_list=FloatList(value=feature_x2.ravel())),
'y': Feature(int64_list=Int64List(value=label))
})
)
tensorflow version is 1.7.0
Solution
Replace
from tensorflow.train import Feature
with
from tensorflow.core.example.feature_pb2 import Feature
Explanation
Remarks about TensorFlow's Aliases
In general, you have to remember that, for example:
from tensorflow import train
is actually an alias for
from tensorflow.python.training import training
You can easily check the real module name by printing the module. For the current example you will get:
from tensorflow import train
print (train)
<module 'tensorflow.python.training.training' from ....
Your Problem
In Tensorflow 1.7, you can't use from tensorflow.train import Feature, because the from clause needs an actual module name (and not an alias). Given train is an alias, you will get an ImportError.
By doing
from tensorflow import train
print (train.Feature)
<class 'tensorflow.core.example.feature_pb2.Feature'>
you'll get the complete path of train. Now, you can use the import path as shown above in the solution above.
Note
In TensorFlow 1.9.0, from tensorflow.train import Feature will work, because tensorflow.train is an actual package, which you can therefore import. (This is what I see in my installed Tensorflow 1.9.0, as well as in the documentation, but not in the Github repository. It must be generated somewhere.)
Info about the path of the modules
You can find the complete module path in the docs. Every module has a "Defined in" section. See image below (taken from Module: tf.train):
I would advise against importing Feature (or any other object) from the non-public API, which is inconvenient (you have to figure out where Feature is actually defined), verbose, and subject to change in future versions.
I would suggest as an alternative to simply define
import tensorflow as tf
Feature = tf.train.Feature
I have the following structure:
ds/
BST.py
BSTNode.py
When I do
from .BSTNode import BSTNode
inside BST.py, and I run from the IDLE BST.py from within ds I receive the following error:
SystemError: Parent module '' not loaded, cannot perform relative import
Why am I receiving it? Note that I am not asking how to fix the problem, because it would work if I simply do
from BSTNode import BSTNode
i.e. without the leading .. I just want to understand better how the Python's import system works.
Note that theoretically I should not need any __init__.py within ds because I am using Python 3.5. If I am not clear enough, just ask!
Thank you!!!!