AttributeError: module 'gensim.models.word2vec' has no attribute 'load' - python-3.x

I import a text file from the desktop to do with gensim model on jupyter notebook. However, it return that:
"AttributeError Traceback (most recent call last)
in
----> 1 model = word2vec.load(r'C:\Users\qlm\Desktop\globalwarming.txt')
AttributeError: module 'gensim.models.word2vec' has no attribute
'load'"
How can I fix this problem
import numpy as np
import pandas as pd
import gensim
from matplotlib import pyplot as plt
from gensim.models import word2vec
from collections import defaultdict
from sklearn.cluster import KMeans
model = word2vec.Text8Corpus(r'C:\Users\qlm\Desktop\globalwarming.txt')
model = word2vec.load(r'C:\Users\qlm\Desktop\globalwarming.txt')

There is a module named word2vec and inside it a class named Word2Vec, since the Word2Vec class is imported in __init__.py of gensim.models you can import it as you tried before:
from gensim.models import Word2Vec
Then you'll have access to the load method.
You can also use the full namespace too.
So:
# Will work as long as models.__init__ keep it available
from gensim.models import Word2Vec
But:
# Will always work as long as the namespace is not altered
from gensim.models.word2vec import Word2Vec
I personally prefer the second choice.

Related

How to load Keras model from a database (which is essentially not a .h5 file)?

I have created a python script to pull weights information from database and now I want to load it as Keras.model , but script is generating error.
It is working perfectly fine when I am using file instead of extracting the data from database
This works perfectly fine if i use model_weights instead of model_weights_1.
In database this is stored as binary blob.
from keras.models import model_from_json
import tensorflow as tf
import pyodbc
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import configparser
from skimage.transform import resize
import matplotlib.pyplot as plt
import os
from io import StringIO,BytesIO
import h5py
model_struct=f'location of json file'
model_weights=f'location of weight file'
model_weights_1,=cursor.execute(("select query to get weights").fetchone()
model_struct_str,=cursor.execute(("select query to get json string").fetchone()
mdl = model_from_json(model_struct_str)
mdl.load_weights(model_weights_1)
It is throwing error
filepath.endswith(".h5")
TypeError: endswith first arg must be bytes or a tuple of bytes, not str
As Load model is expecting a file which ends with .h5
I also tried to create h5 file but i am unable to do that from binary blob.
Any help will be appreciated

Why not use `from tf import keras`

I have import tensorflow as tf,When I want use tensorflow.kera. Why i should from tensorflow import keras, instead from tf import keras.

DeepImageFeaturizer returing error cannot import name ResNet50 from keras.applications in pyspark

I am trying to.implement image classification for.extracting features from images I am using DeepImageFeaturizer and using Inceptionv3 model
But the from sparkdl import DeepImageFeaturizer is returning error
Import error: cannot import name ResNet50 from Keras.applications in pyspark colab.
I think the error is related to the versions. I don't recommend it but you can try this :
Open this file :
/home/user/.local/lib/python3.8/site-packages/sparkdl/transformers/keras_applications.py
instead of
from keras.applications import resnet50
to
from tensorflow.keras.applications import resnet50

ImportError: cannot import name 'PCA' from 'matplotlib.mlab'

According to this task:
Principal Component Analysis (PCA) in Python
I included this line
import from matplotlib.mlab import PCA
but I get the error message:
cannot import name 'PCA' from 'matplotlib.mlab'
I'm using Python3.7 and I have no idea how I can use the PCA function from matlab. Is the new version of matplotlib depricated or is PCA included to another library?
I really don't know if it is too late to reply now. But I will just place it here anyways.
import numpy as np
from sklearn.decomposition import PCA

AttributeError in sklearn_crfsuite has no attribute CRF arror

I am getting this error when trying our rasa_nlu with spacy
AttributeError: 'sklearn_crfsuite' object has no attribute 'CRF'
rasa_nlu was importing this way
import sklearn_crfsuite
So I tried importing like below before calling rasa_nlu
from sklearn_crfsuite import CRF
But getting a different
error - cannot import name 'CRF'
Looking for some suggestions.
If you want to do sklearn_crfsuite.CRF, then do import sklearn_crfsuite to import it. If you're importing with from sklearn_crfsuite import CRF, then just use CRF by itself.

Resources