Facing error in the final cell of ML code - dense-rank

TypeError: Dense.init() missing 1 required positional argument: 'units'enter image description here

Related

How can I set hidden_units to a list in Vertex AI?

I am following this notebook '02 ML Experimentation with Custom Model'.
When i try vertex_ai.log_params(hyperparams), I get:
TypeError: Value for key hidden_units is of type list but must be one of float, int, str
but the next step classifier = trainer.train needs hidden_units to be a list.
(My version of google-cloud-aiplatform is 1.16.0.)
Any help is appreciated.

am trying to extracht hash tags using the neattext function but i keep getting this error. TypeError: expected string or bytes-like object

import neattext.functions as nfx
df['Tweets']
df['Tweets'].apply(nfx.extract_hashtags)
am trying to extracht hash tags using the neattext function but i keep getting this error. TypeError: expected string or bytes-like object
to fix the error you can convert the text to string
eg
Method 1
df['Tweets'].apply(lambda x: nfx.extract_hashtags(str(x)))
Method 2: using astype
df['Tweets'].astype(str).apply(nfx.extract_hashtags))

ColumnTransformer object has no attribute shape error

My data file (CSV) contains categorical and non-categorical variables. To perform cox proportional hazard (CPH) I applied OneHotEncoder on two categorical variables (study_category and patient_category). I am getting the following error on the line where I am trying to fit the CPH model. I am passing three parameters: dataframe, duration column (), event column() to cph.fit() method. I googled the error but could not found something useful. I am using CPH first time, any help to fix the issue will be appreciated.
Error:
AttributeError: 'ColumnTransformer' object has no attribute 'shape'
My python code:
def meth():
dataset = pd.read_csv("C:/Users/XYZ/CTR_Project/CPH.csv")
dataset=dataset.loc[:,
['study_Category','patient_Category','Diff_time','Events']]
X=dataset.loc[:,['study_Category','patient_Category','Diff_time','Events']]
colm_transf=make_column_transformer((OneHotEncoder(),
['study_Category','patient_Category']),remainder='passthrough')
colm_transf.fit_transform(X)
cph= CoxPHFitter()
cph.fit(colm_transf,duration_col='Diff_time', event_col='Events')
cph.print_summary()

AttributeError:Float' object has no attribute log /TypeError: ufunc 'log' not supported for the input types

I have a series of fluorescence intensity data in a column ('2.4M'). I tried to create a new column 'ln_2.4M' by taking the ln of column '2.4M' I got an error:
AttributeError: 'float' object has no attribute 'log'
df["ln_2.4M"] = np.log(df["2.4M"])
I tried using a for loop to iterate the log over each fluorescence data in the column "2.4M":
ln2_4M = []
for x in df["2.4M"]:
ln2_4M = np.log(x)
print(ln2_4M)
Although it printed out ln2_4M as log of column "2.4M" correctly, I am unable to use the data because it gave alongside a TypeError:
ufunc 'log' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe'
Not sure why? - Any help at understanding what is happening and how to fix this problem is appreciated. Thanks
.
I then tried using the method below and it worked:
df["2.4M"] = pd.to_numeric(df["2.4M"],errors = 'coerce')
df["ln_24M"] = np.log(df["2.4M"])

I tried to train eigenfaces using knn it trained but when test with new image it give me the error in findnearest function

I tried to train eigenfaces using knn. It trained but when I test it with new an image it gives me the error in findnearest function
cv2.error: D:\Build\OpenCV\opencv-3.2.0\modules\ml\src\knearest.cpp:325: error: (-215) test_samples.type() == CV_32F && test_samples.cols == samples.cols in function cv::ml::BruteForceImpl::findNearest
That's my function:
ret,id,neighbours,dist=knn.findNearest(S,cv2.ml.COL_SAMPLE,7)
it is probably an error in the typing of the variable S. I went through an error equal to that and decided to put the test case inside an array and it works to me.

Resources