I am trying to do a binary classification using Keras LSTM. My input data is of shape 2340 records * 254 features. The output is 1*2340. Below is my code. X_res = np.array(X_res) X_res = np.reshape(X_res,([1,2340,254])) y_res = np.array(y_res) y_res = np.reshape(y_res,([1,2340])) y_test = np.array(y_test) y_test = np.reshape(y_test,([1,314])) model = keras.Sequential() model.add(keras.layers.LSTM(32 ,input_dim = 254,return_sequences=True)) ..
Category : classification
I have a multiclass problem with 10 classes. Using any of the sklearn classifiers with predict_proba I get an output of (n_classes, n_samples, n_classes_probability_1_or_0) in my case (10, 4789, 2) Now with binary Classification I would just do model.predict_proba(X)[:, 1] I had assumed that: pred = np.array(model.predict_proba(X)) pred = pred.reshape(-1, 10, 2)[:, :, 1] would ..
I am new to machine learning and I started following this tutorial to learn about Employee Churn. I followed the steps but my code is stuck on that cell below 20. The only thing I have changed is to remove the idd parameter but it works for the one above it just not this one. ..
I’m currently working on breast cancer detection model (by using CNN) The total of the dataset is 690 with have three labels (Benign, Malignant, and Normal) and the model takes two mammogram images views for each patient (side and top views) The highest accuracy I got is 0.4960, How do I increase it? class ClassificationModel: ..
Is there a way to implement a generalized linear model to classification problems in sklearn? Since there isn’t a classification class for that I thought of applying a sigmoid function to regression results. Is there an easy way to do it with sklearn? I’ve tried stacking but StackingClassifier does not support regressors as estimators. glm ..
I have written the following code to perform RandomizedSearchCV on LightGBM Classifier Model, but I am getting the following error. ValueError: For early stopping, at least one dataset and eval metric is required for evaluation Code import lightgbm as lgb fit_params={"early_stopping_rounds":30, "eval_metric" : ‘f1’, "eval_set" : [(X_val,y_val)], ‘eval_names’: [‘valid’], ‘verbose’: 100, # ‘categorical_feature’: ‘auto’ } ..
I am using a GAT model for graph node classification with highly imbalanced dataset. I have 4 classes among those 80% of the data falls under single class. So I am unable to train the model properly.I have tried to give class_weight to loss function but it’s not giving any good results. Could anyone suggest ..
I am performing CatBoost Classifier on a survey dataset in which entire data is categorical. The features are already set to "Category" type of course and the output is binary as well but on using SHAP values, I saw that the f(x) is -0.516. If I am not wrong, f(x) is the output. So, how ..
I’m relatively new to Machine Learning, I was using a Logistic Regression model from sklearn to classify whether someone has heart disease or not based on some 13 features. So, when it came to checking feature importance once the model was trained, I noticed this feature called slope which had feature importance of around 0.45: ..
When we opt for one-vs-all configuration to binarize our multi-class test dataset, doesn’t this configuration leave the dataset unbalanced for computing the AUC since this will lead to increase in True Negative’s which will in turn effect the FPR (False Positive Rate)? For instance, class A and ‘not A’ won’t be the same size. Class ..
Recent Comments