/ Home

Pretty Metrics

Classifier:

1) SGDC Classifier

Pros:

Cons:

Reference links:

2) Logistic Regression

Pros:

Cons:

Reference links:

3) LabelPropagation

Pros:

Cons:

Reference links:

4) Random Forest Classifier

Pros:

Reference links:

Cons:

5) GradientBoostingClassifier

Pros:

Cons:

Reference links:

6) AdaBoostClassifier

Pros:

Cons:

Reference links:

7) BaggingClassifier

Pros:

Cons:

Reference links:

8) BernoulliNB

Pros:

Cons:

Reference links:

9) Linear Discriminant Analysis

Pros:

Cons:

Reference links:

10) GaussianNB

Pros:

Cons:

Reference links:

11) DecisionTreeClassifier

Pros:

Cons:

Reference links:

12) MLP Classifier

PROS:

CONS:

Reference links:

13) SVC

PROS:

CONS:

Reference links:

14) KNeighborsClassifier

PROS:

CONS:

Reference links:

15) LinearSVC

PROS:

CONS:

Reference links:

16) Perceptron

PROS:

CONS:

Reference links:

17) LogisticRegressionCV

PROS:

CONS:

Reference links:

18) PassiveAggressiveClassifier

PROS:

CONS:

Reference links:

19) QuadraticDiscriminantAnalysis

PROS:

CONS:

Reference links:

20) HistGradientBoostingClassifier

PROS:

CONS:

Reference links:

21) CalibratedClassifierCV

PRO:

CONS: 1.Could produce worse probabilities calibration wise if the assumptions do not hold 2.Requires more data points to work well 3.Calibration takes time and effort. 4.It uses the entire (cross-validated) answer set for calibration and refits the base model on the full data set. Thus, both the calibration function and the base model are effectively trained on the full data set.

Reference links:

##Regression

1) SVR:

PROS:

CONS:

Reference links:

2) Bagging Regressor:

PROS:

CONS:

Reference links:

3) NuSVR:

Pros:

Cons:

Reference links:

4) RandomForestRegressor:

Pros:

Cons:

Reference links:

5) XGBRegressor:

Pros:

Cons:

Reference links:

6) GradientBoostingRegressor

Pros:

Cons:

Reference links:

7) HistGradientBoostingRegressor

Pros:

Cons:

Reference links:

8) PoissonRegressor

Pros:

Cons:

Reference links:

9) LGBMRegressor

Pros:

Cons:

Reference links:

10) Decision Tree Regression :

Pros:

Cons: 1) Overfitting is one of the practical difficulties for decision tree models. It happens when the learning algorithm continues developing hypotheses that reduce the training set error but at the cost of increasing test set error. But this issue can be resolved by pruning and setting constraints on the model parameters. 2) Decision trees cannot be used well with continuous numerical variables. 3) A small change in the data tends to cause a big difference in the tree structure, which causes instability. 4) Calculations involved can also become complex compared to other algorithms, and it takes a longer time to train the model. 5) It is also relatively expensive as the amount of time taken and the complexity levels are greater.

Reference links:

11) LinearRegression

Pros:

Cons :

Reference links:

12) LARS :

Pros:

Cons:

Reference links:

13) SGD

Pros:

Pros:

Cons:

Reference links:

14) Lasso Regression

Pros:

Cons:

Reference links:

15) ExtraTreesRegressor

PROS:

CONS:

Reference links:

16) Kernel Ridge

PROS:

CONS:

Reference links:

17) GammaRegressor

PROS:

CONS:

Reference links:

19) SGDRegressor

PROS:

CONS:

Reference links:

20) LabelSpreading

PROS :

CONS:

Reference links:

21) KNeighborsRegressor

PROS :

CONS:

Reference links: