Figure 3.
Schematic outlining the 13 base learner models used by the XGB Super Learner to determine classification. This involves categorization into 1 of 9 diagnostic bins. GBM, stochastic gradient boosting; HDDA, high dimensional discriminant analysis; KNN, k-nearest neighbors; MDA, mixture discriminant analysis; MULTINOM, penalized multinomial regression; NB, naïve Bayes; NN, neural network; PAM, nearest shrunken centroids; PDA, penalized discriminant analysis; PLS, partial least squares; RF, random forest; SVMPOLY, support vector machine with polynomial kernel; SVMRAD, support vector machine with radial kernel with radial kernel; XGB, eXtreme gradient boosting.

Schematic outlining the 13 base learner models used by the XGB Super Learner to determine classification. This involves categorization into 1 of 9 diagnostic bins. GBM, stochastic gradient boosting; HDDA, high dimensional discriminant analysis; KNN, k-nearest neighbors; MDA, mixture discriminant analysis; MULTINOM, penalized multinomial regression; NB, naïve Bayes; NN, neural network; PAM, nearest shrunken centroids; PDA, penalized discriminant analysis; PLS, partial least squares; RF, random forest; SVMPOLY, support vector machine with polynomial kernel; SVMRAD, support vector machine with radial kernel with radial kernel; XGB, eXtreme gradient boosting.

Close Modal

or Create an Account

Close Modal
Close Modal