Module ensemble (2.29.0)

Ensemble models. This module is styled after scikit-learn's ensemble module: https://scikit-learn.org/stable/modules/ensemble.html

Classes

RandomForestClassifier

RandomForestClassifier(
    n_estimators: int = 100,
    *,
    tree_method: typing.Literal["auto", "exact", "approx", "hist"] = "auto",
    min_tree_child_weight: int = 1,
    colsample_bytree: float = 1.0,
    colsample_bylevel: float = 1.0,
    colsample_bynode: float = 0.8,
    gamma: float = 0.0,
    max_depth: int = 15,
    subsample: float = 0.8,
    reg_alpha: float = 0.0,
    reg_lambda: float = 1.0,
    tol: float = 0.01,
    enable_global_explain: bool = False,
    xgboost_version: typing.Literal["0.9", "1.1"] = "0.9"
)

A random forest classifier.

A random forest is a meta estimator that fits a number of decision tree classifiers on various sub-samples of the dataset and uses averaging to improve the predictive accuracy and control over-fitting.

RandomForestRegressor

RandomForestRegressor(
    n_estimators: int = 100,
    *,
    tree_method: typing.Literal["auto", "exact", "approx", "hist"] = "auto",
    min_tree_child_weight: int = 1,
    colsample_bytree: float = 1.0,
    colsample_bylevel: float = 1.0,
    colsample_bynode: float = 0.8,
    gamma: float = 0.0,
    max_depth: int = 15,
    subsample: float = 0.8,
    reg_alpha: float = 0.0,
    reg_lambda: float = 1.0,
    tol: float = 0.01,
    enable_global_explain: bool = False,
    xgboost_version: typing.Literal["0.9", "1.1"] = "0.9"
)

A random forest regressor.

A random forest is a meta estimator that fits a number of classifying decision trees on various sub-samples of the dataset and uses averaging to improve the predictive accuracy and control over-fitting.

XGBClassifier

XGBClassifier(
    n_estimators: int = 1,
    *,
    booster: typing.Literal["gbtree", "dart"] = "gbtree",
    dart_normalized_type: typing.Literal["tree", "forest"] = "tree",
    tree_method: typing.Literal["auto", "exact", "approx", "hist"] = "auto",
    min_tree_child_weight: int = 1,
    colsample_bytree: float = 1.0,
    colsample_bylevel: float = 1.0,
    colsample_bynode: float = 1.0,
    gamma: float = 0.0,
    max_depth: int = 6,
    subsample: float = 1.0,
    reg_alpha: float = 0.0,
    reg_lambda: float = 1.0,
    learning_rate: float = 0.3,
    max_iterations: int = 20,
    tol: float = 0.01,
    enable_global_explain: bool = False,
    xgboost_version: typing.Literal["0.9", "1.1"] = "0.9"
)

XGBoost classifier model.

XGBRegressor

XGBRegressor(
    n_estimators: int = 1,
    *,
    booster: typing.Literal["gbtree", "dart"] = "gbtree",
    dart_normalized_type: typing.Literal["tree", "forest"] = "tree",
    tree_method: typing.Literal["auto", "exact", "approx", "hist"] = "auto",
    min_tree_child_weight: int = 1,
    colsample_bytree: float = 1.0,
    colsample_bylevel: float = 1.0,
    colsample_bynode: float = 1.0,
    gamma: float = 0.0,
    max_depth: int = 6,
    subsample: float = 1.0,
    reg_alpha: float = 0.0,
    reg_lambda: float = 1.0,
    learning_rate: float = 0.3,
    max_iterations: int = 20,
    tol: float = 0.01,
    enable_global_explain: bool = False,
    xgboost_version: typing.Literal["0.9", "1.1"] = "0.9"
)

XGBoost regression model.