Xgb

From Eigenvector Research Documentation Wiki
Revision as of 17:33, 17 December 2018 by imported>Scott (Created page with "===Purpose=== Gradient Boosted Tree (XGBoost) for regression or classification. ===Synopsis=== :model = xgb(x,y,options); %identifies model (calibration step) :pre...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Purpose

Gradient Boosted Tree (XGBoost) for regression or classification.

Synopsis

model = xgb(x,y,options); %identifies model (calibration step)
pred = xgb(x,model,options); %makes predictions with a new X-block
pred = xgb(x,y,model,options); %performs a "test" call with a new X-block and known y-values

Description

To choose between regression and classification, use the xgbtype option:

regression  : xgbtype = 'xgbr'
classification : xgbtype = 'xgbc'

It is recommended that classification be done through the xgbda function.

Inputs

  • x = X-block (predictor block) class "double" or "dataset",
  • y = Y-block (predicted block) class "double" or "dataset",
  • model = previously generated model (when applying model to new data)

Outputs

  • model = standard model structure containing the xgboost model (see Standard Model Structure). Feature scores are contained in model.detail.xgb.featurescores.
  • pred = structure array with predictions
  • valid = structure array with predictions