Xgb: Difference between revisions

From Eigenvector Research Documentation Wiki
Jump to navigation Jump to search
imported>Scott
(Created page with "===Purpose=== Gradient Boosted Tree (XGBoost) for regression or classification. ===Synopsis=== :model = xgb(x,y,options); %identifies model (calibration step) :pre...")
 
imported>Scott
Line 7: Line 7:
:model = xgb(x,y,options);          %identifies model (calibration step)
:model = xgb(x,y,options);          %identifies model (calibration step)
:pred  = xgb(x,model,options);      %makes predictions with a new X-block
:pred  = xgb(x,model,options);      %makes predictions with a new X-block
:pred = xgb(x,y,model,options);    %performs a "test" call with a new X-block and known y-values
:valid = xgb(x,y,model,options);    %performs a "test" call with a new X-block and known y-values


===Description===
===Description===

Revision as of 17:33, 17 December 2018

Purpose

Gradient Boosted Tree (XGBoost) for regression or classification.

Synopsis

model = xgb(x,y,options); %identifies model (calibration step)
pred = xgb(x,model,options); %makes predictions with a new X-block
valid = xgb(x,y,model,options); %performs a "test" call with a new X-block and known y-values

Description

To choose between regression and classification, use the xgbtype option:

regression  : xgbtype = 'xgbr'
classification : xgbtype = 'xgbc'

It is recommended that classification be done through the xgbda function.

Inputs

  • x = X-block (predictor block) class "double" or "dataset",
  • y = Y-block (predicted block) class "double" or "dataset",
  • model = previously generated model (when applying model to new data)

Outputs

  • model = standard model structure containing the xgboost model (see Standard Model Structure). Feature scores are contained in model.detail.xgb.featurescores.
  • pred = structure array with predictions
  • valid = structure array with predictions