TreeBagger parameter tuning for classification

Technical Source
2 min readJul 14, 2022

How can I tune parameters for TreeBagger model for classification, I followed the example:”Tune Random Forest Using Quantile Error and Bayesian Optimization”, https://fr.mathworks.com/help/stats/tune-random-forest-using-quantile-error-and-bayesian-optimization.html I only changed “regression” with “classification”. The following code generated multiple errors:

results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,...
'AcquisitionFunctionName','expected-improvement-plus','Verbose',0);

errors:

Error using classreg.learning.internal.table2FitMatrix>resolveName (line 232)
One or more 'ResponseName' parameter values are invalid.
Error in classreg.learning.internal.table2FitMatrix (line 77)
ResponseName = resolveName('ResponseName',ResponseName,FormulaResponseName,false,VarNames);
Error in ClassificationTree.prepareData (line 557)
[X,Y,vrange,wastable,varargin] =
classreg.learning.internal.table2FitMatrix(X,Y,varargin{:},'OrdinalIsCategorical',false);
Error in TreeBagger/init (line 1335)
ClassificationTree.prepareData(x,y,...
Error in TreeBagger (line 615)
bagger = init(bagger,X,Y,makeArgs{:});
Error in oobErrRF2 (line 16)
randomForest = TreeBagger(300,X,'MPG','Method','classification',...
Error in @(params)oobErrRF2(params,trainingDataFeatures)Error in BayesianOptimization/callObjNormally (line 2184)
Objective = this.ObjectiveFcn(conditionalizeX(this, X));
Error in BayesianOptimization/callObjFcn (line 2145)
= callObjNormally(this, X);
Error in BayesianOptimization/callObjFcn (line 2162)
= callObjFcn(this, X);
Error in BayesianOptimization/performFcnEval (line 2128)
ObjectiveFcnObjectiveEvaluationTime, this] = callObjFcn(this, this.XNext);
Error in BayesianOptimization/run (line 1836)
this = performFcnEval(this);
Error in BayesianOptimization (line 450)
this = run(this);
Error in bayesopt (line 287)
Results = BayesianOptimization(Options);

would like to know if there is a way to use this method of tuning for classification. If not, how can I tune my parameters for a TreeBagger classifier.

NOTE:-

Matlabsolutions.com provide latest MatLab Homework Help,MatLab Assignment Help , Finance Assignment Help for students, engineers and researchers in Multiple Branches like ECE, EEE, CSE, Mechanical, Civil with 100% output.Matlab Code for B.E, B.Tech,M.E,M.Tech, Ph.D. Scholars with 100% privacy guaranteed. Get MATLAB projects with source code for your learning and research.

The following works for me in R2018a. It predicts ‘Cylinders’ (3 classes) and it calls oobError to get the misclassification rate of the ensemble.

load carsmall
Cylinders = categorical(Cylinders);
Mfg = categorical(cellstr(Mfg));
Model_Year = categorical(Model_Year);
X = table(Acceleration,Cylinders,Displacement,Horsepower,Mfg,...
Model_Year,Weight,MPG);
rng('default'); % For reproducibility
maxMinLS = 20;
minLS = optimizableVariable('minLS',[1,maxMinLS],'Type','integer');
numPTS = optimizableVariable('numPTS',[1,size(X,2)-1],'Type','integer');
hyperparametersRF = [minLS; numPTS];
results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,...
'AcquisitionFunctionName','expected-improvement-plus','Verbose',1);
bestOOBErr = results.MinObjective
bestHyperparameters = results.XAtMinObjective

SEE COMPLETE ANSWER CLICK THE LINK

--

--

Technical Source

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.