Class: Rumale::Ensemble::AdaBoostRegressor
- Inherits:
-
Base::Estimator
- Object
- Base::Estimator
- Rumale::Ensemble::AdaBoostRegressor
- Includes:
- Base::Regressor
- Defined in:
- rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb
Overview
AdaBoostRegressor is a class that implements AdaBoost for regression. This class uses decision tree for a weak learner.
Reference
-
Shrestha, D. L., and Solomatine, D. P., “Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression,” Neural Computation 18 (7), pp. 1678–1710, 2006.
Instance Attribute Summary collapse
-
#estimator_weights ⇒ Numo::DFloat
readonly
Return the weight for each weak learner.
-
#estimators ⇒ Array<DecisionTreeRegressor>
readonly
Return the set of estimators.
-
#feature_importances ⇒ Numo::DFloat
readonly
Return the importance for each feature.
-
#rng ⇒ Random
readonly
Return the random generator for random selection of feature index.
Attributes inherited from Base::Estimator
Instance Method Summary collapse
-
#fit(x, y) ⇒ AdaBoostRegressor
Fit the model with given training data.
-
#initialize(n_estimators: 10, threshold: 0.2, exponent: 1.0, criterion: 'mse', max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, random_seed: nil) ⇒ AdaBoostRegressor
constructor
Create a new regressor with random forest.
-
#predict(x) ⇒ Numo::DFloat
Predict values for samples.
Methods included from Base::Regressor
Constructor Details
#initialize(n_estimators: 10, threshold: 0.2, exponent: 1.0, criterion: 'mse', max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, random_seed: nil) ⇒ AdaBoostRegressor
Create a new regressor with random forest.
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 60 def initialize(n_estimators: 10, threshold: 0.2, exponent: 1.0, criterion: 'mse', max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, random_seed: nil) super() @params = { n_estimators: n_estimators, threshold: threshold, exponent: exponent, criterion: criterion, max_depth: max_depth, max_leaf_nodes: max_leaf_nodes, min_samples_leaf: min_samples_leaf, max_features: max_features, random_seed: random_seed || srand } @rng = Random.new(@params[:random_seed]) end |
Instance Attribute Details
#estimator_weights ⇒ Numo::DFloat (readonly)
Return the weight for each weak learner.
35 36 37 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 35 def estimator_weights @estimator_weights end |
#estimators ⇒ Array<DecisionTreeRegressor> (readonly)
Return the set of estimators.
31 32 33 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 31 def estimators @estimators end |
#feature_importances ⇒ Numo::DFloat (readonly)
Return the importance for each feature.
39 40 41 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 39 def feature_importances @feature_importances end |
#rng ⇒ Random (readonly)
Return the random generator for random selection of feature index.
43 44 45 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 43 def rng @rng end |
Instance Method Details
#fit(x, y) ⇒ AdaBoostRegressor
Fit the model with given training data.
83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 83 def fit(x, y) # rubocop:disable Metrics/AbcSize, Metrics/MethodLength x = ::Rumale::Validation.check_convert_sample_array(x) y = ::Rumale::Validation.check_convert_target_value_array(y) ::Rumale::Validation.check_sample_size(x, y) unless y.ndim == 1 raise ArgumentError, 'AdaBoostRegressor supports only single-target variable regression; ' \ 'the target value array is expected to be 1-D' end # Initialize some variables. n_samples, n_features = x.shape @params[:max_features] = n_features unless @params[:max_features].is_a?(Integer) @params[:max_features] = [[1, @params[:max_features]].max, n_features].min # rubocop:disable Style/ComparableClamp observation_weights = Numo::DFloat.zeros(n_samples) + 1.fdiv(n_samples) @estimators = [] @estimator_weights = [] @feature_importances = Numo::DFloat.zeros(n_features) sub_rng = @rng.dup # Construct forest. @params[:n_estimators].times do |_t| # Fit weak learner. ids = ::Rumale::Utils.choice_ids(n_samples, observation_weights, sub_rng) tree = ::Rumale::Tree::DecisionTreeRegressor.new( criterion: @params[:criterion], max_depth: @params[:max_depth], max_leaf_nodes: @params[:max_leaf_nodes], min_samples_leaf: @params[:min_samples_leaf], max_features: @params[:max_features], random_seed: sub_rng.rand(::Rumale::Ensemble::Value::SEED_BASE) ) tree.fit(x[ids, true], y[ids]) pred = tree.predict(x) # Calculate errors. abs_err = ((pred - y) / y).abs sum_target = abs_err.gt(@params[:threshold]) break if sum_target.count.zero? err = observation_weights[sum_target].sum break if err <= 0.0 # Calculate weight. beta = err**@params[:exponent] weight = Math.log(1.fdiv(beta)) # Store model. @estimators.push(tree) @estimator_weights.push(weight) @feature_importances += weight * tree.feature_importances # Update observation weights. update = Numo::DFloat.ones(n_samples) update_target = abs_err.le(@params[:threshold]) break if update_target.count.zero? update[update_target] = beta observation_weights *= update observation_weights = observation_weights.clip(1.0e-15, nil) sum_observation_weights = observation_weights.sum break if sum_observation_weights.zero? observation_weights /= sum_observation_weights end if @estimators.empty? warn('Failed to converge, check hyper-parameters of AdaBoostRegressor.') self end @estimator_weights = Numo::DFloat.asarray(@estimator_weights) @feature_importances /= @estimator_weights.sum self end |
#predict(x) ⇒ Numo::DFloat
Predict values for samples.
154 155 156 157 158 159 160 161 162 163 164 |
# File 'rumale-ensemble/lib/rumale/ensemble/ada_boost_regressor.rb', line 154 def predict(x) x = ::Rumale::Validation.check_convert_sample_array(x) n_samples, = x.shape predictions = Numo::DFloat.zeros(n_samples) @estimators.size.times do |t| predictions += @estimator_weights[t] * @estimators[t].predict(x) end sum_weight = @estimator_weights.sum predictions / sum_weight end |