Class: Rumale::NeuralNetwork::MLPRegressor
- Inherits:
-
BaseMLP
- Object
- Base::Estimator
- BaseMLP
- Rumale::NeuralNetwork::MLPRegressor
- Includes:
- Base::Regressor
- Defined in:
- rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb
Overview
MLPRegressor is a class that implements regressor based on multi-layer perceptron. MLPRegressor use ReLu as the activation function and Adam as the optimization method and mean squared error as the loss function.
Instance Attribute Summary collapse
-
#n_iter ⇒ Integer
readonly
Return the number of iterations run for optimization.
-
#network ⇒ Rumale::NeuralNetwork::Model::Sequential
readonly
Return the network.
-
#rng ⇒ Random
readonly
Return the random generator.
Attributes inherited from Base::Estimator
Instance Method Summary collapse
-
#fit(x, y) ⇒ MLPRegressor
Fit the model with given training data.
-
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ MLPRegressor
constructor
Create a new regressor with multi-layer perceptron.
-
#predict(x) ⇒ Numo::DFloat
Predict values for samples.
Methods included from Base::Regressor
Constructor Details
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ MLPRegressor
Create a new regressor with multi-layer perceptron.
47 48 49 50 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 47 def initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) super end |
Instance Attribute Details
#n_iter ⇒ Integer (readonly)
Return the number of iterations run for optimization
28 29 30 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 28 def n_iter @n_iter end |
#network ⇒ Rumale::NeuralNetwork::Model::Sequential (readonly)
Return the network.
24 25 26 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 24 def network @network end |
#rng ⇒ Random (readonly)
Return the random generator.
32 33 34 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 32 def rng @rng end |
Instance Method Details
#fit(x, y) ⇒ MLPRegressor
Fit the model with given training data.
57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 57 def fit(x, y) x = ::Rumale::Validation.check_convert_sample_array(x) y = ::Rumale::Validation.check_convert_target_value_array(y) ::Rumale::Validation.check_sample_size(x, y) y = y.(1) if y.ndim == 1 n_targets = y.shape[1] n_features = x.shape[1] sub_rng = @rng.dup loss = ::Rumale::NeuralNetwork::Loss::MeanSquaredError.new @network = buld_network(n_features, n_targets, sub_rng) @network = train(x, y, @network, loss, sub_rng) @network.delete_dropout self end |
#predict(x) ⇒ Numo::DFloat
Predict values for samples.
79 80 81 82 83 84 85 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_regressor.rb', line 79 def predict(x) x = ::Rumale::Validation.check_convert_sample_array(x) out, = @network.forward(x) out = out[true, 0] if out.shape[1] == 1 out end |