Class: Rumale::NeuralNetwork::BaseMLP
- Inherits:
-
Base::Estimator
- Object
- Base::Estimator
- Rumale::NeuralNetwork::BaseMLP
- Defined in:
- rumale-neural_network/lib/rumale/neural_network/base_mlp.rb
Overview
BaseMLP is an abstract class for implementation of multi-layer peceptron estimator. This class is used internally.
Direct Known Subclasses
Instance Attribute Summary
Attributes inherited from Base::Estimator
Instance Method Summary collapse
-
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ BaseMLP
constructor
Create a multi-layer perceptron estimator.
Constructor Details
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ BaseMLP
Create a multi-layer perceptron estimator.
230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 |
# File 'rumale-neural_network/lib/rumale/neural_network/base_mlp.rb', line 230 def initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) super() @params = { hidden_units: hidden_units, dropout_rate: dropout_rate, learning_rate: learning_rate, decay1: decay1, decay2: decay2, max_iter: max_iter, batch_size: batch_size, tol: tol, verbose: verbose, random_seed: random_seed || srand } @rng = Random.new(@params[:random_seed]) end |