Class: Rumale::NeuralNetwork::BaseMLP

Inherits:
Base::Estimator show all
Defined in:
rumale-neural_network/lib/rumale/neural_network/base_mlp.rb

Overview

BaseMLP is an abstract class for implementation of multi-layer peceptron estimator. This class is used internally.

Direct Known Subclasses

MLPClassifier, MLPRegressor

Instance Attribute Summary

Attributes inherited from Base::Estimator

#params

Instance Method Summary collapse

Constructor Details

#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ BaseMLP

Create a multi-layer perceptron estimator.

Parameters:

  • hidden_units (Array) (defaults to: [128, 128])

    The number of units in the i-th hidden layer.

  • dropout_rate (Float) (defaults to: 0.4)

    The rate of the units to drop.

  • learning_rate (Float) (defaults to: 0.001)

    The initial value of learning rate in Adam optimizer.

  • decay1 (Float) (defaults to: 0.9)

    The smoothing parameter for the first moment in Adam optimizer.

  • decay2 (Float) (defaults to: 0.999)

    The smoothing parameter for the second moment in Adam optimizer.

  • max_iter (Integer) (defaults to: 200)

    The maximum number of epochs that indicates how many times the whole data is given to the training process.

  • batch_size (Intger) (defaults to: 50)

    The size of the mini batches.

  • tol (Float) (defaults to: 1e-4)

    The tolerance of loss for terminating optimization.

  • verbose (Boolean) (defaults to: false)

    The flag indicating whether to output loss during iteration.

  • random_seed (Integer) (defaults to: nil)

    The seed value using to initialize the random generator.



230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
# File 'rumale-neural_network/lib/rumale/neural_network/base_mlp.rb', line 230

def initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999,
               max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil)
  super()
  @params = {
    hidden_units: hidden_units,
    dropout_rate: dropout_rate,
    learning_rate: learning_rate,
    decay1: decay1,
    decay2: decay2,
    max_iter: max_iter,
    batch_size: batch_size,
    tol: tol,
    verbose: verbose,
    random_seed: random_seed || srand
  }
  @rng = Random.new(@params[:random_seed])
end