Class: Rumale::NeuralNetwork::MLPClassifier
- Inherits:
-
BaseMLP
- Object
- Base::Estimator
- BaseMLP
- Rumale::NeuralNetwork::MLPClassifier
- Includes:
- Base::Classifier
- Defined in:
- rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb
Overview
MLPClassifier is a class that implements classifier based on multi-layer perceptron. MLPClassifier use ReLu as the activation function and Adam as the optimization method and softmax cross entropy as the loss function.
Instance Attribute Summary collapse
-
#classes ⇒ Numo::Int32
readonly
Return the class labels.
-
#n_iter ⇒ Integer
readonly
Return the number of iterations run for optimization.
-
#network ⇒ Rumale::NeuralNetwork::Model::Sequential
readonly
Return the network.
-
#rng ⇒ Random
readonly
Return the random generator.
Attributes inherited from Base::Estimator
Instance Method Summary collapse
-
#fit(x, y) ⇒ MLPClassifier
Fit the model with given training data.
-
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ MLPClassifier
constructor
Create a new classifier with multi-layer preceptron.
-
#predict(x) ⇒ Numo::Int32
Predict class labels for samples.
-
#predict_proba(x) ⇒ Numo::DFloat
Predict probability for samples.
Methods included from Base::Classifier
Constructor Details
#initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) ⇒ MLPClassifier
Create a new classifier with multi-layer preceptron.
52 53 54 55 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 52 def initialize(hidden_units: [128, 128], dropout_rate: 0.4, learning_rate: 0.001, decay1: 0.9, decay2: 0.999, max_iter: 200, batch_size: 50, tol: 1e-4, verbose: false, random_seed: nil) super end |
Instance Attribute Details
#classes ⇒ Numo::Int32 (readonly)
Return the class labels.
29 30 31 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 29 def classes @classes end |
#n_iter ⇒ Integer (readonly)
Return the number of iterations run for optimization
33 34 35 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 33 def n_iter @n_iter end |
#network ⇒ Rumale::NeuralNetwork::Model::Sequential (readonly)
Return the network.
25 26 27 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 25 def network @network end |
#rng ⇒ Random (readonly)
Return the random generator.
37 38 39 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 37 def rng @rng end |
Instance Method Details
#fit(x, y) ⇒ MLPClassifier
Fit the model with given training data.
62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 62 def fit(x, y) x = ::Rumale::Validation.check_convert_sample_array(x) y = ::Rumale::Validation.check_convert_label_array(y) ::Rumale::Validation.check_sample_size(x, y) @classes = Numo::Int32[*y.to_a.uniq.sort] n_labels = @classes.size n_features = x.shape[1] sub_rng = @rng.dup loss = ::Rumale::NeuralNetwork::Loss::SoftmaxCrossEntropy.new @network = buld_network(n_features, n_labels, sub_rng) @network = train(x, one_hot_encode(y), @network, loss, sub_rng) @network.delete_dropout self end |
#predict(x) ⇒ Numo::Int32
Predict class labels for samples.
84 85 86 87 88 89 90 91 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 84 def predict(x) x = ::Rumale::Validation.check_convert_sample_array(x) n_samples = x.shape[0] decision_values = predict_proba(x) predicted = Array.new(n_samples) { |n| @classes[decision_values[n, true].max_index] } Numo::Int32.asarray(predicted) end |
#predict_proba(x) ⇒ Numo::DFloat
Predict probability for samples.
97 98 99 100 101 102 |
# File 'rumale-neural_network/lib/rumale/neural_network/mlp_classifier.rb', line 97 def predict_proba(x) x = ::Rumale::Validation.check_convert_sample_array(x) out, = @network.forward(x) softmax(out) end |