Class: Rumale::Manifold::TSNE
- Inherits:
-
Base::Estimator
- Object
- Base::Estimator
- Rumale::Manifold::TSNE
- Includes:
- Base::Transformer
- Defined in:
- rumale-manifold/lib/rumale/manifold/tsne.rb
Overview
TSNE is a class that implements t-Distributed Stochastic Neighbor Embedding (t-SNE) with fixed-point optimization algorithm. Fixed-point algorithm usually converges faster than gradient descent method and do not need the learning parameters such as the learning rate and momentum.
Reference
-
van der Maaten, L., and Hinton, G., “Visualizing data using t-SNE,” J. of Machine Learning Research, vol. 9, pp. 2579–2605, 2008.
-
Yang, Z., King, I., Xu, Z., and Oja, E., “Heavy-Tailed Symmetric Stochastic Neighbor Embedding,” Proc. NIPS’09, pp. 2169–2177, 2009.
Instance Attribute Summary collapse
-
#embedding ⇒ Numo::DFloat
readonly
Return the data in representation space.
-
#kl_divergence ⇒ Float
readonly
Return the Kullback-Leibler divergence after optimization.
-
#n_iter ⇒ Integer
readonly
Return the number of iterations run for optimization.
-
#rng ⇒ Random
readonly
Return the random generator.
Attributes inherited from Base::Estimator
Instance Method Summary collapse
-
#fit(x) ⇒ TSNE
Fit the model with given training data.
-
#fit_transform(x) ⇒ Numo::DFloat
Fit the model with training data, and then transform them with the learned model.
-
#initialize(n_components: 2, perplexity: 30.0, metric: 'euclidean', init: 'random', max_iter: 500, tol: nil, verbose: false, random_seed: nil) ⇒ TSNE
constructor
Create a new transformer with t-SNE.
Constructor Details
#initialize(n_components: 2, perplexity: 30.0, metric: 'euclidean', init: 'random', max_iter: 500, tol: nil, verbose: false, random_seed: nil) ⇒ TSNE
Create a new transformer with t-SNE.
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 60 def initialize(n_components: 2, perplexity: 30.0, metric: 'euclidean', init: 'random', max_iter: 500, tol: nil, verbose: false, random_seed: nil) super() @params = { n_components: n_components, perplexity: perplexity, max_iter: max_iter, tol: tol, metric: metric, init: init, verbose: verbose, random_seed: random_seed || srand } @rng = Random.new(@params[:random_seed]) end |
Instance Attribute Details
#embedding ⇒ Numo::DFloat (readonly)
Return the data in representation space.
31 32 33 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 31 def @embedding end |
#kl_divergence ⇒ Float (readonly)
Return the Kullback-Leibler divergence after optimization.
35 36 37 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 35 def kl_divergence @kl_divergence end |
#n_iter ⇒ Integer (readonly)
Return the number of iterations run for optimization
39 40 41 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 39 def n_iter @n_iter end |
#rng ⇒ Random (readonly)
Return the random generator.
43 44 45 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 43 def rng @rng end |
Instance Method Details
#fit(x) ⇒ TSNE
Fit the model with given training data.
82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 82 def fit(x, _not_used = nil) x = ::Rumale::Validation.check_convert_sample_array(x) if @params[:metric] == 'precomputed' && x.shape[0] != x.shape[1] raise ArgumentError, 'Expect the input distance matrix to be square.' end # initialize some varibales. @n_iter = 0 distance_mat = @params[:metric] == 'precomputed' ? x**2 : ::Rumale::PairwiseMetric.squared_error(x) hi_prob_mat = gaussian_distributed_probability_matrix(distance_mat) y = (x) lo_prob_mat = t_distributed_probability_matrix(y) # perform fixed-point optimization. one_vec = Numo::DFloat.ones(x.shape[0]).(1) @params[:max_iter].times do |t| break if terminate?(hi_prob_mat, lo_prob_mat) a = hi_prob_mat * lo_prob_mat b = lo_prob_mat**2 y = (b.dot(one_vec) * y + (a - b).dot(y)) / a.dot(one_vec) lo_prob_mat = t_distributed_probability_matrix(y) @n_iter = t + 1 if @params[:verbose] && (@n_iter % 100).zero? puts "[t-SNE] KL divergence after #{@n_iter} iterations: #{cost(hi_prob_mat, lo_prob_mat)}" end end # store results. @embedding = y @kl_divergence = cost(hi_prob_mat, lo_prob_mat) self end |
#fit_transform(x) ⇒ Numo::DFloat
Fit the model with training data, and then transform them with the learned model.
120 121 122 123 124 125 |
# File 'rumale-manifold/lib/rumale/manifold/tsne.rb', line 120 def fit_transform(x, _not_used = nil) x = ::Rumale::Validation.check_convert_sample_array(x) fit(x) @embedding.dup end |