Class: Rumale::LinearModel::LogisticRegression

Inherits:
BaseLinearModel show all
Includes:
Base::Classifier
Defined in:
lib/rumale/linear_model/logistic_regression.rb

Overview

LogisticRegression is a class that implements Logistic Regression with mini-batch stochastic gradient descent optimization. For multiclass classification problem, it uses one-vs-the-rest strategy.

Reference

    1. Shalev-Shwartz, Y. Singer, N. Srebro, and A. Cotter, “Pegasos: Primal Estimated sub-GrAdient SOlver for SVM,” Mathematical Programming, vol. 127 (1), pp. 3–30, 2011.

Examples:

estimator =
  Rumale::LinearModel::LogisticRegression.new(reg_param: 1.0, max_iter: 1000, batch_size: 20, random_seed: 1)
estimator.fit(training_samples, traininig_labels)
results = estimator.predict(testing_samples)

Instance Attribute Summary collapse

Attributes included from Base::BaseEstimator

#params

Instance Method Summary collapse

Methods included from Base::Classifier

#score

Constructor Details

#initialize(reg_param: 1.0, fit_bias: false, bias_scale: 1.0, max_iter: 1000, batch_size: 20, optimizer: nil, n_jobs: nil, random_seed: nil) ⇒ LogisticRegression

Create a new classifier with Logisitc Regression by the SGD optimization.

Parameters:

  • reg_param (Float) (defaults to: 1.0)

    The regularization parameter.

  • fit_bias (Boolean) (defaults to: false)

    The flag indicating whether to fit the bias term.

  • bias_scale (Float) (defaults to: 1.0)

    The scale of the bias term. If fit_bias is true, the feature vector v becoms [v; bias_scale].

  • max_iter (Integer) (defaults to: 1000)

    The maximum number of iterations.

  • batch_size (Integer) (defaults to: 20)

    The size of the mini batches.

  • optimizer (Optimizer) (defaults to: nil)

    The optimizer to calculate adaptive learning rate. If nil is given, Nadam is used.

  • n_jobs (Integer) (defaults to: nil)

    The number of jobs for running the fit and predict methods in parallel. If nil is given, the methods do not execute in parallel. If zero or less is given, it becomes equal to the number of processors. This parameter is ignored if the Parallel gem is not loaded.

  • random_seed (Integer) (defaults to: nil)

    The seed value using to initialize the random generator.



54
55
56
57
58
59
60
61
62
63
# File 'lib/rumale/linear_model/logistic_regression.rb', line 54

def initialize(reg_param: 1.0, fit_bias: false, bias_scale: 1.0,
               max_iter: 1000, batch_size: 20, optimizer: nil, n_jobs: nil, random_seed: nil)
  check_params_float(reg_param: reg_param, bias_scale: bias_scale)
  check_params_integer(max_iter: max_iter, batch_size: batch_size)
  check_params_boolean(fit_bias: fit_bias)
  check_params_type_or_nil(Integer, n_jobs: n_jobs, random_seed: random_seed)
  check_params_positive(reg_param: reg_param, bias_scale: bias_scale, max_iter: max_iter, batch_size: batch_size)
  super
  @classes = nil
end

Instance Attribute Details

#bias_termNumo::DFloat (readonly)

Return the bias term (a.k.a. intercept) for Logistic Regression.

Returns:

  • (Numo::DFloat)

    (shape: [n_classes])



29
30
31
# File 'lib/rumale/linear_model/logistic_regression.rb', line 29

def bias_term
  @bias_term
end

#classesNumo::Int32 (readonly)

Return the class labels.

Returns:

  • (Numo::Int32)

    (shape: [n_classes])



33
34
35
# File 'lib/rumale/linear_model/logistic_regression.rb', line 33

def classes
  @classes
end

#rngRandom (readonly)

Return the random generator for performing random sampling.

Returns:

  • (Random)


37
38
39
# File 'lib/rumale/linear_model/logistic_regression.rb', line 37

def rng
  @rng
end

#weight_vecNumo::DFloat (readonly)

Return the weight vector for Logistic Regression.

Returns:

  • (Numo::DFloat)

    (shape: [n_classes, n_features])



25
26
27
# File 'lib/rumale/linear_model/logistic_regression.rb', line 25

def weight_vec
  @weight_vec
end

Instance Method Details

#decision_function(x) ⇒ Numo::DFloat

Calculate confidence scores for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to compute the scores.

Returns:

  • (Numo::DFloat)

    (shape: [n_samples, n_classes]) Confidence score per sample.



109
110
111
112
# File 'lib/rumale/linear_model/logistic_regression.rb', line 109

def decision_function(x)
  check_sample_array(x)
  x.dot(@weight_vec.transpose) + @bias_term
end

#fit(x, y) ⇒ LogisticRegression

Fit the model with given training data.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The training data to be used for fitting the model.

  • y (Numo::Int32)

    (shape: [n_samples]) The labels to be used for fitting the model.

Returns:



70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
# File 'lib/rumale/linear_model/logistic_regression.rb', line 70

def fit(x, y)
  check_sample_array(x)
  check_label_array(y)
  check_sample_label_size(x, y)

  @classes = Numo::Int32[*y.to_a.uniq.sort]
  n_classes = @classes.size
  n_features = x.shape[1]

  if n_classes > 2
    @weight_vec = Numo::DFloat.zeros(n_classes, n_features)
    @bias_term = Numo::DFloat.zeros(n_classes)
    if enable_parallel?
      # :nocov:
      models = parallel_map(n_classes) do |n|
        bin_y = Numo::Int32.cast(y.eq(@classes[n])) * 2 - 1
        partial_fit(x, bin_y)
      end
      # :nocov:
      n_classes.times { |n| @weight_vec[n, true], @bias_term[n] = models[n] }
    else
      n_classes.times do |n|
        bin_y = Numo::Int32.cast(y.eq(@classes[n])) * 2 - 1
        @weight_vec[n, true], @bias_term[n] = partial_fit(x, bin_y)
      end
    end
  else
    negative_label = y.to_a.uniq.min
    bin_y = Numo::Int32.cast(y.ne(negative_label)) * 2 - 1
    @weight_vec, @bias_term = partial_fit(x, bin_y)
  end

  self
end

#marshal_dumpHash

Dump marshal data.

Returns:

  • (Hash)

    The marshal data about LogisticRegression.



152
153
154
155
156
157
158
# File 'lib/rumale/linear_model/logistic_regression.rb', line 152

def marshal_dump
  { params: @params,
    weight_vec: @weight_vec,
    bias_term: @bias_term,
    classes: @classes,
    rng: @rng }
end

#marshal_load(obj) ⇒ nil

Load marshal data.

Returns:

  • (nil)


162
163
164
165
166
167
168
169
# File 'lib/rumale/linear_model/logistic_regression.rb', line 162

def marshal_load(obj)
  @params = obj[:params]
  @weight_vec = obj[:weight_vec]
  @bias_term = obj[:bias_term]
  @classes = obj[:classes]
  @rng = obj[:rng]
  nil
end

#predict(x) ⇒ Numo::Int32

Predict class labels for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to predict the labels.

Returns:

  • (Numo::Int32)

    (shape: [n_samples]) Predicted class label per sample.



118
119
120
121
122
123
124
125
126
127
128
129
130
131
# File 'lib/rumale/linear_model/logistic_regression.rb', line 118

def predict(x)
  check_sample_array(x)

  return Numo::Int32.cast(predict_proba(x)[true, 1].ge(0.5)) * 2 - 1 if @classes.size <= 2

  n_samples, = x.shape
  decision_values = predict_proba(x)
  predicted = if enable_parallel?
                parallel_map(n_samples) { |n| @classes[decision_values[n, true].max_index] }
              else
                Array.new(n_samples) { |n| @classes[decision_values[n, true].max_index] }
              end
  Numo::Int32.asarray(predicted)
end

#predict_proba(x) ⇒ Numo::DFloat

Predict probability for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to predict the probailities.

Returns:

  • (Numo::DFloat)

    (shape: [n_samples, n_classes]) Predicted probability of each class per sample.



137
138
139
140
141
142
143
144
145
146
147
148
# File 'lib/rumale/linear_model/logistic_regression.rb', line 137

def predict_proba(x)
  check_sample_array(x)

  proba = 1.0 / (Numo::NMath.exp(-decision_function(x)) + 1.0)
  return (proba.transpose / proba.sum(axis: 1)).transpose if @classes.size > 2

  n_samples, = x.shape
  probs = Numo::DFloat.zeros(n_samples, 2)
  probs[true, 1] = proba
  probs[true, 0] = 1.0 - proba
  probs
end