Class: SVMKit::PolynomialModel::FactorizationMachineClassifier

Inherits:
Object
  • Object
show all
Includes:
Base::BaseEstimator, Base::Classifier
Defined in:
lib/svmkit/polynomial_model/factorization_machine_classifier.rb

Overview

FactorizationMachineClassifier is a class that implements Fatorization Machine for binary classification with (mini-batch) stochastic gradient descent optimization. Note that this implementation uses hinge loss for the loss function.

Reference

    1. Rendle, “Factorization Machines with libFM,” ACM Transactions on Intelligent Systems and Technology, vol. 3 (3), pp. 57:1–57:22, 2012.

    1. Rendle, “Factorization Machines,” Proceedings of the 10th IEEE International Conference on Data Mining (ICDM’10), pp. 995–1000, 2010.

Examples:

estimator =
  SVMKit::PolynomialModel::FactorizationMachineClassifier.new(
   n_factors: 10, reg_param_bias: 0.001, reg_param_weight: 0.001, reg_param_factor: 0.001,
   max_iter: 5000, batch_size: 50, random_seed: 1)
estimator.fit(training_samples, traininig_labels)
results = estimator.predict(testing_samples)

Instance Attribute Summary collapse

Attributes included from Base::BaseEstimator

#params

Instance Method Summary collapse

Constructor Details

#initialize(n_factors: 2, reg_param_bias: 1.0, reg_param_weight: 1.0, reg_param_factor: 1.0, init_std: 0.1, max_iter: 1000, batch_size: 10, random_seed: nil) ⇒ FactorizationMachineClassifier

Create a new classifier with Support Vector Machine by the Pegasos algorithm.

Parameters:

  • n_factors (Integer) (defaults to: 2)

    The maximum number of iterations.

  • reg_param_bias (Float) (defaults to: 1.0)

    The regularization parameter for bias term.

  • reg_param_weight (Float) (defaults to: 1.0)

    The regularization parameter for weight vector.

  • reg_param_factor (Float) (defaults to: 1.0)

    The regularization parameter for factor matrix.

  • init_std (Float) (defaults to: 0.1)

    The standard deviation of normal random number for initialization of factor matrix.

  • max_iter (Integer) (defaults to: 1000)

    The maximum number of iterations.

  • batch_size (Integer) (defaults to: 10)

    The size of the mini batches.

  • random_seed (Integer) (defaults to: nil)

    The seed value using to initialize the random generator.



53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 53

def initialize(n_factors: 2, reg_param_bias: 1.0, reg_param_weight: 1.0, reg_param_factor: 1.0,
               init_std: 0.1, max_iter: 1000, batch_size: 10, random_seed: nil)
  @params = {}
  @params[:n_factors] = n_factors
  @params[:reg_param_bias] = reg_param_bias
  @params[:reg_param_weight] = reg_param_weight
  @params[:reg_param_factor] = reg_param_factor
  @params[:init_std] = init_std
  @params[:max_iter] = max_iter
  @params[:batch_size] = batch_size
  @params[:random_seed] = random_seed
  @params[:random_seed] ||= srand
  @factor_mat = nil
  @weight_vec = nil
  @bias_term = 0.0
  @rng = Random.new(@params[:random_seed])
end

Instance Attribute Details

#bias_termFloat (readonly)

Return the bias term for Factoriazation Machine.

Returns:

  • (Float)


37
38
39
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 37

def bias_term
  @bias_term
end

#factor_matNumo::DFloat (readonly)

Return the factor matrix for Factorization Machine.

Returns:

  • (Numo::DFloat)

    (shape: [n_factors, n_features])



29
30
31
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 29

def factor_mat
  @factor_mat
end

#rngRandom (readonly)

Return the random generator for transformation.

Returns:

  • (Random)


41
42
43
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 41

def rng
  @rng
end

#weight_vecNumo::DFloat (readonly)

Return the weight vector for Factorization Machine.

Returns:

  • (Numo::DFloat)

    (shape: [n_features])



33
34
35
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 33

def weight_vec
  @weight_vec
end

Instance Method Details

#decision_function(x) ⇒ Numo::DFloat

Calculate confidence scores for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to compute the scores.

Returns:

  • (Numo::DFloat)

    (shape: [n_samples]) Confidence score per sample.



111
112
113
114
115
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 111

def decision_function(x)
  linear_term = @bias_term + x.dot(@weight_vec)
  factor_term = 0.5 * (@factor_mat.dot(x.transpose)**2 - (@factor_mat**2).dot(x.transpose**2)).sum
  linear_term + factor_term
end

#fit(x, y) ⇒ FactorizationMachineClassifier

Fit the model with given training data.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The training data to be used for fitting the model.

  • y (Numo::Int32)

    (shape: [n_samples]) The labels to be used for fitting the model.

Returns:



76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 76

def fit(x, y)
  # Generate binary labels.
  negative_label = y.to_a.uniq.sort.shift
  bin_y = y.map { |l| l != negative_label ? 1.0 : -1.0 }
  # Initialize some variables.
  n_samples, n_features = x.shape
  rand_ids = [*0...n_samples].shuffle(random: @rng)
  @factor_mat = rand_normal([@params[:n_factors], n_features], 0, @params[:init_std])
  @weight_vec = Numo::DFloat.zeros(n_features)
  @bias_term = 0.0
  # Start optimization.
  @params[:max_iter].times do |t|
    # Random sampling.
    subset_ids = rand_ids.shift(@params[:batch_size])
    rand_ids.concat(subset_ids)
    data = x[subset_ids, true]
    label = bin_y[subset_ids]
    # Calculate gradients for loss function.
    loss_grad = loss_gradient(data, label)
    next if loss_grad.ne(0.0).count.zero?
    # Update each parameter.
    @bias_term -= learning_rate(@params[:reg_param_bias], t) * bias_gradient(loss_grad)
    @weight_vec -= learning_rate(@params[:reg_param_weight], t) * weight_gradient(loss_grad, data)
    @params[:n_factors].times do |n|
      @factor_mat[n, true] -= learning_rate(@params[:reg_param_factor], t) *
                              factor_gradient(loss_grad, data, @factor_mat[n, true])
    end
  end
  self
end

#marshal_dumpHash

Dump marshal data.

Returns:

  • (Hash)

    The marshal data about FactorizationMachineClassifier



138
139
140
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 138

def marshal_dump
  { params: @params, factor_mat: @factor_mat, weight_vec: @weight_vec, bias_term: @bias_term, rng: @rng }
end

#marshal_load(obj) ⇒ nil

Load marshal data.

Returns:

  • (nil)


144
145
146
147
148
149
150
151
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 144

def marshal_load(obj)
  @params = obj[:params]
  @factor_mat = obj[:factor_mat]
  @weight_vec = obj[:weight_vec]
  @bias_term = obj[:bias_term]
  @rng = obj[:rng]
  nil
end

#predict(x) ⇒ Numo::Int32

Predict class labels for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to predict the labels.

Returns:

  • (Numo::Int32)

    (shape: [n_samples]) Predicted class label per sample.



121
122
123
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 121

def predict(x)
  Numo::Int32.cast(decision_function(x).map { |v| v >= 0.0 ? 1 : -1 })
end

#score(x, y) ⇒ Float

Claculate the mean accuracy of the given testing data.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) Testing data.

  • y (Numo::Int32)

    (shape: [n_samples]) True labels for testing data.

Returns:

  • (Float)

    Mean accuracy



130
131
132
133
134
# File 'lib/svmkit/polynomial_model/factorization_machine_classifier.rb', line 130

def score(x, y)
  p = predict(x)
  n_hits = (y.to_a.map.with_index { |l, n| l == p[n] ? 1 : 0 }).inject(:+)
  n_hits / y.size.to_f
end