Class: Rumale::Tree::DecisionTreeRegressor

Inherits:
BaseDecisionTree show all
Includes:
Base::Regressor
Defined in:
lib/rumale/tree/decision_tree_regressor.rb

Overview

DecisionTreeRegressor is a class that implements decision tree for regression.

Examples:

estimator =
  Rumale::Tree::DecisionTreeRegressor.new(
    max_depth: 3, max_leaf_nodes: 10, min_samples_leaf: 5, random_seed: 1)
estimator.fit(training_samples, traininig_values)
results = estimator.predict(testing_samples)

Direct Known Subclasses

ExtraTreeRegressor

Instance Attribute Summary collapse

Attributes included from Base::BaseEstimator

#params

Instance Method Summary collapse

Methods included from Base::Regressor

#score

Methods inherited from BaseDecisionTree

#apply

Constructor Details

#initialize(criterion: 'mse', max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, random_seed: nil) ⇒ DecisionTreeRegressor

Create a new regressor with decision tree algorithm.

Parameters:

  • criterion (String) (defaults to: 'mse')

    The function to evaluate spliting point. Supported criteria are ‘mae’ and ‘mse’.

  • max_depth (Integer) (defaults to: nil)

    The maximum depth of the tree. If nil is given, decision tree grows without concern for depth.

  • max_leaf_nodes (Integer) (defaults to: nil)

    The maximum number of leaves on decision tree. If nil is given, number of leaves is not limited.

  • min_samples_leaf (Integer) (defaults to: 1)

    The minimum number of samples at a leaf node.

  • max_features (Integer) (defaults to: nil)

    The number of features to consider when searching optimal split point. If nil is given, split process considers all features.

  • random_seed (Integer) (defaults to: nil)

    The seed value using to initialize the random generator. It is used to randomly determine the order of features when deciding spliting point.



50
51
52
53
54
55
56
57
58
59
60
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 50

def initialize(criterion: 'mse', max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil,
               random_seed: nil)
  check_params_type_or_nil(Integer, max_depth: max_depth, max_leaf_nodes: max_leaf_nodes,
                                    max_features: max_features, random_seed: random_seed)
  check_params_integer(min_samples_leaf: min_samples_leaf)
  check_params_string(criterion: criterion)
  check_params_positive(max_depth: max_depth, max_leaf_nodes: max_leaf_nodes,
                        min_samples_leaf: min_samples_leaf, max_features: max_features)
  super
  @leaf_values = nil
end

Instance Attribute Details

#feature_importancesNumo::DFloat (readonly)

Return the importance for each feature.

Returns:

  • (Numo::DFloat)

    (size: n_features)



24
25
26
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 24

def feature_importances
  @feature_importances
end

#leaf_valuesNumo::DFloat (readonly)

Return the values assigned each leaf.

Returns:

  • (Numo::DFloat)

    (shape: [n_leafs, n_outputs])



36
37
38
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 36

def leaf_values
  @leaf_values
end

#rngRandom (readonly)

Return the random generator for random selection of feature index.

Returns:

  • (Random)


32
33
34
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 32

def rng
  @rng
end

#treeNode (readonly)

Return the learned tree.

Returns:



28
29
30
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 28

def tree
  @tree
end

Instance Method Details

#fit(x, y) ⇒ DecisionTreeRegressor

Fit the model with given training data.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The training data to be used for fitting the model.

  • y (Numo::DFloat)

    (shape: [n_samples, n_outputs]) The taget values to be used for fitting the model.

Returns:



67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 67

def fit(x, y)
  check_sample_array(x)
  check_tvalue_array(y)
  check_sample_tvalue_size(x, y)
  n_samples, n_features = x.shape
  @params[:max_features] = n_features if @params[:max_features].nil?
  @params[:max_features] = [@params[:max_features], n_features].min
  @n_leaves = 0
  @leaf_values = []
  @sub_rng = @rng.dup
  build_tree(x, y)
  eval_importance(n_samples, n_features)
  @leaf_values = Numo::DFloat.cast(@leaf_values)
  @leaf_values = @leaf_values.flatten.dup if @leaf_values.shape[1] == 1
  self
end

#marshal_dumpHash

Dump marshal data.

Returns:

  • (Hash)

    The marshal data about DecisionTreeRegressor



95
96
97
98
99
100
101
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 95

def marshal_dump
  { params: @params,
    tree: @tree,
    feature_importances: @feature_importances,
    leaf_values: @leaf_values,
    rng: @rng }
end

#marshal_load(obj) ⇒ nil

Load marshal data.

Returns:

  • (nil)


105
106
107
108
109
110
111
112
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 105

def marshal_load(obj)
  @params = obj[:params]
  @tree = obj[:tree]
  @feature_importances = obj[:feature_importances]
  @leaf_values = obj[:leaf_values]
  @rng = obj[:rng]
  nil
end

#predict(x) ⇒ Numo::DFloat

Predict values for samples.

Parameters:

  • x (Numo::DFloat)

    (shape: [n_samples, n_features]) The samples to predict the values.

Returns:

  • (Numo::DFloat)

    (shape: [n_samples, n_outputs]) Predicted values per sample.



88
89
90
91
# File 'lib/rumale/tree/decision_tree_regressor.rb', line 88

def predict(x)
  check_sample_array(x)
  @leaf_values.shape[1].nil? ? @leaf_values[apply(x)].dup : @leaf_values[apply(x), true].dup
end