Class: OpenApiOpenAIClient::CreateFineTuneRequest

Inherits:
Object
  • Object
show all
Defined in:
lib/openapi_openai/models/create_fine_tune_request.rb

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(attributes = {}) ⇒ CreateFineTuneRequest

Initializes the object



113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 113

def initialize(attributes = {})
  if (!attributes.is_a?(Hash))
    fail ArgumentError, "The input argument (attributes) must be a hash in `OpenApiOpenAIClient::CreateFineTuneRequest` initialize method"
  end

  # check to see if the attribute exists and convert string to symbol for hash key
  attributes = attributes.each_with_object({}) { |(k, v), h|
    if (!self.class.attribute_map.key?(k.to_sym))
      fail ArgumentError, "`#{k}` is not a valid attribute in `OpenApiOpenAIClient::CreateFineTuneRequest`. Please check the name to make sure it's valid. List of attributes: " + self.class.attribute_map.keys.inspect
    end
    h[k.to_sym] = v
  }

  if attributes.key?(:'training_file')
    self.training_file = attributes[:'training_file']
  else
    self.training_file = nil
  end

  if attributes.key?(:'validation_file')
    self.validation_file = attributes[:'validation_file']
  end

  if attributes.key?(:'model')
    self.model = attributes[:'model']
  end

  if attributes.key?(:'n_epochs')
    self.n_epochs = attributes[:'n_epochs']
  else
    self.n_epochs = 4
  end

  if attributes.key?(:'batch_size')
    self.batch_size = attributes[:'batch_size']
  end

  if attributes.key?(:'learning_rate_multiplier')
    self.learning_rate_multiplier = attributes[:'learning_rate_multiplier']
  end

  if attributes.key?(:'prompt_loss_weight')
    self.prompt_loss_weight = attributes[:'prompt_loss_weight']
  else
    self.prompt_loss_weight = 0.01
  end

  if attributes.key?(:'compute_classification_metrics')
    self.compute_classification_metrics = attributes[:'compute_classification_metrics']
  else
    self.compute_classification_metrics = false
  end

  if attributes.key?(:'classification_n_classes')
    self.classification_n_classes = attributes[:'classification_n_classes']
  end

  if attributes.key?(:'classification_positive_class')
    self.classification_positive_class = attributes[:'classification_positive_class']
  end

  if attributes.key?(:'classification_betas')
    if (value = attributes[:'classification_betas']).is_a?(Array)
      self.classification_betas = value
    end
  end

  if attributes.key?(:'suffix')
    self.suffix = attributes[:'suffix']
  end
end

Instance Attribute Details

#batch_sizeObject

The batch size to use for training. The batch size is the number of training examples used to train a single forward and backward pass. By default, the batch size will be dynamically configured to be ~0.2% of the number of examples in the training set, capped at 256 - in general, we’ve found that larger batch sizes tend to work better for larger datasets.



30
31
32
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 30

def batch_size
  @batch_size
end

#classification_betasObject

If this is provided, we calculate F-beta scores at the specified beta values. The F-beta score is a generalization of F-1 score. This is only used for binary classification. With a beta of 1 (i.e. the F-1 score), precision and recall are given the same weight. A larger beta score puts more weight on recall and less on precision. A smaller beta score puts more weight on precision and less on recall.



48
49
50
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 48

def classification_betas
  @classification_betas
end

#classification_n_classesObject

The number of classes in a classification task. This parameter is required for multiclass classification.



42
43
44
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 42

def classification_n_classes
  @classification_n_classes
end

#classification_positive_classObject

The positive class in binary classification. This parameter is needed to generate precision, recall, and F1 metrics when doing binary classification.



45
46
47
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 45

def classification_positive_class
  @classification_positive_class
end

#compute_classification_metricsObject

If set, we calculate classification-specific metrics such as accuracy and F-1 score using the validation set at the end of every epoch. These metrics can be viewed in the [results file](/docs/guides/fine-tuning/analyzing-your-fine-tuned-model). In order to compute classification metrics, you must provide a ‘validation_file`. Additionally, you must specify `classification_n_classes` for multiclass classification or `classification_positive_class` for binary classification.



39
40
41
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 39

def compute_classification_metrics
  @compute_classification_metrics
end

#learning_rate_multiplierObject

The learning rate multiplier to use for training. The fine-tuning learning rate is the original learning rate used for pretraining multiplied by this value. By default, the learning rate multiplier is the 0.05, 0.1, or 0.2 depending on final ‘batch_size` (larger learning rates tend to perform better with larger batch sizes). We recommend experimenting with values in the range 0.02 to 0.2 to see what produces the best results.



33
34
35
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 33

def learning_rate_multiplier
  @learning_rate_multiplier
end

#modelObject

Returns the value of attribute model.



24
25
26
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 24

def model
  @model
end

#n_epochsObject

The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.



27
28
29
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 27

def n_epochs
  @n_epochs
end

#prompt_loss_weightObject

The weight to use for loss on the prompt tokens. This controls how much the model tries to learn to generate the prompt (as compared to the completion which always has a weight of 1.0), and can add a stabilizing effect to training when completions are short. If prompts are extremely long (relative to completions), it may make sense to reduce this weight so as to avoid over-prioritizing learning the prompt.



36
37
38
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 36

def prompt_loss_weight
  @prompt_loss_weight
end

#suffixObject

A string of up to 40 characters that will be added to your fine-tuned model name. For example, a ‘suffix` of "custom-model-name" would produce a model name like `ada:ft-your-org:custom-model-name-2022-02-15-04-21-04`.



51
52
53
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 51

def suffix
  @suffix
end

#training_fileObject

The ID of an uploaded file that contains training data. See [upload file](/docs/api-reference/files/upload) for how to upload a file. Your dataset must be formatted as a JSONL file, where each training example is a JSON object with the keys "prompt" and "completion". Additionally, you must upload your file with the purpose ‘fine-tune`. See the [fine-tuning guide](/docs/guides/fine-tuning/creating-training-data) for more details.



19
20
21
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 19

def training_file
  @training_file
end

#validation_fileObject

The ID of an uploaded file that contains validation data. If you provide this file, the data is used to generate validation metrics periodically during fine-tuning. These metrics can be viewed in the [fine-tuning results file](/docs/guides/fine-tuning/analyzing-your-fine-tuned-model). Your train and validation data should be mutually exclusive. Your dataset must be formatted as a JSONL file, where each validation example is a JSON object with the keys "prompt" and "completion". Additionally, you must upload your file with the purpose ‘fine-tune`. See the [fine-tuning guide](/docs/guides/fine-tuning/creating-training-data) for more details.



22
23
24
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 22

def validation_file
  @validation_file
end

Class Method Details

._deserialize(type, value) ⇒ Object

Deserializes the data based on type



287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 287

def self._deserialize(type, value)
  case type.to_sym
  when :Time
    Time.parse(value)
  when :Date
    Date.parse(value)
  when :String
    value.to_s
  when :Integer
    value.to_i
  when :Float
    value.to_f
  when :Boolean
    if value.to_s =~ /\A(true|t|yes|y|1)\z/i
      true
    else
      false
    end
  when :Object
    # generic object (usually a Hash), return directly
    value
  when /\AArray<(?<inner_type>.+)>\z/
    inner_type = Regexp.last_match[:inner_type]
    value.map { |v| _deserialize(inner_type, v) }
  when /\AHash<(?<k_type>.+?), (?<v_type>.+)>\z/
    k_type = Regexp.last_match[:k_type]
    v_type = Regexp.last_match[:v_type]
    {}.tap do |hash|
      value.each do |k, v|
        hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
      end
    end
  else # model
    # models (e.g. Pet) or oneOf
    klass = OpenApiOpenAIClient.const_get(type)
    klass.respond_to?(:openapi_any_of) || klass.respond_to?(:openapi_one_of) ? klass.build(value) : klass.build_from_hash(value)
  end
end

.acceptable_attributesObject

Returns all the JSON keys this model knows about



72
73
74
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 72

def self.acceptable_attributes
  attribute_map.values
end

.attribute_mapObject

Attribute mapping from ruby-style variable name to JSON key.



54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 54

def self.attribute_map
  {
    :'training_file' => :'training_file',
    :'validation_file' => :'validation_file',
    :'model' => :'model',
    :'n_epochs' => :'n_epochs',
    :'batch_size' => :'batch_size',
    :'learning_rate_multiplier' => :'learning_rate_multiplier',
    :'prompt_loss_weight' => :'prompt_loss_weight',
    :'compute_classification_metrics' => :'compute_classification_metrics',
    :'classification_n_classes' => :'classification_n_classes',
    :'classification_positive_class' => :'classification_positive_class',
    :'classification_betas' => :'classification_betas',
    :'suffix' => :'suffix'
  }
end

.build_from_hash(attributes) ⇒ Object

Builds the object from hash



263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 263

def self.build_from_hash(attributes)
  return nil unless attributes.is_a?(Hash)
  attributes = attributes.transform_keys(&:to_sym)
  transformed_hash = {}
  openapi_types.each_pair do |key, type|
    if attributes.key?(attribute_map[key]) && attributes[attribute_map[key]].nil?
      transformed_hash["#{key}"] = nil
    elsif type =~ /\AArray<(.*)>/i
      # check to ensure the input is an array given that the attribute
      # is documented as an array but the input is not
      if attributes[attribute_map[key]].is_a?(Array)
        transformed_hash["#{key}"] = attributes[attribute_map[key]].map { |v| _deserialize($1, v) }
      end
    elsif !attributes[attribute_map[key]].nil?
      transformed_hash["#{key}"] = _deserialize(type, attributes[attribute_map[key]])
    end
  end
  new(transformed_hash)
end

.openapi_nullableObject

List of attributes with nullable: true



95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 95

def self.openapi_nullable
  Set.new([
    :'validation_file',
    :'model',
    :'n_epochs',
    :'batch_size',
    :'learning_rate_multiplier',
    :'prompt_loss_weight',
    :'compute_classification_metrics',
    :'classification_n_classes',
    :'classification_positive_class',
    :'classification_betas',
    :'suffix'
  ])
end

.openapi_typesObject

Attribute type mapping.



77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 77

def self.openapi_types
  {
    :'training_file' => :'String',
    :'validation_file' => :'String',
    :'model' => :'CreateFineTuneRequestModel',
    :'n_epochs' => :'Integer',
    :'batch_size' => :'Integer',
    :'learning_rate_multiplier' => :'Float',
    :'prompt_loss_weight' => :'Float',
    :'compute_classification_metrics' => :'Boolean',
    :'classification_n_classes' => :'Integer',
    :'classification_positive_class' => :'String',
    :'classification_betas' => :'Array<Float>',
    :'suffix' => :'String'
  }
end

Instance Method Details

#==(o) ⇒ Object

Checks equality by comparing each attribute.



231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 231

def ==(o)
  return true if self.equal?(o)
  self.class == o.class &&
      training_file == o.training_file &&
      validation_file == o.validation_file &&
      model == o.model &&
      n_epochs == o.n_epochs &&
      batch_size == o.batch_size &&
      learning_rate_multiplier == o.learning_rate_multiplier &&
      prompt_loss_weight == o.prompt_loss_weight &&
      compute_classification_metrics == o.compute_classification_metrics &&
      classification_n_classes == o.classification_n_classes &&
      classification_positive_class == o.classification_positive_class &&
      classification_betas == o.classification_betas &&
      suffix == o.suffix
end

#_to_hash(value) ⇒ Hash

Outputs non-array value in the form of hash For object, use to_hash. Otherwise, just return the value



358
359
360
361
362
363
364
365
366
367
368
369
370
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 358

def _to_hash(value)
  if value.is_a?(Array)
    value.compact.map { |v| _to_hash(v) }
  elsif value.is_a?(Hash)
    {}.tap do |hash|
      value.each { |k, v| hash[k] = _to_hash(v) }
    end
  elsif value.respond_to? :to_hash
    value.to_hash
  else
    value
  end
end

#eql?(o) ⇒ Boolean

See Also:

  • `==` method


250
251
252
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 250

def eql?(o)
  self == o
end

#hashInteger

Calculates hash code according to all attributes.



256
257
258
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 256

def hash
  [training_file, validation_file, model, n_epochs, batch_size, learning_rate_multiplier, prompt_loss_weight, compute_classification_metrics, classification_n_classes, classification_positive_class, classification_betas, suffix].hash
end

#list_invalid_propertiesObject

Show invalid properties with the reasons. Usually used together with valid?



187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 187

def list_invalid_properties
  warn '[DEPRECATED] the `list_invalid_properties` method is obsolete'
  invalid_properties = Array.new
  if @training_file.nil?
    invalid_properties.push('invalid value for "training_file", training_file cannot be nil.')
  end

  if !@suffix.nil? && @suffix.to_s.length > 40
    invalid_properties.push('invalid value for "suffix", the character length must be smaller than or equal to 40.')
  end

  if !@suffix.nil? && @suffix.to_s.length < 1
    invalid_properties.push('invalid value for "suffix", the character length must be great than or equal to 1.')
  end

  invalid_properties
end

#to_bodyHash

to_body is an alias to to_hash (backward compatibility)



334
335
336
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 334

def to_body
  to_hash
end

#to_hashHash

Returns the object in the form of hash



340
341
342
343
344
345
346
347
348
349
350
351
352
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 340

def to_hash
  hash = {}
  self.class.attribute_map.each_pair do |attr, param|
    value = self.send(attr)
    if value.nil?
      is_nullable = self.class.openapi_nullable.include?(attr)
      next if !is_nullable || (is_nullable && !instance_variable_defined?(:"@#{attr}"))
    end

    hash[param] = _to_hash(value)
  end
  hash
end

#to_sString

Returns the string representation of the object



328
329
330
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 328

def to_s
  to_hash.to_s
end

#valid?Boolean

Check to see if the all the properties in the model are valid



207
208
209
210
211
212
213
# File 'lib/openapi_openai/models/create_fine_tune_request.rb', line 207

def valid?
  warn '[DEPRECATED] the `valid?` method is obsolete'
  return false if @training_file.nil?
  return false if !@suffix.nil? && @suffix.to_s.length > 40
  return false if !@suffix.nil? && @suffix.to_s.length < 1
  true
end