Class: Akeyless::CreateTokenizer
- Inherits:
-
Object
- Object
- Akeyless::CreateTokenizer
- Defined in:
- lib/akeyless/models/create_tokenizer.rb
Overview
createTokenizer is a command that creates a tokenizer item
Instance Attribute Summary collapse
-
#alphabet ⇒ Object
Alphabet to use in regexp vaultless tokenization.
-
#decoding_template ⇒ Object
The Decoding output template to use in regexp vaultless tokenization.
-
#delete_protection ⇒ Object
Protection from accidental deletion of this item [true/false].
-
#description ⇒ Object
Description of the object.
-
#encoding_template ⇒ Object
The Encoding output template to use in regexp vaultless tokenization.
-
#encryption_key_name ⇒ Object
AES key name to use in vaultless tokenization.
-
#json ⇒ Object
Set output format to JSON.
-
#metadata ⇒ Object
Deprecated - use description.
-
#name ⇒ Object
Tokenizer name.
-
#pattern ⇒ Object
Pattern to use in regexp vaultless tokenization.
-
#tag ⇒ Object
List of the tags attached to this key.
-
#template_type ⇒ Object
Which template type this tokenizer is used for [SSN,CreditCard,USPhoneNumber,Email,Regexp].
-
#token ⇒ Object
Authentication token (see ‘/auth` and `/configure`).
-
#tokenizer_type ⇒ Object
Tokenizer type.
-
#tweak_type ⇒ Object
The tweak type to use in vaultless tokenization [Supplied, Generated, Internal, Masking].
-
#uid_token ⇒ Object
The universal identity token, Required only for universal_identity authentication.
Class Method Summary collapse
-
.acceptable_attributes ⇒ Object
Returns all the JSON keys this model knows about.
-
.attribute_map ⇒ Object
Attribute mapping from ruby-style variable name to JSON key.
-
.build_from_hash(attributes) ⇒ Object
Builds the object from hash.
-
.openapi_nullable ⇒ Object
List of attributes with nullable: true.
-
.openapi_types ⇒ Object
Attribute type mapping.
Instance Method Summary collapse
-
#==(o) ⇒ Object
Checks equality by comparing each attribute.
-
#_deserialize(type, value) ⇒ Object
Deserializes the data based on type.
-
#_to_hash(value) ⇒ Hash
Outputs non-array value in the form of hash For object, use to_hash.
-
#build_from_hash(attributes) ⇒ Object
Builds the object from hash.
- #eql?(o) ⇒ Boolean
-
#hash ⇒ Integer
Calculates hash code according to all attributes.
-
#initialize(attributes = {}) ⇒ CreateTokenizer
constructor
Initializes the object.
-
#list_invalid_properties ⇒ Object
Show invalid properties with the reasons.
-
#to_body ⇒ Hash
to_body is an alias to to_hash (backward compatibility).
-
#to_hash ⇒ Hash
Returns the object in the form of hash.
-
#to_s ⇒ String
Returns the string representation of the object.
-
#valid? ⇒ Boolean
Check to see if the all the properties in the model are valid.
Constructor Details
#initialize(attributes = {}) ⇒ CreateTokenizer
Initializes the object
124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 124 def initialize(attributes = {}) if (!attributes.is_a?(Hash)) fail ArgumentError, "The input argument (attributes) must be a hash in `Akeyless::CreateTokenizer` initialize method" end # check to see if the attribute exists and convert string to symbol for hash key attributes = attributes.each_with_object({}) { |(k, v), h| if (!self.class.attribute_map.key?(k.to_sym)) fail ArgumentError, "`#{k}` is not a valid attribute in `Akeyless::CreateTokenizer`. Please check the name to make sure it's valid. List of attributes: " + self.class.attribute_map.keys.inspect end h[k.to_sym] = v } if attributes.key?(:'alphabet') self.alphabet = attributes[:'alphabet'] end if attributes.key?(:'decoding_template') self.decoding_template = attributes[:'decoding_template'] end if attributes.key?(:'delete_protection') self.delete_protection = attributes[:'delete_protection'] end if attributes.key?(:'description') self.description = attributes[:'description'] end if attributes.key?(:'encoding_template') self.encoding_template = attributes[:'encoding_template'] end if attributes.key?(:'encryption_key_name') self.encryption_key_name = attributes[:'encryption_key_name'] end if attributes.key?(:'json') self.json = attributes[:'json'] else self.json = false end if attributes.key?(:'metadata') self. = attributes[:'metadata'] end if attributes.key?(:'name') self.name = attributes[:'name'] end if attributes.key?(:'pattern') self.pattern = attributes[:'pattern'] end if attributes.key?(:'tag') if (value = attributes[:'tag']).is_a?(Array) self.tag = value end end if attributes.key?(:'template_type') self.template_type = attributes[:'template_type'] end if attributes.key?(:'token') self.token = attributes[:'token'] end if attributes.key?(:'tokenizer_type') self.tokenizer_type = attributes[:'tokenizer_type'] else self.tokenizer_type = 'vaultless' end if attributes.key?(:'tweak_type') self.tweak_type = attributes[:'tweak_type'] end if attributes.key?(:'uid_token') self.uid_token = attributes[:'uid_token'] end end |
Instance Attribute Details
#alphabet ⇒ Object
Alphabet to use in regexp vaultless tokenization
20 21 22 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 20 def alphabet @alphabet end |
#decoding_template ⇒ Object
The Decoding output template to use in regexp vaultless tokenization
23 24 25 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 23 def decoding_template @decoding_template end |
#delete_protection ⇒ Object
Protection from accidental deletion of this item [true/false]
26 27 28 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 26 def delete_protection @delete_protection end |
#description ⇒ Object
Description of the object
29 30 31 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 29 def description @description end |
#encoding_template ⇒ Object
The Encoding output template to use in regexp vaultless tokenization
32 33 34 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 32 def encoding_template @encoding_template end |
#encryption_key_name ⇒ Object
AES key name to use in vaultless tokenization
35 36 37 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 35 def encryption_key_name @encryption_key_name end |
#json ⇒ Object
Set output format to JSON
38 39 40 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 38 def json @json end |
#metadata ⇒ Object
Deprecated - use description
41 42 43 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 41 def @metadata end |
#name ⇒ Object
Tokenizer name
44 45 46 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 44 def name @name end |
#pattern ⇒ Object
Pattern to use in regexp vaultless tokenization
47 48 49 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 47 def pattern @pattern end |
#tag ⇒ Object
List of the tags attached to this key
50 51 52 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 50 def tag @tag end |
#template_type ⇒ Object
Which template type this tokenizer is used for [SSN,CreditCard,USPhoneNumber,Email,Regexp]
53 54 55 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 53 def template_type @template_type end |
#token ⇒ Object
Authentication token (see ‘/auth` and `/configure`)
56 57 58 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 56 def token @token end |
#tokenizer_type ⇒ Object
Tokenizer type
59 60 61 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 59 def tokenizer_type @tokenizer_type end |
#tweak_type ⇒ Object
The tweak type to use in vaultless tokenization [Supplied, Generated, Internal, Masking]
62 63 64 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 62 def tweak_type @tweak_type end |
#uid_token ⇒ Object
The universal identity token, Required only for universal_identity authentication
65 66 67 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 65 def uid_token @uid_token end |
Class Method Details
.acceptable_attributes ⇒ Object
Returns all the JSON keys this model knows about
90 91 92 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 90 def self.acceptable_attributes attribute_map.values end |
.attribute_map ⇒ Object
Attribute mapping from ruby-style variable name to JSON key.
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 68 def self.attribute_map { :'alphabet' => :'alphabet', :'decoding_template' => :'decoding-template', :'delete_protection' => :'delete_protection', :'description' => :'description', :'encoding_template' => :'encoding-template', :'encryption_key_name' => :'encryption-key-name', :'json' => :'json', :'metadata' => :'metadata', :'name' => :'name', :'pattern' => :'pattern', :'tag' => :'tag', :'template_type' => :'template-type', :'token' => :'token', :'tokenizer_type' => :'tokenizer-type', :'tweak_type' => :'tweak-type', :'uid_token' => :'uid-token' } end |
.build_from_hash(attributes) ⇒ Object
Builds the object from hash
274 275 276 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 274 def self.build_from_hash(attributes) new.build_from_hash(attributes) end |
.openapi_nullable ⇒ Object
List of attributes with nullable: true
117 118 119 120 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 117 def self.openapi_nullable Set.new([ ]) end |
.openapi_types ⇒ Object
Attribute type mapping.
95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 95 def self.openapi_types { :'alphabet' => :'String', :'decoding_template' => :'String', :'delete_protection' => :'String', :'description' => :'String', :'encoding_template' => :'String', :'encryption_key_name' => :'String', :'json' => :'Boolean', :'metadata' => :'String', :'name' => :'String', :'pattern' => :'String', :'tag' => :'Array<String>', :'template_type' => :'String', :'token' => :'String', :'tokenizer_type' => :'String', :'tweak_type' => :'String', :'uid_token' => :'String' } end |
Instance Method Details
#==(o) ⇒ Object
Checks equality by comparing each attribute.
238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 238 def ==(o) return true if self.equal?(o) self.class == o.class && alphabet == o.alphabet && decoding_template == o.decoding_template && delete_protection == o.delete_protection && description == o.description && encoding_template == o.encoding_template && encryption_key_name == o.encryption_key_name && json == o.json && == o. && name == o.name && pattern == o.pattern && tag == o.tag && template_type == o.template_type && token == o.token && tokenizer_type == o.tokenizer_type && tweak_type == o.tweak_type && uid_token == o.uid_token end |
#_deserialize(type, value) ⇒ Object
Deserializes the data based on type
305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 305 def _deserialize(type, value) case type.to_sym when :Time Time.parse(value) when :Date Date.parse(value) when :String value.to_s when :Integer value.to_i when :Float value.to_f when :Boolean if value.to_s =~ /\A(true|t|yes|y|1)\z/i true else false end when :Object # generic object (usually a Hash), return directly value when /\AArray<(?<inner_type>.+)>\z/ inner_type = Regexp.last_match[:inner_type] value.map { |v| _deserialize(inner_type, v) } when /\AHash<(?<k_type>.+?), (?<v_type>.+)>\z/ k_type = Regexp.last_match[:k_type] v_type = Regexp.last_match[:v_type] {}.tap do |hash| value.each do |k, v| hash[_deserialize(k_type, k)] = _deserialize(v_type, v) end end else # model # models (e.g. Pet) or oneOf klass = Akeyless.const_get(type) klass.respond_to?(:openapi_one_of) ? klass.build(value) : klass.build_from_hash(value) end end |
#_to_hash(value) ⇒ Hash
Outputs non-array value in the form of hash For object, use to_hash. Otherwise, just return the value
376 377 378 379 380 381 382 383 384 385 386 387 388 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 376 def _to_hash(value) if value.is_a?(Array) value.compact.map { |v| _to_hash(v) } elsif value.is_a?(Hash) {}.tap do |hash| value.each { |k, v| hash[k] = _to_hash(v) } end elsif value.respond_to? :to_hash value.to_hash else value end end |
#build_from_hash(attributes) ⇒ Object
Builds the object from hash
281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 281 def build_from_hash(attributes) return nil unless attributes.is_a?(Hash) attributes = attributes.transform_keys(&:to_sym) self.class.openapi_types.each_pair do |key, type| if attributes[self.class.attribute_map[key]].nil? && self.class.openapi_nullable.include?(key) self.send("#{key}=", nil) elsif type =~ /\AArray<(.*)>/i # check to ensure the input is an array given that the attribute # is documented as an array but the input is not if attributes[self.class.attribute_map[key]].is_a?(Array) self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) }) end elsif !attributes[self.class.attribute_map[key]].nil? self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]])) end end self end |
#eql?(o) ⇒ Boolean
261 262 263 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 261 def eql?(o) self == o end |
#hash ⇒ Integer
Calculates hash code according to all attributes.
267 268 269 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 267 def hash [alphabet, decoding_template, delete_protection, description, encoding_template, encryption_key_name, json, , name, pattern, tag, template_type, token, tokenizer_type, tweak_type, uid_token].hash end |
#list_invalid_properties ⇒ Object
Show invalid properties with the reasons. Usually used together with valid?
210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 210 def list_invalid_properties invalid_properties = Array.new if @name.nil? invalid_properties.push('invalid value for "name", name cannot be nil.') end if @template_type.nil? invalid_properties.push('invalid value for "template_type", template_type cannot be nil.') end if @tokenizer_type.nil? invalid_properties.push('invalid value for "tokenizer_type", tokenizer_type cannot be nil.') end invalid_properties end |
#to_body ⇒ Hash
to_body is an alias to to_hash (backward compatibility)
352 353 354 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 352 def to_body to_hash end |
#to_hash ⇒ Hash
Returns the object in the form of hash
358 359 360 361 362 363 364 365 366 367 368 369 370 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 358 def to_hash hash = {} self.class.attribute_map.each_pair do |attr, param| value = self.send(attr) if value.nil? is_nullable = self.class.openapi_nullable.include?(attr) next if !is_nullable || (is_nullable && !instance_variable_defined?(:"@#{attr}")) end hash[param] = _to_hash(value) end hash end |
#to_s ⇒ String
Returns the string representation of the object
346 347 348 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 346 def to_s to_hash.to_s end |
#valid? ⇒ Boolean
Check to see if the all the properties in the model are valid
229 230 231 232 233 234 |
# File 'lib/akeyless/models/create_tokenizer.rb', line 229 def valid? return false if @name.nil? return false if @template_type.nil? return false if @tokenizer_type.nil? true end |