Class: TencentCloud::Ccc::V20200210::ControlAIConversationRequest

Inherits:
TencentCloud::Common::AbstractModel
  • Object
show all
Defined in:
lib/v20200210/models.rb

Overview

ControlAIConversation请求参数结构体

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(sessionid = nil, sdkappid = nil, command = nil, serverpushtext = nil, invokellm = nil) ⇒ ControlAIConversationRequest

Returns a new instance of ControlAIConversationRequest.



1045
1046
1047
1048
1049
1050
1051
# File 'lib/v20200210/models.rb', line 1045

def initialize(sessionid=nil, sdkappid=nil, command=nil, serverpushtext=nil, invokellm=nil)
  @SessionId = sessionid
  @SdkAppId = sdkappid
  @Command = command
  @ServerPushText = serverpushtext
  @InvokeLLM = invokellm
end

Instance Attribute Details

#CommandObject

  • ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本

  • InvokeLLM,服务端发送文本给大模型,触发对话

Parameters:

  • ServerPushText:

    服务端发送播报文本命令,当Command为ServerPushText时必填

  • InvokeLLM:

    服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM=“1”



1043
1044
1045
# File 'lib/v20200210/models.rb', line 1043

def Command
  @Command
end

#InvokeLLMObject

  • ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本

  • InvokeLLM,服务端发送文本给大模型,触发对话

Parameters:

  • ServerPushText:

    服务端发送播报文本命令,当Command为ServerPushText时必填

  • InvokeLLM:

    服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM=“1”



1043
1044
1045
# File 'lib/v20200210/models.rb', line 1043

def InvokeLLM
  @InvokeLLM
end

#SdkAppIdObject

  • ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本

  • InvokeLLM,服务端发送文本给大模型,触发对话

Parameters:

  • ServerPushText:

    服务端发送播报文本命令,当Command为ServerPushText时必填

  • InvokeLLM:

    服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM=“1”



1043
1044
1045
# File 'lib/v20200210/models.rb', line 1043

def SdkAppId
  @SdkAppId
end

#ServerPushTextObject

  • ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本

  • InvokeLLM,服务端发送文本给大模型,触发对话

Parameters:

  • ServerPushText:

    服务端发送播报文本命令,当Command为ServerPushText时必填

  • InvokeLLM:

    服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM=“1”



1043
1044
1045
# File 'lib/v20200210/models.rb', line 1043

def ServerPushText
  @ServerPushText
end

#SessionIdObject

  • ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本

  • InvokeLLM,服务端发送文本给大模型,触发对话

Parameters:

  • ServerPushText:

    服务端发送播报文本命令,当Command为ServerPushText时必填

  • InvokeLLM:

    服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM=“1”



1043
1044
1045
# File 'lib/v20200210/models.rb', line 1043

def SessionId
  @SessionId
end

Instance Method Details

#deserialize(params) ⇒ Object



1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
# File 'lib/v20200210/models.rb', line 1053

def deserialize(params)
  @SessionId = params['SessionId']
  @SdkAppId = params['SdkAppId']
  @Command = params['Command']
  unless params['ServerPushText'].nil?
    @ServerPushText = ServerPushText.new
    @ServerPushText.deserialize(params['ServerPushText'])
  end
  unless params['InvokeLLM'].nil?
    @InvokeLLM = InvokeLLM.new
    @InvokeLLM.deserialize(params['InvokeLLM'])
  end
end