Class: GRPC::ActiveCall
- Inherits:
-
Object
- Object
- GRPC::ActiveCall
- Extended by:
- Forwardable
- Includes:
- Core::CallOps, Core::TimeConsts
- Defined in:
- src/ruby/lib/grpc/generic/active_call.rb
Overview
The ActiveCall class provides simple methods for sending marshallable data to a call
Instance Attribute Summary collapse
-
#deadline ⇒ Object
readonly
Returns the value of attribute deadline.
-
#metadata_sent ⇒ Object
readonly
Returns the value of attribute metadata_sent.
-
#metadata_to_send ⇒ Object
readonly
Returns the value of attribute metadata_to_send.
-
#peer ⇒ Object
readonly
Returns the value of attribute peer.
-
#peer_cert ⇒ Object
readonly
Returns the value of attribute peer_cert.
Class Method Summary collapse
-
.client_invoke(call, metadata = {}) ⇒ Object
client_invoke begins a client invocation.
Instance Method Summary collapse
- #attach_peer_cert(peer_cert) ⇒ Object
- #attach_status_results_and_complete_call(recv_status_batch_result) ⇒ Object
-
#bidi_streamer(requests, metadata: {}, &blk) ⇒ Enumerator?
bidi_streamer sends a stream of requests to the GRPC server, and yields a stream of responses.
-
#cancelled? ⇒ Boolean
cancelled indicates if the call was cancelled.
-
#client_streamer(requests, metadata: {}) ⇒ Object
client_streamer sends a stream of requests to a GRPC server, and returns a single response.
-
#each_remote_read ⇒ Enumerator
each_remote_read passes each response to the given block or returns an enumerator the responses if no block is given.
-
#each_remote_read_then_finish ⇒ Enumerator
each_remote_read_then_finish passes each response to the given block or returns an enumerator of the responses if no block is given.
- #get_message_from_batch_result(recv_message_batch_result) ⇒ Object
-
#initialize(call, marshal, unmarshal, deadline, started: true, metadata_received: false, metadata_to_send: nil) ⇒ ActiveCall
constructor
Creates an ActiveCall.
-
#interceptable ⇒ InterceptableView
Returns a restricted view of this ActiveCall for use in interceptors.
-
#merge_metadata_to_send(new_metadata = {}) ⇒ Object
Add to the metadata that will be sent from the server.
-
#multi_req_view ⇒ Object
multi_req_view provides a restricted view of this ActiveCall for use in a server client-streaming handler.
-
#op_is_done ⇒ Object
Signals that an operation is done.
-
#operation ⇒ Object
operation provides a restricted view of this ActiveCall for use as a Operation.
-
#output_metadata ⇒ Object
output_metadata are provides access to hash that can be used to save metadata to be sent as trailer.
-
#read_unary_request ⇒ Object
Intended for use on server-side calls when a single request from the client is expected (i.e., unary and server-streaming RPC types).
- #receive_and_check_status ⇒ Object
-
#remote_read ⇒ Object
remote_read reads a response from the remote endpoint.
-
#remote_send(req, marshalled = false) ⇒ Object
remote_send sends a request to the remote endpoint.
-
#request_response(req, metadata: {}) ⇒ Object
request_response sends a request to a GRPC server, and returns the response.
-
#run_server_bidi(mth, interception_ctx) ⇒ Object
run_server_bidi orchestrates a BiDi stream processing on a server.
-
#send_initial_metadata(new_metadata = {}) ⇒ Object
Sends the initial metadata that has yet to be sent.
-
#send_status(code = OK, details = '', assert_finished = false, metadata: {}) ⇒ Object
send_status sends a status to the remote endpoint.
-
#server_streamer(req, metadata: {}) ⇒ Enumerator|nil
server_streamer sends one request to the GRPC server, which yields a stream of responses.
- #server_unary_response(req, trailing_metadata: {}, code: Core::StatusCodes::OK, details: 'OK') ⇒ Object
-
#single_req_view ⇒ Object
single_req_view provides a restricted view of this ActiveCall for use in a server request-response handler.
-
#wait ⇒ Object
Waits till an operation completes.
Methods included from Core::TimeConsts
Constructor Details
#initialize(call, marshal, unmarshal, deadline, started: true, metadata_received: false, metadata_to_send: nil) ⇒ ActiveCall
Creates an ActiveCall.
ActiveCall should only be created after a call is accepted. That means different things on a client and a server. On the client, the call is accepted after calling call.invoke. On the server, this is after call.accept.
#initialize cannot determine if the call is accepted or not; so if a call that’s not accepted is used here, the error won’t be visible until the ActiveCall methods are called.
deadline is the absolute deadline for the call.
89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 89 def initialize(call, marshal, unmarshal, deadline, started: true, metadata_received: false, metadata_to_send: nil) fail(TypeError, '!Core::Call') unless call.is_a? Core::Call @call = call @deadline = deadline @marshal = marshal @unmarshal = unmarshal @metadata_received = @metadata_sent = started @op_notifier = nil fail(ArgumentError, 'Already sent md') if started && @metadata_to_send = || {} unless started @send_initial_md_mutex = Mutex.new @output_stream_done = false @input_stream_done = false @call_finished = false @call_finished_mu = Mutex.new @client_call_executed = false @client_call_executed_mu = Mutex.new # set the peer now so that the accessor can still function # after the server closes the call @peer = call.peer end |
Instance Attribute Details
#deadline ⇒ Object (readonly)
Returns the value of attribute deadline.
47 48 49 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 47 def deadline @deadline end |
#metadata_sent ⇒ Object (readonly)
Returns the value of attribute metadata_sent.
47 48 49 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 47 def @metadata_sent end |
#metadata_to_send ⇒ Object (readonly)
Returns the value of attribute metadata_to_send.
47 48 49 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 47 def @metadata_to_send end |
#peer ⇒ Object (readonly)
Returns the value of attribute peer.
47 48 49 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 47 def peer @peer end |
#peer_cert ⇒ Object (readonly)
Returns the value of attribute peer_cert.
47 48 49 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 47 def peer_cert @peer_cert end |
Class Method Details
.client_invoke(call, metadata = {}) ⇒ Object
client_invoke begins a client invocation.
Flow Control note: this blocks until flow control accepts that client request can go ahead.
deadline is the absolute deadline for the call.
Keyword Arguments ==
any keyword arguments are treated as metadata to be sent to the server if a keyword value is a list, multiple metadata for it’s key are sent
64 65 66 67 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 64 def self.client_invoke(call, = {}) fail(TypeError, '!Core::Call') unless call.is_a? Core::Call call.run_batch(SEND_INITIAL_METADATA => ) end |
Instance Method Details
#attach_peer_cert(peer_cert) ⇒ Object
579 580 581 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 579 def attach_peer_cert(peer_cert) @peer_cert = peer_cert end |
#attach_status_results_and_complete_call(recv_status_batch_result) ⇒ Object
173 174 175 176 177 178 179 180 181 182 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 173 def attach_status_results_and_complete_call(recv_status_batch_result) unless recv_status_batch_result.status.nil? @call. = recv_status_batch_result.status. end @call.status = recv_status_batch_result.status # The RECV_STATUS in run_batch always succeeds # Check the status for a bad status or failed run batch recv_status_batch_result.check_status end |
#bidi_streamer(requests, metadata: {}, &blk) ⇒ Enumerator?
bidi_streamer sends a stream of requests to the GRPC server, and yields a stream of responses.
This method takes an Enumerable of requests, and returns and enumerable of responses.
requests ==
requests provides an ‘iterable’ of Requests. I.e. it follows Ruby’s #each enumeration protocol. In the simplest case, requests will be an array of marshallable objects; in typical case it will be an Enumerable that allows dynamic construction of the marshallable objects.
responses ==
This is an enumerator of responses. I.e, its #next method blocks waiting for the next response. Also, if at any point the block needs to consume all the remaining responses, this can be done using #each or #collect. Calling #each or #collect should only be done if the_call#writes_done has been called, otherwise the block will loop forever.
a list, multiple metadata for its key are sent
494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 494 def bidi_streamer(requests, metadata: {}, &blk) raise_error_if_already_executed # Metadata might have already been sent if this is an operation view begin () rescue GRPC::Core::CallError => e batch_result = @call.run_batch(RECV_STATUS_ON_CLIENT => nil) set_input_stream_done set_output_stream_done attach_status_results_and_complete_call(batch_result) raise e rescue => e set_input_stream_done set_output_stream_done raise e end bd = BidiCall.new(@call, @marshal, @unmarshal, metadata_received: @metadata_received) bd.run_on_client(requests, proc { set_input_stream_done }, proc { set_output_stream_done }, &blk) end |
#cancelled? ⇒ Boolean
cancelled indicates if the call was cancelled
135 136 137 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 135 def cancelled? !@call.status.nil? && @call.status.code == Core::StatusCodes::CANCELLED end |
#client_streamer(requests, metadata: {}) ⇒ Object
client_streamer sends a stream of requests to a GRPC server, and returns a single response.
requests provides an ‘iterable’ of Requests. I.e. it follows Ruby’s #each enumeration protocol. In the simplest case, requests will be an array of marshallable objects; in typical case it will be an Enumerable that allows dynamic construction of the marshallable objects.
a list, multiple metadata for its key are sent
393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 393 def client_streamer(requests, metadata: {}) raise_error_if_already_executed begin () requests.each { |r| @call.run_batch(SEND_MESSAGE => @marshal.call(r)) } rescue GRPC::Core::CallError => e receive_and_check_status # check for Cancelled raise e rescue => e set_input_stream_done raise e ensure set_output_stream_done end batch_result = @call.run_batch( SEND_CLOSE_FROM_CLIENT => nil, RECV_INITIAL_METADATA => nil, RECV_MESSAGE => nil, RECV_STATUS_ON_CLIENT => nil ) set_input_stream_done @call. = batch_result. attach_status_results_and_complete_call(batch_result) (batch_result) end |
#each_remote_read ⇒ Enumerator
each_remote_read passes each response to the given block or returns an enumerator the responses if no block is given. Used to generate the request enumerable for server-side client-streaming RPC’s.
Enumerator ==
-
#next blocks until the remote endpoint sends a READ or FINISHED
-
for each read, enumerator#next yields the response
-
on status
* if it's is OK, enumerator#next raises StopException * if is not OK, enumerator#next raises RuntimeException
Block ==
-
if provided it is executed for each response
-
the call blocks until no more responses are provided
290 291 292 293 294 295 296 297 298 299 300 301 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 290 def each_remote_read return enum_for(:each_remote_read) unless block_given? begin loop do resp = remote_read break if resp.nil? # the last response was received yield resp end ensure set_input_stream_done end end |
#each_remote_read_then_finish ⇒ Enumerator
each_remote_read_then_finish passes each response to the given block or returns an enumerator of the responses if no block is given.
It is like each_remote_read, but it blocks on finishing on detecting the final message.
Enumerator ==
-
#next blocks until the remote endpoint sends a READ or FINISHED
-
for each read, enumerator#next yields the response
-
on status
* if it's is OK, enumerator#next raises StopException * if is not OK, enumerator#next raises RuntimeException
Block ==
-
if provided it is executed for each response
-
the call blocks until no more responses are provided
323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 323 def each_remote_read_then_finish return enum_for(:each_remote_read_then_finish) unless block_given? loop do resp = begin remote_read rescue GRPC::Core::CallError => e GRPC.logger.warn("In each_remote_read_then_finish: #{e}") nil end break if resp.nil? # the last response was received yield resp end receive_and_check_status ensure set_input_stream_done end |
#get_message_from_batch_result(recv_message_batch_result) ⇒ Object
262 263 264 265 266 267 268 269 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 262 def () unless .nil? || ..nil? return @unmarshal.call(.) end GRPC.logger.debug('found nil; the final response has been sent') nil end |
#interceptable ⇒ InterceptableView
Returns a restricted view of this ActiveCall for use in interceptors
163 164 165 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 163 def interceptable InterceptableView.new(self) end |
#merge_metadata_to_send(new_metadata = {}) ⇒ Object
Add to the metadata that will be sent from the server. Fails if metadata has already been sent. Unused by client calls.
572 573 574 575 576 577 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 572 def ( = {}) @send_initial_md_mutex.synchronize do fail('cant change metadata after already sent') if @metadata_sent @metadata_to_send.merge!() end end |
#multi_req_view ⇒ Object
multi_req_view provides a restricted view of this ActiveCall for use in a server client-streaming handler.
141 142 143 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 141 def multi_req_view MultiReqView.new(self) end |
#op_is_done ⇒ Object
Signals that an operation is done. Only relevant on the client-side (this is a no-op on the server-side)
564 565 566 567 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 564 def op_is_done return if @op_notifier.nil? @op_notifier.notify(self) end |
#operation ⇒ Object
operation provides a restricted view of this ActiveCall for use as a Operation.
153 154 155 156 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 153 def operation @op_notifier = Notifier.new Operation.new(self) end |
#output_metadata ⇒ Object
output_metadata are provides access to hash that can be used to save metadata to be sent as trailer
130 131 132 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 130 def @output_metadata ||= {} end |
#read_unary_request ⇒ Object
Intended for use on server-side calls when a single request from the client is expected (i.e., unary and server-streaming RPC types).
221 222 223 224 225 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 221 def read_unary_request req = remote_read set_input_stream_done req end |
#receive_and_check_status ⇒ Object
167 168 169 170 171 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 167 def receive_and_check_status batch_result = @call.run_batch(RECV_STATUS_ON_CLIENT => nil) set_input_stream_done attach_status_results_and_complete_call(batch_result) end |
#remote_read ⇒ Object
remote_read reads a response from the remote endpoint.
It blocks until the remote endpoint replies with a message or status. On receiving a message, it returns the response after unmarshalling it. On receiving a status, it returns nil if the status is OK, otherwise raising BadStatus
251 252 253 254 255 256 257 258 259 260 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 251 def remote_read ops = { RECV_MESSAGE => nil } ops[RECV_INITIAL_METADATA] = nil unless @metadata_received batch_result = @call.run_batch(ops) unless @metadata_received @call. = batch_result. @metadata_received = true end (batch_result) end |
#remote_send(req, marshalled = false) ⇒ Object
remote_send sends a request to the remote endpoint.
It blocks until the remote endpoint accepts the message.
marshalled.
191 192 193 194 195 196 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 191 def remote_send(req, marshalled = false) GRPC.logger.debug("sending #{req}, marshalled? #{marshalled}") payload = marshalled ? req : @marshal.call(req) @call.run_batch(SEND_MESSAGE => payload) end |
#request_response(req, metadata: {}) ⇒ Object
request_response sends a request to a GRPC server, and returns the response.
a list, multiple metadata for its key are sent
350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 350 def request_response(req, metadata: {}) raise_error_if_already_executed ops = { SEND_MESSAGE => @marshal.call(req), SEND_CLOSE_FROM_CLIENT => nil, RECV_INITIAL_METADATA => nil, RECV_MESSAGE => nil, RECV_STATUS_ON_CLIENT => nil } @send_initial_md_mutex.synchronize do # Metadata might have already been sent if this is an operation view unless @metadata_sent ops[SEND_INITIAL_METADATA] = @metadata_to_send.merge!() end @metadata_sent = true end begin batch_result = @call.run_batch(ops) # no need to check for cancellation after a CallError because this # batch contains a RECV_STATUS op ensure set_input_stream_done set_output_stream_done end @call. = batch_result. attach_status_results_and_complete_call(batch_result) (batch_result) end |
#run_server_bidi(mth, interception_ctx) ⇒ Object
run_server_bidi orchestrates a BiDi stream processing on a server.
N.B. gen_each_reply is a func(Enumerable<Requests>)
It takes an enumerable of requests as an arg, in case there is a relationship between the stream of requests and the stream of replies.
This does not mean that must necessarily be one. E.g, the replies produced by gen_each_reply could ignore the received_msgs
535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 535 def run_server_bidi(mth, interception_ctx) view = multi_req_view bidi_call = BidiCall.new( @call, @marshal, @unmarshal, metadata_received: @metadata_received, req_view: view ) requests = bidi_call.read_next_loop(proc { set_input_stream_done }, false) interception_ctx.intercept!( :bidi_streamer, call: view, method: mth, requests: requests ) do bidi_call.run_on_server(mth, requests) end end |
#send_initial_metadata(new_metadata = {}) ⇒ Object
Sends the initial metadata that has yet to be sent. Does nothing if metadata has already been sent for this call.
119 120 121 122 123 124 125 126 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 119 def ( = {}) @send_initial_md_mutex.synchronize do return if @metadata_sent @metadata_to_send.merge!() ActiveCall.client_invoke(@call, @metadata_to_send) @metadata_sent = true end end |
#send_status(code = OK, details = '', assert_finished = false, metadata: {}) ⇒ Object
send_status sends a status to the remote endpoint.
FINISHED. list, mulitple metadata for its key are sent
206 207 208 209 210 211 212 213 214 215 216 217 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 206 def send_status(code = OK, details = '', assert_finished = false, metadata: {}) ops = { SEND_STATUS_FROM_SERVER => Struct::Status.new(code, details, ) } ops[RECV_CLOSE_ON_SERVER] = nil if assert_finished @call.run_batch(ops) set_output_stream_done nil end |
#server_streamer(req, metadata: {}) ⇒ Enumerator|nil
server_streamer sends one request to the GRPC server, which yields a stream of responses.
responses provides an enumerator over the streamed responses, i.e. it follows Ruby’s #each iteration protocol. The enumerator blocks while waiting for each response, stops when the server signals that no further responses will be supplied. If the implicit block is provided, it is executed with each response as the argument and no result is returned.
a list, multiple metadata for its key are sent
436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 436 def server_streamer(req, metadata: {}) raise_error_if_already_executed ops = { SEND_MESSAGE => @marshal.call(req), SEND_CLOSE_FROM_CLIENT => nil } @send_initial_md_mutex.synchronize do # Metadata might have already been sent if this is an operation view unless @metadata_sent ops[SEND_INITIAL_METADATA] = @metadata_to_send.merge!() end @metadata_sent = true end begin @call.run_batch(ops) rescue GRPC::Core::CallError => e receive_and_check_status # checks for Cancelled raise e rescue => e set_input_stream_done raise e ensure set_output_stream_done end replies = enum_for(:each_remote_read_then_finish) return replies unless block_given? replies.each { |r| yield r } end |
#server_unary_response(req, trailing_metadata: {}, code: Core::StatusCodes::OK, details: 'OK') ⇒ Object
227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 227 def server_unary_response(req, trailing_metadata: {}, code: Core::StatusCodes::OK, details: 'OK') ops = {} @send_initial_md_mutex.synchronize do ops[SEND_INITIAL_METADATA] = @metadata_to_send unless @metadata_sent @metadata_sent = true end payload = @marshal.call(req) ops[SEND_MESSAGE] = payload ops[SEND_STATUS_FROM_SERVER] = Struct::Status.new( code, details, ) ops[RECV_CLOSE_ON_SERVER] = nil @call.run_batch(ops) set_output_stream_done end |
#single_req_view ⇒ Object
single_req_view provides a restricted view of this ActiveCall for use in a server request-response handler.
147 148 149 |
# File 'src/ruby/lib/grpc/generic/active_call.rb', line 147 def single_req_view SingleReqView.new(self) end |