Prevent pushing a stream into both pending_send + pending_open (#235)
Prevent pushing a stream into both pending_send + pending_open, Clear out variables from buffered streams that get a reset, and ignore them when traversing the pending_send queue if they are is_reset(). Add asserts that a stream cannot be in pending_open & pending_send at the same time.
This commit is contained in:
		
				
					committed by
					
						 Carl Lerche
						Carl Lerche
					
				
			
			
				
	
			
			
			
						parent
						
							200c04f1d3
						
					
				
				
					commit
					bbed41974b
				
			| @@ -398,8 +398,19 @@ impl Prioritize { | ||||
|             } | ||||
|         } | ||||
|  | ||||
|         // If data is buffered, then schedule the stream for execution | ||||
|         if stream.buffered_send_data > 0 { | ||||
|         // If data is buffered and the stream is not pending open, then | ||||
|         // schedule the stream for execution | ||||
|         // | ||||
|         // Why do we not push into pending_send when the stream is in pending_open? | ||||
|         // | ||||
|         // We allow users to call send_request() which schedules a stream to be pending_open | ||||
|         // if there is no room according to the concurrency limit (max_send_streams), and we | ||||
|         // also allow data to be buffered for send with send_data() if there is no capacity for | ||||
|         // the stream to send the data, which attempts to place the stream in pending_send. | ||||
|         // If the stream is not open, we don't want the stream to be scheduled for | ||||
|         // execution (pending_send). Note that if the stream is in pending_open, it will be | ||||
|         // pushed to pending_send when there is room for an open stream. | ||||
|         if stream.buffered_send_data > 0 && !stream.is_pending_open { | ||||
|             // TODO: This assertion isn't *exactly* correct. There can still be | ||||
|             // buffered send data while the stream's pending send queue is | ||||
|             // empty. This can happen when a large data frame is in the process | ||||
|   | ||||
		Reference in New Issue
	
	Block a user