mirascope.core.openai.stream
Module stream
The OpenAIStream
class for convenience around streaming LLM calls.
Usage
Attribute FinishReason
Type: Choice.__annotations__['finish_reason']
Class OpenAIStream
A class for convenience around streaming OpenAI LLM calls.
Example:
from mirascope.core import prompt_template
from mirascope.core.openai import openai_call
@openai_call("gpt-4o-mini", stream=True)
def recommend_book(genre: str) -> str:
return f"Recommend a {genre} book"
stream = recommend_book("fantasy") # returns `OpenAIStream` instance
for chunk, _ in stream:
print(chunk.content, end="", flush=True)
Bases:
BaseStream[OpenAICallResponse, OpenAICallResponseChunk, ChatCompletionUserMessageParam, ChatCompletionAssistantMessageParam, ChatCompletionToolMessageParam, ChatCompletionMessageParam, OpenAITool, ChatCompletionToolParam, OpenAIDynamicConfig, OpenAICallParams, FinishReason]Attributes
Name | Type | Description |
---|---|---|
audio_id | str | None | - |
cost_metadata | CostMetadata | - |
Function construct_call_response
Constructs the call response from a consumed OpenAIStream.
Parameters
Name | Type | Description |
---|---|---|
self | Any | - |
Returns
Type | Description |
---|---|
OpenAICallResponse | - |