Skip to content

Commit

Permalink
Remove assumption of one usage per stream
Browse files Browse the repository at this point in the history
  • Loading branch information
jackmpcollins committed Nov 29, 2024
1 parent 53eef26 commit 8ed3898
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion src/magentic/chat_model/openai_chat_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,10 @@ def update(self, item: ChatCompletionChunk) -> None:
tool_call_chunk.index = self._current_tool_call_index
self._chat_completion_stream_state.handle_chunk(item)
if item.usage:
assert not self.usage_ref # noqa: S101
# Only keep the last usage
# Gemini openai-compatible API includes usage in all streamed chunks
# but OpenAI only includes this in the last chunk
self.usage_ref.clear()
self.usage_ref.append(
Usage(
input_tokens=item.usage.prompt_tokens,
Expand Down

0 comments on commit 8ed3898

Please sign in to comment.