fix(ai): buffer streaming preamble text before code fence appears#8911
fix(ai): buffer streaming preamble text before code fence appears#8911temrjan wants to merge 3 commits intomarimo-team:mainfrom
Conversation
When AI models emit conversational preamble before code fences (e.g. "I'll create a cell that..."), the preamble was incorrectly created as a separate Python cell. This happened because CellCreationStream called codeToCells on partial buffer before any fence arrived, treating plain text as Python code. Now CellCreationStream buffers incoming chunks until a code fence (```) appears. Once a fence is found, codeToCells correctly extracts only the code. If the stream ends without any fence, the buffer is flushed as a cell on stop() for backward compatibility. Note: This fixes the "Generate AI cell" flow. The "Inline AI edit" flow (backend without_wrapping_backticks) is a separate issue. Fixes marimo-team#8880 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
All contributors have signed the CLA ✍️ ✅ |
|
I have read the CLA Document and I hereby sign the CLA |
manzt
left a comment
There was a problem hiding this comment.
Nice. I think buffering until a fence appears makes sense, and the stop() fallback for fence-less responses is a good backward-compat touch.
| }); | ||
| }); | ||
|
|
||
| it("should handle delta chunks", () => { |
There was a problem hiding this comment.
This test covered the case where the first chunk is a partial fence, which got replaced. Would we add that scenario back?
There was a problem hiding this comment.
Added! The new test (should buffer partial fence and create cell when fence completes) covers the split-fence scenario: first chunk has just ``````, second chunk completes the fence — no premature cell creation.
Cover the scenario where a code fence arrives split across chunks (e.g. first chunk has just "``", second completes the fence). Ensures no premature cell creation from incomplete fences. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
This PR adjusts the frontend AI “staged cells” streaming logic to avoid turning conversational preamble into its own Python cell by delaying parsing until a code fence is observed (or until the stream ends).
Changes:
- Buffer streamed text in
CellCreationStream.stream()until a triple-backtick code fence appears (unless cells were already created). - On stream end (
stop()), flush buffered content into a cell if no cells were created (maintains behavior for models that emit code without fences). - Add/adjust staged-cells streaming tests for preamble buffering, no-fence flushing, fences in first chunk, and partial-fence completion.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| frontend/src/core/ai/staged-cells.ts | Buffers streamed chunks until a code fence appears; flushes buffered content on stop when no cells were created. |
| frontend/src/core/ai/tests/staged-cells.test.ts | Adds test coverage for buffering behavior and stop-time flushing across multiple stream chunk patterns. |
| } | ||
| } | ||
| // Clear all state | ||
| this.buffer = ""; |
There was a problem hiding this comment.
stop() says it clears all state, but it only resets buffer. createdCells (and hasMarimoImport) remain populated, which can leak state if any additional text-delta chunks arrive after text-end/finish, and also makes the comment inaccurate. Consider resetting createdCells/hasMarimoImport here (and/or nulling the stream ref after stop) so a completed stream cannot update previous cells.
| this.buffer = ""; | |
| this.buffer = ""; | |
| this.createdCells = []; | |
| this.hasMarimoImport = false; |
There was a problem hiding this comment.
Good catch — applied in 2d784fc. CellCreationStream is single-use (new instance per text-start), so this is purely defensive, but it makes the comment accurate.
Clear createdCells and hasMarimoImport alongside buffer so the "Clear all state" comment is accurate. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
CellCreationStreamuntil a code fence (```) appears, preventing conversational preamble from becoming a separate Python cellstop()if the stream ends without any fence (backward compatibility for models that return code without fences)Root cause
CellCreationStream.stream()calledcodeToCells(buffer)on every chunk. Before any fence arrived,codeToCellstreated the entire buffer as Python code (line 166 ofcompletion-utils.ts:if (!code.includes("```")) return [{ language: "python", code }]), creating a cell from preamble text like "I'll create a fibonacci function...".What this does NOT fix
The "Inline AI edit" flow (problem #2 in the issue) uses a different code path through the backend
without_wrapping_backticksfunction. That is a separate issue.Test plan
Fixes #8880
🤖 Generated with Claude Code