Skip to content

fix(ai): buffer streaming preamble text before code fence appears#8911

Open
temrjan wants to merge 3 commits intomarimo-team:mainfrom
temrjan:fix/ai-cell-preamble-text
Open

fix(ai): buffer streaming preamble text before code fence appears#8911
temrjan wants to merge 3 commits intomarimo-team:mainfrom
temrjan:fix/ai-cell-preamble-text

Conversation

@temrjan
Copy link
Copy Markdown

@temrjan temrjan commented Mar 28, 2026

Summary

  • Buffer incoming AI stream chunks in CellCreationStream until a code fence (```) appears, preventing conversational preamble from becoming a separate Python cell
  • Flush buffered content as a cell in stop() if the stream ends without any fence (backward compatibility for models that return code without fences)
  • Add tests for: preamble + fence, code without fence, fence from first chunk

Root cause

CellCreationStream.stream() called codeToCells(buffer) on every chunk. Before any fence arrived, codeToCells treated the entire buffer as Python code (line 166 of completion-utils.ts: if (!code.includes("```")) return [{ language: "python", code }]), creating a cell from preamble text like "I'll create a fibonacci function...".

What this does NOT fix

The "Inline AI edit" flow (problem #2 in the issue) uses a different code path through the backend without_wrapping_backticks function. That is a separate issue.

Test plan

  • All 25 staged-cells tests pass
  • TypeScript clean (no new errors)
  • Test: preamble text buffered, cell created only when fence arrives
  • Test: code without fence → cell created on stream end (backward compat)
  • Test: fence in first chunk → cell created immediately (no delay)

Fixes #8880

🤖 Generated with Claude Code

When AI models emit conversational preamble before code fences
(e.g. "I'll create a cell that..."), the preamble was incorrectly
created as a separate Python cell. This happened because
CellCreationStream called codeToCells on partial buffer before
any fence arrived, treating plain text as Python code.

Now CellCreationStream buffers incoming chunks until a code fence
(```) appears. Once a fence is found, codeToCells correctly extracts
only the code. If the stream ends without any fence, the buffer is
flushed as a cell on stop() for backward compatibility.

Note: This fixes the "Generate AI cell" flow. The "Inline AI edit"
flow (backend without_wrapping_backticks) is a separate issue.

Fixes marimo-team#8880

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
marimo-docs Ready Ready Preview, Comment Apr 8, 2026 5:27am

Request Review

@github-actions
Copy link
Copy Markdown

github-actions bot commented Mar 28, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@temrjan
Copy link
Copy Markdown
Author

temrjan commented Mar 28, 2026

I have read the CLA Document and I hereby sign the CLA

Copy link
Copy Markdown
Collaborator

@manzt manzt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice. I think buffering until a fence appears makes sense, and the stop() fallback for fence-less responses is a good backward-compat touch.

});
});

it("should handle delta chunks", () => {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test covered the case where the first chunk is a partial fence, which got replaced. Would we add that scenario back?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added! The new test (should buffer partial fence and create cell when fence completes) covers the split-fence scenario: first chunk has just ``````, second chunk completes the fence — no premature cell creation.

Cover the scenario where a code fence arrives split across chunks
(e.g. first chunk has just "``", second completes the fence).
Ensures no premature cell creation from incomplete fences.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adjusts the frontend AI “staged cells” streaming logic to avoid turning conversational preamble into its own Python cell by delaying parsing until a code fence is observed (or until the stream ends).

Changes:

  • Buffer streamed text in CellCreationStream.stream() until a triple-backtick code fence appears (unless cells were already created).
  • On stream end (stop()), flush buffered content into a cell if no cells were created (maintains behavior for models that emit code without fences).
  • Add/adjust staged-cells streaming tests for preamble buffering, no-fence flushing, fences in first chunk, and partial-fence completion.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
frontend/src/core/ai/staged-cells.ts Buffers streamed chunks until a code fence appears; flushes buffered content on stop when no cells were created.
frontend/src/core/ai/tests/staged-cells.test.ts Adds test coverage for buffering behavior and stop-time flushing across multiple stream chunk patterns.

}
}
// Clear all state
this.buffer = "";
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

stop() says it clears all state, but it only resets buffer. createdCells (and hasMarimoImport) remain populated, which can leak state if any additional text-delta chunks arrive after text-end/finish, and also makes the comment inaccurate. Consider resetting createdCells/hasMarimoImport here (and/or nulling the stream ref after stop) so a completed stream cannot update previous cells.

Suggested change
this.buffer = "";
this.buffer = "";
this.createdCells = [];
this.hasMarimoImport = false;

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch — applied in 2d784fc. CellCreationStream is single-use (new instance per text-start), so this is purely defensive, but it makes the comment accurate.

Clear createdCells and hasMarimoImport alongside buffer so the
"Clear all state" comment is accurate.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AI cell generation emits conversational text as Python code

4 participants