Fix(chat): resolve response_complete event/DB inconsistency on attachments#457
Open
Jean-Regis-M wants to merge 1 commit intoGenAI-Security-Project:mainfrom
Open
Fix(chat): resolve response_complete event/DB inconsistency on attachments#457Jean-Regis-M wants to merge 1 commit intoGenAI-Security-Project:mainfrom
Jean-Regis-M wants to merge 1 commit intoGenAI-Security-Project:mainfrom
Conversation
Root cause: stream_response built effective_message with attachment prefix but passed user_message (original) to the response_complete event, diverging from the DB record which correctly stores effective_message. Solution: Replace user_message with effective_message in the response_complete event_data dict. effective_message equals user_message when no attachments are present, so the no-attachment path is unaffected. Impact: Event log and DB record now consistent for all sessions. Deterministic. Zero regression risk on attachment-free paths. Signed-off-by: JEAN REGIS <240509606@firat.edu.tr>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #413
Resolves a data consistency bug where the
response_completeevent emitted to Rediscontained the original
user_message, while the DB record savedeffective_message(the attachment-prefixed version). Any session with attachments produced mismatched
records between the event log and the database.
Context
stream_responseconstructseffective_messageby prepending a FinDrive file referenceheader when attachments are present:
This
effective_messageis what gets:_save_message("user", effective_message)However, the
response_completeevent was emittinguser_message, the raw, unprefixedoriginal, making it diverge from every other record of what was actually processed.
The
message_receivedevent correctly and intentionally emitsuser_messageto capturewhat the user literally typed. The
response_completeevent should mirror what wasprocessed, which is
effective_message.Root Cause
effective_messagewas already in scope and correctly used everywhere else.This was a straight omission the wrong variable was referenced.
Fix
finbot/agents/chat.pyevent_data={ "response_length": len(full_response), "response_content": full_response, "duration_ms": duration_ms, - "user_message": user_message, + "user_message": effective_message, "vendor_id": self.session_context.current_vendor_id, "llm_model": self._model, },effective_messageis unconditionally assigned before this point:effective_message = user_message→ identical value, zero regressioneffective_messagecarries the prefix → event now matches DBImpact
mismatches on attachment sessions
the LLM processed
Testing
Regression (no-attachment paths):
pytest tests/integration/agents/test_chat_layer3.py -v -k "not qa_002"Tasks
stream_responseeffective_messageis unconditionally in scope before theresponse_completeemitmessage_receivedevent intentionally retainsuser_messageleft untouchedeffective_message == user_messagewhenattachmentsis falsy)response_completeevent payloadtest_chat_l3_qa_002validates the corrected behaviour