fix: prevent proactive replies from polluting long-term conversation history#7624
Open
AgIzT wants to merge 2 commits intoAstrBotDevs:masterfrom
Open
fix: prevent proactive replies from polluting long-term conversation history#7624AgIzT wants to merge 2 commits intoAstrBotDevs:masterfrom
AgIzT wants to merge 2 commits intoAstrBotDevs:masterfrom
Conversation
Contributor
There was a problem hiding this comment.
Code Review
This pull request updates the proactive reply mechanism to avoid chat history pollution by passing a null conversation and tagging requests with a dynamic attribute. Reviewers noted that disabling the conversation entirely might strip the bot of its persona and skills, suggesting a non-persistent conversation as a better alternative. There are also concerns regarding the use of setattr for internal state, which could fail if the request object has strict validation or is copied during processing.
faf411f to
0068960
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR fixes the issue reported in #7622.
When both of the following are enabled:
provider_ltm_settings.group_icl_enable = trueprovider_ltm_settings.active_reply.enable = trueone successful proactive reply can cause the temporary chatroom-style exchange to be written back into
conv.historythrough the generic persistence flow. This pollutes the real user-to-bot conversation history stored in the database. After that, even normal passive@botmessages may appear to "forget" previously remembered facts because they load corrupted history from the session record.The core problem is not only that
req.contextsis cleared in memory. The more important issue is that the proactive-reply chatroom prompt and model response are incorrectly persisted into long-term conversation history.Modifications
This fix only changes two files under
astrbot/builtin_stars/astrbot/. It does not modify the core persistence pipeline, does not introduce new configuration fields, and does not add any new dependencies.astrbot/builtin_stars/astrbot/main.pyconvintorequest_llm(...)conversation=Noneconv.historywith chatroom content_ltm_active_reply_triggermarker is attached to theProviderRequestinstance to indicate that the current request was actually triggered by a proactive replyastrbot/builtin_stars/astrbot/long_term_memory.pyon_req_llm(...)branch condition is changed from "only checkenable_active_reply" to "the feature is enabled and the current request is truly triggered by a proactive reply"req.contextsis cleared@botrequests always preservereq.contextsand continue to use the real long-term conversation historyProactive replies still keep their original chatroom-style behavior
Proactive replies no longer pollute the database-backed long-term conversation history
Passive
@botrequests can still recall previously established session facts after proactive replies occurThis is NOT a breaking change.
Notes
Note: because proactive replies no longer bind to the current conversation, conversation-level persona / skills injection is also skipped for those proactive replies. This is an intentional tradeoff to prevent chatroom exchanges from polluting long-term conversation history.
Screenshots or Test Results
Validation environment:
v4.23.1DockerGeminiNapCat QQLinuxVerification steps:
provider_ltm_settings.group_icl_enable = trueprovider_ltm_settings.active_reply.enable = truegroup_message_max_cnt = 20@botconversation and let the bot remember a custom nickname or fact@botmessageResult before the fix:
@botrequests can no longer correctly recall previously remembered factsResult after the fix:
conv.history@botrequests can still correctly recall previously established long-term session factsNotes:
Checklist
😊 If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
/ This PR does not add a new feature; the related problem has already been documented in Issue [Bug] 开启主动回复和群聊上下文感知后,会话的长期记忆会被群聊上下文覆盖 #7622.
👀 My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
/ This change has been validated, and the verification steps and test results are provided above.
🤓 I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in
requirements.txtandpyproject.toml./ No new dependencies were introduced by this change.
😮 My changes do not introduce malicious code.
/ This change does not introduce malicious code.
Summary by Sourcery
Prevent proactive group replies from mutating long-term conversation history while preserving their chatroom-style behavior.
Bug Fixes: