Skip to content

Feat: Parental Consent Flow #172

Open
ToxicBiohazard wants to merge 8 commits intohackthebox:mainfrom
ToxicBiohazard:parental-consent
Open

Feat: Parental Consent Flow #172
ToxicBiohazard wants to merge 8 commits intohackthebox:mainfrom
ToxicBiohazard:parental-consent

Conversation

@ToxicBiohazard
Copy link
Contributor

Types of changes

What types of changes does your code introduce?
Put an x in the boxes that apply.

  • Bugfix (non-breaking change which fixes an issue).
  • New feature (non-breaking change which adds functionality).
  • Breaking change (fix or feature that would cause existing functionality not to work as expected).
  • Documentation Update (if none of the other choices applies).

Parental Consent Flow - Feature Documentation

Overview

This feature implements a comprehensive system for handling potentially underage users on the HTB Discord server. When moderators flag a verified user as potentially underage, the system creates a report that authorized reviewers can approve, deny, or check for parental consent.

The flow ensures compliance with platform policies by requiring parental consent for users under 18, with automatic ban management and role assignment.

Key Components

Commands

  • /flag_minor (MOD+): Flag a verified user as potentially underage
  • /minor_reviewers (ADMIN): Manage who can review minor reports
    • add <user>: Add a reviewer
    • remove <user>: Remove a reviewer
    • list: Show all current reviewers
    • seed: Initialize with a default set of reviewers

Database Tables

minor_report

Tracks each flagged user and the report lifecycle.

Column Type Description
id Integer Primary key
user_id BIGINT(18) Discord ID of the flagged user
reporter_id BIGINT(18) Discord ID of the moderator who flagged
suspected_age Integer Suspected age (1-17)
evidence TEXT Evidence supporting the flag
report_message_id BIGINT(20) Discord message ID in the review channel
status VARCHAR(32) Current state: pending, approved, denied, consent_verified
reviewer_id BIGINT(18) Discord ID of the reviewer who took action
created_at TIMESTAMP When the report was created
updated_at TIMESTAMP When the report was last updated
associated_ban_id Integer ID of the ban created for this report (if any)

minor_review_reviewer

Stores authorized reviewers (configurable at runtime).

Column Type Description
id Integer Primary key
user_id BIGINT(18) Discord ID of the reviewer
added_by BIGINT(18) Discord ID of the admin who added them
created_at TIMESTAMP When they were added

Configuration

Required environment variables in .env:

# Role IDs
ROLE_VERIFIED=<id>
ROLE_VERIFIED_MINOR=<id>

# Channel IDs
CHANNEL_MINOR_REVIEW=<id>

# Parental Consent Cloud Function
PARENTAL_CONSENT_CHECK_URL=<Cloud Function URL>
PARENTAL_CONSENT_SECRET=<shared-secret>

The Cloud Function checks if a parental consent form exists in Google Drive for a given HTB account UUID. Requests must be signed with HMAC-SHA1.

Flow Sequences

1. Flag Minor - Initial Report Creation

sequenceDiagram
    actor Mod as Moderator
    participant Bot
    participant DB
    participant CF as Cloud Function
    participant Channel as Review Channel
    
    Mod->>Bot: /flag_minor user age evidence
    Bot->>Bot: Validate user has VERIFIED role
    Bot->>DB: Check htb_discord_link for user
    alt No HTB account linked
        Bot->>Mod: Error: User must be verified first
    else Account found
        Bot->>Mod: Processing...
        Bot->>CF: POST /check-file<br/>{file_name: account_uuid}<br/>X-Signature: hmac_sha1
        CF->>Bot: {"exist": false}
        alt Consent already exists
            Bot->>Bot: Assign VERIFIED_MINOR role
            Bot->>Mod: Consent on file, role assigned
        else No consent
            Bot->>DB: Create MinorReport (status=pending)
            Bot->>Channel: Send report embed with buttons
            Bot->>Mod: Report created
        end
    end
Loading

2. Approve Ban Flow

sequenceDiagram
    actor Rev as Reviewer
    participant View as MinorReportView
    participant Modal as ApproveBanModal
    participant Ban as Ban System
    participant DB
    participant Embed as Report Message
    
    Rev->>View: Click "Approve Ban"
    View->>Rev: Show modal (duration input)
    Rev->>Modal: Submit duration (e.g. "3y")
    Modal->>Ban: ban_member_with_epoch()
    Ban->>DB: Create Ban record
    Ban->>Modal: Ban created/queued for approval
    Modal->>DB: Update MinorReport<br/>status=approved<br/>associated_ban_id=<ban_id>
    Modal->>Rev: Ban submitted, awaiting SR_MOD approval
    Modal->>Embed: Update embed + disable Approve/Deny<br/>Relabel: "Check Consent & Unban"
Loading

3. Check Consent & Unban Flow

sequenceDiagram
    actor Rev as Reviewer
    participant View as MinorReportView
    participant DB
    participant CF as Cloud Function
    participant Guild as Discord Guild
    participant Embed as Report Message
    
    Rev->>View: Click "Check Consent & Unban"
    View->>View: Defer response
    View->>DB: Get account_identifier for user
    View->>CF: POST /check-file<br/>{file_name: uuid}<br/>X-Signature: hmac
    CF->>View: {"exist": true/false}
    
    alt Consent found (exist=true)
        View->>DB: Get associated Ban record
        alt User in guild
            View->>Guild: Assign VERIFIED_MINOR role
            View->>Guild: Unban user (if banned)
            View->>Rev: User unbanned, role assigned
        else User not in guild
            View->>Guild: Unban user (by ID)
            View->>Rev: User unbanned, role assigned on rejoin
        end
        View->>DB: Update status=consent_verified
        View->>Embed: Update embed, remove all buttons
    else No consent (exist=false)
        View->>Rev: Consent still not found
        View->>Embed: Add status note, keep buttons
    end
Loading

4. Auto-Remove Minor Role at Age 18

sequenceDiagram
    participant Scheduler as Scheduled Task
    participant DB
    participant Guild as Discord Guild
    
    loop Every 1 minute
        Scheduler->>DB: Get all reports (status=approved/consent_verified)
        loop For each report
            Scheduler->>Scheduler: Calculate expires_at<br/>(created_at + years_until_18)
            alt now >= expires_at
                Scheduler->>Guild: Get member by user_id
                alt Member found & has VERIFIED_MINOR
                    Scheduler->>Guild: Remove VERIFIED_MINOR role
                    Scheduler->>Scheduler: Log: User aged out
                end
            end
        end
    end
Loading

5. Auto-Assign Minor Role on Rejoin

sequenceDiagram
    actor User
    participant Guild as Discord Guild
    participant Listener as on_member_join
    participant DB
    
    User->>Guild: Joins server
    Guild->>Listener: on_member_join event
    Listener->>DB: Check MinorReport for user<br/>(status=consent_verified)
    alt Report exists
        Listener->>Listener: Calculate expires_at<br/>(created_at + years_until_18)
        alt now < expires_at (still under 18)
            Listener->>Guild: Assign VERIFIED_MINOR role
            Listener->>Listener: Log: Minor role assigned on rejoin
        else Already 18 or older
            Listener->>Listener: Skip (expired)
        end
    end
Loading

Edge Cases

1. User leaves server while banned

Scenario: User is flagged, ban approved, then they leave the server before consent is submitted.

Handling:

  • When reviewer clicks "Check Consent & Unban" after consent arrives:
    • Bot attempts to fetch the member → gets None or a User object (not Member)
    • We still call unban_member(guild, discord.Object(id=user_id)) → Discord unbans by ID
    • We skip role assignment (can't assign roles to non-members)
    • Reviewer sees: "User unbanned. Minor role will be assigned when they rejoin."
  • When the user rejoins:
    • on_member_join handler checks for a CONSENT_VERIFIED report
    • If still under 18, assigns VERIFIED_MINOR automatically

2. Consent arrives before any ban

Scenario: Mod runs /flag_minor, but consent form already exists in Google Drive.

Handling:

  • The command checks consent early (before creating a report)
  • If check_parental_consent returns True:
    • Assigns VERIFIED_MINOR role immediately
    • Returns: "Parental consent already on file. No report created. Role assigned."
    • No report is created in the database

3. Multiple reviewers try to approve the same report

Scenario: Two reviewers click "Approve Ban" at nearly the same time.

Handling:

  • Button callbacks check report.status == PENDING before showing the modal
  • If status is not PENDING, the button replies: "This report is no longer pending."
  • First approver's modal submission sets status=approved
  • Second approver gets the "no longer pending" message

4. Report updated while pending

Scenario: A report already exists for a user, and a moderator runs /flag_minor again with updated info.

Handling:

  • /flag_minor checks for an active report via get_active_minor_report(user_id)
  • If found:
    • Updates suspected_age, evidence, reporter_id, updated_at in the database
    • Fetches the existing report message and edits the embed
    • Returns: "Report updated with new information. Review channel message edited."
  • No new report or message is created

5. User turns 18 while banned

Scenario: User was flagged at age 15, banned for 3 years, consent arrives after 6 months, they're unbanned and assigned the minor role, then 2.5 years pass.

Handling:

  • Scheduled task auto_remove_minor_role runs every minute:
    • Computes: expires_at = report.created_at + timedelta(days=365 * years_until_18(suspected_age))
    • When now >= expires_at:
      • Checks if user is in guild and has VERIFIED_MINOR
      • If yes, removes the role automatically
      • Logs: "Removing minor role from user X (Y) because they have reached 18."
  • The report stays in CONSENT_VERIFIED state (historical record), but the user is no longer treated as a minor

6. Reviewer not in authorized list

Scenario: A moderator (not in minor_review_reviewer table) tries to interact with a report.

Handling:

  • MinorReportView.interaction_check is called before any button callback
  • It checks is_minor_review_moderator(user_id) which queries the DB (with 60-second cache)
  • If not authorized:
    • Returns ephemeral: "You are not authorized to review minor reports."
    • Button callback is never executed

7. Cloud Function timeout or error

Scenario: The parental consent check fails (network error, timeout, or CF returns non-200).

Handling:

  • check_parental_consent wraps the HTTP call in try/except:
    except aiohttp.ClientError as e:
        logger.warning("Parental consent check request failed: %s", e)
        return False
    except asyncio.TimeoutError as e:
        logger.warning("Parental consent check timed out: %s", e)
        return False
  • Returns False (no consent found)
  • Flow continues as "no consent" path (keeps buttons active for retry)

8. Report message deleted or not found

Scenario: The report message in the review channel is manually deleted by a moderator.

Handling:

  • All button callbacks first call get_report_by_message_id(interaction.message.id)
  • If no report is found:
    • interaction_check returns False with message: "Report not found or already resolved."
    • Button callback is not executed
  • When /flag_minor tries to edit an existing report's message:
    try:
        msg = await review_channel.fetch_message(report.report_message_id)
        await msg.edit(embed=embed)
    except (discord.NotFound, discord.HTTPException) as e:
        logger.warning("Could not edit existing report message: %s", e)
    • Logs warning but continues (report still exists in DB)

9. Interaction token expires (>3 seconds)

Scenario: /flag_minor takes too long before responding (DB + HTTP calls).

Handling:

  • Pattern B is implemented: send a fast initial response immediately after validation
    status_message = await ctx.respond("Creating or updating minor report, please wait...", ephemeral=True)
  • Then perform heavy operations (DB queries, Cloud Function calls, sending review message)
  • Finally edit the status message with the result:
    await status_message.edit(content="Report created and posted to the review channel...")
  • This acknowledges the interaction within Discord's 3-second window

10. User has both VERIFIED and VERIFIED_MINOR

Scenario: Through some edge case, a user ends up with both roles.

Handling:

  • /flag_minor validates early:
    if minor_role in target.roles:
        return await ctx.respond("That user already has the verified-minor status. No need to flag.", ephemeral=True)
  • Prevents duplicate reports for users already marked as minors

Complete Flow Diagram

flowchart TD
    Start([Moderator runs /flag_minor]) --> ValidateUser{User has<br/>VERIFIED role?}
    ValidateUser -->|No| ErrorNotVerified[Error: Only verified users can be flagged]
    ValidateUser -->|Yes| HasMinorRole{User already has<br/>VERIFIED_MINOR?}
    
    HasMinorRole -->|Yes| ErrorAlreadyMinor[Error: Already has verified-minor status]
    HasMinorRole -->|No| SendStatus[Send status message:<br/>Processing...]
    
    SendStatus --> GetAccount[Get HTB account identifier from DB]
    GetAccount --> AccountFound{Account found?}
    AccountFound -->|No| ErrorNoAccount[Edit status: Must be verified first]
    AccountFound -->|Yes| CheckConsent[Call Cloud Function:<br/>check_parental_consent]
    
    CheckConsent --> HasConsent{Consent exists?}
    HasConsent -->|Yes| AssignRole[Assign VERIFIED_MINOR role]
    AssignRole --> UpdateStatusEarly[Edit status: Consent on file,<br/>no report created]
    
    HasConsent -->|No| CheckExisting{Active report<br/>exists for user?}
    CheckExisting -->|Yes| UpdateReport[Update existing report:<br/>age, evidence, reporter]
    UpdateReport --> EditMessage[Edit existing review message]
    EditMessage --> NotifyUpdate[Edit status: Report updated]
    
    CheckExisting -->|No| CreateReport[Create MinorReport in DB<br/>status=pending]
    CreateReport --> SendReview[Send embed + view to review channel]
    SendReview --> RefreshReport[Refresh report to get ID]
    RefreshReport --> EditEmbed[Edit embed with Report ID]
    EditEmbed --> NotifyCreated[Edit status: Report created]
    
    NotifyCreated --> ReviewFlow[Review Channel Flow]
    NotifyUpdate --> ReviewFlow
    
    ReviewFlow --> ReviewerAction{Reviewer Action}
    
    ReviewerAction -->|Approve Ban| ShowModal[Show Approve Ban Modal]
    ShowModal --> ValidateDuration[Validate ban duration]
    ValidateDuration --> CreateBan[Create ban via<br/>ban_member_with_epoch]
    CreateBan --> UpdateToApproved[Update status=approved<br/>Save ban_id]
    UpdateToApproved --> DisableButtons[Disable Approve + Deny<br/>Rename Recheck to<br/>Check Consent & Unban]
    DisableButtons --> WaitConsent{Later: Check<br/>Consent & Unban}
    
    ReviewerAction -->|Deny Report| ShowDenyModal[Show Deny Report Modal]
    ShowDenyModal --> SaveDenial[Update status=denied<br/>Add user note]
    SaveDenial --> RemoveButtonsDeny[Remove all buttons]
    RemoveButtonsDeny --> TerminalDenied([TERMINAL: DENIED])
    
    ReviewerAction -->|Recheck Consent| RecheckPending[Check Cloud Function]
    RecheckPending --> ConsentFoundPending{Consent exists?}
    ConsentFoundPending -->|Yes| AssignRolePending[Assign VERIFIED_MINOR<br/>if user in guild]
    AssignRolePending --> SetVerified1[Update status=consent_verified]
    SetVerified1 --> RemoveButtons1[Remove all buttons]
    RemoveButtons1 --> Terminal1([TERMINAL: CONSENT_VERIFIED])
    
    ConsentFoundPending -->|No| AddRecheckNote[Add status note:<br/>Recheck no consent]
    AddRecheckNote --> KeepButtons[Keep buttons active]
    KeepButtons --> ReviewerAction
    
    WaitConsent -->|Consent arrives| CheckConsentUnban[Call Cloud Function]
    CheckConsentUnban --> ConsentFoundApproved{Consent exists?}
    ConsentFoundApproved -->|Yes| UnbanFlow{User in guild?}
    UnbanFlow -->|Yes| UnbanAndRole[Unban user<br/>Assign VERIFIED_MINOR]
    UnbanFlow -->|No| UnbanOnly[Unban by user ID<br/>Role assigned on rejoin]
    UnbanAndRole --> SetVerified2[Update status=consent_verified]
    UnbanOnly --> SetVerified2
    SetVerified2 --> RemoveButtons2[Remove all buttons]
    RemoveButtons2 --> Terminal2([TERMINAL: CONSENT_VERIFIED])
    
    ConsentFoundApproved -->|No| AddRecheckNote2[Add status note:<br/>Recheck no consent]
    AddRecheckNote2 --> KeepButtonsApproved[Keep Check Consent & Unban button]
    KeepButtonsApproved --> WaitConsent
Loading

Automated Background Tasks

1. Remove Minor Role at Age 18

Frequency: Every 1 minute (part of ScheduledTasks.all_tasks)

Logic:

for report in reports with status in (approved, consent_verified):
    expires_at = report.created_at + timedelta(days=365 * years_until_18(report.suspected_age))
    if now >= expires_at:
        if user in guild and has VERIFIED_MINOR role:
            remove VERIFIED_MINOR role
            log removal

Purpose: Automatically clean up the minor role when users age out.

2. Assign Minor Role on Rejoin

Trigger: on_member_join event

Logic:

if MinorReport exists for member.id with status=consent_verified:
    expires_at = report.created_at + timedelta(days=365 * years_until_18(report.suspected_age))
    if now < expires_at:
        assign VERIFIED_MINOR role

Purpose: Assign the minor role to users with verified consent who rejoin the server before turning 18.

User Journey Examples

Scenario A: Clean path with consent on file

  1. Mod runs /flag_minor @User 15 "evidence"
  2. Bot checks Cloud Function → consent exists
  3. Bot assigns VERIFIED_MINOR role
  4. Mod sees: "Parental consent already on file. No report created. Role assigned."
  5. No report created

Scenario B: Flag, approve, consent arrives later, user in guild

  1. Mod runs /flag_minor @User 15 "evidence"
  2. Bot checks Cloud Function → no consent
  3. Report created in review channel (status: pending)
  4. Reviewer clicks "Approve Ban" → modal → submits "3y"
  5. Ban created, report updated to approved
  6. Buttons updated: Approve/Deny disabled, Recheck → "Check Consent & Unban"
  7. (Later) Parent submits consent form to HTB
  8. Reviewer clicks "Check Consent & Unban"
  9. Bot checks Cloud Function → consent exists ({"exist": true})
  10. Bot unbans user, assigns VERIFIED_MINOR role
  11. Report updated to consent_verified
  12. All buttons removed (terminal state)
  13. In ~3 years, scheduled task removes the minor role when user turns 18

Scenario C: Flag, approve, consent arrives, user not in guild

  1. Mod runs /flag_minor @User 15 "evidence"
  2. Bot checks Cloud Function → no consent
  3. Report created in review channel (status: pending)
  4. Reviewer clicks "Approve Ban" → modal → submits "3y"
  5. Ban created, report updated to approved
  6. User leaves the server (still banned)
  7. (Later) Parent submits consent form to HTB
  8. Reviewer clicks "Check Consent & Unban"
  9. Bot checks Cloud Function → consent exists
  10. Bot unbans user by ID (user not in guild, so no role assignment)
  11. Reviewer sees: "Consent found. User unbanned. Minor role will be assigned when they rejoin."
  12. Report updated to consent_verified, all buttons removed
  13. (Later) User rejoins the server
  14. on_member_join handler detects CONSENT_VERIFIED report, user still under 18
  15. Bot assigns VERIFIED_MINOR role automatically

Scenario D: False positive - reviewer denies

  1. Mod runs /flag_minor @User 16 "evidence"
  2. Report created in review channel (status: pending)
  3. Reviewer investigates, determines user is actually 19
  4. Reviewer clicks "Deny Report" → modal → enters reason "User is 19, verified via ID"
  5. Bot adds user note: "Minor flag denied: User is 19, verified via ID"
  6. Report updated to denied
  7. All buttons removed (terminal state)

Security & Access Control

Reviewer Authorization

  • Only users in the minor_review_reviewer table can interact with report buttons
  • Checked on every button click via is_minor_review_moderator(user_id)
  • Results are cached for 60 seconds to reduce DB load
  • Admins manage the reviewer list via /minor_reviewers add|remove|list

Cloud Function Authentication

  • All requests to the parental consent Cloud Function require HMAC-SHA1 signature
  • Signature computed as: hmac.new(PARENTAL_CONSENT_SECRET, json_body_bytes, hashlib.sha1).hexdigest()
  • Sent as X-Signature header
  • If signature is invalid or missing, CF returns 400 {"error":"Missing signature"} or 403 {"error":"Invalid signature"}

Command Permissions

  • /flag_minor: Requires ALL_ADMINS or ALL_MODS roles
  • /minor_reviewers: Requires ALL_ADMINS only

Technical Implementation Details

Pattern B for Interaction Handling

To prevent "Unknown interaction" errors when /flag_minor takes >3 seconds:

  1. Send an immediate ephemeral response after quick validation:

    status_message = await ctx.respond("Creating or updating minor report, please wait...", ephemeral=True)
  2. Perform heavy operations (DB queries, Cloud Function calls)

  3. Edit the status message with the final result:

    await status_message.edit(content="Report created and posted to the review channel...")

This acknowledges the interaction immediately while allowing time for async operations.

Persistent Views

  • MinorReportView is registered as a persistent view in the FlagMinorCog.on_ready handler
  • This allows buttons to remain functional across bot restarts
  • Buttons use custom IDs (minor_report_approve, minor_report_deny, minor_report_recheck)
  • Report lookup is done via report_message_id matching interaction.message.id

Database Relationships

MinorReport.associated_ban_id → Ban.id (nullable foreign key)

This links a minor report to the ban it created, enabling:

  • Checking if the ban still exists before unbanning
  • Verifying the ban was created by this report (not some other mod action)

Reviewer Cache

To avoid DB queries on every button click:

_reviewer_ids_cache: tuple[int, ...] | None = None
_reviewer_ids_cache_ts: float = 0
REVIEWER_CACHE_TTL_SEC = 60
  • Cached for 60 seconds
  • Invalidated explicitly when /minor_reviewers add|remove is used
  • Falls back to DB query on cache miss

Summary

This parental consent flow provides a complete lifecycle for handling potentially underage users:

  • Detection: Moderators flag users via /flag_minor
  • Review: Authorized reviewers approve bans or deny reports
  • Consent verification: Integration with HTB's Cloud Function to check for parental consent forms
  • Ban management: Automatic unbanning when consent arrives
  • Role lifecycle: Assign minor role, maintain it while <18, auto-remove at 18
  • Rejoin handling: Auto-assign role to users who rejoin after consent is verified
  • Clear states: Terminal states (denied, consent_verified) have no buttons, preventing confusion

The implementation handles edge cases like users leaving the server, consent arriving at various points in the flow, interaction timeouts, and authorization checks, making it robust for production use.

…odels and commands. Add parental consent verification and role assignment for flagged users. Update configuration for minor review channels and roles.
…oles for users reaching 18, implement consent check with secure payload, and improve report handling in the UI. Update related database models and logging for better traceability.
…pers, and UI components. Implement unit tests for flagging minors, database interactions, and consent checks to ensure robust functionality and error handling.
…cation functions and enhance mock setups for channel interactions. This improves test reliability and aligns with recent code structure changes.
@codecov
Copy link

codecov bot commented Feb 17, 2026

Codecov Report

❌ Patch coverage is 71.15666% with 197 lines in your changes missing coverage. Please review.
✅ Project coverage is 60.88%. Comparing base (05bda14) to head (d142f2e).

Files with missing lines Patch % Lines
src/views/minorreportview.py 55.60% 111 Missing ⚠️
src/cmds/core/flag_minor.py 74.31% 28 Missing ⚠️
src/webhooks/handlers/mp.py 51.11% 22 Missing ⚠️
src/helpers/minor_verification.py 85.96% 16 Missing ⚠️
src/cmds/core/minor_reviewers.py 84.28% 11 Missing ⚠️
src/cmds/automation/scheduled_tasks.py 82.69% 9 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #172      +/-   ##
==========================================
+ Coverage   59.49%   60.88%   +1.39%     
==========================================
  Files          50       56       +6     
  Lines        2903     3556     +653     
==========================================
+ Hits         1727     2165     +438     
- Misses       1176     1391     +215     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

…f minor roles and member join behavior. Enhance test coverage for flagging minors and minor reviewers, ensuring robust handling of parental consent and role assignments.
…o enhance test coverage and ensure proper handling of user interactions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant