Skip to content

feat: add MiniMax as LLM provider for Planner & Coder agents#22

Open
octo-patch wants to merge 1 commit intoshowlab:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider for Planner & Coder agents#22
octo-patch wants to merge 1 commit intoshowlab:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMax as an alternative LLM provider for the Planner & Coder agents
  • MiniMax M2.7 model with 1M context window via OpenAI-compatible API (https://api.minimax.io/v1)
  • No new dependencies required — reuses the existing openai SDK

Changes

File Change
src/gpt_request.py Add request_minimax() and request_minimax_token() functions
src/api_config.json Add minimax config entry (base_url, api_key, model)
src/agent.py Register "minimax" in API mapping and argparse choices
README.md Mention MiniMax in LLM API section
tests/test_minimax_provider.py 16 unit tests + 3 integration tests

Usage

# Set your MiniMax API key in api_config.json or via env var
export MINIMAX_API_KEY=your_key_here

# Run with MiniMax
sh run_agent_single.sh --API minimax --knowledge_point "Linear transformations and matrices"

Test plan

  • 16 unit tests pass (config, request functions, agent integration, retry/error handling)
  • 3 live integration tests pass against MiniMax API
  • Run full video generation pipeline with --API minimax

Add MiniMax M2.7 as an alternative LLM provider for the Planner & Coder
agents via OpenAI-compatible API.

Changes:
- Add request_minimax() and request_minimax_token() functions in
  gpt_request.py using OpenAI SDK with MiniMax base URL
- Add minimax config entry in api_config.json
- Register minimax in agent.py API mapping and argparse choices
- Mention MiniMax in README.md LLM API section
- Add 16 unit tests and 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant