Skip to content

feat: add MiniMax provider support for prompt enhancement#818

Open
octo-patch wants to merge 1 commit intozai-org:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support for prompt enhancement#818
octo-patch wants to merge 1 commit intozai-org:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

This PR adds MiniMax as an optional LLM provider for the prompt enhancement feature used in the demo scripts (convert_demo.py, gradio_web_demo.py, gradio_composite_demo/app.py).

What changed

  • Added _get_llm_client() helper that auto-selects the LLM backend from environment variables:
    • MINIMAX_API_KEY → uses MiniMax (MiniMax-M2.7 model, OpenAI-compatible API at https://api.minimax.io/v1)
    • OPENAI_API_KEY → existing behaviour (unchanged)
    • MINIMAX_API_KEY takes priority when both are set
    • MINIMAX_BASE_URL allows overriding the default MiniMax endpoint
  • Updated usage instructions in docstrings for all three demo files
  • Added unit tests (tests/test_minimax_provider.py) verifying provider selection logic

Usage

# Use MiniMax for prompt enhancement
MINIMAX_API_KEY=your_minimax_api_key python inference/gradio_web_demo.py

# Use OpenAI (existing behaviour, unchanged)
OPENAI_API_KEY=your_openai_api_key OPENAI_BASE_URL=https://api.openai.com/v1 python inference/gradio_web_demo.py

API References

- Add _get_llm_client() helper to auto-select LLM provider from env vars
- Support MINIMAX_API_KEY to use MiniMax (MiniMax-M2.7) as provider
- MINIMAX_API_KEY takes priority over OPENAI_API_KEY when both are set
- Default base URL: https://api.minimax.io/v1 (overridable via MINIMAX_BASE_URL)
- Update all three demo scripts: convert_demo.py, gradio_web_demo.py, gradio_composite_demo/app.py
- Add unit tests covering provider selection and fallback behaviour
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant