March 7, 2025
We are proud to announce the MiniMax API v1 LLM update, which provides an API for the DeepSeek R1 and MiniMax-Text-01 models.
Up to five parallel requests are supported per individual chat.minimax.io account. If you need more parallel executions, please add additional chat.minimax.io accounts.
Please visit 🚀 Chat with AI to try a fully-fledged chat demo that features the entire LLM functionality available via our API, complete with full source code.
Feel free to chat with our 🤖 Ask AI support bot, which is powered by this very API.
Currently following models supported:
Model | Context length | Notes |
---|---|---|
DeepSeek R1 | 64K | Reasoning model |
MiniMax-Text-01 | 1M | Fast model with concise response |
💡Both models support file uploads for processing and are capable of executing real-time web searches.
Following LLM endpoints added:
- Chat with LLM POST minimax/llm
- Retrieve the list of LLM chats GET minimax/llm
- Retrieve the messages from the LLM chat GET minimax/llm/
chatID
- Delete LLM chats and/or messages DELETE minimax/llm