ERNIE-4.5-21B-A3B-Thinking is a text-based Mixture of Experts (MoE) post-training model featuring 21B total parameters with 3B active parameters per token. It delivers enhanced performance on reasoning tasks, including logical reasoning, mathematics, science, coding, text generation, and academic benchmarks that typically require human expertise. The model offers efficient tool utilization capabilities and supports up to 128K tokens for long-context understanding.
Input: Text
Output: Text
Providers
novita
Credits
Context128k
Max Output8k
Input$0.070/1M
Output$0.280/1M
Cache Read—
Cache Write—
Quick Start
Use Baidu Ernie 4.5 21B A3B Thinking through Helicone's AI Gateway with automatic logging and monitoring.