๐ŸŽ‰ Helicone Joins Mintlify ๐Ÿš€

Helicone AI Gateway - Now Available!

June 19, 2025

Weโ€™re thrilled to announce the launch of Helicone AI Gateway - a powerful open-source solution for routing, caching, and managing your LLM traffic at scale.

๐Ÿš€ What is Helicone AI Gateway?

The AI Gateway is a high-performance proxy that sits between your application and LLM providers, offering enterprise-grade features:

  • Smart Load Balancing: Distribute requests across multiple providers
  • Intelligent Caching: Reduce costs with semantic caching
  • Automatic Failover: Seamlessly switch providers during outages
  • Rate Limiting: Protect against abuse and control costs
  • Built-in Observability: Full integration with Heliconeโ€™s analytics

๐Ÿ’ป Get Started

The AI Gateway is available as a separate open-source project:

GitHub Repository: github.com/helicone/ai-gateway

Quick start with Docker:

docker run -p 8080:8080 helicone/ai-gateway

๐Ÿ“š Learn More

๐Ÿ”ง Key Features

  • Multi-Provider Support: OpenAI, Anthropic, Azure, and more
  • Request Routing: Route by model, cost, or custom rules
  • Security: API key management and request validation
  • Performance: Built in Rust for minimal latency overhead

Start using the AI Gateway today to take control of your LLM infrastructure!