Introducing Open WebUI Integration
April 22, 2025
Today, you can start monitoring your local LLM interactions with Open WebUI in Helicone!
With the integration, you can:
- Monitor interactions across Ollama, OpenAI-compatible APIs, and custom LLM setups.
- Get a consolidated view across model types.
- Visualize and replay requests to see prompts and outputs for evaluation.
- Track local LLM performance, including response times and throughput.
- Analyze usage patterns by model in your Open WebUI setup.
We’ve just published a comprehensive guide on how to integrate Helicone with Open WebUI (formerly Ollama WebUI). This will show you how to gain comprehensive observability across all your LLM interactions, whether you’re using local Ollama models or cloud LLM APIs.
Get started: Open WebUI x Helicone docs.