Open Source Observability & Analytics for LLM Apps

Detailed production traces and a granular view on quality, cost and latency
Backed by
Star us on Langfuse Github stars
Product Hunt - Product of the Day

Teams building complex LLM apps rely on Langfuse

FastgenLangdockAlphawatchBerryNucleus

Observability

Debug faster

LLM applications are increasingly complex, Langfuse helps to trace & debug them. Understand how changes to one step impact overall application performance.

Made for agents & LLM chains.
Trace unlimited nested actions and get a detailed view of the entire request.
Exact cost calculation.
Tokenizes prompts and completions of popular models to exactly measure the cost of each step of the LLM chain.
Track non-LLM actions.
Database queries, API calls, and other actions that lead to the response can be tracked for optimal visibility into issues.
Open & integrated.
Works with all models and configurations. Native integrations with popular frameworks and libraries.

Analytics

Prebuilt dashboards

Based on the ingested data, Langfuse provides prebuilt analytics to help teams focus on the most important metrics accessible to the whole team.

Cost.
Track token usage and cost of your LLM application. Drill-down by users, models, or application features.
Quality.
Add scores to each trace. Can be model-based evaluation, user feedback, or manual labeling in the Langfuse UI.
Latency.
Monitor and improve latency by getting breakdowns of the added latency of each step of the LLM chain.
Connected to traces.
All analytics are connected to traces, so you can easily find the root cause of any issue.
Public API.
All data is also accessible via the public API to build your own custom features and dashboards on top of Langfuse.

Integrations

Works with any LLM app

Typed SDKs for Python & JS/TS. Native integrations for popular frameworks and libraries. Missing an integration? Let us know on Discord!

SDKs for Python & JS/TS
Typed SDKs that capture trace data and send it fully async to Langfuse. You have full control on what is sent to Langfuse. Learn more →
🦜🔗 Langchain integration
Using Langchain? Get full execution traces in 5 minutes by adding the Langfuse Callback Handler to your app. Works for Python and JS projects. Learn more →
Web SDK
Capture user feedback and other quality scores right from the frontend using the Langfuse Web SDK. Learn more →
OpenAI
If you use the OpenAI SDK, use the Langfuse drop-in replacement to get full trace data by just changing the import. Learn more →
API
Need more control? Use the Langfuse API to ingest traces and scores and build your own custom integrations. Learn more →
Other open-source projects
Dedicated integrations with Flowise, Langflow, and LiteLLM. Learn more →

Proudly Open Source

We are committed to open source and Langfuse is easy to run locally and self-hosted.

Pricing

Simple pricing for projects of all sizes

All plans include (fair use)
unlimited projects, unlimited users and unlimited throughput

Hobby

Get started, no credit card required. Great for hobby projects and POCs.

Free

Sign up
  • Unlimited projects, events, and throughput (fair use)
  • 100k observations / month
  • Access last 30 days
  • Basic support
  • Select data region (US or EU)

Pro

For serious projects. Includes access to full history, data governance and support.

Starts at$29/month

Start free trial
  • Unlimited projects, events, and throughput (fair use)
  • 100k observations / month included, additional: $10 / 100k observations
  • Unlimited history
  • Dedicated support channels (Slack or Discord)
  • Custom data retention policies

Team

Dedicated solutions and support for your team. Contact us to learn more.

Starts at$199/month

Talk to founders
  • All Pro features
  • SSO enforcement
  • White-glove onboarding support
  • Single-tenant instances
  • Support SLAs
  • Compliance and security reviews
  • Custom domains, advanced RBAC, and more (soon)