Langfuse

0

🪢 Open source LLM engineering platform. Observability, metrics, evals, prompt management, testing, prompt playground, datasets, LLM evaluations -- 🍊YC W23 🤖 integrate via Typescript, Python / Decor…

Observability

gpt
llmops
llm
analytics

Langfuse GitHub Banner

Langfuse: Open Source LLM Engineering Platform

LLM Observability, Prompt Management, LLM Evaluations,
Datasets, LLM Metrics, and Prompt Playground

Langfuse uses Github Discussions for Support and Feature Requests.
We're hiring. Join us in Product Engineering and Developer Relations.

MIT License Y Combinator W23 Docker Image langfuse Python package on PyPi langfuse npm package

Langfuse Overview

Langfuse Overview Video

Develop

Monitor

Test

  • Experiments: Track and test app behaviour before deploying a new version

Get started

Langfuse Cloud

Managed deployment by the Langfuse team, generous free-tier (hobby plan), no credit card required.

» Langfuse Cloud

Self-Hosting Open Source LLM Observability with Langfuse

Localhost (docker)

# Clone repository
git clone https://github.com/langfuse/langfuse.git
cd langfuse

# Run server and database
docker compose up -d

→ Learn more about deploying locally

Self-host (docker)

Langfuse is simple to self-host and keep updated. It currently requires only a single docker container and a postgres database. → Self Hosting Instructions

Templated deployments: Railway, GCP, AWS, Azure, Kubernetes and others

Get Started

API Keys

You need a Langfuse public and secret key to get started. Sign up here and find them in your project settings.

Ingesting Data · Instrumenting Your Application · LLM Observability with Langfuse

Note: We recommend using our fully async, typed SDKs that allow you to instrument any LLM application with any underlying model. They are available in Python (Decorators) & JS/TS. The SDKs will always be the most fully featured and stable way to ingest data into Langfuse.

See the → Quickstart to integrate Langfuse.

LLM Observability Integrations

IntegrationSupportsDescription
SDKPython, JS/TSManual instrumentation using the SDKs for full flexibility.
OpenAIPython, JS/TSAutomated instrumentation using drop-in replacement of OpenAI SDK.
LangchainPython, JS/TSAutomated instrumentation by passing callback handler to Langchain application.
LlamaIndexPythonAutomated instrumentation via LlamaIndex callback system.
HaystackPythonAutomated instrumentation via Haystack content tracing system.
LiteLLMPython, JS/TS (proxy only)Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
Vercel AI SDKJS/TSTypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js.
APIDirectly call the public API. OpenAPI spec available.

Packages integrated with Langfuse:

NameDescription
InstructorLibrary to get structured LLM outputs (JSON, Pydantic)
DifyOpen source LLM app development platform with no-code builder.
OllamaEasily run open source LLMs on your own machine.
MirascopePython toolkit for building LLM applications.
FlowiseJS/TS no-code builder for customized LLM flows.
LangflowPython-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows.

Questions and feedback

Ideas and roadmap

Support and feedback

In order of preference the best way to communicate with us:

Contributing to Langfuse

  • Vote on Ideas
  • Raise and comment on Issues
  • Open a PR - see CONTRIBUTING.md for details on how to setup a development environment.

License

This repository is MIT licensed, except for the ee folders. See LICENSE and docs for more details.

Misc

GET API to export your data

GET routes to use data in downstream applications (e.g. embedded analytics). You can also access them conveniently via the SDKs (docs).

Security & Privacy

We take data security and privacy seriously. Please refer to our Security and Privacy page for more information.

Telemetry

By default, Langfuse automatically reports basic usage statistics of self-hosted instances to a centralized server (PostHog).

This helps us to:

  1. Understand how Langfuse is used and improve the most relevant features.
  2. Track overall usage for internal and external (e.g. fundraising) reporting.

None of the data is shared with third parties and does not include any sensitive information. We want to be super transparent about this and you can find the exact data we collect here.

You can opt-out by setting TELEMETRY_ENABLED=false.

Star History

Star History Chart

Open Source Projects Using Langfuse

Top open-source Python projects that use Langfuse, ranked by stars (Source):

RepositoryStars
  langgenius / dify54865
  open-webui / open-webui51531
  lobehub / lobe-chat49003
  langflow-ai / langflow39093
  run-llama / llama_index37368
  chatchat-space / Langchain-Chatchat32486
  FlowiseAI / Flowise32448
  mindsdb / mindsdb26931
  twentyhq / twenty24195
  PostHog / posthog22618
  BerriAI / litellm15151
  mediar-ai / screenpipe11037
  formbricks / formbricks9386
  anthropics / courses8385
  GreyDGL / PentestGPT7374
  superagent-ai / superagent5391
  promptfoo / promptfoo4976
  onlook-dev / onlook4141
  Canner / WrenAI2526
  pingcap / autoflow2061
  MLSysOps / MLE-agent1161
  open-webui / pipelines1100
  alishobeiri / thread1074
  topoteretes / cognee971
  bRAGAI / bRAG-langchain823
  opslane / opslane677
  dynamiq-ai / dynamiq639
  theopenconversationkit / tock514
  andysingal / llm-course394
  phospho-app / phospho384
  sentient-engineering / agent-q370
  sql-agi / DB-GPT324
  PostHog / posthog-foss305
  vespperhq / vespper304
  block / goose295
  aorwall / moatless-tools291
  dmayboroda / minima221
  RobotecAI / rai172
  i-am-alice / 3rd-devs148
  8090-inc / xrx-sample-apps138
  babelcloud / LLM-RGB135
  souzatharsis / tamingLLMs129
  LibreChat-AI / librechat.ai128
  deepset-ai / haystack-core-integrations126