Deliver High-Quality AI, Fast

Building AI products is all about iteration.
MLflow lets you move 10x faster by simplifying how you
debug, test, and evaluate your LLM applications, Agents, and Models.

Observability

Capture complete traces of your LLM applications and agents to get deep insights into their behavior. Built on OpenTelemetry and supports any LLM provider and agent framework.

Observability screenshot

Evaluation

Run systematic evaluations, track quality metrics over time, and catch regressions before they reach production. Choose from 50+ built-in metrics and LLM judges, or define your own with highly flexible APIs.

Evaluation screenshot

Prompt & Optimization

Version, test, and deploy prompts with full lineage tracking. Automatically optimize prompts with state-of-the-art algorithms to improve performance.

Prompt & Optimization screenshot

AI Gateway

Unified API gateway for all LLM providers. Route requests, manage rate limits, handle fallbacks, and control costs through a unified OpenAI-compatible interface.

AI Gateway screenshot
Most Adopted Open-Source MLOps Platform
Backed by Linux Foundation, MLflow has been fully committed to open-source for 5+ years. Now trusted by thousands of organizations and research teams worldwide.
mlflow/mlflow
23K
30 Million+Package Downloads / Month
Works With Any LLM Framework
From LLM agent frameworks to traditional ML libraries - MLflow integrates seamlessly with 100+ tools across the AI ecosystem. Supports Python, TypeScript/JavaScript, Java, R, and natively integrates with OpenTelemetry.
Why Teams Choose MLflow
Focus on building great AI, not managing infrastructure. MLflow handles the complexity so you can ship faster.

Open Source

100% open source under Apache 2.0 license. Forever free, no strings attached.

No Vendor Lock-in

Works with any cloud, framework, or tool you use. Switch vendors anytime.

Production Ready

Battle-tested at scale by Fortune 500 companies and thousands of teams.

Full Visibility

Complete tracking and observability for all your AI applications and agents.

Community

20K+ GitHub stars, 900+ contributors. Join the fastest-growing MLOps community.

Integrations

Works out of the box with LangChain, OpenAI, PyTorch, and 100+ AI frameworks.

Get Started in 3 Simple Steps
From zero to full-stack LLMOps in minutes. No complex setup or major code changes required.Get Started →
1

Start MLflow Server

One command to get started. Docker setup is also available.

bash
uvx mlflow server
~30 seconds
2

Enable Logging

Add minimal code to start capturing traces, metrics, and parameters

python
import mlflow
mlflow.set_tracking_uri("http://localhost:5000")
mlflow.openai.autolog()
~30 seconds
3

Run your code

Run your ML/LLM code as usual. MLflow logs traces and you can explore them in the MLflow UI.

python
from openai import OpenAI
client = OpenAI()
client.responses.create(
model="gpt-5-mini",
input="Hello!",
)
~1 minute
Frequently Asked Questions
Everything you need to know about MLflow. Can't find what you're looking for? Join our community.

Yes! MLflow is 100% open source under the Apache 2.0 license. You can use it for any purpose, including commercial applications, without any licensing fees. The project is backed by the Linux Foundation, ensuring it remains open and community-driven.

GET INVOLVED
Connect with the open source community
Join millions of MLflow users