Looking to learn about new significant releases in MLflow?
Find out about the details of major features, changes, and deprecations below.
MLflow Docs Overhaul
The MLflow docs are getting a facelift with added content, tutorials, and guides. Stay tuned for further improvements to the site!
Spark Connect support
You can now log, save, and load models trained using Spark Connect. Try out Spark 3.5 and the MLflow integration today!
Params support for PyFunc Models
PyFunc models now support passing parameters at inference time. With this new feature, you can define the allowable keys, with default values, for any parameters that you would like consumers of your model to be able to override. This is particularly useful for LLMs, where you might want to let users adjust commonly modified parameters for a model, such as token counts and temperature.
MLflow Serving support added to MLflow AI Gateway
The MLflow AI Gateway now supports defining an MLflow serving endpoint as provider. With this new feature, you can serve any OSS transformers model that conforms to the completions or embeddings route type definitions.
Try it out today with our end-to-end example.
Introducing the MLflow AI Gateway
We're excited to announce the newest top-level component in the MLflow ecosystem: The AI Gateway.
With this new feature, you can create a single access point to many of the most popular LLM SaaS services available now, simplifying interfaces, managing credentials, and providing a unified standard set of APIs to reduce the complexity of building products and services around LLMs.
MLflow Evaluate now supports LLMs
You can now use MLflow evaluate to compare results from your favorite LLMs on a fixed prompt.
With support for many of the standard evaluation metrics for LLMs built in directly to the API, the featured LLM modeling tasks of text summarization, text classification, question answering, and text generation allows you to view the results of submitted text to multiple models in a single UI element.