mlflow.openai
The mlflow.openai
module provides an API for logging and loading OpenAI models.
Credential management for OpenAI on Databricks
When this flavor logs a model on Databricks, it saves a YAML file with the following contents as
openai.yaml
if the MLFLOW_OPENAI_SECRET_SCOPE
environment variable is set.
OPENAI_API_BASE: {scope}:openai_api_base
OPENAI_API_KEY: {scope}:openai_api_key
OPENAI_API_KEY_PATH: {scope}:openai_api_key_path
OPENAI_API_TYPE: {scope}:openai_api_type
OPENAI_ORGANIZATION: {scope}:openai_organization
{scope}
is the value of theMLFLOW_OPENAI_SECRET_SCOPE
environment variable.The keys are the environment variables that the
openai-python
package uses to configure the API client.The values are the references to the secrets that store the values of the environment variables.
When the logged model is served on Databricks, each secret will be resolved and set as the corresponding environment variable. See https://docs.databricks.com/security/secrets/index.html for how to set up secrets on Databricks.
-
mlflow.openai.
get_default_conda_env
()[source] Note
Experimental: This function may change or be removed in a future release without warning.
- Returns
The default Conda environment for MLflow Models produced by calls to
save_model()
andlog_model()
.
-
mlflow.openai.
get_default_pip_requirements
()[source] Note
Experimental: This function may change or be removed in a future release without warning.
- Returns
A list of default pip requirements for MLflow Models produced by this flavor. Calls to
save_model()
andlog_model()
produce a pip environment that, at minimum, contains these requirements.
-
mlflow.openai.
load_model
(model_uri, dst_path=None)[source] Note
Experimental: This function may change or be removed in a future release without warning.
Load an OpenAI model from a local file or a run.
- Parameters
model_uri –
The location, in URI format, of the MLflow model. For example:
/Users/me/path/to/local/model
relative/path/to/local/model
s3://my_bucket/path/to/model
runs:/<mlflow_run_id>/run-relative/path/to/model
For more information about supported URI schemes, see Referencing Artifacts.
dst_path – The local filesystem path to which to download the model artifact. This directory must already exist. If unspecified, a local output path will be created.
- Returns
A dictionary representing the OpenAI model.
-
mlflow.openai.
log_model
(model, task, artifact_path, conda_env=None, code_paths=None, registered_model_name=None, signature: mlflow.models.signature.ModelSignature = None, input_example: Union[pandas.core.frame.DataFrame, numpy.ndarray, dict, list, csr_matrix, csc_matrix, str, bytes] = None, await_registration_for=300, pip_requirements=None, extra_pip_requirements=None, metadata=None, **kwargs)[source] Note
Experimental: This function may change or be removed in a future release without warning.
Log an OpenAI model as an MLflow artifact for the current run.
- Parameters
model – The OpenAI model name or reference instance, e.g.,
openai.Model.retrieve("gpt-3.5-turbo")
.task – The task the model is performing, e.g.,
openai.ChatCompletion
or'chat.completions'
.artifact_path – Run-relative artifact path.
conda_env –
Either a dictionary representation of a Conda environment or the path to a conda environment yaml file. If provided, this describes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env()
. IfNone
, a conda environment with pip requirements inferred bymlflow.models.infer_pip_requirements()
is added to the model. If the requirement inference fails, it falls back to usingget_default_pip_requirements()
. pip requirements fromconda_env
are written to a piprequirements.txt
file and the full conda environment is written toconda.yaml
. The following is an example dictionary representation of a conda environment:{ "name": "mlflow-env", "channels": ["conda-forge"], "dependencies": [ "python=3.8.15", { "pip": [ "openai==x.y.z" ], }, ], }
code_paths – A list of local filesystem paths to Python file dependencies (or directories containing file dependencies). These files are prepended to the system path when the model is loaded.
registered_model_name – If given, create a model version under
registered_model_name
, also creating a registered model if one with the given name does not exist.signature –
ModelSignature
describes model input and outputSchema
. The model signature can beinferred
from datasets with valid model input (e.g. the training dataset with target column omitted) and valid model output (e.g. model predictions generated on the training dataset), for example:from mlflow.models import infer_signature train = df.drop_column("target_label") predictions = ... # compute model predictions signature = infer_signature(train, predictions)
input_example – Input example provides one or several instances of valid model input. The example can be used as a hint of what data to feed the model. The given example will be converted to a Pandas DataFrame and then serialized to json using the Pandas split-oriented format. Bytes are base64-encoded.
await_registration_for – Number of seconds to wait for the model version to finish being created and is in
READY
status. By default, the function waits for five minutes. Specify 0 or None to skip waiting.pip_requirements – Either an iterable of pip requirement strings (e.g.
["openai", "-r requirements.txt", "-c constraints.txt"]
) or the string path to a pip requirements file on the local filesystem (e.g."requirements.txt"
). If provided, this describes the environment this model should be run in. IfNone
, a default list of requirements is inferred bymlflow.models.infer_pip_requirements()
from the current software environment. If the requirement inference fails, it falls back to usingget_default_pip_requirements()
. Both requirements and constraints are automatically parsed and written torequirements.txt
andconstraints.txt
files, respectively, and stored as part of the model. Requirements are also written to thepip
section of the model’s conda environment (conda.yaml
) file.extra_pip_requirements –
Either an iterable of pip requirement strings (e.g.
["pandas", "-r requirements.txt", "-c constraints.txt"]
) or the string path to a pip requirements file on the local filesystem (e.g."requirements.txt"
). If provided, this describes additional pip requirements that are appended to a default set of pip requirements generated automatically based on the user’s current software environment. Both requirements and constraints are automatically parsed and written torequirements.txt
andconstraints.txt
files, respectively, and stored as part of the model. Requirements are also written to thepip
section of the model’s conda environment (conda.yaml
) file.Warning
The following arguments can’t be specified at the same time:
conda_env
pip_requirements
extra_pip_requirements
This example demonstrates how to specify pip requirements using
pip_requirements
andextra_pip_requirements
.metadata –
Custom metadata dictionary passed to the model and stored in the MLmodel file.
Note
Experimental: This parameter may change or be removed in a future release without warning.
kwargs – Keyword arguments specific to the OpenAI task, such as the
messages
(see Supported messages formats for OpenAI chat completion task for more details on this parameter) ortop_p
value to use for chat completion.
- Returns
A
ModelInfo
instance that contains the metadata of the logged model.
import mlflow import openai # Chat with mlflow.start_run(): info = mlflow.openai.log_model( model="gpt-3.5-turbo", task=openai.ChatCompletion, messages=[{"role": "user", "content": "Tell me a joke about {animal}."}], artifact_path="model", ) model = mlflow.pyfunc.load_model(info.model_uri) df = pd.DataFrame({"animal": ["cats", "dogs"]}) print(model.predict(df)) # Embeddings with mlflow.start_run(): info = mlflow.openai.log_model( model="text-embedding-ada-002", task=openai.Embedding, artifact_path="embeddings", ) model = mlflow.pyfunc.load_model(info.model_uri) print(model.predict(["hello", "world"]))
-
mlflow.openai.
save_model
(model, task, path, conda_env=None, code_paths=None, mlflow_model=None, signature: mlflow.models.signature.ModelSignature = None, input_example: Union[pandas.core.frame.DataFrame, numpy.ndarray, dict, list, csr_matrix, csc_matrix, str, bytes] = None, pip_requirements=None, extra_pip_requirements=None, metadata=None, **kwargs)[source] Note
Experimental: This function may change or be removed in a future release without warning.
Save an OpenAI model to a path on the local file system.
- Parameters
model – The OpenAI model name or reference instance, e.g.,
openai.Model.retrieve("gpt-3.5-turbo")
.task – The task the model is performing, e.g.,
openai.ChatCompletion
or'chat.completions'
.path – Local path where the model is to be saved.
conda_env –
Either a dictionary representation of a Conda environment or the path to a conda environment yaml file. If provided, this describes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env()
. IfNone
, a conda environment with pip requirements inferred bymlflow.models.infer_pip_requirements()
is added to the model. If the requirement inference fails, it falls back to usingget_default_pip_requirements()
. pip requirements fromconda_env
are written to a piprequirements.txt
file and the full conda environment is written toconda.yaml
. The following is an example dictionary representation of a conda environment:{ "name": "mlflow-env", "channels": ["conda-forge"], "dependencies": [ "python=3.8.15", { "pip": [ "openai==x.y.z" ], }, ], }
code_paths – A list of local filesystem paths to Python file dependencies (or directories containing file dependencies). These files are prepended to the system path when the model is loaded.
mlflow_model –
mlflow.models.Model
this flavor is being added to.signature –
ModelSignature
describes model input and outputSchema
. The model signature can beinferred
from datasets with valid model input (e.g. the training dataset with target column omitted) and valid model output (e.g. model predictions generated on the training dataset), for example:from mlflow.models import infer_signature train = df.drop_column("target_label") predictions = ... # compute model predictions signature = infer_signature(train, predictions)
input_example – Input example provides one or several instances of valid model input. The example can be used as a hint of what data to feed the model. The given example will be converted to a Pandas DataFrame and then serialized to json using the Pandas split-oriented format. Bytes are base64-encoded.
pip_requirements – Either an iterable of pip requirement strings (e.g.
["openai", "-r requirements.txt", "-c constraints.txt"]
) or the string path to a pip requirements file on the local filesystem (e.g."requirements.txt"
). If provided, this describes the environment this model should be run in. IfNone
, a default list of requirements is inferred bymlflow.models.infer_pip_requirements()
from the current software environment. If the requirement inference fails, it falls back to usingget_default_pip_requirements()
. Both requirements and constraints are automatically parsed and written torequirements.txt
andconstraints.txt
files, respectively, and stored as part of the model. Requirements are also written to thepip
section of the model’s conda environment (conda.yaml
) file.extra_pip_requirements –
Either an iterable of pip requirement strings (e.g.
["pandas", "-r requirements.txt", "-c constraints.txt"]
) or the string path to a pip requirements file on the local filesystem (e.g."requirements.txt"
). If provided, this describes additional pip requirements that are appended to a default set of pip requirements generated automatically based on the user’s current software environment. Both requirements and constraints are automatically parsed and written torequirements.txt
andconstraints.txt
files, respectively, and stored as part of the model. Requirements are also written to thepip
section of the model’s conda environment (conda.yaml
) file.Warning
The following arguments can’t be specified at the same time:
conda_env
pip_requirements
extra_pip_requirements
This example demonstrates how to specify pip requirements using
pip_requirements
andextra_pip_requirements
.metadata –
Custom metadata dictionary passed to the model and stored in the MLmodel file.
Note
Experimental: This parameter may change or be removed in a future release without warning.
kwargs – Keyword arguments specific to the OpenAI task, such as the
messages
(see Supported messages formats for OpenAI chat completion task for more details on this parameter) ortop_p
value to use for chat completion.
import mlflow import openai # Chat mlflow.openai.save_model( model="gpt-3.5-turbo", task=openai.ChatCompletion, messages=[{"role": "user", "content": "Tell me a joke."}], path="model", ) # Embeddings mlflow.openai.save_model( model="text-embedding-ada-002", task=openai.Embedding, path="model", )