Aan de slagGa gratis aan de slag

Serving a model

Model deployment is another important step of the ML Lifecycle. The MLflow command line interface includes a command for serving models. Models can be deployed with MLflow from the local filesystem, from MLflow Tracking, and from several cloud providers such as AWS S3.

To serve a model from MLflow Tracking using its run_id, which of the following commands is used to serve the model?

Deze oefening maakt deel uit van de cursus

Introduction to MLflow

Cursus bekijken

Praktische interactieve oefening

Zet theorie om in actie met een van onze interactieve oefeningen.

Begin met trainen