MulaiMulai sekarang secara gratis

Serving a model

Model deployment is another important step of the ML Lifecycle. The MLflow command line interface includes a command for serving models. Models can be deployed with MLflow from the local filesystem, from MLflow Tracking, and from several cloud providers such as AWS S3.

To serve a model from MLflow Tracking using its run_id, which of the following commands is used to serve the model?

Latihan ini adalah bagian dari kursus

Introduction to MLflow

Lihat Kursus

Latihan interaktif praktis

Ubah teori menjadi tindakan dengan salah satu latihan interaktif kami.

Mulai berolahraga