Serving a model
Model deployment is another important step of the ML Lifecycle. The MLflow command line interface includes a command for serving models. Models can be deployed with MLflow from the local filesystem, from MLflow Tracking, and from several cloud providers such as AWS S3.
To serve a model from MLflow Tracking using its run_id, which of the following commands is used to serve the model?
Questo esercizio fa parte del corso
Introduction to MLflow
Esercizio pratico interattivo
Passa dalla teoria alla pratica con uno dei nostri esercizi interattivi
Inizia esercizio