Deploying a model in Databricks
1. Deploying a model in Databricks
Let's bring it all home! In this video, we will be discussing how Databricks can help you deploy a trained machine learning model into production.2. Machine Learning Lifecycle
As a quick level-set, here is our machine learning lifecycle diagram again. We have now learned that Databricks can handle the planning and preparation and the development and evaluation stages of machine learning.3. Model Deployment and Operations
Now, we will learn how Databricks can help with these final stages in the machine learning process. These stages are all about how we make sure that high-quality models get into production.4. Concerns with deploying models
In general, when we think about deploying a machine learning model, we have two main concerns: having available models and the ability to evaluate those models once they are out in the ecosystem. From an availability perspective, these are some of the questions we may be asking ourselves. For example, "Where do I need to put my model so it can be accessed?" The whole point of a machine learning model is that it is useful and accessible to our end users or applications. Thus, we must consider how we deploy a model so both are true. Regarding evaluating our model, these are some of the concerns we will likely have about our deployed models, such as "Is my model still performing well?" One of the big problems we see with deployed models is known as "model drift" or "accuracy drift". The idea here is that our underlying data is always changing, and a stale model will likely no longer be effective at predicting what we trained it for.5. Model Deployment Process
Both of these concerns are addressed with Databricks and the managed version of MLFlow. At a high level, here is the process that a model will go through, from training to deployment.6. Model Flavors
First, we have to start with the model itself. In MLFlow, all trained models are stored in a generalized MLFlow Model object. These objects can store models of any framework and keep important context about the configurations and artifacts needed to create that model. With MLFlow, there is the concept of a Model Flavor, which essentially allows us to translate our model into a different framework. This is very useful when deploying a model into an ecosystem with different technological constraints. Here, we have an example where a model could be trained in any number of frameworks. Upon deployment, you can surface the model into two different flavors, depending on what the downstream application needs.7. Model Registry
In Databricks, we can view all the trained models in our environment within the Model Registry.8. Model Registry
The Model Registry is a collection of all models that you have trained...9. Model Registry
...including previous versions of those models. We can discover different models and quickly see the version history that a model has gone through.10. Model Registry
Within the Model Registry, we can also push a model through into the Staging and Production stages to control which model is available to our consumers, following CI/CD best practices.11. Model Serving
Once we know a model version is ready for production, we can then deploy that model wherever we need to. There are countless ways to host a model, ranging from local model files to Docker containers on the edge. Databricks has a built-in model serving capability that simplifies that portion of the machine learning lifecycle.12. Model Serving
Users can quickly set up a serverless compute cluster...13. Model Serving
..., select which models are served on it, and are ready for users to consume the model.14. Model Serving
Databricks Model Serving also comes with built-in capabilities for monitoring model consumption and model drift, making it a powerful option for DevOps personas.15. Let's practice!
We now know how Databricks can help us deploy a model, so let's go practice!Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.