Machine learning is undeniably useful. ML simplifies the computation of complex data outputs and extracts value from data that humans simply can’t achieve. However, just because a machine is executing the heavy lifting, doesn’t mean ML models are a “set-it and forget-it” type of activity. Like most things in life, data changes over time. Relationships between variables in the data pipeline can also change. These changes prompt what is called Model Drift, sometimes also referred to as Model Decay.
ML models don’t like change. They are trained to assume future data ingested will look like the data used to build the model, so Model Drift is a thing we want to monitor for and prevent. In the subsequent post, we’ll describe two types of Model Drift and outline a few ways you can identify and take action to prevent model degradation.