As a consumer with some understanding of business, purchasing and logistics chains, I assume that supply is considered far in advance when planning promotions. With artificial intelligence, an accurate estimate of expected demand can be made based on past customer behavior. After all, the phenomenon of offers was not suddenly invented last month. Manufacturers make sure there is inventory, and distributors and carriers schedule deliveries. Where does it go wrong?
Data Observability
Assuming enough good planning tools exist by now, it must go wrong with the data on which the schedules are based. The smartest algorithms are worthless if the quality of the data on which they are unleashed is poor. The amount of data available to make decisions is growing rapidly, but its quality is still a problem. The offers at the supermarket are just one example. Finance departments, marketing teams, logistics planners and industrial operators also face poor data in a world where data-driven work is the norm.
Data Observability is an emerging discipline that can help improve data quality by continuously tracking data streams using metadata from processes. There are several tools available, such as Monte Carlo, Bigeye, Databand and Datadog, that integrate well with popular data platforms.
Data Observability assumes that tools monitor five aspects:
Freshness: Check how recent the data is and avoid outdated insights that create inventory problems, for example.
Distribution: Analyze data patterns (such as mean and standard deviation) to detect anomalies. For example, a sudden spike or drop in customer interaction with the website may be a warning sign of a problem with the data collection process or an underlying system error.
Volume: Monitor the amount of data flowing through systems. Unexpected increases or decreases in data flow can cause a problem in processing.
Schema: Check changes in data structure to avoid problems later in the pipeline. Changing a data type, or renaming a column, can cause unexpected errors in linked systems.
Lineage: Trace the origin and transformations of data to detect errors faster. When anomalous values are found, data lineage provides insight into where in the pipeline the cause of the anomaly occurred.
The crux of Data Observability lies in the continuous and automated nature of monitoring data quality rather than a periodic evaluation. However, it is important to define in advance what criteria the data must meet so that it is aligned with business goals. Tools that identify problems are only the beginning of a solution. Setting up a timely and adequate follow-up is also essential to gain maximum benefit from the tools.
Now just hope that soon there will be enough shelf-fillers available to place the offers delivered in the right quantities on the empty shelves. And that they will be ready when I want to stock my favorite product at a bargain price.