What’s the link between quantum physics and data quality? In a recent discussion, Jim Harris suggests that data quality projects yield to a measurement problem as there is in quantum physics. He asks:
“When does a data quality project stop existing as potential success or failure and become one or the other?”
The question makes sense, but I don’t think there is any paradox in it. In my opinion, as long as we do not measure the data quality, nothing can be said about the success or failure of the data quality project. It’s like with any probabilistic issue: as long as the throw of a die is not read, we don’t know the outcome but there is an outcome. In other words, the status of the data quality project is in an undefined state, but it is certainly not in both states at the same time as the Schrödinger’s cat.
Now what could be common to quantum physics and data quality?
I think that something which could be common to both fields is that a context is required in order to perform the measurement. Data quality is contextual. We cannot speak about the quality in an absolute manner (see this discussion). In some context the data will be of good quality whereas in other contexts they could be of bad quality. It depends on the intended use or purpose as says Henrik Liliendahl Sørensen. But it’s not sufficient yet to be quantum. Other features are required before we can speak about quantum quality. I suppose that we would need to identify non-commuting contexts in data quality projects. This would mean the order of the measurements of data quality would play a crucial role. I am not sure that we can exhibit this kind of behavior. And we need probably to define more precisely all the concepts before we can show that data quality has something quantum.