Just about every time I think I I fully understand color science (at least as much as is required to be a good calibrator), another topic comes along that makes me second guess myself.
Lately I have been switching my LUT verification profiles back and forth from dE2000 to ITP just for comparison. I have seen statements (from Portrait among others) that ITP should replace dE2000 even for SDR calibrations.
My verification profiles are usually a field of green in dE2000 with no color error over 2dE (and very few points between 1.5 and 2.0dE).
But switching to ITP seems to reverse this, turning points to mostly red and orange with green error now being in the minority. Color error now goes as high as 6dE (though only a few points). Most points are in the 2 - 5dE range.
I know we don't calibrate to dE, but I can't imagine doing anything different calibration wise that would result in an ITP verification that looks as good on charts as dE2000. At least on a consumer level display/projector.
But charts and reports are one thing, what does all this mean to the viewer? Is having a low error ITP calibration vastly superior to a dE2000 calibration when watching real content? What about all past calibrations where we used dE2000? Were they all "wrong" because ITP would have most likely shown more error? This is where I start to think display calibration is a bit like herding cats.
Reaching out to the true experts for any guidance on this topic to help me understand. It would be greatly appreciated!