TY - JOUR
T1 - Separating the Wheat from the Chaff
T2 - Comparative Visual Cues for Transparent Diagnostics of Competing Models
AU - Dasgupta, Aritra
AU - Wang, Hong
AU - O'Brien, Nancy
AU - Burrows, Susannah
N1 - Funding Information:
This work was partially supported by the Laboratory Directed Research and Development Program at PNNL, a multi-program national laboratory operated by Battelle. We would like to thank Feng Wang for developing the initial prototypes, and Phil Rasch, Yun Qian, and Po-Lun Ma for their feedback about MyriadCues. We are also grateful to the anonymous reviewers for their constructive comments, which helped refine the discussions in the paper.
Publisher Copyright:
© 1995-2012 IEEE.
PY - 2020/1
Y1 - 2020/1
N2 - Experts in data and physical sciences have to regularly grapple with the problem of competing models. Be it analytical or physics-based models, a cross-cutting challenge for experts is to reliably diagnose which model outcomes appropriately predict or simulate real-world phenomena. Expert judgment involves reconciling information across many, and often, conflicting criteria that describe the quality of model outcomes. In this paper, through a design study with climate scientists, we develop a deeper understanding of the problem and solution space of model diagnostics, resulting in the following contributions: i) a problem and task characterization using which we map experts' model diagnostics goals to multi-way visual comparison tasks, ii) a design space of comparative visual cues for letting experts quickly understand the degree of disagreement among competing models and gauge the degree of stability of model outputs with respect to alternative criteria, and iii) design and evaluation of MyriadCues, an interactive visualization interface for exploring alternative hypotheses and insights about good and bad models by leveraging comparative visual cues. We present case studies and subjective feedback by experts, which validate how MyriadCues enables more transparent model diagnostic mechanisms, as compared to the state of the art.
AB - Experts in data and physical sciences have to regularly grapple with the problem of competing models. Be it analytical or physics-based models, a cross-cutting challenge for experts is to reliably diagnose which model outcomes appropriately predict or simulate real-world phenomena. Expert judgment involves reconciling information across many, and often, conflicting criteria that describe the quality of model outcomes. In this paper, through a design study with climate scientists, we develop a deeper understanding of the problem and solution space of model diagnostics, resulting in the following contributions: i) a problem and task characterization using which we map experts' model diagnostics goals to multi-way visual comparison tasks, ii) a design space of comparative visual cues for letting experts quickly understand the degree of disagreement among competing models and gauge the degree of stability of model outputs with respect to alternative criteria, and iii) design and evaluation of MyriadCues, an interactive visualization interface for exploring alternative hypotheses and insights about good and bad models by leveraging comparative visual cues. We present case studies and subjective feedback by experts, which validate how MyriadCues enables more transparent model diagnostic mechanisms, as compared to the state of the art.
KW - Model evaluation
KW - Simulation
KW - Transparency
KW - Visual comparison
KW - Visual cues
UR - http://www.scopus.com/inward/record.url?scp=85075640051&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85075640051&partnerID=8YFLogxK
U2 - 10.1109/TVCG.2019.2934540
DO - 10.1109/TVCG.2019.2934540
M3 - Article
C2 - 31478858
AN - SCOPUS:85075640051
SN - 1077-2626
VL - 26
SP - 1043
EP - 1053
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 1
M1 - 8812989
ER -