Newswise — The death throes of a dying star. Proteins folding into fantastically complex patterns. Expansive clinical trials of potentially lifesaving drugs. These and other amazing scientific findings can change what we know about the natural world, potentially treat devastating diseases, and shed new light on the working of the human brain.  

However, deciphering the raw data that underlie these discoveries, which is often presented as complex charts and graphs, can be a daunting challenge for students, policymakers, the general public, and others in the scientific community.  

In the latest issue of Psychological Science in the Public Interest, a team of researchers explores how well-designed visualizations succeed at conveying information and how poorly designed figures create confusion and misunderstanding, undermining not only comprehension but also trust in science. 

“Thinking and communicating with data visualizations is critical for an educated public,” said Steven L. Franconeri, lead author on the report and professor of psychology at Northwestern University. “On the other hand, ineffective visualization designs lead many viewers to struggle to understand these otherwise powerful thinking tools. Our paper reviews evidence-based guidelines for how to effectively design visualizations that communicate data to students and the general public.” 

Franconeri and his colleagues synthesize guidelines that have been otherwise scattered through multiple research literatures and obscured by jargon.

Their paper also describes the guidelines for communicating data using visualizations, provides examples of good and bad visualizations, and highlights tools and strategies to improve data visualizations.  

Their paper recommends that scientists should:  

  • Visualize data with easily grasped representations like histograms and scatterplots before trusting statistical summaries. 
  • Consider common visual illusions and confusions. For example, starting axes at zero might not always be the best option as it can lead viewers to overestimate differences 
  • Attempt to use visualizations that audiences are familiar with and respect common associations (e.g., “up” and “darker” mean “more”). 
  • When communicating confidence to a lay audience, avoid error bars and instead show examples of discrete values. 
  • When communicating risk to audiences who may have a lower ability to work with numbers and mathematics, rely on absolute instead of relative rates and convey probabilities (e.g., 3 out of 10) instead of percentages (e.g., 30%). 

“Well-designed visualizations amplify our ability to reason about the real-world phenomena data represent, making it possible for us to grasp regularities and variation we might otherwise miss,” said coauthor Jessica Hullman, a professor of computer science at Northwestern University.  

The authors conclude that following the guidelines in this review—aided by an understanding of the evidence that motivates them—should lead to far more effective visualizations across public policy, education, journalism, and research. 

Reference: Franconeri, S, Padilla, L., Shah, P., Zacks, J. & Hullman, J. (2021) The science of visual data communication: What works. Psychological Science in the Public Interest. Advance online publication. http://journals.sagepub.com/doi/full/10.1177/152910062096576

Journal Link: Psychological Science in the Public Interest, Dec. 2021