Stylized heads

[Image: Getty Images]

Often, to a scientist committed to action on key issues such as climate change, the facts seem crystal clear—and communicating them to “the other side” can feel like shouting across a chasm. To two U.S. professors of organizational behavior, the “gap” metaphor is indeed an apt one.

In a recently published paper from a late-2017 colloquium on “The Science of Science Communication,” Matthew Cronin of George Mason University and Laurie Weingart of Carnegie Mellon University explore the notion of representational gaps, or “rGaps” (Proc. Natl. Acad. Sci. USA, doi: 10.1073/pnas.1805866116). These gaps are incompatibilities in fundamental assumptions and perspectives that inevitably arise from the different experience and knowledge bases of the individuals involved.

Managing “rGap conflict,” according to Cronin and Weingart, is a prerequisite to effective communication—particularly in a specialized and potentially forbidding area such as “hard science.” And doing so rests on the hard work of building trust and respect.

Birth of an rGap

While the most pressing needs for effective scientific communication often seem to revolve around issues such as stemming global warming or battling the anti-vaccination movement, Cronin and Weingart first root their discussion in a homier example: The decision on whether to purchase a regular incandescent bulb or an energy-saving compact-fluorescent model. You may find the decision a no-brainer, based on the projected lifetime and energy cost of the fluorescent bulb. Your partner, however, may reach for the incandescent one—fully understanding the energy trade-off, but motivated by other reasons totally unrelated to energy costs, such as light quality. Voila—an rGap is born.

Artist view of people arguing difficult concepts

[Image: Getty Images]

In the best of circumstances, the presence of such a gap in perception would lead to dialogue and discussion, and perhaps the development of a compromise solution, such as limiting incandescent lights only to the areas where they’re most needed. (While small in scale, such a solution, the authors point out, is not unlike the carbon-trading schemes sometimes proposed to help control atmospheric greenhouse gases.) But in general, on identifying such a gap in perception, we instead tend to assume simply that the other party lacks knowledge and understanding. And then we attempt to remedy that perceived shortcoming in the other party through explanations, lecturing and a data-dump of facts.

rGap conflicts

That’s a snare, Cronin and Weingart suggest, into which scientists, as keepers of sometimes arcane expert information, are particularly apt to fall.  And it leads to what they call an “rGap conflict”—when individuals “take incompatible positions in response to information,” and quickly fall into unproductive, sometimes intense personal argument, or its flip side, avoidant behavior or withdrawal.

“Each side believes they have good evidence, belief systems, and values,” Weingart said in a press release accompanying the research. “But rather than explore each other’s evidence, people try to defend their knowledge. As a result, the conversation will escalate into arguments and attacks. It’s very hard to get back to the debate about what is evidence, what is factual.”

Bridging the gap

There are, unfortunately, no short-cuts to getting past this potential barrier to effective science communication, according to Cronin and Weingart.

Instead, their own research, and their analysis of the copious organizational-behavior literature in this area, suggests that effective communication must begin with managing rGap conflicts. And that, in turn, requires a genuine attempt to understand the other person’s or group’s values and perspectives, and to build trust, respect and empathy—a process that Cronin and Weingart refer to as “affective integration.” Only after this process reduces the sense of personal threat and frustration can “cognitive integration”—the debate and discussion that leads to mutual understanding and new solutions—actually happen.

Developing such understanding, rather than assuming that the other party shares your perception and simply lacks “the facts,” is, Cronin and Weingart acknowledge, difficult work. Particularly in an age of the social-media “zinger” or one-liner, it’s often more immediately satisfying to meet different perspectives with derision than with a difficult effort to “bridge the rGap.”

Yet, for those genuinely interested in advancing public scientific understanding in the service of solving problems, the researchers believe the bridging is essential. “We need to be willing to learn from others,” according to Cronin. “This is why trust and respect matter. We listen to the people we trust and respect even when we disagree. And this must be a two-way street.”