Sunday, June 22, 2014

Modes of Communication/Truth is the Weapon of the Underdog

Last week, during a public Medial Lab discussion, Judith Donath remarked that
Without some form of deception, society would be completely impossible. Lies of omission allow us to cooperate
This prompted Joi Ito's elaboration: we tend to make communicative statements (including non-verbal ones, like the choice of clothing) in a very particular context. How can we make a communicative statement work when the context breaks down? In fact, this breakdown of context is almost the norm for communication that is available to non-homogenous audiences, say, both your peers and your parents, following you on Facebook: the same statement may very well mean very different things different recipients. By careful omission, which amounts to a type of deception, we may be able to realize our communicative intent even if the signs and concepts that make up our message mean different things to individual recipients.

This insight has interesting implications: First of all, there are several distinct kinds/modes of public communication. One is concerned with the transmission of content (ideas, arguments, data etc.). The other one is concerned with the elicitation of social behavior: we want the recipients of the communication to interact in certain ways, adopt a position towards us in relation to their own, and so on.


Scientific communication (which concerns me so much that I tend to overlook other modes), is always content-oriented, and predominantly non-deceptive. (Which of course is not the case for all other kinds of content-oriented communication.) In contrast, in social communication, the transmission of content is not the main goal, but a means to an end: social communication facilitates alignment, in very particular areas. Presenting scientific results to the public creates a particular challenge, because it touches on both communication modes: The gist of the content must be preserved well enough to bear significance to audiences that are familiar with the scientific context, but the message must still work if that context breaks down. The message must avoid elements that carry a specific meaning in the scientific context, but work equally well for a more general audience. Complicated details or controversial tropes may be powerful distractors from the universal message of scientific public relations: "We are producing results that are relevant to your interests."

Communication ultimately always works by creating a representation in the mind of the recipient. In content-oriented communication, we usually want to our ideas or arguments to be represented accurately. Our message must specify what concepts and relations are part of the representation, or which ones should be excluded. Language is partly constructive, i.e. it builds ideas by binding them together from parts, and largely disambiguating: it bisects the space of all possible representations until the remainder is similar enough to what we wanted to express.

To make social communication work, we often put more emphasis on partial disambiguation. For instance, a politician might refer to "freedom" or "responsibility" to defend a policy, while realizing that it may mean quite opposite things to different audiences.  A diplomat might refer to God, without specifying the actual brand of God that is implied, etc. Since these concepts carry a positive connotation regardless of their actual content, they may be helpful in facilitating social alignment, but are largely useless in scientific contexts.

In other words, in social communication across diversified audiences, we might do well to leave the creation of some of the actual content of our message to the recipients, and our statement should just be specific enough serve to prompt the elicitation of actual content that supports reaching our goals. If that sounds deceptive and Machiavellian to you, that is probably because it is, but at least it does not open yourself up the same dangers as outright lying (even though it might sometimes be just as unethical).

Because most people form their opinions not based on actual content, but based on social criteria, even lying can be a successful strategy, which is why governments and large organizations often employ it. If individuals know that a particular message carries deceptive content, the message may still be successful in the formation of their opinions, if the speaker carries a large social weight. Social weight is not so much a function of reputation (i.e. the attributed integrity), but of the place on the social hierarchy. (Having grown up in Eastern Germany, I can testify to that.)

Conversely, lies are dangerous if your voice is a small one. If you are a member of a socially weaker opposition, your messages may of course be framed as untruthful by the dominant speakers, no matter what their content is. But your success will ultimately depend on finding allies among the opposition and the undecided audience, and being deceptive will impact your reputation, which in turns makes you a liability to your potential allies. If you are an underdog targeting a mainstream audience, you better stick to the truth. Lies damage reputation, while omissions generally do not.