Deception as Exploitative Social Agency

May 24, 2017 | Autor: Radu Umbres | Categoría: Trust, Deception / Lying (Deception Lying), Cognitive Anthropology, Agency
Share Embed


Descripción

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN   243

CHAPTER 25

Deception as Exploitative Social Agency R ADU UMBRES

W

hen we deceive others, are we using them as tools? After all, when we deceive, we use others as means to our ends, irrespective of their intentions and welfare. But I  argue that such a view is only superficially valid once we consider the mechanisms of deception as a form of social agency. A  more insightful understanding of deception must account for the interlocking of agency between deceiver and deceived at several levels, based on uniquely human meta-​representational capacities. The idea that deception means using humans as tools has widespread intuitive appeal. Often, people deceived by others feel “used” or “played with” or “powerless,” attributes better fitting an inanimate object than an active, intentional agent. One of its guises, manipulation, strongly evokes a puppeteer handling the mechanics of a tool-​like puppet. In slang, a “tool” is an individual deemed too slow-​witted to realize that he is being taken for a fool. The link between deception and tool use has even more prestigious advocates. Immanuel Kant opposed any form of lying (even when the lie would save an innocent life) because it would contradict the first formulation of the categorical imperative (to act only in accordance with the maxim through which you can at the same time will that it become a universal law) but also, and more relevant to our point here, the second formulation—​that we are to treat others as ends in themselves, and not only as means:  “The one who has it in mind to make a lying promise to another will see right away that he wills to make use of another human being merely as means, without the end also being contained in this other. Distributed Agency. N.J. Enfield and Paul Kockelman. © Oxford University Press 2017. Published 2017 by Oxford University Press

9/16/2016 1:07:39 PM 3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 243

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN 244

For the one I want to use for my aims through such a promise cannot possibly be in harmony with my way of conducting myself toward him and thus contain in himself the end of this action” (Kant 1993:429–​430). While Kant is making a moral point about deception, the liar’s conduct (and hence his moral failure) toward the other evokes exactly the use of a device for one’s own ends, without acknowledging the humanity of the victim. Intuition and moral metaphysics aside, this chapter explores aspects of deception that clearly distinguish it from mere tool use and bring it conceptually closer to other forms of social agency. It is argued that deception builds upon co-​opting several aspects of the victim’s agency, whereas no agentive qualities are needed from a tool aside from its affordances. Moreover, although deception feeds on an asymmetry of agency, it is not merely a unilateral process. The target of deception may fend for himself in ways that are unthinkable for an object. Together, these facets reveal the entangled, reciprocally responsive agency of actors involved in deception and counterdeception.

THE COMPLEXITY OF DECEPTION

The first thing about deception that sets it apart from treating others as mere tools is its constitutive social nature. Few scholars if any would disagree with the fact that deception is a form of social agency, yet it is conspicuously absent or undervalued in major works in the philosophy and psychology of social agency. As illuminating as they are, developments in the burgeoning field of studies and theoretical proposals dealing with shared agency might suggest an unwarranted identity between “shared” and “social” agency by overlooking a vast realm of sociality. Since the point made here is a fairly general one, the umbrella-​term “shared” covers many forms of social agency. They may share only a “family resemblance,” including concepts such as “shared,” “joint,” “collective intentionality,” “joint action,” “common ground,” “joint commitment,” “group,” or “collective agency” (Gilbert 1990, Searle 1990, and Tomasello et  al. 2005; Bratman 2014 is an exception, but even he tries to demonstrate how types of deception may still count as shared agency). Deception and coercion are exemplary cases of those other forms of agency that are irreducibly social but have features that set them apart from the “shared agency” family. These different forms are ontologically based on asymmetry rather than symmetry, on exploitation rather than mutuality, on opposition rather than convergence, on competition rather than cooperation, on the circulation of incomplete or misleading information and

[ 244 ]  From Cooperation to Deception and Disruption

3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 244

9/16/2016 1:07:39 PM

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN   245

imperfectly aligned or even conflicting interests. One could hardly consider non-​cooperative social agency as a rare phenomenon, and, if you doubt its relevance for human societies, consider the role of coercion in the slave-​ master relationship behind the building of the pyramids or the expanse of totalitarian political regimes in the past century. As for deception, we can only speculate how the crucial invasion of Sicily would have enfolded were it not for Operation Mincemeat, in which the Allied forces tricked Nazi commanders (including Hitler) that the landing target was Greece, staging a body washed up on the shores of Spain who appeared to be a dead spymaster stacked with orders carefully faked by the British. One of the possible reasons why deception stays under the radar of scholars of social agency is the weight given to human cooperative inclinations as the key to our species’ capacity for culture and for building and abiding by social institutions. In contrast, deception figured more in the scientific mapping of cognitive capacities of nonhuman primates and other animals, especially tactical deception, which brings certain primates closer to human capacities for attending to cues such as field of vision or attention, or even reading minds. While some scholars discussed to what extent chimpanzees deploy a Machiavellian intelligence with deception as a crucial aspect of competitive interactions (Whiten and Byrne 1997), the uniqueness of human thinking was attributed to our cooperative inclinations unparalleled among other primates (Tomasello 2014). The role of cooperation for human social agency could hardly be denied, but this only makes human deception an even more complex phenomenon. Compared to humans, the most Machiavellian of chimps deploys rudimentary ruses, just as limited as its cooperative scope. Human deception may be the opposite of human cooperation, but it is causally built upon the same cognitive capacities, exploits the same propensities for acting and thinking together, and even mimics or parasitizes cooperation to exploitative purposes. In fact, the cooperative aspect of human agency (in forms of trust, shared symbols, expectations, and obligations of mutuality, etc.) opens up vast possibilities for exploitation that are unavailable for less cooperative species. Because the engine of cooperation depends on the mutual good faith of parties, a deceiver may mimic a bona fide cooperator and allow the other party to (mistakenly) think that the goal of interaction is common and mutualistic, but without having the positive, shared, “we” intentions of true cooperation (Tomasello 2014). In a foundational work, Nicholas Humphrey (1976) has argued that human intelligence evolved as a function of social complexity. The presence of cooperation does not just make human sociality more complex in cognitive terms (in contrast with other primates). The dramatic effect is that cooperation extends the capacity of

De c e p t i o n a s E x p l o i tat i v e S o c i a l   Ag e n c y  

[ 245 ]

9/16/2016 1:07:39 PM 3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 245

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN 246

social agency to involve a huge array of flexible and interweaving processes ranging from full-​on cooperation to pure competition with myriads of possibilities in between.

DECEPTION AS SOCIAL AGENCY

If we take agency to be “the relation between a person and a course of action and its effects” (Enfield 2013), the provisional definition of deception that I propose and evoke in empirical examples is the act of an agent (the deceiver) to manipulate information in order to use another agent’s agency (the deceived) without the victim’s awareness or consent. Deception is thus a form of social agency in which two (or more) actors have their individual agencies interlocked in a chain of information manipulation. To further understand the relationship, we may further unpack the concept of agency into flexibility and accountability (Kockelman 2007;, Enfield 2013) to discover the kinds of asymmetries exploited by deception as social agency. As a starting point for understanding deception, the subjective interests of the agents involved are not aligned (or not aligned enough for the purposes of the deceiver). This is important to rule out cooperation with its relatively balanced distribution of interests as a flexible alternative to deception. Unless we think of all deceivers as pathological (which they are obviously not), deception starts from an imbalance between the initiator’s and the target’s desires that cannot be solved efficiently through cooperation or other means. Consequently, the deceiving agent pursues a course of events that builds upon and exploits to his own advantage an asymmetry with the victim at the level of knowledge and beliefs. The epistemic imbalance is clear: the deceiver knows something more than the victim, in other words, that an act of deception is under way, that certain signs or symbols are false or misplaced, that the victim will act against its own interests, and so on. The deceiver either does nothing to correct the error of the victim or, most often, actively creates that error in order to profit from the imbalance. Despite the imbalance, deception relies on the interlocking of individual agencies, leading to causal effects beyond the scope of any single one of them. A simpler way of putting this is that neither party brings about by itself the deception, just as it is true for any communicative or social action. On the deceiver’s side, he or she must manipulate something in the cognitive environment of the deceived. On the other side, the deceived would not pursue the course of action by him-​or herself were it not for the manipulation. Standing by silently while municipal workers water your plants mistakenly taking them for public property may engage someone’s

[ 246 ]  From Cooperation to Deception and Disruption

3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 246

9/16/2016 1:07:39 PM

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN   247

agency in your favor by means of misinformation, but you did not deceive anyone—​the workers are simply mistaken. Contrast this with the case of, for example, a Nigerian inheritance scam, where the victim acts and (falsely) interprets the situation as a cooperative event, apparently acting jointly with the deceiver, but in reality playing into his or her hands. In contrast, tool use may be considered a part of an individual’s agency. Through its affordances, its embodied history and origin, an object may be part of the distributed agency of an individual or a collective of individuals (Hutchins 1995). The elements of the cockpit form a constitutive part of the pilot’s agency in flying a plane as a ship may be part, and mediate, the social agencies of a crew. The tool becomes part of individual or social agency as long as it is engaged into cognitive and physical work by at least one human being. Deception, on the contrary, is essentially social. It takes (at least) two to create a deception (leaving “self-​deception” aside), just as it takes two to have a conversation or to jointly carry an object. This creates a discrepancy at the level of flexibility, as the tool, unlike the victim of deception, may only be co-​opted passively, while the deceived is induced to make choices, comprehend, and communicate. If I  push you in front of a trolley (perhaps to save five people as a good utilitarian), I  am not engaged in any social agency, but merely use your body as a tool for derailing the trolley. A different event happens if I persuade you that it is safe to lay on the tracks by your own volition. Between victim and perpetrator there is an imbalance of epistemic power, but they are still ultimately cognizing actors, while a tool is not. From the second perspective on agency, the proper accountability of tools is usually derived from their users. An Azande granary is the tool used by the sorcerer to kill its victim, but the legal and causal actor is the (invisible) sorcerer.

DEFENSE AND ACCOUNTABILITY IN DECEPTION

The previous section argued that deception is a form of social agency whose flexibility involves the imbalanced relationship at the level of knowledge between the agencies of two actors. The epistemic asymmetry is key to the success of deception, as the deceiver controls the event and adjusts his or her inputs according to how he or she anticipates the behavior of the victim. Yet the victim is not a simple tool, as he or she contributes with his own interpretation of facts, his or her own sources of information, and his or her own capacity for choice and control. Moreover, deception is not the automatic outcome of a single agent—​the deceiver—​since the target is never just a sitting duck.

De c e p t i o n a s E x p l o i tat i v e S o c i a l   Ag e n c y  

[ 247 ]

9/16/2016 1:07:40 PM 3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 247

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN 248

Sperber and colleagues (2010) have proposed a model of epistemic vigilance that exposes the flexibility in the (potential) victim’s agency. The deceiver may try to manipulate information, but the victim is not a passive recipient (a tool, one might say). Sperber et al. argue that the human mind is endowed with a set of mechanisms geared for filtering true and relevant information. Such mechanisms have evolved to avoid both intentional deception as well as unintentional misinformation from malevolent or incompetent sources. Epistemic vigilance consists of assessing a potential deceiver’s reputation and subjective interests (do not trust when a man you know to be a robber tells you to go left into the woods), as well as for consistency between content and prior knowledge (do not believe that this hen for sale lays three eggs a day when no hen ever did that). Such defense mechanisms enormously complicate the interlocking of the agencies of deceiver and deceived. Actors deploy layers of meta-​ representations in deception and counterdeception (I know that she knows that I know that she … ), flexibly adjusting their choices and symbols to the other actor’s inputs. Such to-​and-​fro is unthinkable for the relationship between a tool user and a tool, where the flexibility is only attached to the human element. If the victim is not merely a passive element in the interlocking social agency that characterizes deception, is he or she to blame? Who is accountable for the deceptive act? What are the rights and the duties attached to deception? Here, cultural variation can enlighten our understanding of the agency behind deception. A Lebanese village society (Gilsenan 1976) abides by a rule of generalized potential deception. “Everything is kizb,” an informant declares as a justification for a world in which people patiently plan the humiliation of others using lies and embellishment of facts, where an epistemically vigilant commoner uses a trick to reveal the dishonesty of a visiting sheikh, where everyone needs, and is expected, to lie at least a little bit to keep up pretenses of honor. Likewise, in the Romanian village I have studied for the past ten years, deception coming from people outside a circle of trust and reciprocity is an ever-​present danger and makes a way of life from a heightened sense of mistrust. What makes such communities special is that the blame for being deceived often falls upon the victim. An ethic of “devil take the hindmost” applies to everyday encounters including communicative acts in which one ought to interpret the utterances of others with utmost vigilance and adequate information, lest he or she be deceived by a village competitor. Apportioning (at least part of the) blame to the victim, villagers seem to ascertain the role played by him in the social agency underpinning deception. Had the victim been more competent, more astute, better prepared

[ 248 ]  From Cooperation to Deception and Disruption

3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 248

9/16/2016 1:07:40 PM

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN   249

by his peers, the deceiver would have failed. In a way, the deceiver did what is expected of most people—​to further his or her interests even at the expense of others, using such means as possible and useful. To try a comparison, deception flies in the air just as your inbox swells with Nigerian scam letters. One could hardly stop feeling that the unfortunate client of such scams is at least an active party to the deception with inadequate epistemic vigilance in this day and age. The same apportioning of blame is used in ritual pranks of initiation, in which the incompetence of the victim justifies and legitimates the social and epistemic power of the tricksters (Umbres 2013). But, to return to our initial problem, why do people feel that deception involves them as mere tools of the deceivers? I  conclude with a speculation about the accountability of deception. In modern, centralized societies driven by the universalistic morality of religion, or citizenship, or Kantian ethics, people do not expect fellow humans to deceive them, at least not as a rule. The norms of social interaction prescribe (at least minimally) cooperative agents, engaged in true and relevant communication. They feel obligated to speak the truth (or at least not to tell a lie knowingly) and feel entitled to the same reciprocal treatment. The social contract of everyday interaction punishes deceivers, either formally or informally, from jail sentences to social ostracism. Not all social settings, of course, require such a social norm. Strudler (2009) argues that deception is socially accepted when used as legitimate self-​defense. The norm of trust1 does not apply automatically and exhaustively in legitimate negotiation maneuvers such as communicating a false reservation price when buying a house. In high-​trust societies, the sentiment of reciprocity in truth-​telling makes everyone lower their epistemic guards, leading to huge savings in the transaction costs of communication and social interaction, and furthermore to successful and efficient cooperation (compare with Keenan 1976 for a culture in which truth and relevance are not conversational postulates). But the reverse of this generalized amnesty of distrust is ceding part of our agency to others. When we trust someone, we give him or her power over our behavior, over our representations, over the flexibility of our agency. We do this in the expectation that her or she (1) will be benevolent and cooperative and (2) would do the same for us. This is why deception appears, for such enlightened altruists, as a way of using people as tools, as means to private, egoistic ends. The comparison helps us understand the way deception is accounted for in some societies, but it is misleading as a metaphor or theoretical inspiration. In deception, as in any form of social agency, it takes more than one individual to act and to be responsible for their actions.

De c e p t i o n a s E x p l o i tat i v e S o c i a l   Ag e n c y  

[ 249 ]

9/16/2016 1:07:40 PM 3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 249

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN 250

ACKNOWLEDGMENTS

The writing of this chapter began while I was Fyssen Foundation postdoctoral fellow at Institut Jean Nicod, Paris. I am grateful to Nick Enfield and Dan Sperber for their comments on draft versions, and I thank the participants of the “Foundations of Social Agency” scientific retreat organized by the Max Planck Institute for Linguistics for their suggestions.

NOTE 1. Strudler also links trust with a yielding of autonomy, and deception as breach of trust with a compromise of autonomy, lending further support to the idea that deception is a form of intertwining social agency between parties.

REFERENCES Bratman, M. E. 2014. Shared Agency: A Planning Theory of Acting Together. New York: Oxford University Press. Enfield, N. J. 2013. Relationship Thinking: Agency, Enchrony, and Human Sociality. Oxford: Oxford University Press. Gilbert, M. 1990. “Walking Together: A Paradigmatic Social Phenomenon.” Midwest Studies in Philosophy 15(1): 1–​14. Gilsenan, M. 1976. “Lying, Honor, and Contradiction.” In Transaction and Meaning: Directions in the Anthropology of Exchange and Symbolic Behavior, edited by Bruce Kapferer, 191–​219. Philadelphia: Institute for the Study of Human Values. Humphrey, N. 1976. “The Social Function of Intellect.” In Growing Points in Ethology, edited by P. P. G. Bateson and R. A. Hinde, 303–​317. Cambridge: Cambridge University Press. Hutchins, E. 1995. Cognition in the Wild. Cambridge, MA: MIT Press. Kant, I. 1993. Grounding for the Metaphysics of Morals. Indianapolis: Hackett. Keenan, E. O. 1976. “The Universality of Conversational Postulates.” Language in Society 5(1): 67–​80. Kockelman, P. 2007. “Agency.” Current Anthropology 48(3): 375–​401. Searle, J. R. 1990. “Collective Intentions and Actions.” In Intentions in Communication, edited by P. R. Cohen, J. L. Morgan, and M. E. Pollack, 401–​415. Cambridge, MA: MIT Press. Sperber, D., F. Clément, C. Heintz, O. Mascaro, H. Mercier, G. Origgi, and D. Wilson, 2010. “Epistemic Vigilance.” Mind and Language 25(4): 359–​393. Strudler, A. 2009. “Deception and Trust.” In The Philosophy of Deception, edited by C. W. Martin, 139–​152. New York: Oxford University Press. Tomasello, M. 2014. A Natural History of Human Thinking. Cambridge, MA: Harvard University Press.

[ 250 ]  From Cooperation to Deception and Disruption

3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 250

9/16/2016 1:07:40 PM

OUP UNCORRECTED PROOF – FIRSTPROOFS, Fri Sep 16 2016, NEWGEN   251

Tomasello, M., M. Carpenter, J. Call, T. Behne, and H. Moll. 2005. “Understanding and Sharing Intentions: The Origins of Cultural Cognition.” Behavioral and Brain Sciences 28(5): 675–​691. Umbres, R. 2013. “Chasse au dahu et vigilance épistémique.” Terrain 61: 84–​101. Whiten, A., and R. W. Byrne. 1997. Machiavellian Intelligence II: Extensions and Evaluations. Cambridge: Cambridge University Press.

De c e p t i o n a s E x p l o i tat i v e S o c i a l   Ag e n c y  

[ 251 ]

9/16/2016 1:07:40 PM 3C28A.3D1 Template Standardized 05-07-2016 and Last Modified on 16-09-2016

acprof-9780190457211.indd 251

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.