Philosophical dimensions of anonymity in group support systems: Ethical implications of social psychological consequences

Share Embed


Descripción

Computers in Human Behavior 19 (2003) 355–382 www.elsevier.com/locate/comphumbeh

Philosophical dimensions of anonymity in group support systems: Ethical implications of social psychological consequences Esther E. Kleina,*, Chalmers C. Clarkb, Paul J. Herskovitzc a

Department of Business Computer Information Systems/Quantitative Methods, Zarb School of Business, Hofstra University, 211 Weller Hall, Hempstead, NY 11549, USA b Department of Philosophy, Union College, Schenectady, NY 12308, USA c Department of Business (Law), College of Staten Island, City University of New York, 2800 Victory Boulevard, Room #3N-206, Staten Island, NY 10314, USA

Abstract With the aim of reducing the inefficiencies of group work, group support systems (GSS), also known as groupware, have been designed to facilitate interaction and foster collaboration and decision making within such groups. A key feature available in GSS is anonymous interaction of group members. This paper examines the ethical dimensions of two social psychological consequences of this anonymity that result from the lack of social cues—the absence of gender cues, with the attendant equalization of male–female participation, and deindividuation, with the attendant weakening of social norms, reduction of inner restraints, and loss of evaluation apprehension. Specifically, this paper suggests that the absence of gender cues within GSS-supported decision-making groups promotes the twin values of justice and autonomy, whereas deindividuation results in a ‘‘Ring of Gyges scenario’’ (K. A. Wallace, 1999), wherein anonymity confers immunity against the consequences of bad and disruptive behavior and thereby encourages such behavior. The paper considers the problem of miscommunication—which offsets the advantage of gender neutrality and which stems from the failure of text-based media to convey verbal nuances and emotional content—and provides practical solutions. Moreover, the paper proposes the adoption of Wallace’s electronic ‘‘tagging’’ solution of traceable anonymity in order to avert, or reduce the incidence of, Ring of Gyges scenarios within GSS-supported groups in situations involving intentional or grossly negligent

* Corresponding author. Tel.: +1-516-463-4529; fax: +1-516-463-4834. E-mail address: [email protected] (E.E. Klein). 0747-5632/03/$ - see front matter # 2003 Elsevier Science Ltd. All rights reserved. PII: S0747-5632(02)00053-5

356

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

misconduct but not in cases of ordinary negligence. Additionally, the paper discusses libertylimiting principles as justifications for the restrictions on free speech that are imposed by tagging. # 2003 Elsevier Science Ltd. All rights reserved. Keywords: Group support systems; Anonymity; Gender cues; Deindividuation; Miscommunication; Electronic monitoring; Creative idea generation

1. Introduction Amid management ‘‘talk about the need for collaboration and teamwork’’ (Hymowitz, 2002, p. B1), much of organizational activity is conducted in small groups such as committees, task forces, project teams, and idea-generating, problem-solving, decision-making, and other task-orientated groups (e.g. see Aiken & Paolillo, 1997; Applegate, 1991). Despite their prevalence and ‘‘the critical role they play in communication and decision-making functions’’ (Anson, Bostrom, & Wynne, 1995, p. 189), group meetings have been found to be inefficient and ineffective, laden with many shortcomings (El-Shinnawy & Vinze, 1997; Klein & Dologite, 2000). For example, group members may be reluctant to express their opinions because of the fear of public speaking or of public comment on their ideas by others, and one or more group members may dominate the discussion, resulting in premature consensus (Nunamaker, Dennis, Valacich, Vogel, & George, 1991, 1993). Additionally, high-status members may unduly influence lesser-status members (e.g. see Walker, Ilardi, McMahon, & Fennell, 1996), and group members may be subject to conformance pressures (Nunamaker et al., 1991, 1993). With the aim of reducing the inefficiencies of group work, group support systems (GSS),1 also known as groupware (Johansen, 1988; see also Abel, 1990, p. 490), have been designed to facilitate interaction and foster collaboration and decision making within groups. (For recent reviews of the various streams of GSS research, see Fjermestad & Hiltz, 1999; 2000; Kline & McGrath, 1999; see also Nunamaker, Briggs, Mittleman, Vogel, & Balthazard, 1996–1997.) Specifically, GSS is ‘‘an interactive computer-based information system that supports and structures group interaction and decision-making, including idea generation and problem solving’’ (Klein, 2000, p. 94; see also Huber, Valacich, & Jessup, 1993; Poole & DeSanctis, 1990). According to Nunamaker (1997, p. 357), GSS is ‘‘a set of techniques, software and technology designed to focus and enhance the communication, deliberations and decision making of groups.’’ (For the full range of definitions of GSS, see Zigurs & Buckland, 1998, pp. 318–319; see also Ackermann & Eden, 1994; Aiken & Carlisle, 1992; Anson, Fellers, Kelly, & Bostrom, 1996.) As a practical matter, the implementation of GSS involves placing each group member at a computer workstation, which is connected to a network. As far as physical location, 1 Group support systems (GSS) has replaced the earlier term group decision support systems (GDSS) to reflect the former’s ‘‘broader focus beyond solely decision-related tasks’’ (Anson et al., 1995, p. 189, note 1).

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

357

the computer workstations can be either in the same room or at different locations. In either case, group members participate in a discussion by typing in their messages, which appear on the screens of all other group members without the contributor being identified. By providing parallel communication through its layout, GSS permits group members to input comments simultaneously. (See Jessup & Valacich, 1993.) Moreover, GSS allows same-site or geographically dispersed groups to meet over a network at the same time (synchronous) or at different times (asynchronous). Thus, GSS is available in four temporal/spatial combinations: same-time/same-place, same-time/different-place, different-time/same-place, and different-time/different-place. A key feature available in GSS is the capability for group members to collaborate and contribute, performing tasks such as idea generation, problem solving, and decision-making, all while remaining anonymous (Klein & Dologite, 2000). The word ‘‘anonymity’’ is derived from the Greek an numos, which means ‘‘without a name,’’ and has been defined as ‘‘nonidentifiability’’ (K. A. Wallace, 1999, p. 23; see also Rotenberg, 1993).2 Thus, a critical distinction between a GSS meeting and a traditional face-to-face (FTF) meeting is that in the former there is an option for group members to participate (contribute comments) without revealing their identity (Hayne & Rice, 1997, p. 429). Anonymity shares some characteristics with the broader, related concept of privacy,3 defined as the state of being shielded from observation (Kang, 1998, p. 1203).4 For example, both anonymity and privacy involve an escape from inspection and both promote individual autonomy (see Hirshleifer, 1980, p. 650). (See discussion on autonomy later.) In this paper, we examine the ethical dimensions of two social psychological consequences of anonymity in GSS-supported decision-making groups that result from the lack of social cues: the absence of gender cues, with the attendant equalization of male–female participation, and deindividuation, with the attendant weakening of social norms, reduction of inner restraints, and loss of evaluation apprehension.5 2 Specifically, for K. A. Wallace (1999), anonymity is a form of nonidentifiability that is defined as ‘‘noncoordinatability of traits in a given respect’’ (p. 24). According to Wallace’s definition, then, ‘‘one has anonymity or is anonymous when others are unable to relate a given feature of the person to other characteristics’’ such that the person could be identified (p. 24). Thus, in GSS-supported groups, each group member is anonymous when he or she is known only as ‘‘the maker of such-and-such a comment’’ or ‘‘the proposer of such-and-such an idea’’ and when that trait cannot be related to other traits of the group member—such as name, home address, telephone number, e-mail address, and social security number—so as to enable identification (see Wallace, p. 24). 3 Most scholars conceptualize anonymity as a component of privacy (e.g. see Kabay, 1998; Rotenberg, 1993). 4 The legal scholar Jack Hirshleifer (1980, p. 649) has defined privacy as ‘‘autonomy within society.’’ 5 Other social psychological consequences of anonymity-featured GSS can also be subjected to an ethical analysis. For example, the absence of social status cues (in the sense of cues of social position in general and not limited to gender cues) promotes the value of justice by ensuring that ideas are judged on the basis of merit and not on the social standing of its proposer. In a similar vein, an ethical analysis of social loafing may conclude that it violates principles of justice by fostering the de facto unequal distribution of the decision-making task. In light of the variety and complexity of the ethical issues involved, the two aforementioned social psychological consequences as well as others deserve detailed and comprehensive philosophical investigation. As such, this paper will limit its ethical analysis to the absence of gender cues and to deindividuation. Such an analysis is representative of the kind of ethical exploration that is central to viewing social psychological consequences through a philosophical lens.

358

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

Whereas the social sciences in general and social psychology in particular address issues in objective and ‘‘value neutral’’ terms, our philosophical approach offers a normative lens with which to examine the very phenomena that are of interest to social scientists. The ethical implications of social psychological consequences of anonymity in GSS are worth philosophical investigation because of philosophy’s critical focus on, and heightened sensitivity for, norms and values. Our investigation falls within the tradition of applied ethics, which is concerned with practical or ‘‘applied’’ questions. Since the 1960s applied ethics has constituted a major component of academic research in ethics (Honderich, 1995, p. 42) in topics as varied as bioethics (e.g. see Beauchamp & Walters, 1999; Gert, Culver, & Clouser, 1997; Munson, 2000), business ethics (e.g. see Beauchamp & Bowie, 1997; Donaldson & Werhane, 1998; Ferrell & Fraedrich, 1997), and legal ethics (e.g. see Altman, 2000; Simon, 2000). An aim of this paper is to advance the effort of applied ethics to extend its domain to information technology (IT) by examining the ethical implications of anonymity in GSS-supported groups. GSS researchers have studied the advantages and disadvantages of anonymous participation. The advantages include an increase in the number of ideas generated (e.g. see Jessup & Valacich, 1993; McLeod, 1992) and decreases in evaluation apprehension, in fear of public speaking, in member domination, in conformance pressure, and in status competition, which result in increased exploration of alternatives (Hayne & Rice, 1997; see also Elam & Mead, 1990). Additionally, research has indicated that interactions in groups that are anonymous are more critical and probing than in groups where the interactions are identified (Connolly, Jessup, & Valacich, 1990; Jessup, Connolly, & Tansik, 1990; Jessup, Tansik, & Laase, 1988). Moreover—and more pertinent to our investigation, studies have suggested that the absence of gender and other status cues eliminates biased devaluation of contributions (Herschel, 1994; Klein, 2000; Klein & Dologite, 2000; Lockheed & Hall, 1976; Meeker & Weitzel-O’Neill, 1977; Sell, 1997; for a recent study on anonymity and gender in computer-mediated group environments, see Flanagin, Tiyaamornwong, O’Connor, & Seibold, 2002).6 Thus, an advantage of anonymity in GSSsupported groups is that it enables group members to propose bold conjectures and ideas of their own and to criticize the conjectures and ideas of others. However, GSS anonymity also has serious disadvantages, including deindividuation as well as social loafing, failure to listen, and poor socialization, all of which have the potential for decreasing group effectiveness and increasing group dissatisfaction (Hayne & Rice, 1997).7 This paper will specifically focus on the absence of gender cues and on the deindividuation that result from the anonymity feature of GSS,8 and will, in essence, recast these consequences in terms of applied ethics. 6

But see Herring (1994, 1996), suggesting that, notwithstanding the absence of nonverbal cues in computer-mediated communication (CMC), readers are able to discern the gender of the sender on the basis of language cues. See Savicki, Lingenfelter, and Kelley (1996), weakly supporting Herring’s notions of ‘‘gendered’’ communication. See also Savicki, Kelley, and Ammon (2002). 7 See generally Zmud (1990, p. 95) on the double-edged sword characteristic of information technology: ‘‘If used well, IT can beneficially affect an organization and its members. If used inappropriately, the same technology may very well be associated with negative consequences.’’ 8 See note 5 earlier.

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

359

2. Absence of gender cues 2.1. Social psychological literature Studies have reported that in FTF mixed gender groups the rate of participation for women is lower than that for men, with women having an inclination toward stifling their ideas (Craig & Sherif, 1986) due to evaluation apprehension (see Meeker & Weitzel-O’Neill, 1977) inter alia. Smith-Lovin and Brody (1989) have found that in FTF mixed gender groups, men interrupt women more frequently than they do other men, a mode of interaction that hinders free expression of ideas and freeflowing discussions (see also Tannen, 1994, pp. 53–83). The unequal participation of women in intellectual teamwork is a cause for concern in light of the statistic that women today comprise half of middle management as well as of the workforce in general (Conlin & Zellner, 1999). ‘‘Equal participation [of all group members] has the potential to improve the quality of interaction and perhaps to provide opportunity for more critical discussion of decision alternatives’’ (Adkins, Shearer, Nunamaker, Romero, & Simcox, 1998, p. 518). Expectation states theory (Berger, Fisek, Norman, & Zelditch, 1977; see also Berger & Zelditch, 1998) provides a sociological explanation for gender variations in mixed groups, arguing for the notion that these gender imbalances are status induced (e.g. see Carli & Eagly, 1999, pp. 204–206; Lockheed & Hall, 1976; Meeker & Weitzel-O’Neill, 1977; Sell, 1997). Specifically, the theory suggests that group members tend to evaluate other members on the basis of stereotypical performance expectations, which are influenced by external status characteristics, including gender. Thus, according to expectation states theory, because society regards men as having a higher status than women, the task contributions of men will be perceived as being worthier than the contributions of women, even when the male group members are, in fact, poorer performers. In light of the devaluation of their contributions, women may be further reluctant to express their ideas. GSS with anonymous interaction capability provides an environment free of gender and other status cues, thereby ensuring that the contributions of each group member are evaluated only on merit and not on the contributor’s external characteristics (see Herschel, 1994; Klein, 2000; Klein & Dologite, 2000). In a study on the effect of computer support tools and gender composition on creative idea generation by groups, Klein and Dologite have reported that mixed gender groups using anonymity-featured GSS generated ideas that were as innovative as the ideas generated by all-male or all-female groups using GSS. In particular, mixed gender anonymity-featured GSS-supported groups generated ideas that were as innovative as the ideas generated by same gender anonymity-featured GSS-supported groups on all the measures of innovativeness that were analyzed: novelty, usefulness, feasibility, a novelty–usefulness–feasibility composite item, and an overall creativity item (global measure). Using expectation states theory to explain their results, Klein and Dologite have suggested that the anonymity feature of GSS eliminates gender as a status characteristic and thus equalizes participation by allowing for the evaluation of ideas without gender’s distorting influence.

360

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

Other studies have similarly suggested that anonymity-featured GSS can be used to enhance the creativeness of ideas generated by groups (e.g. Bostrom & Nagasundaram, 1998; Connolly et al., 1990; Nagasundaram & Bostrom, 1994–1995; Siau, 1996; see also, Hender, Dean, Rodgers, & Nunamaker, 2001; Satzinger, Garfield, & Nagasundaram, 1999, empirical studies where the unit of analysis was the group member rather than the group). Moreover, research has also indicated that anonymityfeatured GSS-supported groups produce a greater number of ideas and higher quality ideas than traditional FTF groups without GSS support (e.g. Barki & Pinsonneault, 2001; Gallupe, Cooper, Grise, & Bastianutti, 1994; Gallupe, DeSanctis, & Dickson, 1988). In a pioneering study, Jessup and Tansik (1991) have conducted a laboratory experiment that compared anonymity-featured GSS-supported groups and nonanonymity-featured (identified) GSS-supported groups, finding ‘‘that group members working anonymously are more likely to embellish ideas and question solutions, and will generate slightly more comments than those working under conditions of identifiability’’ (p. 274). Surprisingly, there have been no empirical studies that have compared anonymityfeatured GSS-supported groups, nonanonymity-featured GSS-supported groups, and traditional FTF groups (three-way comparison, at group level of analysis) with respect to the level of creativity of ideas generated or the number of creative ideas generated by the group. Accordingly, no firm conclusions can be drawn for the proposition that in anonymity-featured GSS-supported mixed gender groups, a greater number of creative ideas will be generated because the masking of gender cues, or gender neutrality, will equalize male-female participation rates. However— and although a greater number of ideas generated does not necessarily mean a greater number of creative ideas generated (see Bostrom & Nagasundaram, 1998), on the basis of inference it may be suggested that the more ideas generated, the more opportunity to have creative ideas among them (see Osborn, 1957, for the ‘‘quantity breeds quality’’ premise in the context of FTF brainstorming groups). Conclusively establishing the relationship between GSS anonymity and the generation of creative ideas in mixed gender groups will require experimental studies, which should be a high priority for future researchers interested in the effects of anonymity on creativity in computer-mediated environments. 2.2. Ethical implications Having discussed the absence of gender cues as a social psychological consequence of anonymity, we now turn to an ethical inquiry of this consequence, which leads us to an examination of the values of justice and autonomy. 2.2.1. Justice According to Rawls (1971, p. 3), ‘‘Justice is the first virtue of social institutions,’’ and for Aristotle, it ‘‘occupies a key position among the virtues’’ (MacIntyre, 1988, p. 106). Accordingly, from the perspectives of both moral philosophy and intuitive, ordinary conceptions of fairness, principles of justice should inform all decisionmaking and other small task-orientated groups in their deliberations. The anonymity

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

361

feature of GSS advances justice in that it allows equal participation by all group members irrespective of gender and other external status characteristics, thereby ensuring that the contributions of members will be evaluated solely on the basis of merit (e.g. see Klein & Dologite, 2000; Sell, 1997). The most often cited conception of justice is Aristotle’s principle of formal equality, which holds that equals should be treated equally and that, correspondingly, unequals should be treated unequally (Aristotle, trans. 1990; see also Beauchamp & Childress, 1994, pp. 328–329). As such, in decisionmaking and other small task-orientated groups, contributions of equal merit should be equally evaluated in order to promote the interests of justice.9 Furthermore, the notion of justice mandates that, in actuality, the equal opportunity to participate be afforded to all group members. Expanding on the work of Rawls (1971, 1999, 2000), for whom ‘‘the fundamental idea in the concept of justice is fairness’’ (Rawls, 1999, p. 47), Daniels (1996) has regarded the ‘‘fair equality of opportunity’’ as a necessary element of the justice principle. According to Daniels (p. 310), the fair equality of opportunity mandates not only the removal of constraints on equality of opportunity based on merit but also the provision of ‘‘equality of access—to those with inferior initial competitive positions resulting from . . . biological or social accidents.’’ It follows, then, that, when in traditional FTF groups, negative gender stereotyping results in the contributions of women being underrated and discouraged, an ethical problem arises. Specifically, to the extent that in these groups equal contributions are evaluated differentially or that the contributions of a group member are inhibited, there is a violation of the justice principle. By contrast, the deployment of GSS with anonymous interaction capability and the resultant masking of gender cues furthers the value of justice by making possible the ‘‘fair equality of opportunity’’ (see Daniels, 1996) in permitting the merit-based evaluation of ideas and thereby equalizing rates of participation for both genders. Such ‘‘social equalization’’ ensures that group members ‘‘offer more ideas and more creative ideas than in standard brainstorming sessions’’ (Kiesler, 1994, pp. 6–7). GSS, in essence, provides a ‘‘veil of ignorance,’’ whereby group members are deprived of knowledge of the gender of other group members. We borrow the term ‘‘veil of ignorance’’ from Rawls (1971, 2000), who has used it differently to refer to his notion that principles of justice should be arrived at by each individual imagining a lack of knowledge of his or her social position, which results in the setting aside of the individual’s biases. In contrast to Rawls, our usage of the term focuses on the actual ignorance of group members regarding the gender of fellow members, which is made possible by the anonymity feature of GSS. It is this ignorance that eliminates negative stereotyping.

9 Specific criteria for justice—referred to in the philosophy literature as ‘‘material principles of justice’’—in addition to our criterion of ‘‘to each person according to merit,’’ include such principles as ‘‘to each person according to need’’ and ‘‘to each person according to effort’’ (see Beauchamp & Childress, 1994, pp. 329–331). However, it is the ‘‘to each person according to merit’’ principle that is pertinent to our analysis.

362

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

2.3. Autonomy As GSS specifically encourages interaction, collaboration, and decision making within groups, this situation calls out for a philosophical examination of GSS’s impact on the autonomy of individual group members. As will be shown, the ‘‘veil of ignorance’’ that furthers justice also advances personal autonomy. A key concept within the domain of moral philosophy, autonomy refers to individual self-determination and includes ‘‘personal rule of self that is free from . . . controlling interferences by others’’ (Beauchamp & Childress, 1994, p. 121). Hill (1991) has tersely defined autonomy as ‘‘self-governance’’ (p. 50), asserting that self-respect is grounded in autonomy (p. 1). Recognizing the importance of autonomy to the individual within the larger society, Hirshleifer (1980, p. 650) has termed autonomy ‘‘the bedrock value of that classical liberalism still popular hereabouts.’’ According to Kant (1785/1997), autonomy is an ethical value because not regarding individuals as self-directed agents is to treat them as merely means to ends and not as ends in themselves, thereby running afoul of the self-legislated, duty-based, universalizable categorical imperative, which demands adherence by all individuals in all circumstances without exception. For Kant, ‘‘it is impossible for a creature with a will to regard itself simply as a means to an end’’ (Lindley, 1986, p. 20). Thus, Kant has viewed autonomy as ‘‘the foundation of human dignity and the source of all morality’’ (Hill, 1991, p. 43). By contrast, the utilitarian John Stuart Mill (1859/1974) has held that autonomy has practical worth in that it promotes self-development. Whereas for Kant autonomy is a value in and of itself, for Mill autonomy is a value only insofar as it contributes to the general welfare of society at large. (For a comprehensive historical account of the concept of autonomy see Schneewind, 1997.) Gerald Dworkin (1988, p. 7) has spoken in terms of ‘‘character[izing]’’ the concept of autonomy rather than defining it because of its complexity and lists the various related ways the term has been used: It is apparent that, although not used just as a synonym for qualities that are usually approved of, ‘‘autonomy’’ is used in an exceedingly broad fashion. It is used sometimes as an equivalent of liberty. . ., sometimes as equivalent to self-rule or sovereignty, sometimes as identical with freedom of the will. It is equated with dignity, integrity, individuality, independence, responsibility, and self-knowledge. It is identified with qualities of self-assertion, with critical reflection, with freedom from obligation, with absence of external causation, with knowledge of one’s own interests. (p. 6) There is a near-unanimous consensus by philosophers that autonomy requires both liberty (independence from controlling influences)10 and agency (capacity for 10 This notion that liberty consists of the ability to act without interference by others has been championed by Isaiah Berlin (1998), who has referred to this kind of liberty as ‘‘liberty in the negative sense’’ or ‘‘negative freedom’’ (p. 194). According to Berlin:

Political liberty in this sense is simply the area within which a man can act unobstructed by others. If I am prevented by others from doing what I could otherwise do, I am to that degree unfree; and if this area is contracted by other men beyond a certain minimum, I can be described as being coerced, or, it may be, enslaved. . . . Coercion implies the deliberate interference of other human beings within the area in which I could otherwise act. You lack political liberty or freedom only if you are prevented from attaining a goal by human beings. (p. 194)

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

363

intentional action) (Beauchamp & Childress, 1994, p. 121). According to Beauchamp and Childress (p. 123), an autonomous act is one that is done (a) intentionally, (b) with understanding, and (c) without controlling influences of others. In our analysis of autonomy within anonymity-featured GSS, we will focus on the ‘‘absence of controlling influences’’ aspect of autonomy. The masking of gender cues, or gender neutrality, in GSS-supported group with anonymous interaction capability fosters a dual autonomy. To begin with, as discussed earlier in connection with the value of justice, the masking of gender cues, made possible by the anonymity feature of GSS, promotes women’s complete participation by inhibiting the underrating of the contributions of women group members. Thus, through anonymity, women are given voice and are encouraged to fully express their ideas with the expectation that their ideas will be evaluated on their inherent worth and not on the gender of their proposer. Moreover—and just as importantly, GSS anonymity fosters the autonomy of those group members who are judging the contributions of other members by making certain that the assessments are not affected by anti-women prejudices. Hill (1991, p. 50) has noted that ‘‘[p]eople are not self-governing, in a sense, when their responses to problems. . . are shaped by prejudices at odds with the noble sentiments they think are moving them.’’ In short, GSS anonymity liberates group members from the straitjacket of prejudicial judgments. Thus, the masking of gender cues allowed by the anonymity feature of GSS fosters the autonomy of group members who are judging the contributions of other group members as well as the autonomy of the group members whose contributors are being judged. In this paper, we argue that the masking of gender cues in mixed gender groups supported by anonymity-featured GSS may enhance the creativity of the groups. Viewed through the prism of applied ethics, the argument asserts that the two-fold autonomy made possible by the anonymity feature of GSS encourages creative idea generation in mixed gender groups. Although, as indicated earlier, there have been no empirical studies comparing anonymity-featured GSS-supported groups, nonanonymity-featured GSS-supported groups, and traditional FTF groups on the level of creativity of ideas generated or the number of creative ideas generated, the existing literature strongly suggests that GSS enhances the creativity of groups (e.g. see Connolly et al., 1990; Klein & Dologite, 2000; Siau, 1996). 2.3.1. Miscommunication and its offsetting effect on the advantage of gender neutrality 2.3.1.1. The plight of GSS as a text-based medium. The advantage of the absence of gender cues in anonymity-featured GSS, with the consequent promotion of the value of autonomy, may be offset by the chronic miscommunication that plagues CMC (see Rainey, 2000). Specifically, miscommunication occurs in CMC primarily because text-based media do not convey the nonverbal nuances and emotional content that are reflected in voice patterns (vocal cues, such as pitch, rate, loudness, and tone), body movement, eye contact, and other physical gestures (Crystal, 2001, p. 36; Hiltz & Turoff, 1993, pp. 76–83; see also, Barile & Durso, 2002, pp. 175–176; Drake, Yuthas, & Dillard, 2001, p. 47; Rainey, 2000, p. 27; P. Wallace, 2001, p.

364

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

216).11 In light of this shortcoming, CMC, which includes GSS, has been called an ‘‘impoverished medium’’ (see Rainey, 2000, p. 21), ‘‘transmit[ting] less of the natural richness and interaction of interpersonal communication than face-to-face interaction’’ (Rice & Love, 1987, p. 87). According to Savicki and Kelley (2000, p. 817), ‘‘[b]ecause [CMC] is text-based and participants have no access to facial expressions, body language, and tone of voice, there is a loss of richness of interpersonal cues.’’ Without nonverbal cues to accompany and clarify the message in GSS, there is the danger that the message received by the recipient will not be understood in the manner intended by the sender. Thus, the GSS message received will not represent the idea expressed by the sender. To the extent that the ideas of a group member, man or woman, are distorted, the member’s autonomy—and the self-assertion and unconstrained freedom to express one’s self that it implies (see Dworkin, 1988, pp. 6, 24–25)—is diminished.12 A related problem similarly resulting in miscommunication and a diminution of autonomy is the absence of exclamations—reaction signals (e.g. ‘‘uh-huh’’) and comment clauses (e.g. ‘‘you know,’’ ‘‘I see’’)—in CMC, which contributes to the ambiguity of the message. GSS messages, like e-mail and chat group interactions, and unlike speech, lack the use of reaction signals and comment clauses, which indicate understanding or agreement (see Crystal, 2001, p. 40). According to some researchers, it is the absence of these features of spoken language ‘‘why so many Internet interactions are misperceived as abrupt, cold, distant, and antagonistic’’ (Crystal, p. 40; see also P. Wallace, 2001, p. 16). The same should hold true for GSS messages. Reaction signals and comment clauses in spoken language give the speaker simultaneous feedback. In contrast, by CMC interactions, ‘‘[t]here is no way for a participant to get a sense of how successful a message is, while it is being written—whether it has been understood, or whether it needs repair. There is no technical way . . . of allowing the receiver to send the electronic equivalent of a simultaneous nod, an uh-uh, or any other audio-visual reactions which play such a critical role in face-to-face interaction’’ (Crystal, 2001, p. 30).13

11

Nonverbal behaviors ‘‘leak’’ information that is not verbally expressed in speech, although some nonverbal behaviors are more controllable than others (Zuckerman, DePaulo, & Rosenthal, 1981). For example, facial expressions can be more effectively consciously manipulated than tone of voice (Zuckerman, Larrance, Spiegel, & Klorman, 1981). 12 We assume that miscommunication is evenly (equally) distributed between male and female members of the group. Hence, both genders are treated equally, and thus the sum total of justice is not diminished. However, by distorting expressed ideas and thereby, in effect, sabotaging self-assertion and freedom of expression, miscommunication reduces the autonomy of both male and female members, consequently diminishing the sum total of autonomy while keeping the sum total of equality constant. 13 Hiltz and Turoff (1978, p. 109) have found less communication of overt agreement, as reflected in the use of ‘‘uh-huh’’ or ‘‘yeah,’’ among members of computer-mediated groups than among member of faceto-face groups. Commenting on those research findings, P. Wallace (2001, p. 16) has explained that ‘‘[t]he simple ‘uh-huhs’ that a person uses to show understanding and agreement with the speaker were far less common in the online meeting.’’ The implication of the Hiltz and Turoff study is that there exists a greater risk of miscommunication among members of computer-mediated groups, including those supported by GSS, because of the sparse use reaction signals and comment clauses that clarify communication.

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

365

Another reason for the ambiguity in GSS messages, along with the resultant distortion of intended meaning and diminution of autonomy, is the text-based medium’s reliance on typed input. Crystal (2001, pp. 57–58) explains: ‘‘Typing, not a natural behaviour, imposes a strong pressure on the sender to be selective in what is said, especially if one is not a very fast or competent typist. And selectivity in expression must lead to all kinds of inclarity.’’ 2.3.1.2. Strategies to minimize miscommunication. It is suggested that miscommunication in GSS, as well as in CMC in general, can be reduced by clear, deliberate, and expressive writing, along with the use of imagery, emoticons, and emphatic indicators. GSS participants should be instructed and encouraged to write messages that are unambiguous and complete, messages that clearly state the ideas proposed by the sender and the points the sender wishes to make. GSS participants should be urged to express emotion if such expression would clarify the message. GSS-supported groups may consider adopting the chat group convention of using verbal glosses enclosed within angle brackets to express emotion (Crystal, 2001, pp. 35, 39). Examples provided by Crystal include ‘‘ < Hoppy giggles quietly to himself > ’’ (p. 35) and ‘‘ < Eagle smiles sympathetically at Gunner > ’’ (p. 39). Sloppy and truncated writing should be discouraged. GSS participants should be told to use extra words if these would make the sender’s meaning more precise. Clarity, not eloquence, should be the goal. In fact, Nunamaker et al. (1991, p. 49), have observed that electronic media such as GSS ‘‘typically promote more careful and precisely worded communication’’ (see also Daft & Lengel, 1986, p.560). Asynchronous GSS, like e-mail, is especially conducive to quality writing because it is ‘‘not done in real time as is face-to-face communication, so participants can craft their communication . . . with more care’’ (Rainey, 2000, pp. 23–24). The quality of GSS messages can be improved by the inclusion of imagery, which refers to the use of vivid language, metaphors, and other figures of speech in order to create mental images or pictures in the mind of the reader (see Hess, 1987).14 These mental images appeal to, or activate, one or more of the five senses—sight, sound, smell, taste, and touch—as well as awareness of bodily processes (e.g. breathing, temperature) and muscular tension (e.g. bodily movement) (see Litlangs, 2000; see also Paivio, 1979, p. 11). According to Litlangs ({ 3), the images evoked are ‘‘mental representations of sensory experience, a storehouse of devices by which the original senses of nature, society, commerce, etc. could be recreated.’’ Although imagery forms the essence of poetry (see Packard, 1992, pp. 32–34), it can be utilized to great advantage to enhance narrative writing, including GSS messages, by increasing

14

Metaphors ‘‘structure images or concepts in terms of other images or concepts’’ (S. Zuboff, 1988, p. 454, note 11). According to Lakoff and Johnson (1980, p. 5), ‘‘[t]he essence of a metaphor is understanding and experiencing one kind of thing in terms of another.’’ In connection with Lakoff and Johnson, Zuboff has noted that ‘‘[b]ecause human thought processes are largely metaphorical, the use of metaphor in everyday language provides a window onto the conceptual system that people use to structure and interpret experience’’ (p. 454, note 11).

366

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

clarity, adding detail, and conveying emotion (see Hess, 1987; Pounds, 2000). The writing in newsmagazines offers examples of effective use of imagery. In an item on Heineken beer and its founder, BusinessWeek has reported: ‘‘He was a visionary who understood that a good brew could travel and proved it by turning a venerable Dutch lager into the best-known beer brand in the world—the sudsy equivalent of Marlboro or Coca-Cola [italics added]’’ (Baker, 2002, p. 56). The Economist (‘‘Treatment of choice,’’ 2002, p. 51), in an article on corporate health plans, has observed: ‘‘In America, paying for health care is like having a nasty recurring fever. Just when it seems that harsh medicine has brought costs under control, the temperature rises and spending shoots up once again [italics added].’’ Emoticons, short for emotional icons, also referred to as smileys and graphic accents, are ‘‘playful combinations of punctuation marks designed to show some facial expression,’’ thereby ‘‘enhanc[ing] the socioemotional content of the message’’ (P. Wallace, 2001, p. 18). Emoticons, then, serve as a substitute for the expressive function of nonverbal cues. Crystal (2001, p. 36) has described emoticons as ‘‘combinations of keyboard characters designed to show emotional facial expression: they are typed in sequence on a single line, and placed after the final punctuation mark of a sentence.’’ For example, :- ) denotes happiness or satisfaction, while :- ( denotes sadness or dissappointment. (For a list of conventional emoticons, see Baker & Baker, 2001, Appendix A, p. 256; see also Kay, 2002, p. 42.) Most emoticons are read sideways by turning one’s head to the left. The resultant text–emoticon combination is capable of conveying a more precise and nuanced message than text alone.15 However, emoticons should not be used indiscriminately. Crystal (2001, p. 38) has warned that ‘‘[t]hose who get in the habit of routinely using smileys [emoticons] can also find themselves in the position of having their unmarked utterances misinterpreted precisely because they have no smiley attached to them.’’ The use of emphatic indicators can reduce ambiguity and distortions in communication. These signifiers of emphasis are the CMC analogue of ‘‘vocal variations in pitch (intonation), loudness (stress), speed, rhythm, pause, and tone of voice’’ (Crystal, 2001, p. 34). Common emphatic indicators, listed by Crystal (pp. 34–35), include repeated letters (e.g. ‘‘aaaaahhhhh,’’ ‘‘ooops’’), repeated punctuation marks (e.g. ‘‘of course !!!!!’’), all capitals to indicate shouting (e.g. ‘‘I WANT IT DONE TODAY’’), and asterisks to show emphasis (e.g. ‘‘an *old-fashioned* kitchen’’).

15

In a study of Internet and CompuServe (an online subscription service) messages of publicly posted newsgroups and special interest groups, Witmer and Katzman (1997) have reported that women used more emoticons than men in their postings. The results of this lone study raise the interesting issue of whether emoticons are gender markers. An affirmative answer may indicate that emoticons undermine the masking of gender cues in anonymity-featured GSS and the resultant ethical values of justice and autonomy. However, as this study found that only a small percentage of all message (13.2% of 3000 posts) included emoticons, further empirical research is needed to conclusively establish the relationship between gender and the use of emoticons (see Crystal, 2001, p. 167, note 64).

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

367

3. Deindividuation 3.1. Social psychological literature Lea and Spears (1991, p. 284) have summarized the literature on the negative social psychological consequences of computer-mediated communication in general thus: ‘‘[V]arious technological features of electronic communication trigger psychological states and processes that result in less normative influences on individuals and groups and more deregulated and extreme behaviour.’’ Deindividuation is one such psychological state. The absence of a social presence made possible by the anonymity offered by GSS may result in deindividuation (Hayne & Rice, 1997; Jessup et al., 1990; see also Coleman, Paternite, & Sherman, 1999; Siegel, Dubrovsky, Kiesler, & McGuire, 1986), a state of diminished self-awareness wherein individuals act as if they were submerged or psychologically absorbed in a group, leading to weakening of social norms, abandonment of inner restraints, and loss of evaluation apprehension (Diener, 1980; Festinger, Pepitone, & Newcomb, 1952).16 Thus, deindividuation refers to ‘‘a loss of identity leading to an antisocial state’’ (Lea & Spears, 1991, p. 285). According to Festinger et al. (1952, pp. 382–383), when individuals are deindividuated: [they] are not seen or paid attention to as individuals. The members do not feel that they stand out as individuals. Others are not singling a person out for attention nor is the person singling out others. . .. [U]nder conditions where the member is not individuated in the group, there is likely to occur for the member a reduction of inner restraints against doing various things. In other words, many of the behaviors which the individual wants to perform but which are otherwise impossible to do because of the existence, within himself, of restraints, become possible under conditions of de-individuation in a group. Jessup et al. (1990) have proposed the theory of anonymous interaction, which advances the notion that anonymity in GSS-supported groups undermines external social controls and thus leads to a reduction of internal restraints. According to Jessup et al. (p. 339), ‘‘anonymity will foster deindividuated behavior because the system buffers group members from each other and detaches individuals from their contributions.’’ Anonymity, then, is a two-faced coin. It allows group members to propose bold ideas that they would not put forward if their contributions were identified. But, anonymity also encourages antisocial or reckless behavior, such as group members being ‘‘overly caustic in their evaluations’’ of the contributions of other members (Jessup et al., 1990, p. 339). In addition, the dehumanization and the loss of inhibitions 16

Although the concept of deindividuation is within the realm of social psychology, the legal literature is especially rich in discussions of deindividuation (e.g. see Brand-Ballard, 1996; Gandy, 2000; Judges, 1999; Kahan, 1997; King, 1996; Perlin, 1994; Rachlinski, 2000; Salamanca, 1999; Schkade, Sunstein, & Kahneman, 2000; Sunstein, 2000).

368

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

that accompany deindividuation may result in flaming, which refers to the making of insulting, hostile, or obscene comments, including name calling, swearing, and interjecting irrelevant remarks (see Kabay, 1998; Kiesler & Sproull, 1992; Riva & Galimberti, 1998; Sproull & Kiesler, 1991; see also Agre, 1998, p. 3; Aiken & Waller, 2000; Reinig, Briggs, & Nunamaker, 1997–1998). Thus, without the social constraints of public visibility and accountability, irrelevant comments may be offered and verbal aggression committed, thereby sabotaging the processes of decision making and other intellectual teamwork by diverting attention from the task at hand and poisoning the atmosphere. 3.2. Ethical implications 3.2.1. Ring of Gyges scenario The psychological state of deindividuation brings about a ‘‘Ring of Gyges scenario,’’ K. A. Wallace’s (1999, p. 31) characterization of a state of affairs in which immunity against the consequences of ‘‘unethical or criminal action’’ is conferred by virtue of the cloak of anonymity. Wallace uses the Ring of Gyges scenario as a metaphor for this negative aspect anonymity in cyberspace.17 The term ‘‘Ring of Gyges’’ originates in a myth, recounted in The Republic of Plato (trans. 1991), concerning the law-abiding shepherd Gyges, who finds a ring that can render him invisible at will and enable him to do as he pleases without fear of punishment. Acting on his most wicked ambitions, Gyges enters the service of the king of Lydia, seduces the queen, kills the king, and seizes the throne. According to K. A. Wallace (1999, p. 31), this parable suggests that ‘‘[e]ven when the initial primary purpose [of anonymity] is to protect [the anonymous individual] from harmful actions by others or to promote positively valued activity, anonymity also provides space for action with impunity, and hence, the Ring of Gyges scenario.’’ 3.2.2. Intentional misconduct Wallace’s innovative adoption of the Ring of Gyges scenario to refer to situations involving bad behavior within cyberspace, apparently freely chosen or intentional, combined with a loss of constraints and a lack of accountability is in line with its use by ethicists as a device with which to pose the question ‘‘Would any person continue to obey the principles of morality if [one] could break them without being caught and punished?’’ (Arrington, 1998, p. 39; see also Martin, 1992, p. 174).18 The use of the Gyges story by Wallace (1999) also parallels its utilization within legal scholarship, where the expression ‘‘ring of Gyges’’ has been employed in connection with situations that provide individuals with freedom to choose between right and wrong along with ‘‘opportunities to do wrong without suffering the consequences’’ (Drury, 17 According to Drury (1996, p. 1229), writing within the legal tradition, Plato’s myths are ‘‘intended as fictions depicting profound truths about . . . the human condition.’’ 18 The use of stories for the purpose of furthering moral development is an ancient device of philosophers. Typical are the dialogues of Plato. For discussions regarding ancient philosophers as storytellerscum-moral teachers, see law review articles by Atkinson (1995, p. 218) and Drury (1996, pp. 1229–1230).

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

369

1996, p. 1232; see also Atkinson, 1995, p. 218, note 216; Kaufman, 1988, p. 256; Novak, 2000, p. 589; Nygaard, 1998, p. 381; Rosen, 1989, p. 1256; Wagner, 1994– 1995, p. 536; Yeager, 1997, p. 1302; Yoshino, 1998, p. 521, note 147). GSS-supported groups with anonymous interaction capability are at significant risk for Ring of Gyges scenarios. Anonymous and thereby ‘‘completely cushioned from the possibility of being punished’’ (A. Zuboff, 2002, p. 39), members of such groups may be induced to act in an antisocial manner (e.g. flaming) with adverse effects on group productivity. For example, making malicious or distracting comments in group discussions may create conflict or confusion, interrupt the flow of ideas, divert the energies of the group from its appointed task, and undermine decision-making, problem solving, and idea generation. Contributions by other group members may be evaluated without the necessary neutrality and detached perspective (see K. A. Wallace, 1999, p. 32). Remarks of an intimidating nature may constrain group members from making contributions, some of which may be creative and useful. 3.2.3. Negligent misconduct Our discussion thus far concerned Ring of Gyges scenarios resulting from the intentional acts of GSS participants. We now consider whether Ring of Gyges scenarios can exist within the context of negligent misconduct. Following the law of torts,19 we define negligence as the failure to exercise reasonable care under the circumstances (Prosser et al., 1984, pp. 164–168). Negligence, then, refers to careless conduct that brings about unintended consequences. In GSS-supported groups, anonymity and the concomitant lack of accountability may encourage poor writing, ambiguous phrasing, incomplete consideration of issues, and inattentiveness to proposals of others, resulting in miscommunication and distortion of the intended message (see earlier). Such carelessness may arise from mere laziness of the unmonitored group member or the informality and relaxed standards that are part and parcel of cyberculture. Although unintended, the damage nonetheless may be serious and may subvert the group’s task. Inarticulate and unclear writing, inattention, and insufficient consideration of matters relating to the group task are practical problems for firms using GSS to assist in intellectual teamwork. We now examine whether such negligent behavior is an ethical problem as well. Specifically, does a negligently prepared GSS message rise to the level of a Ring of Gyges scenario? If we answer the question in the affirmative, then, as a practical matter, any solution we propose to avert, or decrease the incidence of, Ring of Gyges scenarios should apply to such negligent conduct (see later). In considering this issue, we again cross disciplinary boundaries by importing concepts from the law of torts, and we make a distinction between ordinary negligence, on one hand, and what has been variously termed as gross negligence, reckless conduct, and willful and wanton misconduct, on the other.

19

Torts refer to civil wrongs for which the law affords a remedy (see Prosser, Keeton, Dobbs, Keeton, & Owen, 1984, pp. 1–7).

370

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

The negligence that we have been discussing so far is ordinary negligence, that is, the failure to exercise ordinary care or the care that a reasonable person would exercise under similar circumstances. Ordinary negligence, such as that usually involved in carelessly written GSS messages, does not appear to involve a Ring of Gyges problem, as the latter term has been employed in the legal and philosophy literatures. In these literatures, the term Ring of Gyges has been used in connection with situations in which a criminal act, lying, unscrupulous behavior, or other intentional misconduct—not mere carelessness—occurs because of the absence of public scrutiny and accountability (e.g. see Drury, 1996; Kaufman, 1988; Novak, 2000; see also A. Zuboff, 2001). A negligently prepared GSS message typically lacks the element of moral corruption to fall within the ambit of Ring of Gyges scenario. However, we suggest that conduct that is so egregious that it is deemed by the law of torts to fall within the category of gross negligence should come within the Ring of Gyges rubric. At its most basic, gross negligence concerns a much greater degree of inattention or inadvertence than ordinary negligence, a degree of carelessness that is so reckless and wanton that it approaches the intentional. The jurist Learned Hand defined gross negligence in terms of ‘‘some opprobrium or reproach’’ (Conway v. O’Brien, 1940). Thus, for example, writing incoherent or harassing GSS messages after irresponsibly and recklessly drinking an excessive amount of alcohol would be considered gross negligence and, therefore, should fall within the ambit of a Ring of Gyges scenario. 3.2.4. Averting or decreasing the incidence of Ring of Gyges scenarios: practical solutions How can such Ring of Gyges scenarios be averted or, at least, reduced in number in GSS-supported groups where anonymity is conferred upon group members? Discussing college classes that established anonymous ‘‘discussion rooms’’ on a LAN (local area network), K. A. Wallace (1999, p. 32) has suggested a ‘‘tagging’’ system20 ‘‘so that comments [are] traceable if certain agreed upon or publicized norms of conduct [are] violated.’’ Thus, the anonymity that is conferred when such a tagging system is in place is limited rather than an absolute. Froomkin (1996, p. 417), writing within the discipline of law, has characterized such limited anonymity as ‘‘traceable anonymity,’’ which is distinguished from the ‘‘untraceable’’ kind. Applying Wallace’s (1999) tagging system proposal to GSS-supported groups will result in traceable anonymity by virtue of the presence of a system administrator (‘‘sysadmin’’), who is not a group member. This administrator can be either an 20

A tagging system is similar to computer-based electronic awareness and performance monitoring systems. Awareness monitoring systems provide information on the presence or behavior of colleagues. For a recent scholarly treatment of awareness monitoring, see Zweig and Webster (2001). Performance monitoring systems ‘‘provide managers with access to their employees’ computer terminals and telephones, allowing managers to determine at any moment throughout the day the pace at which employees are working, their degree of accuracy, log-in and log-off times’’ (Aiello & Kolb, 1995, p. 339; see also Ambrose, Alder, & Noel, 1998, p. 62). For a recent studies on performance monitoring, see Alge (2001); Stanton and Julian (2002).

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

371

external facilitator 21 (also referred to as a moderator) or a mere technician, who is charged with maintenance of the technology. Such an administrator will have access to the identities of group members in either one of two ways. If all group members are situated in the same room, the administrator will know—by name or some other identifier—who is seated at each computer station. If the group members are geographically dispersed, the administrator may assign a ‘‘sign-in’’ name (a pseudonym) to each member. In either case, anonymity is limited and traceable because the identities are known to the administrator—and only to the administrator. Upon a violation of agreed upon or publicized norms of conduct, the administrator will be able to remove the offending group member, whose identity will not be revealed to the other group members. [Agreed upon norms that are decided by group consensus—not by the system administrator—prior to the GSS meeting are preferable as giving group members a voice in establishing norms may lead to increased acceptance because of increased perceptions of procedural fairness (see Alder, 1998, p. 737; Alder & Tompkins, 1997, p. 274; Ambrose & Alder, 2000, pp. 198–199; Thibaut & Walker, 1975; see also Brockner, 2002; Fox & Amichai-Hamburger, 2001, pp. 90–91). If publicized norms are used, we suggest that standardized norms adopted by a professional organization—rather than norms drafted by the system administrator—be used, provided they exist, as they may have a proven record of effectiveness, and, moreover, because decisions by an impartial third-party tend to be viewed as more procedurally fair (see Lind & Tyler, 1988, p. 13; see also Thibaut & Walker, 1975).] The removal of group members from GSS-supported groups will not affect the existence or viability of the group because the number of participants need not be constant. Moreover, the removal of a group member will not be known to the other members because the number of participants in GSSsupported groups is not divulged to the members in the first instance. As indicated earlier, ordinarily negligent behaviors, such as those typically found in carelessly prepared GSS messages (e.g. ambiguous writing), do not rise to the level of Ring of Gyges scenarios. Tagging would be inappropriate in these circumstances because the harm imposed by restricting, or placing a chilling effect upon, freedom of expression outweighs the harm caused by negligently prepared messages. For example, if tagging were to be imposed on situations involving ordinary negligence, there would be the danger that GSS participants would withhold some of their more innovative and complex ideas because of the fear that the ideas have not been well thought out or will be improperly articulated. For tagging to be effective it must be an extraordinary remedy to be invoked in extraordinary circumstances, specifically, in cases of intentional misconduct (e.g. deliberate flaming) or gross negligence (e.g. alcohol-induced harassing comments), thus substantially preserving freedom of expression. Were tagging to be resorted to in cases of mere carelessness, such a solution to the Ring of Gyges problem would create such a chilling effect so as to undermine the advantage of anonymity-featured 21 An external facilitator is either a process facilitator, who directly interacts with the group, or a technical facilitator, who is charged with operating the technology (see Bostrom, Anson, & Clawson, 1993, p. 159).

372

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

GSS: generating ideas without fear of reprisal, ridicule, or other adverse consequences. 3.2.5. Justifications for tagging: liberty-limiting principles Although a sensible, practical solution to the Ring of Gyges scenario, tagging appears to compromise autonomy and impose a ‘‘tyranny of the majority’’ by restricting, or having a chilling effect upon, freedom of speech (also referred to as freedom of expression, freedom of thought, 22 and freedom of public discussion, e.g. see Popper, 1984/1992, p. 157; Strauss, 1952/1988, pp. 22–23). According to Smith (1994, p. 89), ‘‘[T]he tensions between individual freedoms and majoritarian control are as evident in the computer-mediated forum as they are in the more traditional communication contexts.’’ The principle of free speech, affirmed by the United Nations General Assembly in the Universal Declaration of Human Rights in 1948 (see Glendon, 2001; see also Sunstein, 2002), has been regarded as a hallmark of a free society (e.g. see Popper, 1971), wherein a ‘‘carnival of voices’’ (Edmonds & Eidinow, 2001, p. 99) informs a robust public conversation and a free exchange of ideas.23 For example, in the United States, freedom of speech,24 a fundamental principle of Anglo-American jurisprudence, is not only enshrined in the First Amendment to the Constitution and thus embedded in the American political tradition, but also ‘‘has an independent life outside of the courtroom’’ as the principle has a ‘‘large cultural presence [italics in original]’’ (Sunstein, 2001, p. 146; see also Friedman, 1998, p. 209; Laski, 1948, p. 668).25 The social and cultural importance of free speech was recognized by Dostoyevsky, who asked, ‘‘[H]ow can nihilism be fought without freedom of speech?’’ (Ozick, 2000, p. 8). In a more practical vein—and of special relevance to idea generation within groups supported by GSS with anonymous interaction capability, Popper (1984/ 1992, p. 205) has asserted the sociocultural belief that ‘‘only critical discussion can help us to see an idea from many sides and to judge it fairly.’’ Such a ‘‘cultural understanding’’ (Sunstein, p. 146) and expectation of free speech mandates that when freedom of expression is limited even in non-legal contexts, it be for some compelling reason that can be ethically and philosophically justified. Accordingly, 22

According to Popper (1984/1992, p. 208), ‘‘without free exchange of thought there can be no true freedom of thought.’’ Similarly, in Ashcroft v. Free Speech Coalition (2002), the United States Supreme Court has asserted, ‘‘The right to think is the beginning of freedom, and speech must be protected from the government because speech is the beginning of thought.’’ 23 Potts (2002, p. 7) has grasped the essence of the principle of free speech as ‘‘the belief that freedom of expression is an inalienable right, and that the truth will emerge victorious in a ‘free market’ of ideas.’’ 24 Although a bedrock principle of Anglo-American law and of the United States Constitution, freedom of speech has never been absolute. For example, there are laws against obscene speech in public as well as ‘‘civil liability for defamatory utterances and for nondefamatory statements that reveal information that is properly private, criminal liability for irresponsible statements that cause panics or riots, laws against incitements to crime, and (more controversially) sedition’’ (Feinberg & Coleman, 2000, p. 256). 25 Cohen (2002, p. 28) has expressed the sociocultural sensibility toward free speech thus: ‘‘Free speech is not just a Constitutional protection; it is a profoundly moral principle: there can be no meaningful human freedom without freedom of thought, and thought requires access to and an exchange of ideas.’’

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

373

any justification for tagging must be found in the liberty-limiting principles, or autonomy restrictors, which have their genesis in the law and philosophy literatures. These principles address the issue of when an individual’s autonomy may be overridden. The legal philosopher Joel Feinberg has listed and discussed various liberty-limiting principles that can be appealed to in justifying limitations on freedom and autonomy (Feinberg, 1973, pp. 33–34, 1984, pp. 26–27). Originally articulated in the context of legal theory, especially with respect to criminal law and compulsory taxation, these principles also have been applied to the realm of medical ethics ‘‘in justifying policies and practices of institutions (such as hospitals) and the actions of individuals that affect other people’’ (Munson, 2000, p. 43). We further extend these principles, which are not mutually exclusive (Feinberg, 1973, p. 34), by appealing to them in justifying the limitations upon free speech that tagging imposes. The harm principle, first posited by John Stuart Mill (1859/1974) in his essay On Liberty, asserts that the liberty of an individual can be overridden or curtailed if it will prevent harm to others (Feinberg, 1973, p. 33, 1984, p. 27). Mill has ‘‘warn[ed] against the ‘tyranny of the majority’ that can suppress individuality in the interests of the accepted modes of behavior in society’’ (Arrington, 1998, p. 356). For Feinberg (1973, p. 54), ‘‘[w]hen persons in groups are deprived of what they need they are harmed [italics in original].’’ Freedom from outrageous and harassing behavior in a Ring of Gyges scenario is a necessity for the unhindered generation of ideas and the proper working of the group. Thus, averting one group member from creating a Ring of Gyges scenario prevents harm to other group members. According to the harm principle, an individual’s right to free speech is subordinate to the stronger considerations of free speech for other group members. Collective autonomy, then, trumps individual autonomy when the exercise of the latter harmfully interferes with the exercise of the former. The welfare principle, also referred to as the social welfare principle, holds that an individual’s autonomy may be restricted in order to provide a benefit to others (Feinberg, 1973, p. 33, 1984, p. 27). According to Munson (2000, p. 45), ‘‘an ideal application of the [welfare] principle would be the case in which we give up just a little autonomy to bring about a great deal of benefit to others.’’ The prevention of a Ring of Gyges scenario—and the anarchy, intimidation, and distraction this scenario entails—is a great benefit to members of GSS-supported groups, who will thus be able to conduct discussions and to generate ideas in a calm, professional atmosphere conducive to focusing on the task at hand. Under a tagging system, group members will be able to express themselves without fear of verbal harassment, while identification of members will be allowed only in cases of outrageous behavior by a GSS group member—behavior which itself has a chilling effect on the freedom of speech of the other group members. Thus, tagging is an instance where a little autonomy is given up in order to confer a great benefit to others: the preservation of the free and focused expression of ideas relating to the task for which the group was created. According to the principle of paternalism, also known as legal paternalism, an individual’s autonomy may be interfered with in order to prevent that individual from harming himself or herself (Feinberg, 1971, p. 105, 1973, p. 33, 1984, p. 27).

374

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

The essence of a paternalistic act is that it ‘‘must in some way target the good of another person’’ (Crossley, 1999, p. 297). Coercing persons for their own good accords with both ‘‘common sense and our long established customs and laws’’ (Feinberg, 1971, p. 105). By inhibiting an individual group member from creating a Ring of Gyges scenario and thereby from sabotaging the group’s work, tagging averts a blemish to the group’s reputation, which may attach to each individual group member, with possible adverse career consequences, by virtue of belonging to an ineffective or unproductive group. Thus, paternalism would justify tagging in order to prevent individual group members from self-inflicted professional harm. The principle of extreme paternalism, also called strong paternalism, asserts that an individual’s freedom may be restricted in order to benefit the individual himself or herself (Feinberg, 1973, p. 33) and ‘‘guide them, whether they like it or not, toward their own good’’ (Feinberg, 1971, p. 105). This principle, as well as the more moderate paternalism principle (see earlier), is contrary to Mill’s position that ‘‘the individual’s own good is never a sufficient warrant for the exercise of compulsion [italics in original]’’ (Dworkin, 1972, p. 64). For example, by curtailing an individual’s freedom to flame, tagging may increase the likelihood that each group member may make positive contributions to the group, which ultimately may inure to the individual group member’s benefit. For example, management may render a favorable performance evaluation26 or award a bonus to all members of the group for a job well done (see Baker, Jensen, & Murphy, 1988; Fairburn & Malcomson, 2001; see also Milgrom & Roberts, 1992). The offense principle holds that an individual’s liberty is justifiably restricted to prevent that individual from offending others (Feinberg, 1973, p. 33, 1984, p. 27). This principle is most persuasively applied to conduct that may be harmless but is nevertheless repugnant (Feinberg, 1985). ‘‘[I]f the offence was extremely repellent to a large segment of the population, was very public, so much so that it could not be avoided by persons who found it offensive, then one could legitimately apply the offence principle’’ (Guth, 1999, { 15). For example, flaming by one group member may be so extremely offensive to other group members—who are, in effect, a captive audience for the flamer as all messages, including the flame, are automatically displayed on all group members’ computer screens—that such conduct may meet these criteria and thus justify tagging. Finally, the legal moralism principle may be employed in defense of a tagging system. This principle asserts that a person’s liberty may be restricted to prevent a person from acting immorally even though the person’s conduct ‘‘causes neither harm nor offense to the actor or the others’’ (Feinberg, 1984, p. 27). Ring of Gyges behaviors are inherently immoral. In fact, as indicated above, the term Ring of Gyges has been used as a metaphor by legal scholars and ethicists for immoral conduct and evil behavior occasioned by the loss of constraints and absence of accountability that are afforded by anonymity. By allowing for the tracing of comments upon a violation of certain agreed upon or publicized norms (K. A. Wallace, 1999, p. 32), a tagging system can discourage the violation of moral standards. 26 Consistent favorable performance evaluations of an employee may lead to a promotion within the organization (see Fairburn & Malcomson, 2001; see also Baker, et al. 1988; Milgrom & Roberts, 1992).

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

375

4. Conclusions In this paper, we have offered an ethical perspective on an aspect of information technology (IT) and computer-supported cooperative work (CSCW): GSS with anonymous interaction capability. In the course of our ethical analysis of two social psychological consequences of the anonymity allowed by GSS—that of the absence of gender cues and that of deindividuation, we have trespassed on various disciplinary domains, including social psychology, philosophy, law, creative writing, and linguistics, and we have considered such issues as creative idea generation, miscommunication, electronic tagging, the ordinary negligence–gross negligence distinction, and free speech and its limitations. The anonymity provided by GSS is a two-faced coin. The absence of gender cues fosters the twin values of justice and autonomy by permitting the evaluation of group members’ contributions exclusively on the ground of merit. Deindividuation, however, results in a ‘‘Ring of Gyges scenario’’ (K. A. Wallace, 1999), in which misbehavior is encouraged because of the absence of public scrutiny and accountability. We have adopted Wallace’s electronic tagging solution of traceable anonymity as a means by which Ring of Gyges scenarios can be averted or minimized. Moreover, we have provided justifications on the restrictions of free speech imposed by tagging by reference to various liberty-limiting principles. It is suggested that future researchers investigate the ethical dimensions of other social psychological consequences of anonymity-featured GSS, examine from a philosophical point of view the importance of those consequences that are desirable, and discover practical ways of limiting those consequences that are undesirable.

Acknowledgements The authors would like to thank the anonymous reviewers for the helpful comments.

References Abel, M. J. (1990). Experiences in exploratory distributed organization. In J. Galegher, R. E. Kraut, & C. Egido (Eds.), Intellectual teamwork: social and technological foundations of cooperative work (pp. 489– 510). Hillsdale, NJ: Erlbaum. Ackermann, F., & Eden, C. (1994). Issues in computer and non-computer supported GDSSs. Decision Support Systems, 12, 381–390. Adkins, M., Shearer, R., Nunamaker, J. F. Jr., Romero, J., & Simcox, F. (1998). Experiences using group support systems to improve strategic planning in the Air Force. Proceedings of the Thirty-First Hawaii International Conference on System Sciences, 1, 515–524. Agre, P. E. (1998, 3 July). Yesterday’s tomorrow: The advance of law and order into the utopian wilderness of cyberspace. [Special issue on information technology]. The Times Literary Supplement, pp. 3–4. Aiello, J. R., & Kolb, K. J. (1995). Electronic performance monitoring and social context: impact on productivity and stress. Journal of Applied Psychology, 80, 339–353. Aiken, M., & Carlisle, J. (1992). An automated idea consolidation tool for computer supported cooperative work. Information & Management, 23, 373–382.

376

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

Aiken, M., & Paolillo, J. (1997). A longitudinal study of group decision support system use. International Business Schools Computing Quarterly, 8(3), 49–54. Aiken, M., & Waller, B. (2000). Flaming among first-time group support system users. Information & Management, 37, 87–94. Alder, G. S. (1998). Ethical issues in electronic performance monitoring: a consideration of deontological and teleological perspectives. Journal of Business Ethics, 17, 729–743. Alder, G. S., & Tompkins, P. K. (1997). Electronic performance monitoring: an organizational justice and concertive control perspective. Management Communication Quarterly, 10, 259–288. Alge, B. J. (2001). Effects of computer surveillance on perceptions of privacy and procedural justice. Journal of Applied Psychology, 86, 797–804. Altman, A. (2000). Arguing about law (2nd ed). Belmont, CA: Wadsworth. Ambrose, M. L., & Alder, G. S. (2000). Designing, implementing, and utilizing computerized performance monitoring: enhancing organizational justice. In G. R. Ferris (Ed.), Research in personnel and human resources management (Vol. 18) (pp. 187–219). Stamford, CT: JAI Press. Ambrose, M. L., Alder, G. S., & Noel, T. W. (1998). Electronic performance monitoring: a consideration of rights. In M. Schminke (Ed.), Managerial ethics: moral management of people and processes (pp. 61– 80). Mahwah, NJ: Erlbaum. Anson, R., Bostrom, R., & Wynne, B. (1995). An experiment assessing group support system and facilitator effects on meeting outcomes. Management Science, 41, 189–208. Anson, R., Fellers, J., Kelly, G. G., & Bostrom, R. P. (1996). Facilitating research with group support systems. Small Group Research, 27, 179–214. Applegate, L. M. (1991). Technology support for cooperative work: a framework for studying introduction and assimilation in organizations. Journal of Organizational Computing, 1, 11–39. Aristotle (1990). The politics of Aristotle (E. Barker, Trans.). New York: Oxford University Press. Arrington, R. L. (1998). Western ethics: an historical introduction. Malden, MA: Blackwell Publishers. Ashcroft v. Free Speech Coalition, No. 00–795, slip op. at 15 US 16 April (2002). Atkinson, R. (1995). How the butler was made to do it: the perverted professionalism of the remains of the day. Yale Law Journal, 105, 177–220. Baker, K., & Baker, S. (2001). How to say it online: everything you need to know to master the new language of cyberspace. Paramus, NJ: Prentice Hall Press. Baker, G. P., Jensen, M. C., & Murphy, K. J. (1988). Compensation and incentives: practice vs. theory. Journal of Finance, 43, 593–616. Baker, S. (with White, C., & Khermouch, G.) (2002, 28 January). Freddy Heineken’s recipe may be scrapped. BusinessWeek, p. 56. Barile, A. L., & Durso, F. T. (2002). Computer-mediated communication in collaborative writing. Computers in Human Behavior, 18, 173–190. Barki, H., & Pinsonneault, A. (2001). Small group brainstorming and idea quality: is electronic brainstorming the most effective approach?. Small Group Research, 32(2), 158–205. Beauchamp, T. L., Bowie, N. E. (Eds.). (1997). Ethical theory and business (5th ed.). Upper Saddle River, NJ: Prentice Hall. Beauchamp, T. L., & Childress, J. F. (1994). Principles of biomedical ethics (4th ed). New York: Oxford University Press. Beauchamp, T. L., & Walters, L. (1999). Contemporary issues in bioethics (5th ed). Belmont, CA: Wadsworth. Berger, J., Fisek, M. H., Norman, R. Z., & Zelditch Jr., M. (1977). Status characteristics and social interactions: an expectation states approach. New York: Elsevier. Berger, J., Zelditch Jr., M. (Eds.). (1998). Status, power and legitimacy. New Brunswick, NJ: Transaction Publishers. Berlin, I. (1998). Two concepts of liberty. In H. Hardy, & R. Hausheer (Eds.), The proper study of mankind: an anthology of essays [by Isaiah Berlin] (pp. 191–242). New York: Farrar, Straus and Giroux. Bostrom, R. P., Anson, R., & Clawson, V. K. (1993). Group facilitation and group support systems. In L. M. Jessup, & J. S. Valacich (Eds.), Group support systems: new perspectives (pp. 146–168). New York: Macmillan.

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

377

Bostrom, R. P., & Nagasundaram, M. (1998). Research in creativity and GSS. Proceedings of the ThirtyFirst Annual Hawaii International Conference on System Sciences. Retrieved 29 May 2002. Available: http://www.computer.org/proceedings/hicss/8233/8233toc.htm. Brand-Ballard, J. (1996). Reconstructing MacKinnon: essentialism, humanism, feminism. Southern California Review of Law and Women’s Studies, 6, 89–172. Brockner, J. (2002). Making sense of procedural fairness: how high procedural fairness can reduce or heighten the influence of outcome favorability. The Academy of Management Review, 27(1), 58–76. Carli, L. L., & Eagly, A. H. (1999). Gender effects on social influence and emergent leadership. In G. N. Powell (Ed.), Handbook of gender and work (pp. 203–222). Thousand Oaks, CA: Sage. Cohen, R. (2002, 7 April). Evolving kids [The ethicist]. The New York Times Magazine, pp. 28, 30. Coleman, L. H., Paternite, C. E., & Sherman, R. C. (1999). A reexamination of deindividuation in synchronous computer-mediated communication. Computers in Human Behavior, 15, 51–65. Conlin, M., & Zellner, W. (1999, 22 November). The CEO still wears wingtips: jobs that lead to the top remain overwhelmingly female-free. Business Week, 3656, 82. Connolly, T., Jessup, L. M., & Valacich, J. M. (1990). Effects of anonymity and evaluative tone on idea generation in computer mediated groups. Management Science, 36, 689–703. Conway v. O’Brien, 111 F.2d 611, 612 (2d Cir. 1940), rev’d on other grounds, 312 US 492 (1941). Craig, J. M., & Sherif, C. W. (1986). The effectiveness of men and women in problem-solving groups as a function of group gender composition. Sex Roles, 14, 453–466. Crossley, D. (1999). Paternalism and corporate responsibility. Journal of Business Ethics, 21, 291–302. Crystal, D. (2001). Language and the Internet. Cambridge, England: Cambridge University Press. Daft, R. L., & Lengel, R. H. (1986). Organizational information requirements, media richness and structural design. Management Science, 32(5), 554–571. Daniels, N. (1996). Justice and justification: reflective equilibrium in theory and practice. Cambridge, England: Cambridge University Press. Diener, E. (1980). Deindividuation: the absence of self-awareness and self-regulation in group members. In P. B. Paulus (Ed.), Psychology of group influence (pp. 209–242). Hillsdale, NJ: Erlbaum. Donaldson, T., & Werhane, P. H. (1998). Ethical issues in business: a philosophical approach (6th ed). Upper Saddle River, NJ: Prentice Hall. Drake, B., Yuthas, K., & Dillard, J. F. (2000). It’s only words—impacts of information technology on moral dialogue. Journal of Business Ethics, 23, 41–59. Drury, S. B. (1996). Lying and politics. University of Cincinnati Law Review, 64, 1227–1235. Dworkin, G. (1988). The theory and practice of autonomy. Cambridge, England: Cambridge University Press. Dworkin, G. (1972). Paternalism. The Monist, 56, 64–83. Edmonds, D., & Eidinow, J. (2001). Wittgenstein’s poker: the story of a ten-minute argument between two great philosophers. New York: HarperCollins. El-Shinnawy, M., & Vinze, A. S. (1997). Technology, culture and persuasiveness: a study of choice-shifts in group settings. International Journal of Human–Computer Studies, 47(3), 473–496. Elam, J. J., & Mead, M. (1990). Can software influence creativity?. Information Systems Research, 1(1), 1– 22. Fairburn, J. A., & Malcomson, J. M. (2001). Performance, promotion, and the Peter Principle. Review of Economic Studies, 68, 45–66. Feinberg, J. (1971). Legal paternalism. Canadian Journal of Philosophy, 1, 105–124. Feinberg, J. (1973). Social philosophy. Upper Saddle River, NJ: Prentice Hall. Feinberg, J. (1984). Harm to others. New York: Oxford University Press. Feinberg, J. (1985). Offense to others. New York: Oxford University Press. Feinberg, J., & Coleman, J. (2000). Philosophy of law (6th ed). Belmont, CA: Wadsworth. Ferrell, O. C., & Fraedrich, J. (1997). Business ethics: ethical decision making and cases (3rd ed). Boston: Houghton Mifflin. Festinger, L., Pepitone, A., & Newcomb, T. J. (1952). Some consequences of de-individuation in a group. Journal of Abnormal and Social Psychology, 47, 382–389. Fjermestad, J., & Hiltz, S. R. (1999). An assessment of group support systems research: results. Proceed-

378

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

ings of the Thirty-Second Hawaii International Conference on System Sciences. Retrieved 9 September 2002. Available: http://www.computer.org/proceedings/hicss/0001/00011/00011021toc.htm. Fjermestad, J., & Hiltz, S. R. (2000). Case and field studies of group support systems: an empirical assessment. Proceedings of the Thirty-Third Hawaii International Conference on System Sciences. Retrieved 29 May 2002. Available: http://www.computer.org/proceedings/hicss/0493/0493toc.htm. Flanagin, A. J., Tiyaamornwong, V., O’Connor, J., & Seibold, D. R. (2002). Computer-mediated group work: the interaction of member sex and anonymity. Communication Research, 29, 66–93. Fox, S., & Amichai-Hamburger, Y. (2001). The power of emotional appeals in promoting organizational change programs. Academy of Management Executive, 15(4), 84–94. Friedman, L. M. (1998). American law: an introduction (2nd ed). New York: Norton. Froomkin, A. M. (1996). Flood control on the information ocean: living with anonymity, digital cash, and distributed databases. Journal of Law and Commerce, 15, 395–507. Gallupe, R. B., Cooper, W. H., Grise, M. L., & Bastianutti, L. M. (1994). Blocking electronic brainstorms. Journal of Applied Psychology, 79(1), 77–86. Gallupe, R. B., DeSanctis, G., & Dickson, G. W. (1988). Computer-based support for group problemfinding: an experimental investigation. Management Information Systems Quarterly, 12(2), 277–296. Gandy, O. H. Jr. (2000). Exploring identity and identification in cyberspace. Notre Dame Journal of Law, Ethics and Public Policy, 14, 1085–1111. Gert, B., Culver, C. M., & Clouser, K. D. (1997). Bioethics: a return to fundamentals. New York: Oxford University Press. Glendon, M. A. (2001). A world made new: Eleanor Roosevelt and the Universal Declaration of Human Rights. New York: Random House. Guth, F. R. (1999, Winter). ‘‘Liberty-limiting’’ principles. Accreditation Ontario, 3. Retrieved 10 February, 2002. Available: http://acl.on.ca/accont/march99.html . Hayne, S. C., & Rice, R. E. (1997). Attribution accuracy when using anonymity in group support systems. International Journal of Human–Computer Studies, 47, 429–452. Hender, J. M., Dean, D. L., Rodgers, T. L., & Nunamaker, J. F. Jr. (2001). Improving group creativity: brainstorming versus non-brainstorming techniques in a GSS environment. Proceedings of the ThirtyFourth Hawaii International Conference on System Sciences. Retrieved 29 May 2002. Available: http:// www.computer.org/proceedings/hicss/0981/0981toc.htm. Herring, S. C. (1994, June). Gender differences in computer-mediated communication: bringing familiar baggage to the new frontier. Paper presented at the annual meeting of the American Library Association, Miami, FL. Retrieved 17 February, 2002. Available: http://sun.soci.niu.edu/weblinks/links/ herring.txt . Herring, S. C. (1996). Gender and democracy in computer-mediated communication. In R. Kling (Ed.), Computerization and controversy: value conflicts and social choices (2nd ed.) (pp. 476–489). New York: Academic Press. Herschel, R. T. (1994). The impact of varying gender composition on group brainstorming performance in a GSS environment. Computers in Human Behavior, 10, 209–222. Hess, K. K. (1987). Enhancing writing through imagery. New York: Trillium. Hill, T. E. Jr. (1991). Autonomy and self-respect. Cambridge, England: Cambridge University Press. Hiltz, S. R., & Turoff, M. (1978). The network nation: human communication via computer. Reading, MA: Addison-Wesley. Hiltz, S. R., & Turoff, M. (1993). The network nation: human communication via computer (Rev. ed). Cambridge, MA: MIT Press. Hirshleifer, J. (1980). Privacy: its origin, function, and future. Journal of Legal Studies, 9, 649–664. Honderich, T. (Ed.). (1995). The Oxford companion to philosophy. Oxford, England: Oxford University Press. Huber, G. P., Valacich, J. S., & Jessup, L. M. (1993). A theory of the effects of group support systems on an organization’s nature and decisions. In L. M. Jessup, & J. S. Valacich (Eds.), Group support systems: new perspectives (pp. 255–269). New York: Macmillan. Hymowitz, C. (2002, 9 April). How managers can keep from being ambushed by the boss [In the lead]. The Wall Street Journal, p. B1.

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

379

Jessup, L. M., Connolly, T., & Tansik, D. (1990). Toward a theory of automated group work: the deindividuating effects of anonymity. Small Group Research, 21, 333–348. Jessup, L. M., & Tansik, D. A. (1991). Decision making in an automated environment: the effects of anonymity and proximity with a group decision support system. Decision Sciences, 22(2), 266–279. Jessup, L. M., Tansik, D. A., & Laase, T. (1988). Group problem solving in an automated environment: the effects of anonymity and proximity on group process and outcome with a group decision support system. In F. Hoy (Ed.), Proceedings of the Forty-Eighth Annual Meeting of the Academy of Management (pp. 237–241). Mississippi State, MS: Author. Jessup, L., Valacich, J. (Eds.). (1993). Group support systems: new perspectives. New York: Macmillan. Johansen, R. (1988). Groupware: computer support for business. New York: Free Press. Judges, D. P. (1999). Scared to death: capital punishment as authoritarian terror management. University of California Davis Law Review, 33, 155–248. Kabay, M. E. (1998, March). Anonymity and pseudonymity in cyberspace: deindividuation, incivility and lawlessness versus freedom and privacy. Paper presented at the Annual Conference of the European Institute for Computer Anti-virus Research (EICAR), Munich, Germany. Kahan, D. M. (1997). Social influence, social meaning, and deterrence. Virginia Law Review, 83, 349–395. Kang, J. (1998). Information privacy in cyberspace transactions. Stanford Law Review, 50, 1193–1294. Kant, I. (1997). Groundwork of the metaphysics of morals (M. J. Gregor, Trans.). Cambridge, England: Cambridge University Press (Original work published 1785). Kaufman, I. R. (1988). New remedies for the next century of judicial reform: time as the greatest innovator. Fordham Law Review, 57, 253–269. Kay, R. (2002, 14 January). Emoticons and Internet shorthand. Computerworld, 36(3), 42. Kiesler, S. (1994). Working together apart. Retrieved 11 August 2002. Available: http://www.educause.edu/ir/library/text/cem9433.txt. Kiesler, S., & Sproull, L. (1992). Group decision making and communication technology. Organizational Behavior & Human Decision Process, 52(1), 96–123. King, N. J. (1996). Nameless justice: the case for the routine use of anonymous juries in criminal trials. Vanderbilt Law Review, 49, 123–159. Klein, E. E. (2000). The impact of information technology on leadership opportunities for women: the leveling of the playing field. Journal of Leadership Studies, 7(3), 88–98. Klein, E. E., & Dologite, D. G. (2000). The role of computer support tools and gender composition in innovative information system idea generation by small groups. Computers in Human Behavior, 16, 111–139. Kline, T. J. B., & McGrath, J. (1999). A review of the groupware literature: theories, methodologies, and a research agenda. Canadian Psychology, 40(3), 265–271. Laski, H. J. (1948). The American democracy: a commentary and an interpretation. New York: Viking Press. Lakoff, G., & Johnson, M. (1980). Metaphors we live by. Chicago: University of Chicago Press. Lea, M., & Spears, R. (1991). Computer-mediated communication, deindividuation and group decisionmaking. International Journal of Man–Machine Studies, 34, 283–301. Lind, E. A., & Tyler, T. R. (1988). The social psychology of procedural justice. New York: Plenum Press. Lindley, R. (1986). Autonomy. Atlantic Highlands, NJ: Humanities Press International. Litlangs (2000). The craft and theory of writing poetry: imagery and metaphor. Retrieved 20 January 2002, Available: http://www.poetrymagic.co.uk/imagery.html. Lockheed, M. E., & Hall, K. P. (1976). Conceptualizing sex as a status characteristic: applications to leadership training strategies. Journal of Social Issues, 32, 111–124. MacIntyre, A. (1988). Whose justice? Which rationality?. Notre Dame, IN: University of Notre Dame Press. Martin, R. M. (1992). There are two errors in the the title of this book: a sourcebook of philosophical puzzles, paradoxes and problems. Peterborough, Ontario, Canada: Broadview Press. McLeod, P. (1992). An assessment of the experimental literature on electronic support of group work: results of a meta-analysis. Human–Computer Interaction, 7, 257–278.

380

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

Meeker, B. F., & Weitzel-O’Neill, P. A. (1977). Sex roles and interpersonal behavior in task oriented groups. American Sociological Review, 42, 91–105. Milgrom, P., & Roberts, J. (1992). Economics, organization and management. Englewood Cliffs, NJ: Prentice Hall. Mill, J. S. (1974). On liberty. London: Penguin Books (Original work published 1859). Munson, R. (2000). Intervention and reflection: basic issues in medical ethics (6th ed). Belmont, CA: Wadsworth. Nagasundaram, M., & Bostrom, R. P. (1994–1995). The structuring of creative processes using GSS: a framework for research. Journal of Management Information Systems, 11(3), 87–114. Novak, D. (2000). Law: religious or secular?. Virginia Law Review, 86, 569–596. Nunamaker, J. F. Jr. (1997). Future research in group support systems: needs, some questions and possible directions. International Journal of Human–Computer Studies, 47, 357–385. Nunamaker, J. F. Jr., Briggs, R. O., Mittleman, D. D., Vogel, D. R., & Balthazard, P. A. (1996–1997). Lessons learned from a dozen years of group support systems research. Journal of Management Information Systems, 13(3), 163–207. Nunamaker, J. F. Jr., Dennis, A. R., Valacich, J. S., Vogel, D. R., & George, J. F. (1991). Electronic meeting systems to support group work. Communications of the ACM, 34(7), 40–61. Nunamaker, J. F. Jr., Dennis, A. R., Valacich, J. S., Vogel, D. R., & George, J. F. (1993). Group support systems research: experience from the lab and field. In L. M. Jessup, & J. S. Valacich (Eds.), Group support systems: new perspectives (pp. 125–145). New York: Macmillan. Nygaard, R. L. (1998). Crime, pain, and punishment: a skeptic’s view. Dickinson Law Review, 102, 355– 381. Osborn, A. F. (1957). Applied imagination (2nd ed). New York: Scribner. Ozick, C. (2000). Dostoyevsky’s unabomber. In C. Ozick (Ed.), Quarrel & quandary: essays (pp. 3–25). New York: Knopf. Packard, W. (1992). The art of poetry writing. New York: St. Martin’s Press. Paivio, A. (1979). Imagery and verbal processes. Hillsdale, NJ: Erlbaum. Perlin, M. L. (1994). Therapeutic jurisprudence: understanding the sanist and pretextual bases of mental disability law. New England Journal on Criminal and Civil Confinement, 20, 369–383. Plato (1991). The republic (A. Bloom, Trans.). New York: Basic Books. Poole, M. S., & DeSanctis, G. (1990). Understanding the use of group decision support systems: the theory of adaptive structuration. In C. W. Steinfeld, & J. Fulk (Eds.), Organizations and communication technology (pp. 173–193). Newbury Park, CA: Sage. Popper, K. R. (1971). The open society and its enemies (Vols. 1–2). Princeton, NJ: Princeton University Press. Popper, K. R. (1992). In search of a better world: lectures and essays from thirty years (L. J. Bennett, Trans.). London: Routledge (Original work published 1984). Potts, R. (2002, 15 February). Injurious to truth? [Review of the book Censorship: a world encyclopedia]. The Times Literary Supplement, pp. 7–8. Pounds, E. (2000). Improving writing through the use of imagery. In D. J. G. Brian, J. Taylor-Pendergrass, & S. Thompson (Eds.), 2000 families first idea book: integrating work skills and basic skills (pp. C23–C26). Knoxville, TN: Center for Literacy Studies, University of Tennessee/Knoxville. Prosser, W. L., Keeton, W. P., Dobbs, D. B., Keeton, R. E., & Owen, D. G. (1984). Prosser and Keeton on the law of torts (5th ed). St. Paul, MN: West. Rachlinski, J. J. (2000). The limits of social norms. Chicago-Kent Law Review, 74, 1537–1566. Rainey, V. P. (2000). The potential for miscommunication using e-mail as a source of communication. Transactions of the SDPS: Journal of Integrated Design and Process Science, 4(4), 21–43. Rawls, J. (1971). A theory of justice. Cambridge, MA: Harvard University Press. Rawls, J. (1999). Justice as fairness. In S. Freeman (Ed.), John Rawls: collected papers (pp. 47–72). Cambridge, MA: Harvard University Press. Rawls, J. (2000). A theory of justice (Rev. ed). Cambridge, MA: Harvard University Press. Reinig, B. A., Briggs, R. O., & Nunamaker, J. F., Jr. (1997–1998). Flaming in the electronic classroom. Journal of Management Information Systems, 14(3), 45–59.

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

381

Rice, R. E., & Love, G. (1987). Electronic emotion: socioemotional content in a computer-mediated communication network. Communication Research, 14(1), 85–108. Riva, G., & Galimberti, C. (1998). Computer-mediated communication: identity and social interaction in an electronic environment. Genetic, Social, and General Psychology Monographs, 124(4), 434–447. Rosen, R. E. (1989). Ethical soap: L.A. Law and the privileging of character. University of Miami Law Review, 43, 1229–1261. Rotenberg, M. (1993). Communication privacy: implications for network design. Communications of the ACM, 36(8), 61–69. Salamanca, P. E. (1999). Constitutional protection for conversations between therapists and clients. Missouri Law Review, 64, 77–122. Satzinger, J. W., Garfield, M. J., & Nagasundaram, M. (1999). The creative process: the effects of group memory on individual idea generation. Journal of Management Information Systems, 15(4), 143–160. Savicki, V., & Kelley, M. (2000). Computer mediated communication: gender and group composition. CyberPsychology & Behavior, 3, 817–826. Savicki, V., Kelley, M., & Ammon, B. (2002). Effects of training on computer-mediated communication in single or mixed gender small task groups. Computers in Human Behavior, 18, 257–269. Savicki, V., Lingenfelter, D., & Kelley, M. (1996). Gender language style and group composition in Internet discussion groups. Journal of Computer-Mediated Communication, 2(3). Retrieved 2 November 2000, Available: http://www.ascusc.org/jcmc/vol2/issue3/savicki.html Schkade, D., Sunstein, C. R., & Kahneman, D. (2000). Deliberating about dollars: the severity shift. Columbia Law Review, 100, 1139–1175. Schneewind, J. B. (1997). The invention of autonomy: a history of modern moral philosophy. Cambridge, England: Cambridge University Press. Sell, J. (1997). Gender, strategies, and contributions to public goods. Social Psychology Quarterly, 60(3), 252–265. Siau, K. L. (1996). Electronic creativity techniques for organizational innovation. The Journal of Creative Behavior, 30(4), 283–293. Siegal, J., Dubrovsky, V., Kiesler, S., & McGuire, T. (1986). Group processes in computer mediated communication. Organizational Behavior & Human Decision Process, 37, 157–187. Simon, W. H. (2000). The practice of justice: a theory of lawyers’ ethics. Cambridge, MA: Harvard University Press. Smith, S. A. (1994). Communication and the Constitution in cyberspace. Communication Education, 43, 87–101. Smith-Lovin, L., & Brody, C. (1989). Interruptions in group discussions. American Sociological Review, 54, 424–435. Sproull, L. S., & Kiesler, S. (1991). Connections: new ways of working in the networked organization. Cambridge, MA: MIT Press. Stanton, J. M., & Julian, A. L. (2002). The impact of electronic monitoring on quality and quantity of performance. Computers in Human Behavior, 18, 85–101. Strauss, L. (1988). Persecution and the art of writing. Chicago: University of Chicago Press (Original work published 1952). Sunstein, C. R. (2000). Deliberative trouble? Why groups go to extremes. Yale Law Journal, 110, 71–119. Sunstein, C. R. (2001). Republic.com. Princeton, NJ: Princeton University Press. Sunstein, C. R. (2002, 25 February). Rights of passage [Review of the book A world made new: Eleanor Roosevelt and the Universal Declaration of Human Rights]. The New Republic, pp. 37–41. Tannen, D. (1994). Gender and discourse. New York: Oxford University Press. Thibaut, J., & Walker, L. (1975). Procedural justice: a psychological analysis. Hillsdale, NJ: Erlbaum. ‘‘Treatment of choice’’ (2002, 19 January). The Economist, pp. 51–52. Wagner, W. J. (1994–1995). In search of the market’s moral limits: liberalism, perfectionism, and ‘‘the bad man’’ in Christian perspective. The Journal of Law & Religion, 11, 535–545. Walker, H. A., Ilardi, B. C., McMahon, A. M., & Fennell, M. L. (1996). Gender, interaction, and leadership. Social Psychology Quarterly, 59(3), 255–272. Wallace, K. A. (1999). Anonymity. Ethics & Information Technology, 1, 23–35.

382

E.E. Klein et al. / Computers in Human Behavior 19 (2003) 355–382

Wallace, P. (2001). The psychology of the Internet. Cambridge, England: Cambridge University Press. Witmer, D. F., & Katzman, S. L. (1997). On-line smiles: does gender make a difference in the use of graphic accents? Journal of Computer-Mediated Communication, 2(4). Retrieved 25 September 2001, Available: http://jcmc.huji.ac.il/vol2/issue4/witmer1.html Yeager, D. (1997). Does privacy really have a problem in the law of criminal procedure?. Rutgers Law Review, 49, 1283–1315. Yoshino, K. (1998). Assimilationist bias in equal protection: the visibility presumption and the case of ‘‘don’t ask, don’t tell’’. Yale Law Review, 108, 485–571. Zigurs, I., & Buckland, B. K. (1998). A theory of task/technology fit and group support systems effectiveness. Management Information Systems Quarterly, 22, 313–334. Zmud, R. W. (1990). Opportunities for strategic information manipulation through new information technology. In J. Fulk, & C. Steinfield (Eds.), Organizations and communication technology (pp. 95– 116). Newbury Park, CA: Sage. Zuboff, A. (2001, March/April). Why should I care about morality? Philosophy Now, 24–27. Zuboff, A. (2002, August/September). Morality and hot mud. Philosophy Now, 39–40. Zuboff, S. (1988). In the age of the smart machine: the future of work and power. New York: Basic Books. Zuckerman, M., DePaulo, B. M., & Rosenthal, R. (1981). Verbal and nonverbal communication of deception. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 14) (pp. 1–59). New York: Academic Press. Zuckerman, M., Larrance, D. T., Spiegel, N. H., & Klorman, R. (1981). Controlling nonverbal cues: facial expressions and tone of voice. Journal of Experimental Social Psychology, 17, 506–524. Zweig, D., & Webster, J. (2001, April). Accepting awareness monitoring systems: can technology overcome psychology? In B. J. Alge (Chair), Design considerations in electronic workplace surveillance systems. Symposium conducted at the meeting of the Sixteenth Annual Conference of the Society for Industrial and Organizational Psychology, San Diego, CA.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.