Expanding Digital Divides Research: A Critical Political Economy of Social Media

Share Embed


Descripción

The Communication Review

ISSN: 1071-4421 (Print) 1547-7487 (Online) Journal homepage: http://www.tandfonline.com/loi/gcrv20

Expanding Digital Divides Research: A Critical Political Economy of Social Media Lincoln Dahlberg To cite this article: Lincoln Dahlberg (2015) Expanding Digital Divides Research: A Critical Political Economy of Social Media, The Communication Review, 18:4, 271-293, DOI: 10.1080/10714421.2015.1085777 To link to this article: http://dx.doi.org/10.1080/10714421.2015.1085777

Published online: 13 Nov 2015.

Submit your article to this journal

Article views: 62

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=gcrv20 Download by: [Lincoln Dahlberg]

Date: 14 December 2015, At: 23:40

The Communication Review, 18:271–293, 2015 Copyright © Taylor & Francis Group, LLC ISSN: 1071-4421 print/1547-7487 online DOI: 10.1080/10714421.2015.1085777

Expanding Digital Divides Research: A Critical Political Economy of Social Media LINCOLN DAHLBERG

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Institute for Advanced Studies of the Humanities, The University of Queensland, Queensland, Australia

This paper highlights, through a critical political economy approach, a number of inequalities, or “divides,” that have been neglected in digital divides research, divides arising from the domination of social media platform ownership by a few for-profit corporations. As a result, the paper calls for an expansion of digital divides research to include a critical examination of the empowerment relations flowing from the contexts of digital media technologies themselves and not just the contexts of users.

INTRODUCTION Digital divides research has expanded over the last couple of decades from a focus on the question of who does and does not have access to digital media technology, to now also include an interest in the differences in the types of activities that people are participating in once they are connected (van Deursen & van Dijk, 2014; van Dijk, 2012). In this paper I argue for a further expansion of digital divides research because, as I see it, important aspects of digital inequality are currently being overlooked. In short, digital divides research has extensively examined if and how users might be differentially empowered by different levels and types of digital media access and usage, where empowerment here generally refers to individual human actors increasing their capacity to shape their own lives and to participate in the shaping of social life more broadly. This examination has largely involved deploying survey data to measure users’ digital media access and usage, and subsequently comparing these measures with the contexts that each user brings to their access and usage, contexts being Address correspondence to Lincoln Dahlberg, 17 Coronation Street, Belmont, Auckland, 0622, New Zealand. E-mail: [email protected] 271

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

272

L. Dahlberg

operationalized in terms of demographic characteristics, including a user’s social, cultural, political, and economic capital, measured by, among other things, education, gender, income, language, national citizenship, skills, and urbanization (see, for example, Brake, 2014; Hargittai, 2010; Schradie, 2011; Smith, 2013). What is largely overlooked in this focus on the contexts of users is the contexts structuring digital media technologies, by which I mean the social, cultural, political, and economic relations that structure digital media technologies and thereby shape their uses. Here I will explore this contextual effect with the aim of identifying and drawing attention to its impact on digital divides, with specific focus on the political-economic context of social media platforms.1 I concentrate on that subset of digital media, and of the Internet more specifically, popularly referred to as “social media”2 because of the current enthusiasm about it enabling digital inclusion and empowerment (e.g., Ali, 2011; Bard, 2012) and also because this subset offers a way to both narrow my analysis to manageable proportions and, at the same time, examine sociotechnical structures, forms, and practices that are increasingly central to digital media participation in general. And I focus on the political-economic context of these platforms because of the likelihood, I hypothesize, of empowerment divides arising from the current domination of social media platform ownership by a handful of for-profit corporations: the profit-orientation of these dominant platforms driving them in directions that increase not only their power and the power of their owner-executives, but also the power of those who help bring in greater revenues, as against the power of other online actors. Put as a general research question: what impact is the political-economic context of social media platforms—specifically the domination of ownership by a few for-profit corporations—having on digital divides and, moreover, what are then the implications for digital divides research? 1

The concept “platform” has now become the hegemonic description of web applications in which users who accept certain “terms of service” are able to enter a user-friendly computational environment that provides particular structured services and experiences (Gillespie, 2010; van Dijk, 2013, p. 29). The term “platform” has neutral connotations, implying a technical structure which raises up, or empowers, all users equally (Gillespie, 2010, p. 352). However, such connotations act ideologically to obscure the shaping—enabling and constraining—of users’ actions and the differential treatment of users that the technology performs (Gillespie, 2010; van Dijk, 2013, p. 29). 2 Social media here refers to those platforms supporting users in content production, networking, interactivity, and collaboration, or, more precisely, supporting blogging and micro-blogging (e.g., Twitter), social networking (e.g., Facebook), professional networking (e.g., LinkedIn), collaborative production (e.g., Wikipedia), social news reporting and reviewing (e.g., Digg, Reddit, Slashdot), photo and video sharing (e.g., Flickr, Instagram, YouTube), social bookmarking (e.g., Delicious), and interactive gaming and social worlds (e.g., Second Life). I am not including as social media the many sites that restrict user-generated content to a limited part of their content and do not enable user communities to organize themselves. Here I am thinking of sites that include user contributions to product ratings and reviews (e.g., Amazon) and news media websites that draw on user contributions (e.g., GuardianWitness, Huffington Post, OhmyNews).

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

273

To examine this question, a critical political economy of communication approach is ideal.3 Defined specifically in terms of the question at hand, the “economy” refers to the ownership structures and relations of social media platforms and how they shape these platforms and subsequent communications. The “political” refers to the contingency of the ownership structures and relations, platforms and communications, and thus to the power, and associated identities and values, involved in their constitution and stabilization, and the resulting inclusionary/exclusionary effects and empowerment divides. The “critical” in “critical political economy” indicates an approach that is not only reflexive about the contingency and thus the value-imbued contextuality of any knowledge-practice, and thus of the constitutive exclusions in any particular research process and findings, but also explicitly embraces contingent values through normative critique with an interest in promoting progressive social change. In the political economy examination of social media platforms to follow, the reflexive orientation is manifested first in the examination proceeding interpretively via reasoned argument aimed toward persuasion—as a discursive reading of “the situation”—rather than positively via measurement aimed at proof, and second as the focus of the digital divides research field is put into question. The normative orientation of the examination is implicit in the interest in understanding empowerment inequalities: I presuppose, as does most digital divides research to some extent, a general guiding normative ideal—itself contingent and vulnerable to critique and revision—that everyone should have an equal chance of being equally empowered through digital media. Although I do not develop an explicit normative critique in this paper, I do imply, given my analysis of the situation set against the presupposed norm just identified, that “things” are currently not nearly ideal. And by contrasting the empowerment relations developing from for-profit platforms with those arising from other ownership structures, I encourage reflection on how things could be otherwise. I also, in conclusion, indicate what a practical critique based on the norm— of equal chance of being equally empowered—would entail with respect to this paper’s political economy examination. While deploying critical political economy as my overarching framework of analysis, I also draw insights from research in what can be broadly referred to as critical software studies, an emerging field that examines the social and political constitution and effects of software systems.4 This work helps me to explore the various sociotechnical systems by which social media 3

See Golding and Murdock (2000) and Mosco (2009) for more extensive discussion of the (critical) political economy of communication approach, and see Andrejevic (2011) and Fuchs (2013a, 2013b) for examples of its application to digital media. 4 Research drawn on in this paper that can be identified with the general area of critical software studies includes Bucher (2012), Gehl (2011, 2012), Gillespie (2010, 2014), Gelitz and Helmond (2013), Kleiner and Wyrick (2007), and contributions to Lovink and Rasch (2013).

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

274

L. Dahlberg

platforms and their owner-executives attempt to regulate or exert influence over users and thereby advance their interests. “Critical” here again signifies a reflexive and normative orientation, providing for a shared epistemological and ontological grounding, and thus coherent articulation, between critical software studies and critical political economy. I begin my critical contextual examination of social media platforms by outlining the domination of platform ownership by a few profit-driven corporations, bracketing out Chinese, Iranian, and Russian social media, given that the most popular social media platforms in these countries differ from those that dominate market share in the rest of the world (I have also bracketed out North Korea, where social media is largely unavailable). I then identify, and sketch the contours of, a number of interrelated divides that flow from this corporate domination but that are not accounted for in digital divides research: first a “control divide,” and subsequently, “surveillance,” “exploitation,” and “visibility” divides. I use the term “divides” to include all forms of inequality—even though “inequalities” is now often preferred in digital divides research—because I find that, with respect to the critical political economy of for-profit social media platforms, stark empowerment divisions exist (although the situation is more complex with respect to visibility, and in the section dealing with visibility I tend to refer to inequalities or stratifications rather than divides). In addition, so as to identify the specific impact of for-profit ownership on empowerment divides and thus show the contingent and political nature of the platforms, at each stage I draw attention to the very different empowerment relations arising from various nonprofit and nonproprietary social media platforms. I conclude the paper by considering the implications of the divides outlined here for digital divides research.

THE CORPORATE DOMINATION OF SOCIAL MEDIA PLATFORM OWNERSHIP Only a dozen or so social media platforms dominate social media usage or “audience share.” Although a plurality of social media initiatives initially had success from the late-1990s to the mid-2000s, market concentration has been such that particular forms of social media are now dominated by single platforms. For example, LinkedIn in professional networking, Twitter in micro-blogging, Wikipedia in collaborative production, and YouTube (owned by Google) in video. Concentration has been most dramatic in the area of social media often referred to as social networking, with previously globally popular platforms such as Bebo, Friendster, Myspace, Orkurt, and Xanga, as well as a range of more nationally focused social networking platforms, rapidly losing out to Facebook. The closest direct competitor to Facebook outside of China, Iran, and Russia, is now Google+, although

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

275

competition is also starting to come from the likes of Twitter as the functionalities of different forms of social media increasingly overlap.5 These platforms that dominate audience share are almost all owned by for-profit corporations—the only exception being Wikipedia. Ownership here means ownership of physical infrastructures (e.g., server farms) and intellectual property (e.g., data analysis and processing systems, (meta-)data, platform software, and user profiles).6 Given that the domination of audience share translates into the domination of the ownership of data and user profiles, and subsequently the domination of social media advertising revenues, which in turn enables ownership of the largest server farms and the most sophisticated software and data analysis, we can conclude, first, that audience share is also a, if not the, key asset owned by for-profit platforms (van Dijk, 2013, p. 36), and second, that a few for-profit corporations dominate social media platform ownership. With these dominant platforms becoming the taken-for-granted go-to places for individual social media users, other for-profit and nonprofit organizations, including those offering digital media services and applications, are compelled to work alongside or within their framework (van Dijk, 2013, pp. 163–164). Many organizations, as with many individuals, simply find it convenient to use the dominant platforms—in particular Facebook, Twitter, and YouTube—as online bases from which to present themselves and to communicate (see van Dijk, 2013, pp. 163–164). Even universities and public service organizations set up pages on the dominant corporate platforms, and encourage or require their students or publics to use them. In fact, Facebook, YouTube, and Twitter are now often referred to as though they were simply names for generic communication technologies such as the telephone, radio, television, and e-mail, rather than as particular brands, an articulation that obscures their corporate ownership. Moreover, new users, especially young people and people in the Global South, are increasingly beginning their Internet experiences via major corporate social media platforms, often from mobile devices, bypassing traditional Internet entry points and conventional software, including web browsers and e-mail systems (Lunden, 2013; Manyika, 2012). Facebook, in particular, has encouraged this development by offering—in the name of connectivity and overcoming the digital divide—mobile phone users a number of ways to use Facebook at little or no cost, including via cheap SIM cards and Facebook’s “0” service, both of which enable users with basic mobile devices to connect to a text-based Facebook interface (Leistert, 2013). For users who 5

For a visualization of the dominant social media networks by country over time (since June 2009), and the ever-increasing dominance of Facebook with respect to social networking see Vinco’s “world map of social networks” at http://vincos.it/world-map-of-social-networks/ 6 For an outline of the ownership details of some of the most significant social media platforms see van Dijk (2013). (Meta-)data will be referred to simply as “data” from here on to include both data and meta-data.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

276

L. Dahlberg

transition to a smartphone and are already hooked on Facebook, Facebook’s Android app produces a Facebook-structured mobile Internet experience (Talbot, 2013). In these various ways, Facebook is locking new users into its platform. Success for Facebook is evident in the fact that many people in the Global South who have never used e-mail now have a Facebook page (Manyika, 2012). For an increasing number of users the Internet is now synonymous with Facebook, and in fact many Facebook users do not even consider themselves Internet users and do not go beyond the platform’s offerings (Marani, 2015). Still, there exists a range of nonprofit social media initiatives, such as the decentralized social network platform Diaspora and the federation of self-managed civil society networks that is Lorea.7 Many of these nonprofit projects can be defined as nonproprietary as they put ownership of platform rules and code,8 and even of their servers and data, in the hands of their community of users. However, Wikipedia is currently the only nonprofit and user-participatory social media platform that attracts significant user interest, consistently being ranked by all measures of popularity in the top ten websites globally. How does this domination of social media platform ownership by a few for-profit corporations play out in terms of empowerment divides? My argument is that this “ownership divide” feeds into a control divide, and this control divide subsequently feeds into a number of other divides that are of an extent and intensity unique to a situation of corporate domination. I will now set out in turn these interlinked divides, and along the way contrast these with empowerment relations arising from nonprofit and nonproprietary platforms—taking Diaspora, Lorea, and Wikipedia as exemplars—so as to identify the specific impact of for-profit corporate ownership, and encourage reflection on whether/how things might be different with different ownership structures.

CONTROL DIVIDE Despite the seemingly endless possibilities for users to be creative and develop networks through social media platforms, creativity and networking is controlled—regulated and shaped—by platform rules (or terms of service) and code. In the case of for-profit corporations, this control is on the whole kept out of user hands. As such, the ownership divide discussed in the previous section translates into a control divide between a few 7

See diasporafoundation.org and Lorea.org. See Lovink and Rasch (2013) for discussion of a number of such alternatives to corporate social media. 8 For a summary of the different types of computer code deployed by social media platforms see van Dijk (2013, pp. 30–32).

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

277

corporate social media platforms (and their owner-executives) and users, a divide between those with more and those with less means to regulate and shape social media practice. In other words, we are talking about an empowerment divide, control being an aspect of power and its exercise instantiating an(other) empowerment/disempowerment relation.9 I will now explore how the particular deployment of terms of service and code by those few corporate social media platforms that dominate the social media landscape translate into this control divide. Corporate social media platform terms of service establish control through setting out both the “rights” of the platform concerned and the “rights” of their users. Rights claimed by corporate platforms in these terms of service “agreements,” that most users never actually read, afford them explicit control over aspects of a platform’s usage. Such rights include, for example, the rights to not only modify their platforms however they like, censor content and applications, and appropriate user creativity and data, as further discussed in the following sections of this paper, but also the right to do so without informing users of when or of why. Rights given to users, as opposed to platforms, in these agreements further the control corporate platforms have over usage, doing so by defining what users are allowed to do, which is often defined by statements of what they are not allowed to do. For example, terms of service generally prohibit users from writing “false” profiles of themselves, thus allowing them the right to display “true” profiles. Enforcement of the terms of service is based on further rights platforms claim for themselves in the terms, including, for example, the right to delete profiles judged to break the rules of what is deemed to be a profile of an authentic individual human subject.10 More subtle than control realized through terms of service is the control—regulation and shaping—instituted via computer code. Think, for instance, of the way in which communication is shaped by Facebook’s “like” button, including the fostering of affirmative or uncritical interaction. Or think of the regulation and shaping of communication resulting from the various limits on the word length and format of posts that Facebook, Twitter, and other platforms impose on users through code. Or, take the example given earlier of Facebook’s mobile phone services, which impose a particular structure on users’ Internet experiences, one that aims to keep users within Facebook properties for as long as possible. Or, going beyond a platform’s explicit boundaries, think of Google’s and Facebook’s digital identification or “passport” systems, which enable sites throughout the web to verify a user’s (digital) identity before allowing access to content and services, thus conferring on Google and Facebook the status of online governments (Rosendaal, 9

For more on the relation between control and power see Lukes (2005, pp. 72–92). See Stein (2013) for a more extensive examination of the control exercised by Facebook and YouTube via their rules. 10

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

278

L. Dahlberg

2012). The deployment and take-up of these digital identification systems illustrates how corporate social media control extends beyond a platform’s pages, following users throughout the web. Moreover, it shows how market concentration allows greater control: sites will generally only employ an external identification system adopted by large numbers of users. More examples of control through code will emerge throughout the next sections of this paper: given that platforms are constructed by code, control through code necessarily operates in every aspect of the architecture of social media platforms and digital media more generally, and thus every online activity of users is framed by code, enabling certain practices and disabling or limiting others.11 However, platform rules and code do not in themselves bring about a digital control divide of any significance. Even nonprofit and democratically oriented platforms, including those such as Diaspora or Lorea that institute decentralized systems so as to maximize user autonomy, necessarily exert a certain degree of control via rules and code in order to effectively manage their systems. What matters, with respect to control and digital empowerment divides, is how democratic the rules and code are: the extent to which rules and code are open to challenge, debate, input, and redesign by users. Nonprofit social media often explicitly embrace democratic rules and code. Platforms such as Diaspora, Lorea, and Wikipedia, for example, while relying on some form of management hierarchy for operational reasons, not only embrace but also depend upon a significant degree of user input into their development and management: site rules and decision-making structures are, to various extents, open to user input (in some cases even being based on elections), and code is usually “free,”12 allowing for platform architectures to be modified by any user.13 This is in contrast to corporate social media platforms, where rules are determined by management systems divorced from users—who are not consulted, and sometimes not even informed, when terms of service are altered—and where code is in most cases closed, protected by private property law (Lovink & Rasch, 2013; Stein, 2013; van Dijk, 2013).14 Corporate social media platforms, or more precisely their ownerexecutives, institute closed systems of control because they see such as 11

For more on control in relation to social media code, see Hands (2011, particularly pp. 78–85). The classic work on computer code as control or “law” is Lessig (1999). 12 Free code is not necessarily free-of-charge but rather is computer software that allows users the freedom to run the software for any purpose as well as to study, modify, and distribute the original software and the adapted versions, as long as any modified code is kept free (see GNU.org). As such, it is also known as nonproprietary code, the basis for nonproprietary platforms. 13 For further details on the democratic form of a range of nonprofit and nonproprietary platforms see various contributions to Lovink and Rasch (2013), and for more on Wikipedia in relation to control see Stein (2013, pp. 364–365). 14 The exclusion of users from the means to contribute to the rules and code framing their social media use puts into question the celebrated notion of users being transformed into “produsers” through social media (see Bruns, 2008). Hence, I continue to refer to social media “users.”

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

279

ensuring greater revenues—greater ability to capitalize on user data, content, and practices—than more democratic systems. Control is further extended and intensified by corporate ownership concentration: the fewer the number of entities involved, the greater the coordination of control possible. Control through corporate platforms’ closed rules and code does not mean that use is fully determined on corporate platforms by ownerexecutives, but rather that these owner-executives have significantly greater power than users in regulating and thus influencing practices on their platforms and beyond, and thus significantly greater power to condition “the actual possibilities for participation” (Olsson & Svensson, 2012, p. 49).15 Thus, the ownership divide translates into a control divide between a few for-profit social media corporations (and their dominant owner-executives) and platform users (whether individuals or other sites and applications), the latter being subject to rules and code determined in the last instance by the platform’s drive to maximize profit. Corporations and their owner-executives are interested in the ownership and control of communications on their platforms so as to monetize user participation: ownership and control enables, among other things, the capture of user data and subsequently the construction of user profiles for sale to marketers for targeted advertising. The capture of data takes place via digital surveillance. As such, corporate ownership and control divides feed into a “surveillance divide,” which I will now turn to.

SURVEILLANCE DIVIDE Surveillance is extensively carried out by social media corporations. Not only do social media corporations require users to provide private information with site registration, but data is also gathered on a multitude of social media practices, including adding links, commenting, downloading, friending, liking, posting, purchasing, rating content, (re)posting, responding, searching, status updating, and tagging. In order to extend tracking and data collection, corporate social media platforms are continually exploring new ways to get users to communicate ever more information about their “real selves.” For instance, Facebook is, as are other corporate social media platforms, now inviting users to refine the profiles that it has constructed about them, under the pretense of giving them more control over the advertising that is targeted at them (Facebook, 2014c). And social media corporations not only track activities on their own platforms and follow their registered users around the web, but also surreptitiously collect data from other Internet users who visit 15

Olsson and Svensson’s (2012) case study research shows how this conditioning of practice operates, revealing the “conscious and strategic work involved in producing Web participation” in a top-down way by a social media company that is itself dependent on, and framed by, Facebook.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

280

L. Dahlberg

websites that link to their platform (Facebook, 2014c; Gehl, 2012; Gerlitz & Helmond, 2013). Social media corporations even plan to monitor users’ cursor movements (Rosenbush, 2013).16 This extensive surveillance constitutes a surveillance divide between social media corporations (and their ownerexecutives) and Internet users, a divide that is consolidated by the market dominance of a few platforms, which enables better coordinated and more efficient, systematic, and generalized surveillance. The fact that it is these social media corporations’ profit-orientation, and associated targeted advertising revenue model, that drives them to collect massive amounts of data on millions of individuals is shown by contrast with nonprofit—and more specifically noncommercial—platforms such as Diaspora, Lorea, and Wikipedia. Although noncommercial platforms often do collect some minimal usage data, this is not for the purposes of advertisingoriented user profiling, but simply for improving platform functionality, and users are generally kept informed of, and sometimes even consulted about, any data collection (Stein, 2013, p. 365). Moreover, some democratically oriented nonprofit platforms, such as Diaspora and Lorea, have developed decentralized17 and free software systems so as to, among other things, enable users to bypass surveillance, whether from within or without the platform (Cabello et al., 2013; diasporafoundation.org). And this decentralization and freedom is facilitated by various degrees of user ownership and control. Diaspora, for example, allows users to own their own servers, and thus store and control their own data, and its nonproprietary code means that users can reprogram the platform for their own security needs (see diasporafoundation.org). Moreover, nonprofit platforms, including both Diaspora and Wikipedia, often do not insist that their users represent a “true” identity, thus allowing anonymous contributions. Hence, we can conclude that it is the targeted advertising revenue model embraced by for-profit social media platforms that drives them to carry out surveillance and that leads to a surveillance divide between corporate platforms (and their owner-executives) and Internet users. But Internet surveillance is most often associated with state surveillance. So what then is the relationship of this state surveillance to the corporate social media surveillance divide just described? Corporate social media surveillance not only stands alongside but also feeds into state surveillance. Although corporate social media platforms collect data predominantly for advertising purposes, the data may also become part of state surveillance. Social media corporations now routinely hand over large amounts of private data to the United States and other governments when requested to do so. 16

For more in-depth discussion on social media surveillance, see Andrejevic (2013), Fuchs (2011), Gehl (2011), and Gerlitz and Helmond (2013). 17 Decentralization here refers to a system whereby communication does not pass through central servers.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

281

In the first half of 2014, for example, Twitter (2014) received 2,058 requests for information from 54 countries and handed over data 52% of the time, 72% of the time in the case of the U.S. government requests (1,257 requests). Facebook (2014b) and Google (2014b) actually receive significantly more requests for information from governments than Twitter does, and hand over considerably more data. The substance of handed-over data in the “transparency reports” of these social media corporations is not made public, but information leaks have raised worrying signs for political freedom, such as the surrender (if reluctantly) of the private data of Occupy Wall Street protestors to the U.S. government (Williams, 2012). The integration of corporate social media data into U.S. government surveillance became a global news story with Edward Snowden’s monumental leaks of National Security Agency (NSA) documents. The leaks showed, among other revelations, that the U.S. government was spying on an enormous amount of U.S. and global Internet communication with the cooperation of Internet corporations, including social media platforms Facebook, Google+, and YouTube (Algoritmi, 2014). After the Snowden revelations, some of the major Internet corporations, including a few of the dominant social media platforms, lobbied the U.S. government to not only be more transparent about its use of data for spying but also to rein in this spying (Wyatt & Miller, 2013). However, these corporations have largely kept silent about their own surveillance systems that have enabled the NSA’s (and other governments’) spying. Corporate social media platform ownership and control thus contributes to an expanding global surveillance divide, a divide between the watchers and the watched, or more specifically between those who track, collect, store, analyze, and purchase data, or acquire it by power, and those whose online (and offline) interactions generate the data. The watchers do not simply “watch,” but control when/how they themselves are watched. On the one hand, when they wish publicity, the watchers strategically display themselves for public consumption, for example, through attracting publicity for their tax deductable charitable donations (Facebook CEO and controlling shareholder Mark Zuckerberg’s donation in October 2014 to the U.S. Centers for Disease Control for ebola research was broadcast across all mainstream international news media, online and offline, as can be confirmed from a Google News search). On the other hand, when they wish privacy, the watchers, while subjecting others to extensive surveillance, go to great lengths to conceal their own affairs from public gaze. This concealing is the case for both social media corporations (e.g., the profits of Facebook, Google, etc., are hidden via the use of tax havens and complex company structures) and their individual owner-executives (e.g., Zuckerberg purchased the houses surrounding his home to guard his privacy) (Fuchs, 2014, p. 82). The social media surveillance divide identified here has generally not been acknowledged, let alone investigated, in digital divides research. But, it might be asked, why is this surveillance divide also an empowerment divide?

282

L. Dahlberg

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

What power is gained by being able to carry out social media surveillance? The simple answer is that surveillance affords the power that comes with finding out some information about someone that they have not voluntarily revealed to the watcher, which offers the watcher the possibility of (a) deploying the information as political capital, directly intervening in that person’s social practice, as in censoring and coercing them, the central motivation behind state surveillance; or (b) turning the information into economic capital, as in selling user profiles to advertisers, the main reason for corporate social media surveillance. The selling of user profiles to advertisers is not only an expression of the power of surveillance and the driver behind an associated empowerment divide but also, as I will now argue, contributes to an exploitation divide.

EXPLOITATION DIVIDE The targeted advertising revenue model embraced by social media corporations can be understood as relying upon exploitation—the appropriation of value from the value creators—and upon an empowerment divide between exploiter and exploited. Here I will outline a range of aspects of this appropriation, before naming it as exploitation that points to an exploitation divide. The most obvious aspect of the appropriation of value from the value creators, and the only one up to now that I have mentioned when referring to the monetization of user participation, is the extraction and subsequent sale of user data. Corporate social media platforms, as seen in the previous section, harvest data from the extensive and varied networking of the Internet users they track. With this data, market-oriented profiles of specific users are constructed, often with the help of data acquired from other big data companies such as Acxiom and Datalogix (Sengupta, 2013). These profiles are then sold, or more accurately rented, to advertisers for targeted marketing (Borgesius, 2012). The result is the commodification of user data. Less obvious in the process of value appropriation is the role of usergenerated content (UGC), where UGC refers to interactions, postings, and applications produced by users. UGC, alongside professionally generated content (PGC), attracts users and encourages networking, and thus facilitates the production of data trails and subsequently the construction of user profiles, the number and richness of such having a positive relationship with how much advertising money can be attracted. As a result, UGC produces value for social media corporations, value that is realized in the sale of user profiles and advertising “impressions” (Andrejevic, 2011), and thus the commodification of UGC. A more direct form of appropriation of value from the UGC value creators is the deployment by corporate social media platforms of a user’s affective networking activities—such as commenting, friending,

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

283

following, liking, recommending, tagging, and so on—along with the user’s name and (sometimes) personal profile photo, as product endorsements targeted to the user’s “friends,” thus clearly appropriating and commodifying the activity concerned (Facebook, 2014a; Google, 2014a). As well as extracting value from UGC, corporate social media platforms readily encourage and harness, without payment, user self-management: moderating, adjudicating, and fact checking (van Dijk, 2013). Finally, in terms of appropriation of value, we must consider how advertising targeted at users—the turning back of their data on themselves—extracts value from users through selling not just their data and UGC but their attention, time, emotions, and, if purchases are made, their consumption capacity, to advertisers, thus further commodifying the user.18 Thus, in a range of ways outlined here, corporate social media users’ networking—or net-working—is producing economic value that is subsequently appropriated by the platform’s and their owners. As a result, many social media theorists argue that not only commodification but also exploitation is involved here (e.g., Andrejevic, 2011; Fuchs, 2013a, 2013b). Referring to the extraction of value from users’ social media data and creativity as exploitation is open to debate, particularly given that users are not explicitly selling their labor time.19 However, I see the term as apt as it points to a process of appropriation of economic value from the creators of this value (even if relations of exploitation here are not nearly as concrete and oppressive as often found in relation to mineral mining or computer assembly work). As such, we can talk of an exploiter/exploited divide resulting from the corporate ownership and control of social media platforms, a divide between those being empowered by the extraction of economic value from social networking and those doing the networking, that is, a divide between social media platform owners and users, including those Internet users who are not registered with any social media site. This is a divide that noncommercial social media platforms such as Diaspora, Lorea, and Wikipedia do not contribute to. Although still relying on users’ contributions to content and management, and in some cases data, for platform success, these media can be considered nonexploitative as they do not commodify user networking/data and appropriate the resulting value. Wikipedia, however, is the only noncommercial social media platform that has attracted a large number of users. Exploitation, in the specific sense deployed here, is carried out by all other popular, as well as many not so popular, social media platforms. The resulting exploitation divide is overlooked in digital divides research 18

For an outline of the world of web advertising that is evolving ever more sophisticated methods of data collection and invasive forms of targeted marketing see Turow (2011). For further discussion on the reduction of social media to the logic of exchange, see Andrejevic (2011), Fuchs (2013a), Gehl (2011, 2012), and Kleiner and Wyrick (2007). 19 For a useful discussion of the debate around digital exploitation see Andrejevic et al. (2014).

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

284

L. Dahlberg

literature. And yet this divide is only likely to increase with the increasing concentration of social media platform ownership and control because, as suggested in the previous section with respect to surveillance, the fewer the entities that dominate data collection the more coordinated, efficient, and systematic exploitation can be. The exploitation divide and surveillance divide are interlinked but not the same. The exploitation referred to here relies on surveillance for the construction of profiles, but does not necessarily follow from surveillance, which might be carried out for reasons other than exploitation. However, exploitation—the appropriation of the economic value created by users—is the central reason for the surveillance carried out by corporate platforms. As such, surveillance and exploitation go hand in hand, and the surveillance and exploitation divides could be considered as together making up a more general empowerment divide, one that is enabled by, and strengthens, control. Moreover, to bring discussion back to the core of political economy concerns, control, surveillance, and exploitation divides are all related to structures and relations of ownership. To move from a position of being controlled, watched, and exploited, to one of controlling, watching, and exploiting, requires the ownership of significant resources, from huge data storage farms to platform and data tracking code to the monetary capital to pay for data mining and analytics. Of course, ownership of these resources is not necessary for simply using social media platforms to post and receive various forms of content, within the framework set by platform rules and code. Users pay for these services with their networking. But such ownership is essential if one is to effectively and systematically engage in the control, surveillance, and exploitation of social media use, and the associated shaping of the “visibility” of users’ voices. This shaping of visibility leads to the final divide that I will be considering here.

VISIBILITY DIVIDE Ownership of social media platforms by for-profit corporations is shaping visibility and hence leading to a social media visibility divide, that is, a divide—or more precisely stratification—in the extent to which users’ voices20 are heard, or put in a position to be heard, with preferential treatment being given to those voices that offer more to platform owners in terms of driving revenues. I will consider four ways in which visibility is systematically shaped and subsequently stratified by corporate social media platforms: paid-for content, strategic partnerships with commercial media producers, value ranking algorithms, and censorship. 20

“Voice” here refers to the claims and stories that human agents seek recognition for.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

285

The first and most obvious way in which corporate social media platforms shape online visibility is through advertising: through the targeted distribution of the voices of those actors—often referred to as “brands”— who pay for this distribution with money rather than via the economic value of their networking. Advertising on social media is both increasing quantitatively and becoming ever more subtly integrated into the stream of everyday digital interaction. Corporate platforms are combining the now-traditional targeted banner advertising with a range of other “native advertising” options that make advertisements less intrusive and ad-like, and read more like news, information posts, and recommendations (Turow, 2011; van Dijk, 2013). One of the most insidious new advertising products that is being offered—in one way or another—by corporate social media platforms is “promoted content.” This involves brands paying to have their content circulate more widely or displayed more prominently, the distribution of non-paid-for content being severely limited so as to encourage the uptake of this product by advertisers (Kilonzo, 2014; Kim, 2012, pp. 56–57; Vahl, 2014; van Dijk, 2013, p. 125). The result, with respect to digital divides, of this escalation of advertising is an increasingly clear divide between those who pay for the distribution of their voice through social media and those who do not, and a stratification between the visibility of brands based on the money they pay for distribution. Second, strategic win-win partnerships between some corporate social media platforms—particularly YouTube—and major professional (mostly commercial) media content producers is increasingly leading to the distribution and promotion of professionally generated content (PGC) on these platforms and the sharing of subsequent advertising revenues (Gillespie, 2010; Kim, 2012; Merrin, 2012). The result is the privileging of PGC over user-generated content (UGC). PGC tends to produce considerably more user attention, and hence advertising interest, than most UGC (Gillespie, 2010, p. 353). As such, it makes sense for PGC to be privileged. Once again, the financial interests of for-profit social media platforms produces a bias toward certain content (here PGC over most UGC), leading to inequality of voice. Third, inequalities in visibility are advanced by corporate social media platform value ranking algorithms. For-profit platforms deploy complex algorithms to calculate and rank the “value” of every item of UGC, and thus the extent that any such item ends up being foregrounded or circulated (see Bucher, 2012; Gillespie, 2014). Although the details of how algorithms discriminate are largely kept confidential, it is no secret that—given the advertising-based revenue model of corporate social media platforms—value is associated with, and algorithms programmed in accordance with, what content is expected to gain attention and stimulate interaction, including what is seen as popular and “fresh” (Bucher, 2012, pp. 1167–1168). As a result, social media value ranking algorithm recommendations advance the

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

286

L. Dahlberg

visibility of some—often those already dominant—voices over others as users as a whole become exposed to certain types (e.g., “popular”) of UGC more than others. Finally, visibility is affected by content censorship. Corporate social media platforms are not often seen as instigators of censorship. They might, in contrast, be read as offering citizens a means to escape state censorship of traditional media, demonstrated very clearly by the use of social media in citizen uprisings against authoritarian regimes that control the traditional broadcast media, as in the “Arab Spring.” However, this is certainly not the whole story. Corporate platforms have not only themselves been subject to state censorship but have also, with commercial considerations in mind, on multiple occasions bowed to both government and public pressure to block or delete content and accounts (Youmans & York, 2012).21 Facebook has been particularly notorious for this, its censorship ranging from blocking anarchist pages to deleting mention of the Armenian genocide and any reference to a Kurdish state (presumably at the request of the Turkish government) to taking down breastfeeding photos for fear of offending users (Dencik, 2014; Sulaiman, 2012). As such, the profit interests of social media corporations lead to the exclusion, or at least marginalization, of certain voices on their networks. Nonprofit platforms do not shape visibility in the ways described above, given that they (generally) do not facilitate paid-for content or make deals with commercial media providers, and certainly do not rank posts according to revenue-earning potential. Moreover, nonprofits might be considered less willing to capitulate to censorship pressures, at least to any state or consumer threats related to being cut off from markets. Yet nonprofit platforms also shape visibility in various ways: their aims, norms, and associated technological form necessarily lead to various visibility biases and hence inequalities of voice. This shaping of visibility is particularly evident in Wikipedia’s explicit structuring of contributions toward its ideal of neutrality (van Dijk, 2013, pp. 136–146). However, as with control, the key question here has to do with democratic openness: the extent that the visibility stratification, and the system that shapes it, is not only made explicit but also open to contestation and revision by users. For-profit platforms, as we have seen earlier, close off their systems from user scrutiny or revision. In contrast, nonprofits, and particularly those such as Diaspora, Lorea, and Wikipedia, that embrace a general democratic ethos, are not only transparent about the logic by which they shape visibility but also allow various degrees of user input into the development of this logic. 21

See Youmans and York (2012) for examples of ways in which dominant social media platforms have cooperated with governments in relation to the control of social media content. I do not look further at government control of digital communication here because it has been the focus of much academic attention in recent times (especially with regard to China).

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

287

Hence, we can speak of a largely unresearched corporate social mediabased visibility divide or at least stratification, in which certain voices—often commercial ones and/or those already dominant offline—are privileged over others: corporate social media platforms might be enabling users’ voices but they are not doing so equally. This inequality parallels and complexifies inequalities in voice that result from the uneven social media access and usage and that have been identified in previous research, as noted in this paper’s introduction. The stratification of visibility, in contrast to the other divides outlined in this paper, at first glance seems to be simply between different types of users (including commercial ones), rather than between users and corporate platforms (and their owner-executives). However, behind this stratification between users is also an empowerment divide between users and corporate platforms based on the power that these platforms have, and users do not, to not only control their own visibility (as noted at the end of the surveillance section) but also to determine the criteria and associated rules and code that shape user visibility. This empowerment divide between users and corporate platforms subsequently feeds into the visibility inequalities between users: the for-profit logic of corporate platforms means that the criteria, rules, and code that shape visibility are such that those voices that bring in more revenue for the platform concerned will be promoted, and empowered, over others. Thus, in agreement with Bucher (2012), social media corporations need to be investigated for how they shape visibility—which I have argued here involves a range of factors including paid-for content, strategic partnerships, value-ranking algorithms, and censorship—and how this shaping empowers unequally.

IMPLICATIONS FOR DIGITAL DIVIDES RESEARCH In this paper I have identified and outlined a range of interconnected empowerment divides, or stratifications, associated with the politicaleconomic context of social media platforms, specifically with the domination of social media platform ownership by a few profit-driven corporations (and their owner-executives), divides that have been largely overlooked in both digital divides research and celebrations of social media empowerment. These particular empowerment divides have significant implications for, first, the substantive focus and, second, the normative reach of the field, as I will now outline in turn. First, the divides highlighted in this paper not only indicate the need for their full examination in future research, but also, given the need for such an examination, demand an expansion of the substantive focus of the digital divides research problematic from a focus on inequalities between individual user’s digital media access and usage, inequalities attributed to the contexts of users, to include a focus on inequalities arising from the

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

288

L. Dahlberg

contexts of the digital media technology involved. This expansion of the digital divides problematic can and should go further than the questioning of the political-economic context of social media that can be found in this paper. To begin with, researchers should consider possible empowerment divides developing from the political-economic contexts structuring all digital media technologies. For example, the field can and should include consideration of how the empowerment divides in control, surveillance, exploitation, and the shaping of visibility sketched in this paper are extended by the corporate ownership and associated for-profit structuring of search engines, apps stores, cloud computing facilities, and other digital systems (Mager, 2012; Mosco, 2014); how a “data divide”—between those producing data and those collecting, analyzing, and accessing data—is a likely outcome of the ever more pervasive systems of largely privately owned and controlled digital sensors that are rapidly spreading throughout advanced capitalist societies (Andrejevic, 2014); and how significant exploitation divides can be found associated with the often precarious and piecemeal labor employed to produce digital technologies, from the mining of raw materials to the assembling of hardware to the scanning and sorting of digital content (Cushing, 2012; Fuchs, 2013b; Huws, 2013). And then there are the more characteristically political factors structuring digital media technologies that should be examined in relation to digital divides. Here I am thinking particularly of divides stemming from state power (which as we have seen in the surveillance section of this paper are often also intertwined with divides associated with economic power). Clearly there are digital control, censorship, surveillance, and visibility divides between states (e.g., the powerful position of the U.S. government and its allies, vis-à-vis other states, in Internet governance and surveillance), as well as between states and their citizens, and in some cases between citizens as the result of state laws discriminating between users (e.g., in some jurisdictions different users are given different Internet access depending on their relationship to the state). Finally, as well as examining the spectrum of political-economic contexts, digital divides research should take into account how social and cultural contexts may also, to some extent, shape digital technologies so as to feed into digital inequalities. For example, the social and cultural contexts of the programmers of digital media might inform the coding in ways—such as foregrounding certain natural languages—that discriminate in favor of some users over others.22 Thus, the general call here is for digital divides researchers to complement their focus on empowerment inequalities between users arising from differential levels and types of digital media access and usage, differentials attributed to the different contexts of users, with at least an equal 22

See Marwick (2013) for one study that, among other things, reveals various ways in which the cultural context and associated values of programmers become built into the design of social media such as to empower some users more than others.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

289

focus upon the empowerment inequalities resulting from the contexts structuring digital media technologies. These empowerment inequalities include power differentials overlooked in digital divides research between: first, users and the institutions (and some individuals) owning and/or governing the technology; and second, users and other users, inequality here stemming from the contextual structuring of the technology concerned discriminating between users in different ways, as illustrated in the visibility divide section of this paper. Such an expanded, and moreover complexified, digital divides focus—and by implication definition—demands the deployment of research approaches that enable the exploration of the contexts structuring digital media technologies and subsequently shaping uses. I have shown how critical political economy of communication and critical software studies are applicable. Future research is likely to point to other approaches suitable to the expanded research problematic being called for here. Second, turning to the normative implications, the political-economic context outlined in this paper puts into question not only social media empowerment celebrations but also the presupposition of much, if not all, digital divides discussion and research—that digital inclusion and participation are necessarily good. Increasing digital inclusion and participation might also mean, given the domination of social media platform ownership by a few for-profit corporations, increasing control, surveillance, exploitation, and visibility divides. As a result, what it means to “bridge the digital divide” needs more careful consideration, and the extension of normative judgment beyond questioning individual user’s access and usage with respect to user contexts. And, if we were also to embrace the general normative ideal implicit in much digital divides research, including in this paper—that everyone should have an equal chance to be equally empowered—then an increase in control, surveillance, exploitation and visibility divides would be given a negative judgment. Moreover, assuming that an explicitly critical approach embracing practical critique was taken up, this judgment would be followed by an exploration of how the identified divides could be mitigated such that digital inclusion and participation lead to greater equality of empowerment. This requires research that goes beyond the aims of this paper, but that this paper establishes the grounds for by illustrating—via the examples given of nonprofit platforms—how empowerment divides or stratifications differ significantly with different ownership structures, thus pointing to where to look for reductions in these inequalities, including to noncommercial and democratic ownership and revenue models (e.g., different types of nonprofit platforms, including public service digital media that promise much, in contrast to many other nonprofits, given that they often start from a more popular reception and have more secure funding) and to the regulation of for-profit platforms (e.g., to allow data ownership and control by users and/or to encourage greater plurality of ownership in contrast to the current concentration). This is all work ahead for those researchers and

290

L. Dahlberg

practitioners committed to greater equality of empowerment through digital media technologies.

ACKNOWLEDGMENTS My sincere thanks to the reviewers of The Communication Review for their many insightful comments and to the journal editors for their support and patience.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

REFERENCES Algoritmi, J. (2014, April 24–25). The NSA’s algorithmic citizenship and foreignness. Theorizing the Web conference, New York. Retrieved from http://www. youtube.com/watch?v=tvRpJb4M2N8 Ali, A. H. (2011). The power of social media in developing nations: New tools for closing the global digital divide and beyond. Harvard Human Rights Journal, 24(1), 186–219. Andrejevic, M. (2011). Exploitation in the data mine. In C. Fuchs, K. Boersma, A. Albrechtslund, & M. Sandoval (Eds.), Internet and surveillance: The challenges of Web 2.0 and social media (pp. 71–88). London, England: Routledge. Andrejevic, M. (2013). Infoglut: How too much information is changing the way we think and know. New York, NY: Routledge. Andrejevic, M. (2014). The big data divide. International Journal of Communication, 8, 1673–1689. Andrejevic, M. Banks, J., Campbell, J., Couldry, N., Fish, A., Hearn, A., & Ouellette, L. (2014). Participations: Dialogues on the participatory promise of contemporary culture and politics, part 2, labour. International Journal of Communication, 8(forum), 1089–1106. Bard, A. (2012, May 28–30). The Internet revolution. NEXT conference, Berlin, Germany. Retrieved from http://nextberlin.eu/2012/05/alexander-bardthe-internet-revolution/ Borgesius, F. Z. (2012, March 9–11). The ecosystem of online audience buying. UnlikeUs #2 conference: Understanding social media monopolies and their alternatives, Institute of Network Cultures. Retrieved from http://vimeo.com/ 38840197 Brake, D. R. (2014). Are we all online content creators now? Web 2.0 and digital divides. Journal of Computer-Mediated Communication, 19(3), 591–609. doi:10.1111/jcc4.12042 Bruns, A. (2008). Blogs, Wikipedia, Second Life, and beyond: From production to produsage. New York, NY: Peter Lang Publishing. Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. doi:10.1177/ 1461444812440159 Cabello, F., & Franco, M. G., Cabello, F, Franco, M. G., & Hache, A. (2013). Towards a free federated social web: Lorea takes the networks! In G. Lovink & M.

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

291

Rasch (Eds.), The unlike us reader (pp. 338–346). Amsterdam, The Netherlands: Institute of Network Cultures. Cushing, E. (2012, August 1). Dawn of the digital sweatshop. East Bay Express. Retrieved from http://www.eastbayexpress.com/oakland/dawn-of-the-digitalsweatshop/Content?oid= 3301022 Dencik, L. (2014, January 17). From breastfeeding to politics, Facebook steps up censorship. The Conversation. Retrieved from http://theconversation.com/frombreastfeeding-to-politics-facebook-steps-up-censorship-22098 Facebook. (2014a). About Facebook adverts. Retrieved from https://www.facebook. com/about/ads/ Facebook. (2014b). Government requests reports. Retrieved from https:// govtrequests.facebook.com/ Facebook. (2014c, June 12). Making ads better and giving people more control over the ads they see. Retrieved from http://newsroom.fb.com/news/2014/06/ making-ads-better-and-giving-people-more-control-over-the-ads-they-see/ Fuchs, C. (2011). Web 2.0, prosumption, and surveillance. Surveillance & Society, 8(3), 288–309. Fuchs, C. (2013a). Social media and capitalism. In T. Olsson (Ed.), Producing the Internet: Critical perspectives of social media (pp. 25–44). Gothenburg, Sweden: Nordicom. Fuchs, C. (2013b). Theorising and analysing digital labour: From global value chains to modes of production. The Political Economy of Communication, 1(2), 3–27. Fuchs, C. (2014). Social media and the public sphere. TripleC, 12(1), 57–101. Gehl, R. W. (2011). The archive and the processor: The internal logic of Web 2.0. New Media & Society, 13(8), 1228–1244. doi:10.1177/1461444811401735 Gehl, R. W. (2012). Real (software) abstractions: On the rise of Facebook and the fall of Myspace. Social Text, 30(2), 99–119. doi:10.1215/01642472-1541772 Gerlitz, C., & Helmond, A. (2013). The like economy: Social buttons and the dataintensive web. New Media & Society, 15(8), 1348–1365. doi:10.1177/1461444812 472322 Gillespie, T. (2010). The politics of “platforms.” New Media & Society, 12(3), 347–364. doi:10.1177/1461444809342738 Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies (pp. 167–194). Cambridge, MA: MIT Press. Golding, P., & Murdock, G. (2000). Culture, communication, and political economy. In J. Curran & M. Gurevitch (Eds.), Mass media and society (3rd ed., pp. 71–92). London, England: Edward Arnold. Google. (2014a). How shared endorsements work. Retrieved from https://support. google.com/plus/answer/3403513?hl=en Google. (2014b). Transparency report. Retrieved from http://www.google.com.au/ transparencyreport/ Hands, J. (2011). @ is for activism: Dissent, resistance, and rebellion in a digital culture. London, England: Pluto Press. Hargittai, E. (2010). Digital na(t)ives? Variation in Internet skills and uses among members of the “Net generation.” Sociological Inquiry, 80(1), 92–113. doi:10.1111/soin.2010.80.issue-1

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

292

L. Dahlberg

Huws, U. (Ed.). (2013). Working online, living offline. Work Organization, Labour and Globalization, 7(1), 1–11. Kilonzo, R. (2014, May 21). How to use Twitter tailored audiences. Social Media Examiner. Retrieved from http://www.socialmediaexaminer.com/twittertailored-audiences/ Kim, J. (2012). The institutionalization of YouTube: From user-generated content to professionally generated content. Media, Culture & Society, 34(1), 53–67. doi:10.1177/0163443711427199 Kleiner, D., & Wyrick, B. (2007). Info-enclosure 2.0. Mute: Culture and Politics After the Net, 2(4). Retrieved from http://www.metamute.org/en/InfoEnclosure-2.0 Leistert, O. (2013). Smell the fish: Digital Disneyland and the right to oblivion. First Monday, 18(3–4). doi:10.5210/fm.v18i3.4619 Lessig, L. (1999). Code and other laws of cyberspace. New York, NY: Basic Books. Lovink, G., & Rasch, M. (Eds.). (2013). Unlike us reader: Social media monopolies and their alternatives. Amsterdam, the Netherlands: Institute of Network Cultures. Lukes, S. (2005). Power: A radical view. Hampshire, UK: Palgrave MacMillan. Lunden, I. (2013, December 8). Twitter is taking a “log-out” approach to raise usage, awareness in emerging markets with USSD on mobiles. Techcrunch. Retrieved from Techcrunch.com/2013/12/08 Mager, A. (2012). Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society, 15(5), 769–787. doi:10.1080/ 1369118X.2012.676056 Manyika, J. (2012, November 8). The social economy: Unleashing value and productivity through social technologies. Oxford Internet Institute webcast. Retrieved from http://webcast.oii.ox.ac.uk/?view=Webcast&ID=20121108_470 Marani, L. (2015). Millions of Facebook users have no idea they’re using the Internet. Quartz. Retrieved from http://qz.com/333313/milliions-of-facebook-users-haveno-idea-theyre-using-the-internet/ Marwick, A. E. (2013). Status update. Celebrity, publicity, and branding in the social media age. New Haven, CT: Yale University Press. Merrin, W. (2012). Still fighting “the Beast”: Guerrilla television and the limits of YouTube. Cultural Politics, 8(1), 97–119. Mosco, V. (2009). The political economy of communication (2nd ed.). London, England: Sage. Mosco, V. (2014). To the cloud: Big data in a turbulent world. Boulder, CO: Paradigm. Olsson, T., & Svensson, A. (2012). Producing prod-users: Conditional participation in a Web 2.0 consumer community. Javnost/The Public, 19(3), 41–58. Rosenbush, S. (2013, October 17). Facebook tests software to track your cursor on screen. The Wall Street Journal. Retrieved from http://blogs.wsj.com/cio/2013/ 10/30/facebook-considers-vast-increase-in-data-collection/ Rosendaal, A. (2012, March 8–10). Who decides who I am? UnlikeUs #2 conference: Understanding social media monopolies and their alternatives, Institute of Network Cultures, Amsterdam, the Netherlands. Retrieved from http://vimeo. com/38838266

Downloaded by [Lincoln Dahlberg] at 23:40 14 December 2015

Expanding Digital Divides Research

293

Schradie, J. A. (2011). The digital production gap: The digital divide and Web 2.0 collide. Poetics, 39(2), 145–168. doi:10.1016/j.poetic.2011.02.003 Sengupta, S. (2013, March 26). What you didn’t post, Facebook may still know. The New York Times. Retrieved from http://www.cnbc.com/id/100590334 Smith, A. (2013). Civic engagement in the digital age (Pew Internet & American Life Project Report). Retrieved from http://pewinternet.org/Reports/2013/CivicEngagement/ Stein, L. (2013). Policy and participation on social media: The cases of YouTube, Facebook, and Wikipedia. Communication, Culture & Critique, 6(3), 353–371. doi:10.1111/cccr.12026 Sulaiman, K. (2012, March 6). The Kurdish Facebook scandal. eKurd.net. Retrieved from http://www.ekurd.net/mismas/articles/misc2012/3/turkey3815.htm Talbot, D. (2013, April 9). Facebook’s real “home” may be the developing world. MIT Technology Review. Retrieved from http://www.technologyreview.com/news/ 513416/facebooks-real-home-may-be-the-developing-world/ Turow, J. (2011). The daily you: How the new advertising industry is defining your identity and your worth. New Haven, CT: Yale. Twitter. (2014). Information requests January 1 to June 30 2014 (Twitter transparency report). January to June 2014. Retrieved from https://transparency. twitter.com/information-requests/2014/jan-jun Vahl, A. (2014, May 5). Boost posts or promoted posts on Facebook: Which is better? Social Media Examiner. Retrieved from http://www.socialmediaexaminer.com/ facebook-boost-posts-promoted-posts/ van Dijk, J. (2013). The culture of connectivity: A critical history of social media. Oxford, UK: Oxford University Press. van Dijk, J. A. G. M. (2012). The evolution of the digital divide: The digital divide turns to inequality of skills and usage. In J. Bus, M. Crompton, M. Hildebrandt, & G. Metakides (Eds.), Digital enlightenment yearbook (pp. 57–75). Amsterdam, The Netherlands: IOS Press. Williams, M. (2012, September 15). Twitter complies with prosecutors to surrender Occupy activist’s tweets. The Guardian. Retrieved from http://www.guardian. co.uk/technology/2012/sep/14/twitter-complies-occupy-activist-tweets Wyatt, E., & Miller, C. C. (2013, December 9). Tech giants issue call for limits on government surveillance of users. The New York Times. Retrieved from http:// www.nytimes.com/2013/12/09/technology/tech-giants-issue-call-for-limits-ongovernment-surveillance-of-users.html?pagewanted=all&_r=0 Youmans, W. L., & York, J. C. (2012). Social media and the activist toolkit: User agreements, corporate interests, and the information infrastructure of modern social movements. Journal of Communication, 62(2), 315–329. doi:10.1111/ j.1460-2466.2012.01636.x

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.