reading response ( at least 300 words)

Anonymous
timer Asked: Oct 22nd, 2018
account_balance_wallet $9.99

Question Description

please read the below attached article James Grimmelmann, 2015, "the virtues of moderation” (You may skip section IV “Lessons for Law.”) after you read that please write a 300 words reading response, do not plagirism and use outside sources, the format should be APA and you should include some intext citation in the reading response also the response should related to some personal example or interesting example.

Unformatted Attachment Preview

THE VIRTUES OF MODERATION James Grimmelmann… 17 YALE J.L. & TECH. 42 (2015) ABSTRACT TL;DR—On a Friday in 2005, the Los Angeles Times launched an experiment: a “wikitorial” on the Iraq War that any of the paper’s readers could edit. By Sunday, the experiment had ended in abject failure: vandals overran it with crude profanity and graphic pornography. The wikitorial took its inspiration and its technology from Wikipedia, but missed something essential about how the “the free encyclopedia that anyone can edit” staves off abuse while maintaining its core commitment to open participation. The difference is moderation: the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse. Town meetings have moderators, and so do online communities. A community’s moderators can promote posts or hide them, honor posters or shame them, recruit users or ban them. Their decisions influence what is seen, what is valued, what is said. They create the conditions under which cooperation is possible. This Article provides a novel taxonomy of moderation in online communities. It breaks down the basic verbs of moderation—exclusion, pricing, organizing, and norm-setting—and shows how they help communities walk the tightrope between the chaos of too much freedom and the sterility of too much control. Scholars studying the commons can learn from moderation, and so can policy-makers debating the regulation of online communities. … Professor of Law, University of Maryland Francis King Carey School of Law. My thanks for their comments to Aislinn Black, BJ Ard, Jack Balkin, Shyam Balganesh, Nicholas Bramble, Danielle Citron, Anne Huang, Matt Haughey, Sarah Jeong, Amy Kapczynski, David Krinsky, Chris Riley, Henry Smith, Jessamyn West, Steven Wu, and participants in the 2007 Commons Theory Workshop for Young Scholars at the Max Planck Institute for the Study of Collective Goods, the 2007 Intellectual Property Scholars Conference, the 2007 Telecommunications Policy Research Conference, and the 2014 Free Expression Scholars Conference. This Article may be freely reused under the terms of the Creative Commons Attribution 4.0 International license, https://creativecommons.org/licenses/by/4.0. Attribution under the license should take the form “James Grimmelmann, The Virtues of Moderation, 17 YALE J.L. & TECH. 42 (2015)” or its equivalent in an appropriate citation system. 2015 The Virtues of Moderation 43 TABLE OF CONTENTS Introduction ....................................................................... 44 I. The Problem of Moderation ......................................... 47 A. Definitions ......................................................... 48 B. Goals .................................................................. 50 C. Commons Problems ........................................... 51 D. Abuse.................................................................. 53 II. The Grammar of Moderation ..................................... 55 A. Techniques (Verbs) ............................................ 56 1. Excluding ................................................ 56 2. Pricing ..................................................... 57 3. Organizing .............................................. 58 4. Norm-Setting .......................................... 61 B. Distinctions (Adverbs) ....................................... 63 1. Automatically / Manually ...................... 63 2. Transparently / Secretly ........................ 65 3. Ex Ante / Ex Post ................................... 67 4. Centrally / Distributedly ........................ 69 C. Community Characteristics (Adjectives) .......... 70 1. Infrastructure Capacity ......................... 71 2. Community Size ..................................... 72 3. Ownership Concentration ...................... 74 4. Identity ................................................... 76 III. Case Studies ............................................................... 79 A. Wikipedia ........................................................... 79 B. The Los Angeles Times Wikitorial ................... 87 C. MetaFilter .......................................................... 88 D. Reddit ................................................................ 94 IV. Lessons for Law ........................................................ 101 A. Communications Decency Act § 230 ............... 103 B. Copyright Act § 512 ......................................... 107 V. Conclusion .................................................................. 108 44 THE YALE JOURNAL OF LAW & TECHNOLOGY Vol. 17 Building a community is pretty tough; it requires just the right combination of technology and rules and people. And while it’s been clear that communities are at the core of many of the most interesting things on the Internet, we’re still at the very early stages of understanding what it is that makes them work. –Aaron Swartz1 INTRODUCTION If you’ve never seen the image known as “goatse,” trust me—you don’t want to.2 But if you have, you understand why it was such a disaster when this notoriously disgusting photograph showed up on the website of the Los Angeles Times on June 19, 2005.3 It wasn’t a hack. The newspaper had invited its readers to post whatever they wanted. One of them posted a gaping anus. It had started off innocently enough. Inspired by Wikipedia, the Times launched a “wikitorial,” an editorial that any of the paper’s readers could edit.4 At first, readers fought over its position: should it be for or against the Iraq War?5 Then one boiled the argument down to its essence—“Fuck USA”— touching off an edit war of increasingly rapid and radically incompatible changes.6 By the second day, trolls were posting hardcore pornography, designed to shock and disgust.7 The Times pulled the plug entirely in less than forty-eight hours.8 What had started with “Rewrite the editorial yourself”9 ended 1 2 3 4 5 6 7 8 9 Aaron Swartz, Making More Wikipedias, RAW THOUGHT (Sept. 14, 2006), http://www.aaronsw.com/weblog/morewikipedias [http://perma.cc/U2LR-C DTB]. The image, which has circulated on the Internet since 1999, depicts a man exposing himself to the camera in a particularly graphic and unpleasant way. In its heyday, goatse was most often used for its shock value: direct people to a website containing it, and revel in their horror. See Adrian Chen, Finding Goatse: The Mystery Man Behind the Most Disturbing Internet Meme in History, GAWKER, Apr. 10, 2012, http://gawker.com/findinggoatse-the-mystery-man-behind-the-most-disturb-5899787 [http://perma.cc/6RJ8-WVAW]. See, e.g., Dan Glaister, LA Times ‘Wikitorial’ Gives Editors Red Face, THE GUARDIAN, June 21, 2005, http://www.theguardian.com/technology/2005 /jun/22/media.pressandpublishing [http://perma.cc/NY5A-3A83]. A Wiki for Your Thoughts, L.A. TIMES, June 17, 2005, http://www.latimes .com/news/la-ed-wiki17jun17-story.html [http://perma.cc/4QW8-RH7C]. Glaister, supra note 3. Id. Id. James Rainey, ‘Wikitorial’ Pulled Due to Vandalism, L.A. TIMES, June 21, 2005, http://articles.latimes.com/2005/jun/21/nation/na-wiki21 [http://perm a.cc/TJ2J-AD7S]. A Wiki for Your Thoughts, supra note 4. 2015 The Virtues of Moderation 45 with the admission that “a few readers were flooding the site with inappropriate material.”10 The wikitorial debacle has the air of a parable: the Los Angeles Times hung a “KICK ME” sign on its website, and of course it got kicked. Open up an online community, and of course you’ll bring out the spammers, the vandals, and the trolls. That’s just how people act on the Internet. But consider this: the Times’ model, Wikipedia, is going into its thirteenth year.11 It is the sixth most-visited website on the Internet.12 And despite being a website “that anyone can edit,” it remains almost entirely goatse-free.13 Anarchy on the Internet is not inevitable. Spaces can and do flourish where people collaborate and where all are welcome. What, then, separates the Wikipedias from the wikitorials? Why do some communities thrive while others become ghost towns? The difference is moderation. Just as town meetings and debates have moderators who keep the discussion civil and productive,14 healthy online communities have moderators who facilitate communication. A community’s moderators can promote posts or hide them, honor posters or shame them, recruit users or ban them. Their decisions influence what is seen, what is valued, what is said. When they do their job right, they create the conditions under which cooperation is possible. Wikipedia, for all its faults, is moderated in a way that supports an active community of mostly productive editors. The Los Angeles Times, for all its good intentions, moderated the wikitorial in a way that provided few useful defenses against vandals. Wikipedia’s moderation keeps its house in order; the Times gave arsonists the run of the place. This Article is a guided tour of moderation for legal scholars. It synthesizes the accumulated insights of four groups of experts who have given the problem of moderation their careful and sustained attention. The first is moderators themselves— those who are entrusted with the care and feeding of online 10 11 12 13 14 Rainey, supra note 8. See generally ANDREW LIH, THE WIKIPEDIA REVOLUTION (2009). See Top Sites, ALEXA, http://www.alexa.com/topsites [http://perma.cc/36H3 -9STW] (last visited Mar. 30, 2015); see also Wikipedia: Statistics, WIKIPEDIA, http://en.wikipedia.org/wiki/Wikipedia:Statistics#pageviews [http: //perma.cc/HW25-U4WS] (last visited Jan. 20, 2015) (reporting 4,841,082 articles in the English-language version). But see goatse.cx, WIKIPEDIA, http://en.wikipedia.org/wiki/Goatse.cx [http:// perma.cc/7YQD-EBGH] (last visited Feb. 23, 2015) (telling rather than showing). See, e.g., ALEXANDER MEIKLEJOHN, FREE SPEECH AND ITS RELATION TO SELF-GOVERNMENT 25-26 (1948) (“[A]t a town meeting . . . [n]o competent moderator would tolerate . . . wasting . . . the time available for free discussion,” but “no suggestion of policy shall be denied a hearing because it is on one side of the issue rather than another.”). 46 THE YALE JOURNAL OF LAW & TECHNOLOGY Vol. 17 communities. They have written at length about helpful interventions and harmful ones, giving guidelines and rules of thumb for nudging users towards collaborative engagement.15 A second group, the software and interface designers who are responsible for the technical substrate on which online communities run, works closely with the first (indeed, they are often the same people). Their own professional literature offers a nuanced understanding of how the technical design of a social space influences the interactions that take place there.16 The third group consists of academics from a wide variety of disciplines—psychology, communications, and computer science, to name just a few—who have turned a scholarly eye on the factors that make communities thrive or wither.17 The fourth is 15 16 17 See generally JONO BACON, THE ART OF COMMUNITY: BUILDING THE NEW AGE OF PARTICIPATION (2d ed. 2012); AMY JO KIM, COMMUNITY BUILDING ON THE WEB (2000); DEBORAH NG, ONLINE COMMUNITY MANAGEMENT FOR DUMMIES (2011); DEREK POWAZEK, DESIGN FOR COMMUNITY (2001); JENNY PREECE, ONLINE COMMUNITIES: DESIGNING USABILITY, SUPPORTING SOCIABILITY (2000). See generally GAVIN BELL, BUILDING SOCIAL WEB APPLICATIONS (2009); CHRISTIAN CRUMLISH & ERIN MALONE, DESIGNING SOCIAL INTERFACES (2009); F. RANDALL FARMER & BRYCE GLASS, BUILDING WEB REPUTATION SYSTEMS (2010); JENIFER TIDWELL, DESIGNING INTERFACES (2d ed. 2010). A particularly fruitful trend in this literature consists of pattern languages: interlocking networks of design elements that have repeatedly proven their worth. The idea of pattern languages comes from the work of the architectural theorist Christopher Alexander. See, e.g., CHRISTOPHER ALEXANDER, THE TIMELESS WAY OF BUILDING (1979) (presenting a theory of patterns); CHRISTOPHER ALEXANDER ET AL., A PATTERN LANGUAGE: TOWNS, BUILDINGS, CONSTRUCTION (1977) (developing pattern language for architecture). Software designers took his idea of a pattern as “a careful description of a perennial solution to a recurring problem within a building context,” Aims & Goals, PATTERNLANGUAGE.COM, http://www.patternlangu age.com/aims/intro.html [http://perma.cc/9BE6-BM4A], and generalized it to technical problems in computer system design. See, e.g., ERICH GAMMA ET AL., DESIGN PATTERNS: ELEMENTS OF REUSABLE OBJECT-ORIENTED SOFTWARE (1994); RICHARD P. GABRIEL, PATTERNS OF SOFTWARE: TALES FROM THE SOFTWARE COMMUNITY (1996). From there, it was only a small step to develop patterns describing how people use software; indeed, these interaction patterns come closest to Alexander’s goal of finding patterns that make “towns and buildings . . . able to come alive.” ALEXANDER, A PATTERN LANGUAGE, supra, at x. Notable examples of pattern languages for social interactions using software include MEATBALLWIKI, http://meatballwiki.or g/wiki [http://perma.cc/9RUZ-YZNK]; YAHOO DESIGN PATTERN LIBRARY, https://developer.yahoo.com/ypatterns/ [https://perma.cc/RAZ6-N4XM]; and ONLINE MODERATION STRATEGIES, https://web.archive.org/web/200704 19071423/http://social.itp.nyu.edu/shirky/wiki [https://perma.cc/NWZ2-W M5L]. This Article uses a different analytical structure to describe moderation, but the themes of these pattern languages inform the thinking behind it. For an outstanding synthesis of the literature, see ROBERT E. KRAUT & PAUL RESNICK, BUILDING SUCCESSFUL ONLINE COMMUNITIES: EVIDENCEBASED SOCIAL DESIGN (2012). 2015 The Virtues of Moderation 47 made up of journalists who cover the online beat by embedding themselves in communities (often in moments of high drama).18 The Article draws on these various sources to present a novel taxonomy of moderation. The taxonomy takes the form of a grammar—a set of nouns, verbs, adverbs, and adjectives suitable for describing the vast array of moderation techniques in common use on the Internet. The Article describes these techniques in terms of familiar jurisprudential categories such as ex ante versus ex post and norms versus architecture. This richer understanding of moderation should be useful to scholars and regulators in two ways. One is theoretical: wellmoderated online communities catalyze human cooperation. Studying them can provide insights into the management of common-pool resources and the creation of information goods, two problems moderation must solve simultaneously. Studying online communities is thus like studying fisheries or fan fiction—a way to understand society. The other payoff is practical. Many laws either regulate the activities of online communities or exempt them from regulation. The wisdom of these choices depends on empirical facts about the value and power of moderation. Regulators cannot properly evaluate these laws without paying close attention to how moderation plays out on the ground. Part I of the Article provides basic definitions and describes the dual commons problems that online communities confront. Part II supplies the detailed grammar of moderation, liberally annotated with examples. Part III presents four case studies of moderation in action: Wikipedia, the Los Angeles Times wikitorial, MetaFilter, and Reddit. Part IV offers some lessons for regulators by examining the two most important statutes that regulate moderation: § 230 of the Communications Decency Act, and § 512 of the Copyright Act. Part V concludes. I. The Problem of Moderation By “moderation,” I mean the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse. Part II will explain how moderation works; this Part lays the foundation by describing the problems it must solve. Section A supplies some basic definitions and details the motivations of community members; Section B describes the goals of good moderation; Section C explains why moderation must confront not one, but two commons problems; 18 Examples will appear throughout the Article, but a good starting point would be Adrian Chen’s work. See, e.g., Adrian Chen, The Laborers Who Keep Dick Pics and Beheadings out of Your Facebook Feed, WIRED, Oct. 23, 2014, http://www.wired.com/2014/10/content-moderation [http://perma.cc/ FJK6-B9SC]. 48 THE YALE JOURNAL OF LAW & TECHNOLOGY Vol. 17 and Section D provides a typology of the abuses against which moderation guards. A. Definitions Our object of study is an online community.19 A community can be as small as the handful of people on a private mailing list or as large as the Internet itself. Communities can overlap, as anyone on both Twitter and Facebook knows. Communities can also nest: the comments section at Instapundit is a meaningful community, and so is the conservative blogosphere. There is little point in being overly precise about any given community’s boundaries, so long as we can identify three things: the community’s members, the content they share with each other, and the infrastructure they use to share it.20 The Internet as a whole is both an agglomeration of numerous communities and a sprawling, loosely knit community in its own right. Its moderation includes both the moderation within its constituent communities and moderation that cannot easily be attributed to any of them. Thus, even though it is not particularly helpful to talk about Google as a community in its own right,21 it and other search engines play an important role in the overall moderation of the Web.22 Members can wear different hats: there are owners of the infrastructure, moderators of the community, and authors and readers of content. For example, on YouTube, Google owns the infrastructure; video uploaders are authors; video viewers are readers; and the moderators include everyone who clicks to flag an inappropriate video,23 the algorithms that collate user re19 20 21 22 23 The defined terms that make up the vocabulary of moderation will be written in bolded italics when they make their first appearances in the Article. These are virtual communities, defined by a shared virtual place rather than by shared geography, meaning, or practice. See generally HOWARD RHEINGOLD, THE VIRTUAL COMMUNITY (1993); Quinn Warnick, What We Talk About When We Talk About Talking: Ethos at Work in an Online Community (2010) (unpublished Ph.D. dissertation, Iowa State University), http://lib.dr.iastate.edu/cgi/viewcontent.cgi?article=2480&context=etd [http://perma.cc/P9JK-HY2P]; see also PREECE, supra note 15, at 10-17. The problem is that there is not a close nexus between Google’s users, the content it indexes, and the infrastructure in Google’s server farms. Most of the websites whose content appears on Google are Google “users” only in a very loose sense, and they bring their own server infrastructure to the table. There is interesting moderation here, but “Google” is the wrong level of generality for identifying the community that the moderation affects. See generally James Grimmelmann, Speech Engines, 98 MINN. L. REV. 868, 893-96 (2014) (discussing the role of search engines in organizing the Internet). See Alistair Barr & Lisa Fleisher, YouTube Enlists ‘Trusted Flaggers’ to Police Videos, WALL ST. J., Mar. 17, 2014, http://blogs.wsj.com/digits/2014/ 03/17/youtube-enlists-trusted-flaggers-to-police-videos [http://perma.cc/Z6 HY-RYKU]. 2015 The Virtues of Moderation 49 ports, and the unlucky YouTube employees who manually review flagged videos.24 Owners occupy a privileged position because their control over infrastructure gives them unappealable control over the community’s software-based rules.25 This control lets owners decide who can moderate and how. Moderato ...
Purchase answer to see full attachment

Tutor Answer

toto
School: UT Austin

Attached.

Running Head: READING RESPONSE

1

Reading Response
Name
Institutional Affiliation

READING RESPONSE

2

The article "The virtues of moderation" by James Grimmelmann (2017) is a very
informative and insightful piece because it offers tangible information on how the power of
moderation can help to minimize abuse of online communities. Grimmelmann cited an example
of "wikitorial" that were abused by users to depict how moderators can play a positive role in
pr...

flag Report DMCA
Review

Anonymous
Good stuff. Would use again.

Similar Questions
Related Tags

Brown University





1271 Tutors

California Institute of Technology




2131 Tutors

Carnegie Mellon University




982 Tutors

Columbia University





1256 Tutors

Dartmouth University





2113 Tutors

Emory University





2279 Tutors

Harvard University





599 Tutors

Massachusetts Institute of Technology



2319 Tutors

New York University





1645 Tutors

Notre Dam University





1911 Tutors

Oklahoma University





2122 Tutors

Pennsylvania State University





932 Tutors

Princeton University





1211 Tutors

Stanford University





983 Tutors

University of California





1282 Tutors

Oxford University





123 Tutors

Yale University





2325 Tutors