Liberation Technology
China’s “Networked
Authoritarianism”
Rebecca MacKinnon
Rebecca MacKinnon is a Bernard L. Schwartz Senior Fellow at the
New America Foundation. She is cofounder of Global Voices Online
(www.globalvoicesonline.org), a global citizen-media network. This
essay draws on testimony that she gave before the U.S. CongressionalExecutive Commission on China (www.cecc.gov) on 24 March 2010.
To mark the twentieth anniversary of the fall of the Berlin Wall, a Ger-
man arts organization launched a website called the “Berlin Twitter
Wall.” Anyone anywhere on the Internet could use Twitter to post a comment into one of the speech bubbles. Within a few days of its launch, the
website was overrun by messages in Chinese. Instead of talking about
the end of the Cold War and the fall of communism in Europe, Chinese
Twitter users accessed the site to protest their own government’s Internet
censorship. One wrote: “My apologies to German people a million times
[for taking over this site]. But I think if Germans learn about our situation,
they would feel sorry for us a million times.” Twitter is blocked in China.
Still, a growing community is so determined to gain access to the widely
used social-networking service and hold uncensored conversations with
people around the world that these Chinese Internet users have acquired
the technical skills to circumvent this censorship system—widely known
as the “Great Firewall of China,” a filtering system that blocks websites
on domestic Internet connections.
In late January 2010, U.S. secretary of state Hillary Clinton—who
two months earlier had stood at Berlin’s Brandenburg Gate with other
world leaders to celebrate the twentieth anniversary of the fall of the
Wall—gave a 45-minute speech on “Internet Freedom.” She spelled out
how one single, free, and open global Internet is an essential prerequisite for freedom and democracy in the twenty-first century. “A new information curtain is descending across much of the world,” she warned.
“And beyond this partition, viral videos and blog posts are becoming the
samizdat of our day.”1
Journal of Democracy Volume 22, Number 2 April 2011
© 2011 National Endowment for Democracy and The Johns Hopkins University Press
Rebecca MacKinnon
33
But can we assume that Chinese authoritarianism will crumble just
as the Iron Curtain crumbled two decades ago? It is unwise to make the
assumption that the Internet will lead to rapid democratization in China
or in other repressive regimes. There are difficult issues of government
policy and corporate responsibility that must be resolved in order to ensure that the Internet and mobile technologies can fulfill their potential
to support liberation and empowerment.
When an authoritarian regime embraces and adjusts to the inevitable
changes brought by digital communications, the result is what I call
“networked authoritarianism.” In the networked authoritarian state, the
single ruling party remains in control while a wide range of conversations about the country’s problems nonetheless occurs on websites and
social-networking services. The government follows this online chatter, and sometimes people are able to use the Internet to call attention
to social problems or injustices and even manage to have an impact on
government policies. As a result, the average person with Internet or
mobile access has a much greater sense of freedom—and may feel that
he has the ability to speak and be heard—in ways that were not possible
under classic authoritarianism. At the same time, in the networked authoritarian state, there is no guarantee of individual rights and freedoms.
Those whom the rulers see as threats are jailed; truly competitive, free,
and fair elections are not held; and the courts and the legal system are
tools of the ruling party.
As residents of a networked authoritarian society, China’s more than
four-hundred million Internet users are managing to have more fun, feel
more free, and be less fearful of their government than was the case
even a mere decade ago. At the same time, however, the government
has continued to monitor its people and to censor and manipulate online
conversations to such a degree that no one has been able to organize a
viable opposition movement. According to the Dui Hua Foundation, a
human-rights advocacy organization, arrests and indictments on charges
of “endangering state security”—the most common charge used in cases
of political, religious, or ethnic dissent—more than doubled in 2008
for the second time in three years.2 Average Chinese citizens, however,
rarely hear of such trends—an “information gap” which makes it much
less likely that a critical mass of them will see the need for rapid political change. The system does not control all of the people all of the time,
but it is effective enough that even most of China’s best and brightest
are not aware of the extent to which their understanding of their own
country—let alone the broader world—is being blinkered and manipulated. All university students in China’s capital now have high-speed Internet access. But when a documentary crew from U.S. public television
recently went onto Beijing university campuses and showed students the
iconic 1989 photograph of a man standing in front of a tank in Tiananmen Square, most did not recognize the picture at all.
34
Journal of Democracy
The Chinese experience teaches us a globally applicable lesson: Independent activists and prodemocracy movements may have won some
early skirmishes against censorship, but one cannot assume that their
adversaries will remain weak and unskilled in the navigation and manipulation of digital communications networks. In fact, governments
and others whose power is threatened by digital insurgencies are learning quickly and pouring unprecedented resources into building their
capacity to influence and shape digital communications networks in
direct and indirect ways. As Larry Diamond put it: “It is not technology, but people, organizations, and governments that will determine
who prevails.”3
In the public discourse about the Internet and repressive regimes,
Western policy makers and activists frequently use Cold War–era metaphors in ways that are similar to Clinton’s likening of blogs to Sovietera samizdat. Such metaphors are strongest in the policy discourse about
the Great Firewall of China. The Hong Kong–based communications
scholar Lokman Tsui has criticized this “Iron Curtain 2.0” lens through
which many in the West seek to understand the Chinese government’s
relationship with the Internet. “Strategies to break down the Great Firewall,” he writes, “are based on the belief that the Internet is a Trojan
Horse (another metaphor!) that eventually will disempower the Chinese state from within and topple the authoritarian government, as the
barbarians in previous times have done for China, and as international
broadcasting has done with regard to ending communism in the Cold
War.” Tsui argues that this framework for understanding the impact of
the Internet on Chinese politics is not consistent with the growing body
of empirical research and is therefore likely to result in failed policy and
activism strategies.4
Guobin Yang, who began researching Chinese online discourse even
before the Internet first became commercially available there in 1995,
has concluded that in spite of China’s increasingly sophisticated system
of censorship and surveillance, the Chinese Internet is nonetheless a
highly “contentious” place where debate is fierce, passionate, and also
playful. After analyzing numerous cases in which Chinese Internet users succeeded in bringing injustices to national attention or managed to
cause genuine changes in local-government policies or official behavior, Yang argues that the Internet has brought about a “social revolution,
because the ordinary people assume an unprecedented role as agents of
change and because new social formations are among its most profound
outcomes.”5 Note that the revolution he describes is being waged mainly
by Chinese people posting and accessing information on websites and
services operated by Chinese companies—in other words, acting inside
the Great Firewall.
In examining the use of information and communications technologies (ICTs) by China’s “have-less” working classes, Jack Linchuan Qiu
Rebecca MacKinnon
35
documents how Internet and mobile-phone use has spread down to the
“lower strata” of Chinese society. This development has given birth to
a new “working-class network society” that provides China’s less fortunate people with tools for mobility, empowerment, and self-betterment.
Yet he also describes how “working-class ICTs” provide new levers
for government and corporations to organize and control a new class of
“programmable labor.” While Chinese workers have been able to use Internet and mobile technologies to organize strikes and share information
about factory conditions in different parts of the country, Qiu concludes
that “working-class ICTs by themselves do not constitute a sufficient
condition for cultural and political empowerment.”6
Can Online Activism Help Authoritarians?
In his book Technological Empowerment: The Internet, State, and
Society in China, Yongnian Zheng points out that the success or failure
of online activism in China depends on its scope and focus, and that
some online activism—particularly that which is at the local level or
targets specific policy issues over which there are divisions or turf
wars between different parts of the government—can actually serve
to bolster regime legitimacy. The least successful online movements
tend to be those that advocate various forms of political “exit,” including calls for an end to one-party rule by the Chinese Communist Party
(CCP) and greater political autonomy or independence for particular
ethnic or religious groups. “When the regime is threatened by challengers,” Zheng writes, “the soft-liners and hard-liners are likely to
stand on the same side and fight the challengers.” On the other hand,
successful online movements in China are usually characterized by
what Zheng (following Albert O. Hirschman) calls the “voice” option,
or what other political scientists call the “cooperation option.” Such
online insurgencies actually provide ammunition to reformist leaders
or liberal local bureaucrats in their power struggles against hard-line
conservative colleagues. Voice activism helps reduce political risks to
reformist officials, who can point to online sentiment and argue that
without action or policy change there will be more unrest and public
unhappiness.7
Thus, rising levels of online activism in China cannot automatically be interpreted as a sign of impending democratization. One
must examine what kind of online activism is succeeding and what
kind is failing. If voice activism is for the most part succeeding while
exit activism is systematically being stifled and crushed—thanks to
high levels of systematic censorship and surveillance, in addition to
the lack of an independent or impartial judiciary—one can conclude
that the CCP has adapted to the Internet much more successfully than
most Western observers realize. The Iron Curtain 2.0 mentality criti-
36
Journal of Democracy
cized by Tsui may indeed have blinded many Western policy makers,
human-rights activists, and journalists to what is really happening in
China. In 2005, New York Times columnist Nicholas Kristof wrote
breathlessly: “it’s the Chinese leadership itself that is digging the
Communist Party’s grave, by giving the Chinese people broadband.” 8
Zheng’s analysis, however, supports the opposite conclusion: The Internet may actually prolong the CCP’s rule, bolstering its domestic
power and legitimacy while the regime enacts no meaningful political
or legal reforms.
Public-policy discourse and deliberation are not exclusive features
of democracies. Political scientists have identified varying amounts of
public discourse and deliberation in a range of authoritarian states. In
2008, Baogang He and Mark Warren coined the term “authoritarian deliberation” to explain how China’s authoritarian regime uses “deliberative venues” to bolster regime legitimacy. While it is possible that the
deliberation now taking place within Chinese authoritarianism might
bring about eventual democratization, Baogang He and Warren believe
that this is only one of two possibilities. The other is that the deliberative practices embraced by the state could stabilize and extend the
CCP’s authoritarian rule.9
Min Jiang applies the concept of authoritarian deliberation specifically to Chinese cyberspace, identifying four main deliberative spaces:
1) “central propaganda spaces,” meaning websites and forums built and
operated directly by the government; 2) “government-regulated commercial spaces,” meaning websites and other digital platforms that are
owned and operated by private companies but subject to government
regulation, including elaborate requirements for content censorship
and user surveillance; 3) “emergent civic spaces,” meaning websites
run by nongovernmental organizations and noncommercial individuals,
which are censored less systematically than commercial spaces but are
nonetheless subject to registration requirements as well as intimidation,
shutdown, or arrest when authors cross the line or administrators fail
to control community conversations; and 4) “international deliberative
spaces,” meaning websites and services that are hosted beyond Chinese-government jurisdiction—some of which are blocked and require
circumvention tools to access—where content and conversations not
permitted on domestic websites can be found, and where more internationally minded Chinese Internet users seek to conduct conversations
with a broader global public.
It is important to note that the Great Firewall is meant to control
only the fourth category of deliberative space, the one that is located outside China. Yet it is the first two categories, as Jiang points
out, that have the greatest impact on Chinese public opinion. The
state uses much more direct and proactive means to control the first
three deliberative spaces, all of which operate within the jurisdic-
Rebecca MacKinnon
37
tion of the Chinese government. Undesirable or “sensitive” content
is either deleted from the Internet altogether or blocked from being
published. 10
The Web as Waterworks
Chinese scholar Li Yonggang has suggested that, instead of using a
“firewall” metaphor, it is more helpful to think of Chinese Internet controls—which include not only censorship but surveillance and manipulation of information—as something like a hydroelectric water-management system. Managers have both routine and crisis-management goals:
managing daily flows and distribution on the one hand and managing
droughts and floods on the other. It is a huge, complex system with
many moving parts, and running it requires flexibility. It is impossible
for the central government to have total control over every detail of water level or pressure at any given time. The system’s managers learn and
innovate as they go along.11
Recent Chinese-government statements show that, like water, the Internet is viewed as simultaneously vital and dangerous. According to the
2010 government white paper “The Internet in China,” rapid, nationwide
expansion of Internet and mobile-device penetration is a strategic priority. The Internet is seen as indispensible for education, poverty alleviation, and the efficient conveyance of government information and services to the public. The development of a vibrant, indigenous Internet and
telecommunications sector is also considered critical for China’s longterm global economic competitiveness.12 Globally, the Internet is rapidly
evolving away from personal computers and toward mobile devices, appliances, and vehicles, with the most rapid rate of growth in Internet and
mobile-phone use taking place in Africa and the Middle East. The Chinese government’s strategy is for Chinese companies to be leaders in mobile Internet innovation, particularly in the developing world. Last year,
Premier Wen Jiabao spoke on multiple occasions about the importance of
“the Internet of things,” encouraging breakthroughs by Chinese companies in what the government has designated as a strategic industry.13
Although the government has direct control over websites run by
state-operated media as well as its own national- and provincial-level
websites, by far the largest portion of the Chinese Internet is run by the
private sector (or “government-regulated commercial spaces” according
to Min Jiang’s taxonomy of Chinese deliberative digital spaces). Chinese networked authoritarianism cannot work without the active cooperation of private companies—regardless of the origin of their financing
or where they are headquartered. Every year a group of Chinese Internet executives is chosen to receive the government’s “China Internet
Self-Discipline Award” for fostering “harmonious and healthy Internet
development.”
38
Journal of Democracy
In Anglo-European legal parlance, the legal mechanism used to implement such a “self-discipline” system is “intermediary liability.” It
is the mechanism by which Google’s Chinese search engine, Google.
cn, was required to censor itself until Google redirected its simplified
Chinese search engine offshore to Hong Kong. All Internet companies
operating within Chinese jurisdiction—domestic or foreign—are held
liable for everything appearing on their search engines, blogging platforms, and social-networking services. They are also legally responsible
for everything their users discuss or organize through chat clients and
messaging services. In this way, the government hands many censorship
and surveillance tasks to private companies that face license revocations
and forced shutdowns should they fail to comply. Every one of China’s
large Internet companies has a special department full of employees
whose sole job is to police users and censor content.
In 2008, I conducted a comparative study examining how fifteen different Chinese blog-hosting services censored user-created content. The
tests revealed that each company used slightly different methods and
approaches in its censorship. The specific content censored also varied
from service to service. In a number of tests, when I tried to post politically sensitive material such as an article about the parents of students
killed in Tiananmen Square, or a recent clash in a remote town in Western China, internal site software would block publication of the post
entirely. Other posts could be saved as drafts but were “held for moderation” until a company staffer could make a decision about whether they
should be allowed. Other postings simply disappeared within hours of
publication.
Lifting the Veil
In June 2010, a report giving Internet users a peek behind the veil of
secrecy surrounding corporate complicity in Chinese Internet censorship appeared on the popular Chinese website Sina.com for a few hours
before, ironically, being censored. It quoted Chen Tong, the editor of Sina’s Twitter-like microblogging service, who described his company’s
censorship system in some detail: round-the-clock policing; constant
coordination between the editorial department and the “monitoring department”; daily meetings to discuss the latest government orders listing
new topics and sensitive keywords that must either be monitored or deleted depending on the level of sensitivity; and finally, systems through
which both editors and users report problematic content and bring it
to the attention of company censors.14 In April 2009, an employee of
Baidu, China’s leading search engine, which also runs user-generated
content services, leaked a set of detailed documents from Baidu’s internal monitoring and censorship department confirming the company’s
longstanding reputation as an industry leader not only as a search engine
Rebecca MacKinnon
39
and online-services company, but also in censoring both search-engine
results and user-generated content. The documents included censorship
guidelines; lists of specific topics and words to be censored; guidelines
on how to search for information that needs to be deleted, blocked, or
banned; and other internal information from November 2008 through
March 2009.15
In its efforts to manage what the Chinese people can learn, discuss,
and organize online, the government deploys a range of other tactics as
well. They include:
Cyber-attacks: The sophisticated, military-grade cyber-attacks launched
against Google in late 2009 were targeted specifically at the Gmail accounts of human-rights activists who are either from China or work on
China-related issues. Websites run by Chinese exiles, dissidents, and human-rights defenders (most of whom lack the training or resources to protect themselves) have been the victims of increasingly aggressive cyberattacks over the past few years—in some cases, compromising activists’
computer networks and e-mail accounts. Domestic and foreign journalists
who report on politically sensitive issues and academics whose research
includes human-rights problems have also found themselves under aggressive attack in China, with efforts to expose their sources, making it much
more risky to work on politically sensitive topics.
Device and network controls: In May 2009, the Ministry of Industry
and Information Technology (MIIT) mandated that by July 1 of that
year a specific software product called Green Dam Youth Escort was
to be preinstalled on all computers sold in China. While Green Dam
was ostensibly aimed at protecting children from inappropriate content,
researchers outside and within China quickly discovered that it not only
censored political and religious content but also logged user activity and
sent this information back to a central computer server belonging to the
software developer’s company. The software had other problems that
created opposition to it within U.S. companies. It contained serious programming flaws that increased the user’s vulnerability to cyber-attack.
It also violated the intellectual property rights of a U.S. company’s filtering product. Faced with uniform opposition from the U.S. computer
industry and strong protests from the U.S. government, the MIIT backed
down on the eve of its deadline, making the installation of Green Dam
voluntary instead of mandatory.
The defeat of Green Dam, however, did not diminish other efforts
to control and track Internet-user behavior at more localized levels—
schools, universities, apartment blocks, and citywide Internet Service
Providers (ISPs). In September 2009, news reports circulated that local
governments were mandating the use of censorship and surveillance
products with names such as “Blue Shield” and “Huadun.” The pur-
40
Journal of Democracy
pose of these products appeared similar to Green Dam’s, though they
involved neither the end user nor foreign companies. 16 Unlike Green
Dam, the implementation of these systems has received little attention
from foreign media, governments, or human-rights groups.
Domain-name controls: In December 2009, the government-affiliated China Internet Network Information Center (CNNIC) announced
that it would no longer allow individuals to register Internet domain
names ending in “.cn.” Only companies or organizations would be able
to use the .cn domain. While authorities explained that this measure
was aimed at cleaning up pornography, fraud, and spam, a group of
Chinese webmasters protested that it also violated individual rights.
Authorities announced that more than 130,000 websites had been shut
down in the cleanup. In January 2010, a Chinese newspaper reported that self-employed individuals and freelancers conducting online
business had been badly hurt by the measure.17 In February, CNNIC
backtracked somewhat, announcing that individuals would once again
be allowed to register .cn domains, but all applicants would have to
appear in person to confirm their registration, show a government ID,
and submit a photo of themselves with their application. This eliminated the possibility of anonymous domain-name registration under
.cn and has made it easier for authorities to warn or intimidate website
operators when “objectionable” content appears.
Localized disconnection and restriction: In times of crisis, when
the government wants to ensure that people cannot use the Internet or
mobile phones to organize protests, connections are shut down entirely
or heavily restricted in specific locations. The most extreme case is in
the far-northwestern province of Xinjiang, a traditionally Muslim region
that borders Pakistan, Kazakhstan, and Afghanistan. After ethnic riots
took place in July 2009, the Internet was cut off in the entire province
for six months, along with most mobile text messaging and international
phone service. No one in Xinjiang could send e-mail or access any website—domestic or foreign. Business people had to travel to the bordering
province of Gansu to communicate with customers. Internet access and
phone service have since been restored, but with severe limitations on
the number of text messages that people can send on their mobile phones
per day, no access to overseas websites, and very limited access even
to domestic Chinese websites. Xinjiang-based Internet users can only
access watered-down versions of official Chinese news and information sites, with many of the functions such as blogging or comments
disabled.18
Surveillance: Surveillance of Internet and mobile users is conducted
in a variety of ways, contributing to an atmosphere of self-censorship.
Rebecca MacKinnon
41
Surveillance enables authorities to warn and harass Internet users either via electronic communications or in person when individuals are
deemed to have transgressed certain standards. Detention, arrest, or imprisonment of selected individuals serves as an effective warning to others that they are being watched. Surveillance techniques include:
“Classic” monitoring: While surveillance measures are justified to
the public as antiterrorism measures, they are also broadly used to identify and harass or imprison peaceful critics of the regime. Cybercafés—
the cheap and popular option for students and the less affluent—are required to monitor users in multiple ways, including identity registration
upon entry to the café or upon login, surveillance cameras, and monitoring software installed on computers.
“Law-enforcement compliance”: In China, where “crime” is defined
broadly to include political dissent, companies with in-country operations and user data stored locally can easily find themselves complicit
in the surveillance and jailing of political dissidents. The most notorious example of law-enforcement compliance gone wrong was when
Yahoo’s local Beijing staff gave Chinese police account information of
activist Wang Xiaoning in 2002 and journalist Shi Tao in 2004, leading
to their imprisonment. In 2006, Skype partnered with a Chinese company to provide a localized version of its Internet-based phone-calling
service, then found itself being used by Chinese authorities to track and
log politically sensitive chat sessions by users inside China. Skype had
delegated law-enforcement compliance to its local partner without sufficient attention to how the compliance was being carried out.19
“Astroturfing” and public outreach: The government increasingly
combines censorship and surveillance measures with proactive efforts
to steer online conversations. In 2008, the Hong Kong–based researcher David Bandurski determined that at least 280,000 people had been
hired at various levels of government to work as “online commentators.” Known derisively in the Chinese blogosphere as the “fifty-cent
party,” these people are paid to write posts that show their employers in
a favorable light in online chatrooms, social-networking services, blogs,
and comments sections of news websites.20 Many more people do similar work as volunteers—recruited from the ranks of retired officials as
well as college students in the Communist Youth League who aspire
to become Party members. This approach is similar to a tactic known
as “astroturfing” in U.S. parlance, now commonly used by commercial
advertising firms, public-relations companies, and election campaigns
around the world in order to simulate grassroots enthusiasm for a product or candidate. In many Chinese provinces, it is now also standard
practice for government officials—particularly at the city and county
level—to coopt and influence independent online writers by inviting
them to special conferences and press events.
42
Journal of Democracy
The central government has also adopted a strategy of using official interactive portals and blogs, which are cited as evidence both
at home and abroad that China is liberalizing. In September 2010, the
CCP launched an online bulletin board called “Direct to Zhongnanhai,” through which the public was invited to send messages to China’s
top leaders. Since 2008, President Hu Jintao and Premier Wen Jiabao
have held annual “web chats” with China’s “netizens.” An official
“E-Parliament” website, on which citizens are invited to post policy
suggestions to the National People’s Congress, was launched in 2009.
The 2010 official government white paper lists a variety of ways in
which the Chinese government solicits public feedback through the
Internet. It states: “According to a sample survey, over 60 percent of
netizens have a positive opinion of the fact that the government gives
wide scope to the Internet’s role in supervision, and consider it a
manifestation of China’s socialist democracy and progress.” 21
All of this is taking place in the context of the Chinese government’s
broader policies on information and news control. In December 2009,
the Committee to Protect Journalists listed China as the world’s worst
jailer of journalists. In recent testimony before the U.S. Congress, Joshua Rosenzweig of the Dui Hua Foundation presented an array of statistics to support a grim conclusion:
Over the past two-and-a-half years in particular, roughly since the beginning of 2008, there has been a palpable sense that earlier progress towards
rule of law in China has stalled, or even suffered a reversal, and there is
mounting evidence that a crackdown is underway, one particularly targeting members of ethnic minorities, government critics, and rights defenders.22
Thus online public discourse is indeed expanding—with government
encouragement. The government is creating and promoting the impression both at home and abroad that China is moving in the direction of
greater democracy. At the same time, the Chinese people’s ability to
engage in serious political dissent or to organize political movements
that might effectively challenge the CCP’s legitimacy has actually diminished, and the consequences for attempting such activities are more
dire than they were ten years ago.
Networked Authoritarianism Beyond China
In their most recent book surveying Internet censorship and control around the world, Ron Deibert and Rafal Rohozinski warn that
“the center of gravity of practices aimed at managing cyberspace has
shifted subtly from policies and practices aimed at denying access to
content to methods that seek to normalize control and the exercise
of power in cyberspace through a variety of means.” This article has
Rebecca MacKinnon
43
described a range of ways in which China is near the forefront of this
trend. Deibert and Rohozinski divide the techniques used by governments for Internet censorship and control into three “generations”: The
“first generation” of techniques focuses on “Chinese-style” Internet
filtering and Internet-café surveillance. “Second-generation” techniques include the construction of a legal environment legitimizing
information control, authorities’ informal requests to companies for
removal of information, technical shutdowns of websites, and computer-network attacks. “Third-generation” techniques include warrantless
surveillance, the creation of “national cyber-zones,” state-sponsored
information campaigns, and direct physical action to silence individuals or groups.23
While Deibert and Rohozinski characterize Chinese cyber-controls
as being largely first generation, the Chinese government aggressively
uses all the second- and third-generation techniques and has been doing
so for quite some time. Indeed, the second- and third-generation techniques are essential because the Great Firewall alone is ineffective and
permeable.
Deibert and Rohozinski point out that a number of governments, particularly those in Russia and several former Soviet republics, have bypassed the first-generation controls almost completely and instead are
concentrating their energies on second- and third-generation controls,
most of which (with the jarring exception of “direct physical action to
silence individuals or groups”) are more subtle, more difficult to detect,
and more compatible with democratic or pseudodemocratic institutions.
The Russian-language Internet, known by its denizens as “RUNET,” is
thus on the cutting edge of techniques aimed to control online speech
with little or no direct filtering.24
Research in the Middle East and North Africa shows that while Internet filtering is more common and pervasive throughout that region,
governments are increasing the use of second- and third-generation
techniques. Many governments in the region have cracked down on
online dissent through the skillful use of family-safety measures and
antiterrorism laws. At the same time, they have made substantial investments in Internet and telecommunications infrastructure, recognizing that connectivity is essential for economic success.25
Some second- and third-generation controls are also used by democratically elected governments, including those of South Korea and
India.26 Intermediary censorship is deployed in a range of political
systems to silence antiregime speech, fight crime, or protect children.
The concept of holding service providers liable has become increasingly popular among lawmakers around the world, including in Western
Europe—where the main goals are to combat intellectual-property theft
and protect children. In the United States, activists are concerned about
the weakening of due process, which has allowed government access
44
Journal of Democracy
to networks owned and run by corporations, all in the name of combating cyber-crime and cyber-warfare. Even the Chinese government has
adopted a very similar language of cyber-security to justify its Internetcontrol structures and procedures. Deibert and Rohozinski are right to
warn that “many of the legal mechanisms that legitimate control over
cyberspace, and its militarization, are led by the advanced democratic
countries of Europe and North America.”27
Chinese authoritarianism has adapted to the Internet Age not merely through the deployment of Internet filtering, but also through the
skilled use of second- and third-generation controls. China’s brand of
networked authoritarianism serves as a model for other regimes, such
as the one in Iran, that seek to maintain power and legitimacy in the Internet Age. In Russia and elsewhere there is a further, disturbing trend:
Strong governments in weak or new democracies are using second- and
third-generation Internet controls in ways that contribute to the erosion
of democracy and slippage back toward authoritarianism. This situation
is enabled by a weak rule of law, lack of an independent judiciary, weak
guarantees for freedom of speech and other human-rights protections,
heavy or untransparent regulation of industry (particularly the telecommunications sector), and weak political opposition that is rendered even
weaker by clever manipulation of the media, legal system, and commercial-regulatory system.
It is clear that simply helping activists to circumvent first-generation censorship and training them in the use of new technologies
for digital activism without also addressing the second- and thirdgeneration controls deployed by their governments is insufficient,
sometimes counterproductive, and potentially dangerous for the individuals involved. Weak rule of law and lack of accountability and
transparency in the regulation of privately owned and operated Internet platforms and telecommunications networks facilitate the use
of second- and third-generation controls, which pose a great threat
to activists. Therefore, strong advocacy work at the policy and legislative level aimed at improving rule of law, transparency, and accountability—in government as well as the private sector—is more
important than ever.
The business and regulatory environment for telecommunications
and Internet services must become a new and important focus of human-rights activism and policy. Free and democratic political discourse
requires Internet and telecommunications regulation and policy making that are transparent, accountable, and open to reform both through
independent courts and the political system. Without such baseline
conditions, opposition, dissent, and reform movements will face an increasingly uphill battle against progressively more innovative forms of
censorship and surveillance.
Rebecca MacKinnon
45
NOTES
1. Hillary Rodham Clinton, “Remarks on Internet Freedom,” Washington, D.C., 21
January 2010; available at www.state.gov/secretary/rm/2010/01/135519.htm.
2. “Chinese State Security Arrests, Indictments Doubled in 2008,” Dui Hua Human
Rights Journal, 25 March 2009; available at www.duihua.org/hrjournal/2009/03/chinesestate-security-arrests.html.
82.
3. Larry Diamond, “Liberation Technology,” Journal of Democracy 21 (July 2010):
4. Lokman Tsui, “The Great Firewall as Iron Curtain 2.0: The Implications of China’s
Internet Most Dominant Metaphor for U.S. Foreign Policy,” paper presented at the sixth
annual Chinese Internet Research Conference, Hong Kong University, 13–14 June 2008;
available at http://jmsc.hku.hk/blogs/circ/files/2008/06/tsui_lokman.pdf.
5. Guobin Yang, The Power of the Internet in China: Citizen Activism Online (New
York: Columbia University Press, 2009), 213.
6. Jack Linchuan Qiu, Working-Class Network Society: Communication Technology
and the Information Have-Less in Urban China (Cambridge: MIT Press, 2009), 243.
7. Yongnian Zheng, Technological Empowerment: The Internet, State, and Society in
China (Stanford: Stanford University Press, 2008), 164–65.
8. Nicholas D. Kristof, “Death by a Thousand Blogs,” New York Times, 24 May 2005;
available at www.nytimes.com/2005/05/24/opinion/24kristoff.html.
9. Baogang He and Mark Warren, “Authoritarian Deliberation: The Deliberative Turn
in Chinese Political Development,” paper presented at the Annual Meeting of the American Political Science Association, Boston, 28–31 August 2008; forthcoming, Perspectives
on Politics, June 2011.
10. Min Jiang, “Authoritarian Deliberation on Chinese Internet,” Electronic Journal of
Communication 20 (2010); available at http://papers.ssrn.com/sol3/papers.cfm?abstract_
id=1439354.
11. Rebecca MacKinnon, “Chinese Internet Research Conference: Getting Beyond
‘Iron Curtain 2.0,’” RConversation, 18 June 2008; available at http://rconversation.blogs.
com/rconversation/2008/06/chinese-inter-1.html.
12. “The Internet in China,” Information Office of the State Council of the People’s
Republic of China (SCIO), 8 June 2010; available at http://china.org.cn/government/
whitepaper/node_7093508.htm.
13. Robert McManus, “Chinese Premier Talks Up Internet of Things,” ReadWriteWeb,
19 January 2010; available at www.readwriteweb.com/archives/chinese_premier_internet_of_things.php.
14. Jonathan Ansfield, “China Tests New Controls on Twitter-Style Services,” New
York Times, 16 July 2010; available at www.nytimes.com/2010/07/17/world/asia/17beijing.
html. The full Chinese-language text of the report (which was deleted by censors from the
original source) was reproduced by Radio France Internationale at www.chinese.rfi.fr.
15. Xiao Qiang, “Baidu’s Internal Monitoring and Censorship Document Leaked,”
China Digital Times, 30 April 2009; available at http://chinadigitaltimes.net/2009/04/
baidus-internal-monitoring-and-censorship-document-leaked/.
46
Journal of Democracy
16. Owen Fletcher, “China Clamps Down on Internet Ahead of 60th Anniversary,”
IDG News Service, 25 September 2009; available at www.pcworld.com/article/172627/
china_clamps_down_on_internet_ahead_of_60th_anniversary.html; and Oiwan Lam,
“China: Blue Dam Activated,” Global Voices Advocacy, 13 September 2009; available at
http://advocacy.globalvoicesonline.org/2009/09/13/china-blue-dam-activated.
17. Oiwan Lam, “China: More than 100 Thousand Websites Shut Down,” Global
Voices Advocacy, 3 February 2010; available at http://advocacy.globalvoicesonline.
org/2010/02/03/china-more-than-100-thousand-websites-shut-down.
18. Josh Karamay, “Blogger Describes Xinjiang as an ‘Internet Prison,’” BBC News, 3
February 2010; available at http://news.bbc.co.uk/2/hi/asia-pacific/8492224.stm.
19. Nart Villeneuve, “Breaching Trust: An Analysis of Surveillance and Security Practices on China’s TOM-Skype Platform,” Open Net Initiative and Information Warfare
Monitor, October 2008; available at: www.nartv.org/mirror/breachingtrust.pdf.
20. David Bandurski, “China’s Guerilla War for the Web,” Far Eastern Economic
Review, July 2008.
21. SCIO, “The Internet in China.”
22. Joshua Rosenzweig, “Political Prisoners in China: Trends and Implications for
U.S. Policy,” Testimony to the Congressional-Executive Committee on China, 3 August
2010; available at www.cecc.gov/pages/hearings/2010/20100803/statement5.php.
23. Ronald Deibert and Rafal Rohozinski, “Control and Subversion in Russian Cyberspace,” in Ronald Deibert et al., eds., Access Controlled: The Shaping of Power, Rights,
and Rule in Cyberspace (Cambridge: MIT Press, 2010), 23.
24. Deibert and Rohozinski, “Control and Subversion in Russian Cyberspace,” in Access Controlled, 15–34.
25. “MENA Overview,” Access Controlled, 523–35.
26. Michael Fitzpatrick, “South Korea Wants to Gag the Noisy Internet Rabble,”
Guardian.co.uk, 9 October 2008; available at www.guardian.co.uk/technology/2008/
oct/09/news.internet; and John Ribeiro, “India’s New IT Law Increases Surveillance
Powers,” IDG News Service, 27 October 2009; available at www.networkworld.com/
news/2009/102709-indias-new-it-law-increases.html.
27. Deibert and Rohozinski, “Beyond Denial: Introducing Next-Generation Information Access Controls,” 6.
Regional Variation in Chinese Internet Filtering
Joss Wright
Oxford Internet Institute,
University of Oxford
joss.wright@oii.ox.ac.uk
September 11, 2012
Abstract
Internet filtering in China is a pervasive and well-reported phenomenon and, as arguably the most
extensive filtering regime in the world today, has been studied by a number of authors. Existing studies,
however, have considered both the filtering infrastructure and the nation itself as largely homogeneous in
this respect. In order to gain a deeper understanding of Chinese internet filtering, its practical effects and its
social implications, it is crucial to understand in detail the extent to which filtering varies across China, and
amongst its hundreds of millions of internet users.
is work investigates regional variation in filtering across China through direct access to internet services
distributed across the country. is is achieved through use of the Domain Name Service, or DNS, which
provides a mapping between human-readable names and machine-routable IP addresses on the internet, and
which is thus a critical component of internet-based communications. Due to the key role that this service
plays, its manipulation is a common mechanism used by states and institutions to hamper access to internet
services that have been deemed undesirable.
rough access to a range of publicly available DNS servers located across China, obtained from the
Asia-Pacific Network Information Centre (APNIC), we query data concerning known blocked websites.
e results of these queries are compared against canonical results from unfiltered connections, allowing
the detection of any tampering or blocking. By combining these results with the geographical location of
each server, the nature of DNS-based filtering experienced in various parts of China can be analysed and
mapped.
ese experiments demonstrate that filtering varies widely across China, and occurs in several forms.
Queries concerning blocked websites may result in valid responses, in faked responses that redirect to other
servers, or in no response at all. We observe that individual servers do not typically respond in the same way
for all blocked requests; nor is there a discernable overall geographic pattern to the nature or extent of filtering
despite significant variation.
Our results support the hypothesis that, despite typically being considered a monolithic entity, the Golden
Shield is better understood as a decentralised and semi-privatised operation in which low-level filtering
decisions are left to local authorities and organisations. As such, the filtering experienced by a given citizen
may vary wildly in the blocking technique employed, the ease by which the block can be bypassed, the level
of collateral blocking that occurs, and the specific sites blocked at a given time.
is article provides a first step in understanding how filtering affects populations at a fine-grained level,
and moves toward a more subtle understanding of internet filtering than those based on the broad criterion
of nationality. e techniques employed in this work, while here applied to geographic criteria, provide an
approach by which filtering can be analysed according to a range of social, economic and political factors in
order to more fully understand the role that internet filtering plays in China, and around the world.
1
1
Introduction
Many nations around the globe participate in some form of internet filtering (Deibert et al. 2008). Whilst
filtering and censorship can, to an extent, be open and transparent, their nature tends towards secrecy. In
order to understand the extent and nature of filtering around the world, we desire the ability to experience
directly the limitations imposed on these internet connections.
National-level filtering, however, is simply the crudest form of such mapping. Whilst many states have
national filtering policies, there is some evidence that the specific implementation of these may vary from
region to region, from ISP to ISP and even from computer to computer. In order to fully understand
filtering and its role in the globally networked world, it is extremely useful to explore connectivity at a more
geographically and organisationally fine-grained level.
To this end, it is desirable to experience the Internet as viewed by a computer in a location of interest.
ere are a number of existing services specifically designed to allow this: virtual private network, or VPN,
software and other proxy services allow remote computers to route their connections through a given remote
network, and the well-known Tor (Dingledine et al. 2004) anonymising network provides a similar service
specifically aimed at bypassing national-level filtering.
For the purposes of wide-scale research, however, many of these services are relatively rare and require
explicit access. Further, many of these services are employed directly to avoid filtering and thus to allow
filtered users to access unfiltered connections. Clearly, such a service is less likely to exist on heavily filtered
connections. In deliberately investigating filtered connections, it may be necessary also to explore other forms
of information.
2
Motivation
ere are a number of technical approaches to internet filtering employed around the world, varying in
their severity and extent. e most well-known filtering regime is almost certainly China’s Golden Shield
Project (金盾工程 jīndùn gōngchéng) or ‘Great Firewall’, which represents arguably the largest and most
technologically advanced filtering mechanism in use today at the national level.
Despite the technological sophistication of the Chinese national firewall, it is subject to a number of
limitations. With a total population of roughly 1.3 billion and an internet population estimated at 5.13
million in 2011 (CNNIC n.d.) the number of Chinese internet users is comparable to the entire population
of the European Union. At such a scale economies must be made in the mechanisms of filtering in order
to limit the required computational and human resources to a manageable level. An excellent study of some
of the major techniques employed by the underlying the Chinese national firewall was presented by Clayton
et al. (2006).
Many other countries, however, perform internet filtering with significantly lower resources and technical
investment. Technologies range from crude blocking of large portions of the internet, to sophisticated and
subtle blocking of specific content. A global view of internet filtering has been comprehensively presented in
Deibert et al. (2008). is work is notable not just for its scope, but for its focus on the sociological as well as
technical aspects of filtering, covering the nature of filtered topics and the levels of state transparency in the
filtering process.
At a national level, however, filtering beyond crude mechanisms is often considered infeasible due not only
to computational, but also to the organisational requirements of such systems; even if sufficient technological
resources are available, the dynamic nature of the internet imposes a significant administrative burden in
maintaining up-to-date filtering rules.
In solving this second problem states may choose to provide broader filtering guidelines to be implemented
by local authorities or individual service providers, resulting in potential differences between the filtering
experienced between users in different geographical locations or those using different providers. It is also
possible, and has been observed in a number of cases, that a state may deliberately choose to restrict internet
services to a greater or lesser extent in certain locations as a result of unrest or disaster.
To understand the technologies employed by states in filtering the internet, and the decisions behind
this filtering, we therefore see great interest in studying the extent and nature of filtering at a regional and
organisational, rather than national, level. We believe that this will provide a much more sophisticated picture
of filtering around the globe, and provide a valuable source of information for internet researchers.
2
3
Filtering Technologies
e development of the internet was neither carefully planned, nor accurately predicted. It has expanded
through the accretion of protocols, services and applications that have been extended and improved far beyond
their original purpose. As such, many of the protocols provide opportunities both for filtering technologies,
and for attempts to bypass or study those technologies.
ere are a number of methods applied to filter internet connections at a national level. ese have been
usefully categorised by Murdoch & Anderson (2008) as follows:
• TCP/IP Header Filtering: IP, the Internet Protocol, is the fundamental protocol by which traffic
passes across the internet, encoded in IP packets. Filtering may occur via inspection of the header of
an IP packet, which details the numerical address of the packet’s destination. Packets may therefore be
filtered according to lists of banned destination IP addresses. is method is simple and effective, but
difficult to maintain due to the potential for services to change, or to have multiple, IP addresses. is
approach may also incur significant collateral damage in the case of services that share IP addresses,
causing multiple innocent services to be blocked along with the desired target.
• TCP/IP Content Filtering: Rather than inspecting the header, a filter may search the content of traffic
for banned terms. is is a far more flexible approach to filtering, allowing packets to be blocked only
if the include banned keywords or the traffic patterns of particular applications. is approach is also
known as deep packet inspection, and is known to be employed to some extent by the Chinese national
firewall. Deep packet inspection can be partially defeated by using encrypted connections, however
filters may choose simply to block all encrypted connections in response, or to block traffic according to
identifying traffic signatures that can occur even in encrypted protocols. e most significant limitation
of this approach is that inspection of traffic content comes at a significant computational cost.
• DNS Tampering: e DNS protocol maps human-readable names to IP addresses on the internet, and
is thus critical for most user-focused services such as the web. By altering DNS responses, returning
either empty or false results, a filter can simply and cheaply block or redirect requests. is mechanism
is simple to employ and maintain, but limits filters to entire websites, and can be relatively easy to bypass
for technical users. is approach is typically employed as a first approach to web-based filtering, due
to its low resource requirements and in spite of its ease of bypass, however it has been noted that states
typically graduate to more sophisticated filtering techniques over time (Deibert et al. 2008).
• HTTP Proxy Filtering: A more sophisticated approach is to pass all internet traffic through an intermediary
proxy server that fetches and, typically, caches information for users. is is a common internet service
that can be used to speed up internet connections and reduce traffic. A suitably enabled proxy can,
however, employ sophisticated filtering on certain destinations, whilst leaving other connections alone.
is approach can, by ignoring the majority of traffic, be efficient on a national scale while still allowing
for detailed filtering similar to TCP/IP content filtering.
• Other Approaches: A variety of other means can be taken to regulate content on the internet. States
can request that websites are removed from the internet, either by taking down their servers or by
removing their names from the global DNS records. A state may also choose not to block a connection
entirely, but to slow any connection to that site to unusable levels. At a less technical level, legal and
social constraints can be imposed to may accessing certain services illegal or socially unacceptable.
It has been noted, in Deibert et al. (2008) that many states begin by employing IP header filtering before
moving on to more sophisticated methods as citizens protest the limiting of their connections. In the case of
sophisticated national-level connections it is likely that a combination of these methods will be employed in
order to meet the various constraints of large-scale filtering.
4
Mapping Filtering
A number of projects exist that provide insight into internet censorship around the world, both from the
perspective of discovering filtered sites and keywords, and from the more concern of bypassing filtering. e
most thorough study of global internet filtering is from Deibert et al. (2008), who present an in-depth global
3
study of tools and techniques of filtering. e related Herdict project (e Herdict Project n.d.) allows users
to report apparently blocked websites, via a browser plugin, to build up a global map of filtered sites. e
Alkasir project (Al-Saqaf n.d.) combines user-based reporting of blocked content with an anti-censorship
tool that attempts to penetrate such filtering.
In bypassing internet filtering, perhaps the most well-known technology internationally is the Tor project (Dingledine
et al. 2004), which allows users to reroute their connections through a global network of volunteer-run
anonymising proxy servers. is network, originally designed to preserve the connection-level privacy of
users, was found to be an excellent tool for bypassing national filtering and now invests significant resources
in supporting this use. Similar tools include Psiphon (Psiphon Inc. n.d.) as well as numerous Virtual Private
Network (VPN) servers that allow users to evade national filters. All of these services work in a similar
manner: by rerouting a connection through a server located in a different country, the user experiences the
internet as if their connection originated in that country. us, a user from Saudi Arabia is able to route their
connection through a US computer and bypass all filtering imposed by their state or organization, at the cost
of some slowing of their connection and gaining any filtering or surveillance, if any, imposed by the US or
the provider of the proxy.
From these examples, we can observe two major possibilities for studying internet filtering. e first is to
ask users in a given country to report their experience, as exemplified by the Herdict project; the second is to
make use of an available service, such as a Tor node, in that country to experience the filtering directly. Both
of these approaches have limitations that we explore in detail below.
Fundamentally, both of the aforementioned approaches suffer from a lack of availability that we see no
easy way to avoid. In requesting users to directly report their experiences, Herdict relies on reaching interested
and informed users. Tor relies on technically knowledgeable users to set up relays that require both significant
resources and a willingness to face potentially serious legal issues (e Electronic Frontier Foundation n.d.).
In particular, at time of writing the Tor network does not report any publicly available servers in China1 .
e advantage of using a system such as Tor, Psiphon or VPN services is that they allow a researcher
directly to control the flow of traffic. Sites of interest and even specific patterns of traffic can be directly
sent and examined. is allows for a much more detailed examination of the technical measures employed
on a given network. e approach taken by Herdict, however, cannot currently reproduce this level of
sophistication. In the absence of a large network of experienced and technically capable users. user-level
reporting only provides that a site appears to be unavailable, without reference to the conditions that cause
the unavailability2 .
In order to achieve the fine-grained mapping of filtering that we desire, there are two major points of
interest beyond those commonly considered by the most well-known current mapping projects. e first of
these is the precise geographical location of a particular computer. e ability to determine the originating
country of an IP address is relatively well known, and location to the level of an individual city can be achieved
with some accuracy. Recent results from Wang et al. (2011) have proposed mechanisms that achieve a median
accuracy of 690 metres, albeit within the US. is simple extension, we propose, would provide a valuable
source of data on the applications of filtering. In many cases it is also possible to determine which organisation
has been allocated any particular IP address, to the level of an ISP or major company. Both of these pieces of
information can be used to build up a much more detailed view of filtering.
e second point of interest is to study, in detail, the technical nature of the filtering that is imposed on a
given connection in a given location. While work has been conducted into specific methods, as in the work
of Clayton et al. (2006) relating to the Chinese national filter, most large-scale projects appear to be focused
more on the existence of filtering rather than the details of its implementation.
4.1
Extending Reporting Approaches
e approach taken by the Herdict project, which relies on volunteer participation to gather data, can be
highly effective if sufficient volunteers can be found. Herdict currently provides a webpage that attempts
to direct a user’s browser to load a random potentially-blocked site, and to report their experience. e
1
Specifically, there are no announced exit nodes, which would be the most feasible way to examine network filtering, reported as
located on the Chinese mainland.
2
e Herdict project does allow a user to express their opinion as to the cause of the blocking, but in the absence of direct
experimentation this data has significant limitations.
4
project also makes available a web browser plugin that allows users to report sites that appear blocked. By
focusing on the web browser environment, Herdict greatly reduce the effort required for user participation.
e importance of this approach to usability, and the trust implicitly gained through the familiarity of the
web browser, should not be overlooked.
is volunteer approach could naturally be extended to the use of more sophisticated tools to detect the
presence of filtering automatically and, where possible, test the mechanisms employed. e detection of DNS
filtering, IP blocking and even deep packet inspection is often simple enough in itself, particularly when the
results of requests can be compared against reference requests made in other countries. It is, however, much
more difficult to discover specifics of filtering mechanisms without direct, interactive access to the filtered
network connection.
5
Ethics and Legality
While many technical approaches, and challenges, exist for mapping global filtering, there are a number of
serious legal and ethical issues to be faced with performing this research.
Deliberate misuse of a network service, for the purposes of detecting internet filtering, may be illegal in
many jurisdictions; such misuse without a user’s consent may well be unethical. Even when using openly
available and general purpose services, however, there are serious considerations when attempting to access
blocked content via a third party.
In many situations, a user is unlikely to face repercussions for being seen to be attempting to access
blocked content. e scale of internet use, even in smaller countries with low internet penetration rates, is
simply too high for there to be serious policing of users who request filtered content. It is likely that, in the
vast majority of cases, such attempts may not be logged at all. However, users in specific contexts may be put
at risk.
e legality of attempting to access filtered content is also a concern. Many nations have somewhat
loosely defined computer crime laws, and often prefer to prosecute crimes involving computers under existing
legislation rather than through creation of new laws (Lessig 2006). e legal status of attempting to access
blocked content, however, and of attempting to bypass such blocks is not something a researcher can afford
to ignore.
From the point of view of a researcher, these concerns are exacerbated by two factors: the concentrated
attempts to access filtered content that is caused by a detection tool, and the wide variety of laws and social
conventions that are encompassed by researching a global phenomenon such as censorship.
By their nature, the filtering detection mechanisms that we have discussed, and any that we can feasibly
imagine, detect filtering through attempts to access filtered content: websites or IP addresses that are known,
or are believed or likely, to be banned. As we have stated above, it would be largely impractical for a state to
take note of every blocking action taken by their filter; it is possible, however, that sufficiently high-volume
requests for banned content may be considered worthy of further action. A user innocently aiding a researcher
in mapping censorship, resulting in their computer suddenly attempting to connect to all forms of banned
content, may find themselves under highly unwelcome scrutiny.
It is also of great concern that a researcher not cause a user to unwittingly break the law with respect to
the content that they direct a user to access. With the wide global variance in law, great care must be taken
that a censorship tool not attempt to access content that was directly illegal. Pornography, particularly with
respect to those under the local age of legal consent, lèse majesté and insults to religion are all sensitive issues
that vary widely between cultures.
Volunteers that participate in research of this nature by running a filtering detection tool must do so
having been fully informed as to the nature of the tool and the potential risks involved. From this perspective
there is a significant added burden on the researcher to state to the participant, who may well not have any
significant level of technical expertise, what the tool will do and what particular risks they run.
In the case of relay services, such as Tor or Psiphon, consideration must be given to the safety and security
of the user operating the service. Due to their nature these services are frequently abused, and operators of
such services must be prepared to defend their operation of the service. e Tor Project, in particular, invests
significant efforts in education both for operators and for users. is does not, however, reduce the burden
on a researcher taking advantage of such a service to ensure that they do not harm or endanger the operator
5
through their actions.
6
Experiments
To conduct a direct investigation of regional variations in censorship across China, access through the DNS
service was selected as the most appropriate mechanism. DNS provides a number of attractive features for
both technical and for legal and ethical reasons.
At a technical level, due to their crucial role in resolving names to IP addresses, DNS servers are both
common and widespreadi, providing the desired level of coverage across the country to an extent that is
difficult to match with other approaches. DNS servers are also often openly accessible, meaning that there is
no technical restriction in making requests to these remote systems.
DNS servers are also attractive as they are typically run either by internet service providers for the benefit
of their customers, or by large organizations that run their own networks. e result of this is that the results
returned by a given DNS server typically reflect the view of the internet, at the level of DNS, of a reasonably
large class of users.
From a legal and ethical point of view, DNS servers have the advantage of functioning, at an extremely
simple level, as a simple database of mappings between domaian names3 and IP address. As such, requesting
information regarding a given mapping between a domain and an address does not cause any direct accses to
potentially sensitive resources on behalf of a third party4 , as would be the case for the proxy services mentioned
above.
To obtain a useful sample for investigating DNS censorship across China, a list of DNS servers was
retrieved from the Asia Pacific Network Information Centre (APNIC), the Regional Internet Registry responsible
for allocating IP addresses and Autonomous System (AS) numbers across the Asia Pacific region. is
organization maintain a database, known as a WHOIS database, that stores information regarding registered
domain names in their region, including the authoritative DNS servers for each domain. From the WHOIS
records, a list of 278 DNS servers apparently located in China, according to our geolocation service, was
retrieved of which 187 were found to be available and responsive to remote queries.
In order to incorporate geographical information in our results we make use of the freely-available MaxMind
GeoIP database (MaxMind Inc. n.d.) to resolve IP addresses to the city level with a tolerable level of accuracy.
is thus allows us to identify the location of almost all DNS servers in our test set. It is worth noting, as we
shall discuss later, that this does not represent the location of the users of that service; these users make DNS
requests from their home network connection, and could potentially be located in almost any geographical
location, but will in practice almost certainly be within China.
We made use of the Herdict project (e Herdict Project n.d.), operated by the Harvard Berkman Centre,
to retrieve a list of domain names that had been reported as blocked in China. e Herdict project makes
use of crowdsourced reporting to maintain a list of websites for which users have experienced some form of
filtering or censorship, sorted according to country, and thus provides a useful source of data for potentially
blocked domain names.
e Herdict project lists the most frequently reported blocked websites for each country, each list comprising
the top 80 reported domains. In addition to this, we included five popular Chinese websites that, presumably,
would not be blocked in mainland China. A full list of tested domains is given in Appendix A
To learn the scope and scale of blocking, each potentially-blocked domain name in the list retrieved from
Herdict was requested from each DNS server retrieved from the APNIC WHOIS database. ese results
were recorded and analysed according to the nature of the DNS response received in each case.
In order to determine whether the results returned were genuine, an equivalent query was conducted on a
self-managed DNS server located in a country that does not perform extensive internet filtering5 . e results
3
Strictly speaking, DNS servers return the IP for a particular hostname, many of which may fall under a given domain. For the
purposes of this article, the two may be considered functionally equivalent as we will not request multiple hosts for a single domain.
4
For completeness, it should be mentioned that DNS servers function in a hierarchy, and may request information for unknown
domain names from more authoritative servers. is normal function of the service would not, however, implicate any third party, and
would in fact be directly traceable to the computer used in our experiments.
5
In the case of the experiments detailed here, this was the United Kingdom. Whilst the United Kingdom certainly does engage in
national-scale internet filtering, care was taken that such filtering would not affect the results of these experiments.
6
of the remote query were compared heuristically6 with the local result, any differences were noted.
In order to minimise genuine short-term network errors, the sequence of requests was repeated six times
at one-hour intervals. e results from the different experimental sets were combined in such a way that
timeouts, which could represent genuinely poor connectivity, were eliminated unless they were seen to be
consistent across all result sets.
6.1
DNS Response Types
e categorization presented in §3 identifies four major filtering techniques, of which DNS poisoning is
one. Manipualation of DNS can, however, take on a number of forms, of which some could represent either
censorship or genuine errors. e most important behaviours of a DNS server, for the purposes of this article,
are discussed here.
• Invalid Server Errors
A DNS server, on receiving a given query, may respond with an indication that it is not, in fact, a DNS
server. In this case, the requesting party will not receive a mapping from the requested name to an
IP address, and thus cannot proceed with making a connection. Clearly, such a response could also
indicate that the requested party was genuinely not, in fact, a DNS server.
• Timeout Errors
A simpler behaviour, and one that is harder to categorize unambiguously as censorship, is for a DNS
server to accept requests, but not to respond in any way for blocked domains. Eventually, the requesting
party will exceed a given time threshold and abandon the query. is again prevents the client from
learning the IP address of the requested host, and could be ascribed to a genuine network error. A
secondary effect of such an approach is that the requesting party does not receive an immediate response,
which may cause internet requests to blocked sites to pause unti the timeout threshold is reached.
• Unknown Domain Errors
e simplest form of direct DNS censorship is for the DNS server to deny the existence of the requested
website, causing the requesting party to receive an error. For known existing domains, this response is
easily identifiable as malicious behaviour on the part of the server.
• Misleading Results
A more subtle approach to censorship is for requests for blocked websites to generate a valid DNS
response, providing the client with an IP address for the requested hostname, but to provide false
information in the form of an incorrect IP address.
is approach has several potential implications, which will be discussed further below. One potential
outcome of such an approach is that the requesting party may be directed to a host that logs all attempts
to access banned websites, allowing for a level of surveillance or monitoring of such requests.
• Genuine Results
e final possibility that we consider is that the DNS server returns the IP address that corresponds to
the requested hostname. While this partiular piece of information may be accurate censorship may, of
course, occur through one of the other techniques discussed in §3.
7
Results
We discuss here the results of querying DNS servers across China for reportedly-banned domain name, discuss
the general trends in the responses, and isolate a number of particularly unusual observed behaviours.
6
Specifically, the first two dotted quads of the IP addresses returned by the remote and the local DNS server were compared. If these
differed, the response was marked as incorrect. In order to accommodate internet services that use content distribution networks that
employ a wider range of IP addresses, the results were manually examined to detect any such networks, which were in turn whitelisted.
We accept that this automated approach does allow a small chance of introducing both false positives and false negatives with respect to
the existence of misleading DNS results.
7
Domain
twitter.com
www.backchina.com
www.youtube.com
www.ntdtv.com
www.open.com.hk
www.torproject.org
www.tibet.net
www.peacehall.com
www.facebook.com
www.6park.com
No Domain
0
0
0
0
0
0
0
0
0
0
No Answer
1
0
0
0
1
2
2
1
1
0
No Nameserver
12
13
14
23
20
24
22
20
26
26
Timeout
6
7
7
7
7
7
7
7
7
7
True IP
3
5
6
0
3
1
3
6
0
2
False IP
165
162
160
157
156
153
153
153
153
152
Figure 1: Ten most misdirected domains from experiments, showing DNS error result counts for each domain.
7.1
Broad Trends
Overall, experiments were conducted on 187 DNS servers across China, 178 of these servers answered at least
one query with a valid, but not necessarily truthful, IP address. Of the responding servers, 79 answered at
least one query with a response that appeared to be accurate, meaning that 99 servers returned only invalid
results for the requested domains.
A small number of servers were clearly either misconfigured, or deliberately providing invalid results to
our requests. Five servers consistently timed out on DNS requests, despite an allowance for an artificially
long timeout period of 60 seconds. One server consistently produced an invalid nameserver error, despite
apparently accepting DNS requests.
We discuss below the general trends observed in the overall result sets, before moving on to specific
examples.
7.1.1
Widespread DNS Poisoning
Our experiments provide evidence of widespread manipulation of DNS results, occuring in all the forms
discussed in the previous section. Interestingly, individual DNS servers do not, in general, display consistent
blocking behaviour across all domains, but may instead return an incorrect IP address for one domain, claim
that a second domain does not exist, and refuse to respond to requests for a third domain.
Figure 1 demonstrates the ten most widely misdirected domains observed in experiments. ese domains
were thus almost universally blocked across China. It should be noted that, in addition to an overwhelming
majority of misleading results for each domain, the remaining servers were likely either to time out or to claim
not to be a valid nameserver for this result.
Figure 2 lists the ten domains that are most often claimed not to exist by the tested DNS servers. As can
be seen, claiming a domain to be non-existent is far less common than providing an inaccurate IP address
result. It is worth noting that the domains listed in Figure 2 receive large numbers of timed-out requests, as
well as both accurate and inaccurate IP responses.
ese results suggest that approaches to DNS poisoning favour misdirection of domains over claims that
the domain does not exist, and that allowing a request to timeout by not responding, as opposed to generating
an error, is also common approach.
7.1.2
Timeout Responses
e prevalence of timeouts in our results could potentially be explained by filtering occurring not directly at
the DNS servers, but instead at other points in the network.
e DNS protocol, unusually, makes use of an underlying internet transport protocol known as UDP,
as opposed to the more common TCP protocol employed by the majority of internet services such as the
world-wide web. is protocol has the advantage of higher speeds and lower transmission overheads, but
8
Domain
www.ahrchk.net
killerjo.net
www.x365x.com
www.websitepulse.com
www.voanews.com
www.tumblr.com
www.steves-digicams.com
www.scribd.com
www.pinyinannotator.com
www.newgrounds.com
No Domain
4
4
3
3
3
3
3
3
3
3
No Answer
17
17
17
18
17
17
17
17
18
16
No Nameserver
64
65
65
65
64
64
65
65
67
64
Timeout
40
37
41
36
38
38
36
36
36
36
True IP
60
62
59
63
63
37
64
38
61
66
False IP
2
2
2
2
2
28
2
28
2
2
Figure 2: Ten domains most often claimed non-existent.
does not provide guarantees that data will be delivered, nor does it allow for confirmation of message delivery.
It is therefore the case that, if DNS requests for particular domains were blocked or dropped in the network,
that it would be difficult to detect this fact; the result would be observed simply as a timeout.
Another alternative is that the DNS servers in question, upon receiving a request for a blocked domain,
simply ignore the request. From the experiments detailed here, it is difficult to verify either of these claims.
It seems likely, however, that filtering of DNS traffic in transit to block requests would be more complex and
costly, and would result in a more homogeneous and extensive pattern of timeouts than were observed. As
such, the argument for filtering at the server level, or some combination of both arguments, appears most
likely.
7.1.3
Common Misleading IP Addresses
An examination of the results returned from the experiments show that, in the case that a DNS server returns
an IP address that does not correspond to the requested domain, the returned IP address is drawn from a
comparatively small pool of possible responses; misleading IP addresses are neither random nor returned on
a per-server basis.
Our experiments made requests for 85 domains to 187 DNS servers, resulting in a total of 15,895 requests
in total. Of these requests 6658 gave a reponse that pointed to an IP address, 2258 of which were judged
by our analysis to be misleading. ese 2258 misleading results each pointed to one of only 84 IP addresses,
meaning that there is significant correlation between the misleading IP addresses returned by DNS servers
across the country.
Two possible explanations exist for this result, assuming that our analysis of misleading IP addresses is
correct. e first is that a centralized list exists that provides specific DNS poisoning instructions, including
IP address, for DNS server operators. e second possibility is that the DNS responses observed in our
experiments, being conducted outside of China and therefore travelling across the nation’s border routers
that are known to engage in substantial filtering, were manipulated in transit. Investigation of these two
possibilities is a subject for future work.
7.2
Domain Statistics
We discuss here a number of results that exemplify observed filtering behaviour according to the particular
domains queried in these experiments.
7.2.1
Poisoning of Uncensored Domains
In addition to the list of 80 domains obtained from the Herdict project, our experiments incorporated
domain names for five popular Chinese internet services with the intention that these would be unfiltered.
Surprisingly, in several cases our results appeared to show misleading results for a number of these domains.
9
Server
202.95.0.10
Location
China,
Beijing
121.101.208.41
China,
Chaoyang
Remote Result
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
renren.com.
900 IN
900 IN
900 IN A
900 IN A
900 IN A
900 IN
900 IN
900 IN A
900 IN A
900 IN A
A 123.125.38.2
A 123.125.38.3
123.125.38.239
123.125.38.240
123.125.38.241
A 123.125.38.2
A 123.125.38.3
123.125.38.239
123.125.38.240
123.125.38.241
Figure 3: Inaccurate results for renren.com. Distinct servers show identical incorrect results.
We have not determined whether this represents misconfiguration of DNS servers, deliberately invalid results
returned due to our request originating outside of China, or some other cause.
An illustrative example is that of renren.com, a popular Chinese social network. In at least two cases,
invalid IP addresses were returned for this service, as shown in Figure 3. In this example, the two servers are
located in different cities, and are apparently operated by separate companies; both are certainly both logically
and physically distinct. Despite this, both servers return the same list of IP addresses, none of which appear
to belong to servers of renren.com.
On directly querying the addresses in question, a number of them appear to be running an unconfigured
webserver. It is not known what, if any, the significance of these addresses may be.
7.2.2
Purposeful Misdirection of torproject.org
e Tor Project produce a number of tools that aim to provide anonymous and untraceable internet communications,
as well as to bypass censorship. As such, both the tool and the project website are commonly blocked in
countries with extensive internet censorship.
On querying servers for the Tor Project’s website, eighteen apparently unrelated servers instead returned
results for an entirely separate domain: tonycastro.com. is domain is not in any discernible way linked
to the Tor Project, except in that both domains begin with the letters ‘to’, and are ten characters in total,
suggesting that some form of substitution had occurred in the DNS records. While this could be a pure
coincidence, the number of results from disparate servers all pointing to the same domain strongly imply
some broader connection.
e Tor Project appears to be the only domain on the tested list that suffers from this, particularly curious,
form of filtering.
7.3
Server Statistics
We examine here a number of statistics concerning the behaviour of servers across the range of queried
domains.
7.3.1
Exemplar Misleading Results
e majority of servers queried in our experiments returned a mix of result types, with varying degrees of
misleading results. A small number of DNS servers, however, demonstrated an unusually extreme range of
negative responses, and are thus demonstrative examples of the invalid responses given.
IP Address: 113.11.192.25
is server is apparently located in Beijing. Over the course of 85 domain name requests, this server
responded with a ‘no answer’ reply 68 times. is included the five valid services, including Baidu and
10
Server
58.20.127.238
61.135.238.2
122.102.0.10
159.226.161.126
159.226.8.6
159.226.8.6
202.101.98.54
202.102.134.69
202.103.64.139
202.103.96.66
202.38.128.10
202.38.64.56
202.95.0.10
202.99.224.200
203.207.119.8
210.45.224.1
220.250.64.18
221.5.203.99
222.66.238.4
Location
China, Changsha
China, Beijing
China, Chaoyang
China, Beijing
China, Beijing
China, Beijing
(Unknown Location)
China, Jinan
China, Changsha
China, Changsha
China, Beijing
China, Hefei
China, Beijing
China, Baotou
China, Beijing
China, Hefei
China, Beijing
China, Chongqing
China, Shanghai
Domain of Returned IP Address
tonycastro.com
tonycastro.net
tonycastro.com
tonycastro.com
tonycastro.net
tonycastro.org.ez-site.net
tonycastro.org.ez-site.net
tonycastro.net
tonycastro.net
tonycastro.net
tonycastro.net
tonycastro.org.ez-site.net
tonycastro.com
tonycastro.org.ez-site.net
tonycastro.com
tonycastro.org.ez-site.net
tonycastro.org.ez-site.net
tonycastro.org.ez-site.net
tonycastro.com
Figure 4: torproject.org requests resolving to alternative domain.
RenRen in our test set, and may indicate discrimination against our requests due to being located outside of
China.
A further 13 requests resulted in the return of a valid IP address. On examination, all of these IP addresses
were found to be unassociated with the requested domain. e list of domains and associated IP addresses
can be found in Figure 5.
Of interest, rather that simply the fact that a misleading IP address was returned, is the nature of the
IP addresses in question. ere was no discernable pattern in these results; they point to seemingly-random
hosts corresponding to domains and organisations that do not appear to have any connection with each other
or with the originally requested domain. It is notable, however, that certain of the blocked domains point to
the same IP addresses, even though those IP addresses are not related to the domain in question. As can be
seen from Figure 5, both YouTube and Facebook redirect to the same IP address, as do peacehall.com and
wujie.net, and backchina.com, boxun.com and open.com.hk.
e remaining four domains requested from this server resulted in a claim that no such domain existed.
IP Address: 202.99.224.203
is server is apparently located in Baotou. Of 85 domains, the majority of results were to claim that the
server was not valid for returning DNS requests. In total, requests for 14 domains resulted in one or more IP
addresses being reported, none of which led to the appropriate servers. is behaviour, to appear invalid for
some domains and to return fake results for others, was particularly strange.
It could once more be observed that, although invalid IP addresses were returned, these were not purely
random but instead were consistent for each domain, and were drawn from a small pool of IP addresses that
were used multiple times for different domains.
7.3.2
Localhost Redirection
An interesting choice of address to return when providing inaccurate IP addresses is to point the request back
to the computer from which it originated. is can be achieved through use of the special ‘reserved’ IP address
127.0.0.1, which also has the DNS designation of ‘localhost’.
11
Domain
www.hotspotshield.com
www.tibet.net
www.boxun.com
wezhiyong.org
www.backchina.com
www.ntdtv.com
www.peacehall.com
www.youtube.com
www.facebook.com
twitter.com
www.wujie.net
www.6park.com
www.open.com.hk
Returned IP
8.7.198.45
159.106.121.75
46.82.174.68
8.7.198.45
46.82.174.68
8.7.198.45
59.24.3.173
203.98.7.65
203.98.7.65
159.106.121.75
59.24.3.173
159.106.121.75
46.82.174.68
Figure 5: Misleading IP addresses from a Beijing-based DNS server.
Local redirection has the advantage of not requiring a genuine IP address to be selected from the internet,
which can lead to undesirable behaviour. It also minimises traffic passing over the internet, as any further
requests made to this connection remain on the user’s computer without travelling over the general internet.
Despite this, the use of redirection to the localhost was not particularly widespread amongst the queried
servers. Of the 187 servers queried only six servers returned results pointing to the localhost, of which four
consistently returned the localhost for any DNS query. is could represent either a misconfigured DNS
server, or a blanket policy for unauthorized or non-Chinese requests.
Two servers, however, wth addresses 202.99.224.200 and 202.99.224.223 returned 127.0.0.1 for the
majority of requests, but also resulted in an invalid nameserver error for seven domains. In 13 cases, however,
an IP response was given that, again, appears random, resolving to Azerbaijani, Irish, US, Italian, New Zealand
owned hosts.
8
Geographical Distribution
Figure 6 presents an overview of the variations in filtering observed across the various cities covered by our
experiments. Darker grey markers represent a greater percentage of misleading DNS responses compared
to accurate responses. As results were obtained for potentially many servers within a given city, the median
average percentage of all results observed for all servers in the city is represented. To indicated cities with a
larger number of DNS servers, markers are scaled according to the number of servers tested, ranging from a
single server in cities such as Dongguang and Harbin, to 72 servers in Beijing.
Whilst we do not observe any overall pattern to the filtering experienced in different cities, thre is clear
heterogeneity across the country. is supports the view that high-level controls over filtering are relatively
loose in terms of implementation of filtering, with the technical details of blocking being decided at the local
rather than regional or national level.
9
Experimental Limitations
ere are a number of limitations to the experimental methodology that we employed in this work, which
we will detail briefly here.
e first and most obvious is that the experiments relied on a restricted list of DNS servers obtained
from the APNIC WHOIS database. Whilst the set of servers used provided a reasonable coverage of China
geographically, although with a notable bias towards the East of the country, there was a great disparity
between the number of servers observed in each city. is figure ranged from 72 servers in Beijing, to only
12
Harbin
Harbin
Changchun
Changchun
Chaoyang
Chaoyang
Baotou
Baotou
Beijing
Beijing
TT
Shijiazhuang
Shijiazhuang
Shijiazh
X
X
Q
Q
Z
Z
X
X
H
H
Chengdu
Chengdu
oo
oo
N
N
W
W
N
N
Chongqing
Chongqing
oo
Changsha
Changsha
Guiyang
Guiyang
Fuzhou
Fuzhou
X
X m
m
N
N
Guangzhou
Guangzhou
Dongguan
Dongguan
Z
Z oo
F gure 6 DNS quer es across Ch na show ng med an percentage of m s ead ng resu ts for quer ed doma ns w th
darker po nts represent ng a h gher percentage of m s ead ng resu ts C rc e s ze represents the re at ve number of
servers quer ed n each c ty
13
one server in several of the smaller cities. While this will, to some extent, reflect the realities of DNS server
placement in China, it appears insufficient for a genuine analysis of the relative experience of internet users.
A more fundamental limitation is that DNS servers are not necessarily, or usually, located in the same
geographical area as a user. A DNS server is typically operated and managed by an ISP, and made available to
its users automatically. It is therefore likely that a given ISPs customers, who may be widely dispersed, all use
the same DNS server. As such, the results presented here arguably represent organizational variation, rather
than geographical.
Ongoing research is being conducted into means to extend the approach presented here to allow direct
remote IP scanning to detect reachability between two remote systems. is approach, by revealing actual
traffic flows rather than the simple routing information returned by DNS queries, allows a much deeper and
effective analysis of filtering behaviour. In particular, recently discovered forms of remote scanning (Ensafi
et al. 2010) provide a promising avenue for this work, and would largely eliminate the previously mentioned
limitations.
Further, the results in this article represent a snapshot taken in mid-2012, and as such cannot reflect the
changing patterns of censorship. Given the automated nature of these tests, however, and the relatively short
time required to conduct them, the gathering of time-series data is a relatively small step, and has the potential
to reveal useful patterns of censorship over time.
e final major limitation to this work is that it provides a purely technical view of one form of filtering
occurring in China. ese results provide a window into the limtations imposed on users’ internet connections,
but can provide little data with respect to the effects of censorship on users’ browsing behaviour, social
attitudes to various forms of content, choice of forums, modes and means of communication, and access
to news sources. As such, the experiments detailed here provide only a limited first step in understanding the
wider phenomenon of internet censorship.
10
Conclusions
We have proposed that it is, in general, false to consider internet filtering as an homogeneous phenomenon
across a country, and that the practicalities of implementing a filtering regime are likely to result in geographical
and organisational differentiation between the filtering experienced by users.
We believe that the study of these differences are of great interest in understanding both the technologies
and the motivations behind filtering, and have proposed a number of mechanisms that could be employed
to gain this understanding.
Despite the existence of a number of technological and social avenues to aid in this research, we see
a number of serious legal and ethical concerns that must be thoroughly considered in order to undertake
broad-scale research of this nature. Beyond the more obvious pitfalls of misusing third-party services in an
attempt to conduct this research, there are more subtle issues. e necessity of attempting to access blocked
content, and the legality and ethics of performing this via a third-party volunteer or service operator are all
worthy of serious discussion by researchers in this field.
In response to the technical and ethical challenges of censorship research, we have conducted a nation-wide
remote survey of the apparent filtering experienced by Chinese internet users, with specific reference to
blocking attempts that occur through the Domain Name Service (DNS). ese experiments have revealed
widespread poisoning of DNS results, including invalid server responses, valid domains claimed to be non-existent,
and the return of IP addresses that do not correspond to the requested domain.
Our analysis of these results has revealed a number of trends in this filtering, most notably the prevalence
of misleading responses for domains over claims that domains do not exist. Further, although the extent
of filtering varies geographically, we have identified correlations in the misleading IP addresses returned in
response to requests for blocked domains by different servers, implying some level of top-down involvement
in the behaviour of servers.
Despite the concerns raised in ethically researching internet censorship, and the technical hurdles to
gaining a detailed picture of global internet filtering, we consider that research into this subject presents a
number of interesting problems, and can provide insight into the development of the internet and its ongoing
social and political role both the national and international level.
14
A
Tested Domain Names
e following domain names were tested against the list of available DNS servers:
A.1
Reported Blocked
ese domains were retrieved from the Herdict Project, and have thus been reported multiple times as blocked
within China.
www.torproject.org - www.google.com - mail.live.com - www.blogger.com - dropbox.com - www.wretch.cc
- vimeo.com - www.scribd.com - anchorfree.com - developer.android.com - www.gmail.com - www.Demonoid.com
- www.bing.com - thepiratebay.org - piratebay.org - www.hotspotshield.com - www.box.net - mail.google.com
- chinagfw.org - blogspot.com - wikileaks.org - www.tibet.net - www.boxun.com - www.bbc.co.uk - wezhiyong.org
- www.bullogger.com - www.rfa.org - wikileaks.com - www.backchina.com - huffingtonpost.com - www.ntdtv.com
- www.rthk.org.hk - www.aboluowang.com - www.voanews.com - www.wenxuecity.com - www.dw-world.de
- zh.wikipedia.org - www.danwei.org - news.bbc.co.uk - www.peacehall.com - www.youtube.com - www.facebook.com
- twitter.com - www.wujie.net - www.6park.com - www.steves-digicams.com - www.hotmail.com - www.x365x.com
- www.wenku.com - picasaweb.google.com - www.camfrog.com - www.tumblr.com - www.foursquare.com www.imdb.com - flickr.com - t.co - www.livejournal.com - www.twitzap.com - killerjo.net - www.paltalk.com
- www.pinyinannotator.com - www.python.org - www.midwest-itc.org - www.cafepress.com - tar.weatherson.org
- secure.wikimedia.org - theviennawilsons.net - www.gamebase.com.tw - www.newgrounds.com - angrychineseblogger.blog-city.com
- www.open.com.hk - bbs.sina.com - www.mitbbs.com - www.parantezbaz.com - www.aixin119.com - english.rti.org.tw
- www.ahrchk.net - mashable.com - WWW.SIQO.COM - www.websitepulse.com
A.2
Common Services
ese domains represent popular Chinese services that were anticipated not to be blocked within China.
baidu.com - qq.com - caixin.com - renren.com - chinaview.cn
References
Al-Saqaf, W. (n.d.), ‘Alkasir for Mapping and Circumventing Cyber-Censorship’, http://www.alkasir.
com/. Accessed August 8th , 2012.
Clayton, R., Murdoch, S. J. & Watson, R. N. M. (2006), Ignoring the great firewall of china, in ‘In 6th
Workshop on Privacy Enhancing Technologies’, Springer.
CNNIC (n.d.), ‘China Internet Network Information Center Homepage’, http://www.cnnic.cn.
Accessed August 31st , 2012.
Deibert, R. J., Palfrey, J. G., Rohozinski, R. & Zittrain, J. (2008), Access Denied: e Practice and Policy of
Global Internet Filtering (Information Revolution and Global Politics), MIT Press.
Dingledine, R., Mathewson, N. & Syverson, P. (2004), Tor: e Second-Generation Onion Router, in
‘Proceedings of the 13th USENIX Security Symposium’.
Ensafi, R., Park, J. C., Kapur, D. & Crandall, J. R. (2010), Idle port scanning and non-interference analysis of
network protocol stacks using model checking, in ‘USENIX Security Symposium’, USENIX Association,
pp. 257–272.
Lessig, L. (2006), Code: And Other Laws of Cyberspace, Version 2.0, Basic Books.
URL: http://codev2.cc/
MaxMind Inc. (n.d.), ‘MaxMind GeoIP City Database’, http://www.maxmind.com/app/city. Accessed
August 8th , 2012.
15
Murdoch, S. & Anderson, R. (2008), Tools and Technology of Internet Filtering, in R. Deibert, ed., ‘Access
Denied: e Practice and Policy of Global Internet Filtering (Information Revolution and Global Politics
Series)’, 2 edn, MIT Press, chapter 3, pp. 57–72.
URL: http://www.worldcat.org/is...
Purchase answer to see full
attachment