SAGE Reference
The SAGE Encyclopedia of the Internet
Internet Origins and History
By:Barney Warf
Edited by: Barney Warf
Book Title: The SAGE Encyclopedia of the Internet
Chapter Title: "Internet Origins and History"
Pub. Date: 2018
Access Date: September 4, 2022
Publishing Company: SAGE Publications, Inc.
City: Thousand Oaks,
Print ISBN: 9781473926615
Online ISBN: 9781473960367
DOI: https://dx.doi.org/10.4135/9781473960367.n155
Print pages: 543-553
© 2018 SAGE Publications, Inc. All Rights Reserved.
This PDF has been generated from SAGE Knowledge. Please note that the pagination of the online
version will vary from the pagination of the print book.
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
The Internet reflects a long history of technological innovation in computing and communications. Some
might argue that the roots extend to the 19th century with the invention of the telegraph in 1844 or Herman
Hollerith’s punched cards to store data in the 1880s. Certainly, the history of computing stretches at least as
far back as World War II, when the first computers were deployed to assist in the creation of nuclear weapons.
Norbert Wiener played a key role in the invention of cybernetics, born out of studies of missile technology,
which focused on the feedback loops between humans and electronic systems; his book, Cybernetics: Or
Control and Communication in the Animal and the Machine, was published in 1948. Wide area networks
emerged as early as the 1950s. This early stage was followed by rapid increases in computing power
and memory, the miniaturization of electronics, and the microelectronics revolution. It is vital to note that
the Internet emerged through a process of incremental creep rather than long-term planning. Many current
applications, such as blogs, adware, Skype, and social media, could not have been imagined in the Internet’s
early stages; indeed, the invention and soaring popularity of these phenomena point to the web’s trajectory
as a palimpsest of unintended consequences.
This entry approaches the development and evolution of the Internet in three stages. First, it examines
the birth of the Internet in the 1960s and its development through the 1980s, including the rise and fall
of ARPANET (Advanced Research Projects Agency Network) and NSFNET (National Science Foundation
Network), the creation of standard protocols, and the creation of the World Wide Web. Second, it turns to the
1990s, when the Internet became privatized, commercialized, and globalized, and exploded in size. The third
section focuses on the 21st century and the numerous changes that accompanied the continued growth of
the Internet, including Web 2.0, the mobile Internet, social media, the digital divide, the Chinese web, and
mounting censorship.
Early Days (1960s–1980s)
The birth of the Internet may be traced back to the early 1960s to the U.S. Defense Department’s Advanced
Research Projects Agency (ARPA, later Defense Advanced Research Projects Agency, or DARPA), which
was founded by President Dwight D. Eisenhower in 1958 in the wake of the then Soviet Union’s launch of the
first satellite, Sputnik. ARPA’s mission was to propose innovative research ideas with significant technological
impacts. Massachusetts Institute of Technology (MIT) scientist Joseph Licklider, the first head of research at
DARPA under program manager Lawrence Roberts, proposed a “galactic network” of computers that could
communicate with one another as early as 1962. DARPA awarded numerous contracts for packet-switching
networks. In 1965, a connection was established between Lincoln Lab at MIT and the System Development
Corporation lab in Santa Monica, California, using a dedicated phone line, with a speed of 1.2 Kbps, what
Michael Banks (2008) calls a “a proof of concept” of a wide area network (p. 4).
It is widely believed that this effort was designed with the aim of allowing computers to communicate with one
another in the event of a nuclear attack—the principal focus was on the Pentagon and its computing center at
Cheyenne Mountain, Montana—although a 2009 article by Barry M. Leiner and colleagues disputes the idea
that this was part of the original intent. Another motivation was to allow computers, which were expensive
at the time, to share resources. Time sharing allowed researchers at one institution to avail themselves of
computers at another institution.
Much of the durability of the current system is due to the enormous amounts of federal dollars dedicated
toward research in this area. The initial military goals were soon supplemented by civilian ones. DARPA
grouped together several young, ambitious computer scientists, who created a series of related innovations
such as neural networks, queuing theory, adaptive routing, and File Transfer Protocols. File Transfer Protocol
was invented in 1973. However, Martin Campbell-Kelly and Daniel Garcia-Swartz have argued that this
conventional story of the Internet’s origins is too simple and that multiple networks were under way at the
time, such as the U.S. military’s Semi-Automatic Ground Environment defense system in 1962.
Drawing on a well-received 1961 paper by MIT’s Leonard Kleinrock (sometimes called the father of modern
networking) on packet switching titled “Information Flow in Large Communication Nets,” Paul Baran
Page 2 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
(1926–2011) of the RAND Corporation formalized what he called “message blocks,” in which individual
messages may be decomposed, the constituent parts transmitted by various channels, and then
reassembled, virtually instantaneously, at their destination. In Britain’s National Physical Laboratory, Donald
Davies independently created packet switching, which became the preferred term, and proposed a national
data network. Both message blocks and packet switching divided messages into pieces that could be
independently routed. Similar efforts included Telenet and the Michigan Educational Research Information
Triad, which began in 1966.
On August 30, 1969, Bolt, Beranek and Newman delivered the first Interface Message Processor to
Kleinrock’s Network Measurements Center at the University of California, Los Angeles (UCLA). The UCLA
team responsible for installing the Interface Message Processor and creating the first ARPANET node
included graduate students Vinton Cerf, Steve Crocker, Bill Naylor, Jon Postel, and Mike Wingfield. Two
months later, on October 29, 1969, Charley Kline, a student in the Network Measurement Center, sent the
first-ever message from one computer to another on the ARPANET.
In the 1970s, packet-switching standards were standardized under the International Telecommunication
Union. France experimented with the Cyclades project. In 1978, Western Union, Tymnet, and the British
Post Office collaborated to develop an international packet-switching network, which expanded to include
North America, Hong Kong, and Australia. In the 1970s, the Transmission Control Protocol/Internet Protocol
(TCP/IP), codeveloped by Robert Kahn of DARPA and Vinton Cerf at Stanford University, was introduced,
and in 1982, it became the standard networking protocol, making it possible to join almost any set of
networks together (although there was some resistance in Europe). The term Internet became shorthand for
“internetworking.”
ARPA gave birth to a network quite different from the centralized telephone system of the time, which relied
on analogue information: Rather, digitization facilitated a decentralized, and then distributed, network. From
the beginning, it adopted open architecture networking, a revolutionary concept at the time, meaning that
a particular network design could be chosen by users and made to interface with other networks through
a higher level internetworking architecture. What would become ARPANET developed new protocols for
internetworking and became the first wide-area packet-switching network. In 1969, ARPANET sent its first
message, a “node-to-node” communication, between two computers. The early ARPANET used a Network
Control Program rather than TCP/IP, but switched on January 1, 1983, after careful planning. TCP/IP remains
the standard protocol.
The power of networked systems soon became clear. By 1969, ARPANET connected four universities: (1)
UCLA (the first node in ARPANET), (2) the Stanford Research Institute, (3) the University of California, Santa
Barbara, and (4) the University of Utah. In 1971, it added the University of Hawaii’s ALOHAnet. By the end of
1971, 23 universities were connected by the web. In 1972, Robert Kahn made the first public demonstration
of ARPANET at the International Computer Communication Conference. The European Network spread
ARPANET through the continent’s universities and research centers. In 1973, the first international connection
was established between ARPANET and University College London and the Royal Radar Establishment in
Norway. In 1976, Queen Elizabeth sent the first royal email message during a visit to a military base, but it
would be several years before computer-to-computer communications routinely included messages among
people. The roots of email may be traced to the Compatible Time Sharing System at MIT, written by Tom
Van Vleck and Noel Morris. In 1972, Ray Tomlinson, an engineer at the U.S. computer science research firm
Bolt, Beranek and Newman (which played a key role in ARPANET), wrote the basic software for sending
and receiving email, and he invented the use of the @ symbol in addresses, thus adapting computers for
communication among humans. Soon thereafter, experiments began in Internet telephony, file sharing, and
video games.
In 1975, the Defense Communication Agency assumed control of ARPANET. Other government agencies
began to set up similar networks, such as the Department of Energy’s MFENET (Magnet Fusion Energy
Network) and NASA’s (National Aeronautics and Space Administration) Space Physics Analysis Network.
AT&T, disseminating the UNIX operating system, gave rise to Usenet. In 1981, Greydon Freeman and Ira
Fuchs devised BITNET (“Because It’s There Network,” and later “Because It’s Time Network”) to connect
academic computers, first connecting Yale University and the City University of New York.
Page 3 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
Because ARPA is a public agency, its applications were noncommercial in nature (military, research, and
educational). The private sector, however, was eager to get into the new market: In 1974, Telenet, what was
arguably the first ISP (Internet service provider), began operations serving universities. In 1979, Compuserve
became the first company to offer retail email services, including a chat program. Meanwhile, some of the
world’s first cybercommunities began to take root, such as the WELL (Whole Earth ’Lectronic Link) in San
Francisco, which started in 1985. Similar bulletin board systems proliferated.
The first personal computers appeared, giving rise to (unfounded) fears that TCP/IP was too complex for them
to handle. By 1980, Usenet (conceived by Jim Ellis and Tom Truscott) began to form its first communities,
which posted notices in a bulletin board system but without a dedicated administrator.
In 1981, the NSF funded the Computer Science Network, including a series of supercomputers at universities.
Because DARPA’s responsibility was to develop cutting-edge technologies, not administer a communications
network, in 1984, administration of the Internet was transferred to the NSF. In 1983, ARPANET was split in
two: (1) MILNET (military network), for military purposes, and (2) NSFNET. MILNET, officially known as the
Defense Data Network, was operated by the Defense Information Systems Agency and lasted from its birth
in 1983 until 1995 to carry unclassified material electronically. It provided Internet connectivity to Defense
Department offices and installations in the United States and to military bases overseas.
NSF funded five supercomputer centers at (1) Princeton University; (2) the University of Illinois; (3) Cornell
University; (4) the University of California, San Diego; and (5) the Pittsburgh Computing Center (including
Carnegie Mellon University and the University of Pittsburgh) and connected these centers and the National
Center for Atmospheric Research in Boulder, Colorado. From the original six nodes, the network began
to expand to other academic centers. By 1991, it expanded to 14 nodes, including the universities of
Utah, Washington, and Delaware, and Stanford University, and had added international connections as well.
NSFNET consisted of backbone links among 200 academic supercomputers, mid-level networks (campusbased or supercomputer consortia), and campus networks. NSFNET also had connections to networks at
NASA and the Department of Energy. In partnership with IBM and the Michigan Educational Research
Information Triad, a consortium of universities in Michigan, NSF upgraded its backbones in 1988 to DS-1 fiber
lines (1.544 megabits per second, or Mbps) as client demands increased and the older lines were overloaded.
As the number of Internet hosts rapidly multiplied, hosts were assigned names so as to eliminate the need
to remember numeric addresses. As more independently managed networks were joined together, Paul
Mockapetris, Jon Postel (1943–1998), and Craig Partridge invented the Domain Name System in 1985, which
allowed for a hierarchy of locations within an Internet address. ARPA initially created six domains: (1) edu
(education), (2) mil (military), (3) gov (government), (4) com (commercial), (5) org (organizations), and (6) net
(networks). (Postel was also active in the creation of Internet standards and edited the Request for Comments
for many years.) The number of Internet hosts multiplied like rabbits: By 1987, there were 10,000; by 1989,
100,000; and by 1990, 300,000. Also, in 1990, the first search engine designed by a student, Archie, was
created.
Soon thereafter, in keeping with the broader deregulation of telecommunications, the government began to
withdraw its support for the Internet, handing its governance and provision of services to the private sector.
As David Mowery and Timothy Simcoe (2002) argue, “Adoption of the Internet in the US was encouraged by
antitrust and regulatory policies that weakened the market power of established telecommunications firms and
aided the emergence of a domestic ISP (Internet Service Provider) industry” (p. 1370). Telecommunications
providers were eager to sell connectivity. ARPANET was decommissioned in 1990, and the market was
charged with providing backbone services. Similarly, NSFNET was decommissioned in April 1996, when
NSF defunded it after spending US$200 million on it over 11 years. Subsequently, the control over Internet
backbones became privatized.
By 1995, the Internet had grown to encompass 50,000 networks, half of which were located in the United
States. With the end of restrictions on commercial traffic, today, almost all of the Internet’s infrastructure
are privately owned and managed. Notably, the private users of the Internet benefited heavily from billions
of dollars of public investments. Unlike the telephone industry, the privatization of Internet backbones was
accomplished with little regulatory oversight (e.g., nondiscriminatory regulations for carrying traffic).
Page 4 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
Malicious use of the Internet grew in tandem with the number of users. In 1971, the first virus, Creeper, made
its debut. In 1978, the first spam message crept through the network. In 1988, the first worm, Morris, was
created. In 2000, the iloveyou virus infected 50 million computers in 10 days. In 2010, Stuxnet was used with
devastating effect against Iranian nuclear reactors.
In the early 1980s, the European Particle Physics Lab (CERN) developed hypertext and URLs (Uniform
Resource Locators), the system of addresses used on what would become the World Wide Web. CERN
drew on a long history concerning hypertext (a term coined by Ted Nelson in 1965), reaching back to Doug
Engelbart’s prototype “oNLine System”; Engelbart also invented the mouse. Similarly, Andy van Dam and
others built the Hypertext Editing System in 1967.
The British physicist Sir Tim Berners-Lee (knighted in 2004) envisioned a global, hyperlinked information
system. In 1990, with his colleague Robert Cailliau, he proposed a hypertext project called the
WorldWideWeb (initially used as one word) in which documents could be viewed using a “browser” and
a client-server architecture. (Others had proposed similar systems earlier but had not followed through;
Vannevar Bush envisioned just such a system called Memex in The Atlantic Monthly in 1945.) Berners-Lee,
who is often called the father of the World Wide Web, created the first web server and web browser (named
“CERN httpd”) while employed at CERN in Geneva; he opened the first website on December 20, 1990,
and the first webpage on August 6, 1991. Essentially, he successfully intertwined hypertext and the Internet,
making the network as flexible and decentralized as possible.
Berners-Lee’s contributions to the web are enormous: the first webpage, the invention of hypertext and
hypertext markup language, and the World Wide Web Consortium, which Berners-Lee founded in 1994 after
he left CERN to join MIT. CERN, inspired by computer geek culture hostile to copyright protection (e.g., the
Free Software movement), announced that the newly born World Wide Web would be free to everyone. (In
part, this was a reaction to the University of Minnesota’s decision to charge license fees for the Gopher
protocol, which had been released in 1991.) Although many people use the terms Internet and World Wide
Web interchangeably, technically, the web consists of hyperlinked documents identified through their URLs
and is part of the broader configuration known as the Internet. Together, the Internet’s founders developed a
multitiered system that ranged from computers at the bottom to successive layers of networks and browsers.
The Internet’s expansion spurred concerns over standards and protocols and led to the rise of several
oversight agencies. In 1979, DARPA created the Internet Configuration Control Board, which was later
renamed as the Internet Advisory Board in 1984, then the Internet Activities Board in 1986, and finally the
Internet Architecture Board in 1992. In 1986, the U.S. government established the Internet Engineering Task
Force to promote the Internet standards. ARPANET’s standards were governed by the Internet Assigned
Numbers Authority (IANA), founded in 1988. IANA was charged with root zone management in the global
Domain Name System. In 1998, ownership of IANA was transferred to the Internet Corporation for Assigned
Numbers and Names, a nonprofit organization based in Washington, D.C. The Internet Society, which was
formed in 1992, supports the development of Internet-related standards and policy and provides education
and training about the Internet.
By the early 1990s, webpages had become increasingly normalized and visible, even though early ones were
read-only documents that did not allow for interaction with users. In 1993, both the White House and the
United Nations went online, and President Bill Clinton was the first U.S. president to send an email from the
White House. Shortly thereafter, the Encyclopedia Britannica went online, later to give up its paper version
altogether.
The 1990s: The Internet Goes Global and Commercial
The mid- to late 1990s saw a vast expansion of the Internet’s size and number of users, as multiple
communities came online. In 1993 alone, the number of web servers jumped from 50 to 500. The Internet
rapidly spread beyond the research community and soon was used by households, nonprofits, and
corporations, among others. Hypertext became wildly popular among authors, programmers, scientists,
librarians, and others. Several factors contributed to the explosive growth of the Internet.
Page 5 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
Graphical interfaces greatly simplified the use of the Internet, leading to the creation of the World Wide
Web. In 1993, Lou Montulli released a text-based browser, Lynx. The same year, a team at the University
of Illinois led by Marc Andreessen created the first graphical browser, Mosaic (which became Netscape in
1994), greatly enhancing popular access to web contents and freeing users from Microsoft DOS. Research
for Mosaic was funded by the High Performance Computing and Communication Act of 1991, an initiative
of Senator Al Gore (D-TN). Soon thereafter, private web browsers sprouted like mushrooms, including
Microsoft’s Internet Explorer (1995), Apple’s Safari (2003), Firefox (2004), Google Chrome (2008), and
in 2016, Microsoft Edge, displacing older models such as Gopher. America Online launched its Instant
Messenger service in 1997.
Another factor was the enormous decreases in the cost of computers and exponential increases in their power
and memory. Personal computers have become increasingly affordable, and relatively fast, low-end machines
are readily available for relatively modest sums. In 1975, the first computer for home use, the Altair 8800,
became available. In 1976, Steve Jobs and Steve Wozniak founded Apple Computer, which launched its first
personal computer, the Apple II, in 1977; in 1984, it launched the Macintosh.
Simultaneously, several high-speed fiber-optics networks were laid down, first across the Atlantic Ocean
(in 1988) and then, in the 1990s, across the Pacific and Indian oceans. With much higher bandwidth, the
adoption of graphics, including video, and sound became feasible. The first video-sharing site went online
in 1997. Fiber optics greatly enabled the transformation of the Internet from a communication system to a
commercial system, accelerating the pace of customer orders, procurement, production, and product delivery.
In the 1990s, telecommunications giants such as British Telecom, France Telecom, Deutsche Telekom, and
many others initiated Internet services.
Cheap computers, easy-to-use browsers, and fast connections made the Internet wildly popular. In 1994, the
number of Internet users jumped by more than 300,000%. By 1998, 100 million people were using the web.
This rate of increase conforms to Metcalfe’s Law, which holds that the value of a network is proportional to
the square of the number of users.
Given capitalism’s relentless pressure to commodify everything, it is no surprise that corporations soon took
to the Internet, leading to a proliferation of .com websites, which constitute the majority of sites on the web
today. The market rapidly shifted from providing networking services to offering a wide array of goods and
services. The commercialization of the Internet, and thus the rise of e-tailing, e-banking, and business-tobusiness e-commerce, was propelled by the numerous vendors who incorporated TCP/IP.
In 1988, the first Interop trade show was held, and it is now held in seven places across the globe, drawing
one quarter of a million people annually. Subsequently, online vendors began to sprout, some becoming
enormous. In 1994, Pizza Hut opened the first online store. The same year, Jeff Bezos founded Amazon.com,
originally to sell books online but soon to become the behemoth of Internet e-tailing. eBay emerged as an
electronic marketplace in 1995 under the name AuctionWeb. In 1996, Hotmail became the first commercial
email service, founded by Sabeer Bhatia and Jack Smith.
In 1998, Sergey Brin and Larry Page founded Google, a search engine to rival Yahoo (founded in 1994 by
Jerry Yang and David Filo). Google soon became the king of search engines, responsible for 70% of Internet
searches worldwide as of 2017. In 1999, a college student, Shawn Fanning, started Napster, an online music
file sharing service (which was shut down in 2002). Also in 1999, online shoe retailer Zappos started the first
online-only retail store. Internet gambling took off around this time. Newspapers began online versions, which
soon surpassed the older paper form in the number of readers, while print newspaper readership dropped
quickly.
Schools and libraries began to adopt the Internet with gusto. The Clinton administration established the
Schools and Libraries Program of the Universal Service Fund, informally known as E-rate, as part of
the Telecommunications Act of 1996. E-rate generated US$2.25 billion in funds designed to subsidize
telecommunications services for schools in impoverished communities, and it was widely credited with raising
the proportion of public schools with Internet access from 14% in 1996 to 98% in 2010. However, E-rate failed
to provide funding for computer hardware, software programs, staff or teacher training, or broadband services,
Page 6 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
all of which are essential for effective Internet access. In addition, the Clinton administration created the ECorps, 750 AmeriCorps volunteers working to facilitate Internet access in low-income areas through a network
of federally subsidized Community Technology Centers. By 2010, 99.1% of all U.S. public libraries offered
free Internet use, due in part to funding from the Bill and Melinda Gates Foundation, which saw libraries as a
means of reducing the digital divide.
The Internet also began to weave its way into the fabric of the economy. What is often described as the first
banner web ad was sold by Wired magazine to AT&T in 1994. Firms began to use the web for recruiting,
advertising, and marketing; today, Internet advertising revenues exceed those of television. Other uses
included telecommuting and managing inventories, particularly as just-in-time inventory systems became
widespread. Supply chain management became more accurate. Business-to-business e-commerce took
off, including digital contracts and electronic signatures. Government-to-business e-commerce followed suit,
including online auctions and procurement. So great was the number of Internet start-ups fueled by Silicon
Valley venture capitalists in the late 1990s that the market soon readjusted with the devastating dot-com crash
of 2001.
As ever-larger numbers of people became comfortable with the Internet, the term cyberspace, coined by
science fiction writer William Gibson in his 1984 novel Neuromancer, became popular. “Cyber” became a
common prefix to designate many Internet-related activities, such as, to take but a few examples, cyberwar,
cybersex, cyberstalking, cyberterrorism, cyberactivism, cybercafés, cybercrime, cybercommunity,
cyberfeminism, cyberculture, cyberheroes, cyberjournalist, cyberthief, cybermarriage, and cybersecurity.
Spurred by declining costs, deregulation, and an increasingly tech-savvy public, the growth of the Internet
has been phenomenal; indeed, it is arguably the second-most rapidly diffusing technology in world history,
second only to the mobile phone. With rapid declines in the cost of computer technology, a glut of fiber optics
that led to dramatic falls in communications prices, easy-to-use graphical interfaces, and the clear potential
for all sorts of as-yet unheard of applications, it is no wonder that Internet usage worldwide began to grow
exponentially. The number of users soared from roughly 10 million in 1990 to 1 billion in 2001, to 3.73 billion
in March 2017 (about half of the planet’s population), an average rate of increase of almost 14% per year
(Figure 1). By 2014, the website Internet Live Stats estimated that there were more than 1 billion websites,
although the number has fluctuated since then. As of late 2017, the indexed web was estimated to contain
more than 4.5 billion webpages.
Page 7 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
Figure 1 Growth in Global Internet Users, 1990–2016
Source: Author, using data from Internet World Stats (internetworldstats.com).
As the number of users and Internet applications multiplied, and as graphical content made the volume of
data grow exponentially, broadband became vital to Internet connectivity. Although broadband technology
has existed since the 1950s, it was not commercially feasible until the deployment of vast quantities of fiberoptics lines in the 1990s allowed enormous amounts of data to be transferred across the Internet at high
speeds. The Federal Communications Commission defined broadband in 2010 as a minimum speed of 4
Mbps for uploading and 1 Mbps for downloading, although much higher speeds are necessary for advanced
capabilities. Many use a benchmark of 25 Mbps upload and 3 Mbps download speed.
The slow speeds of modems and telephone lines gave way to fast and then to superfast connections (as fast
as 2.4 gigabytes per second). Today, the vast majority of netizens in the developed world use broadband.
Broadband made possible innovations such as Internet television (e.g., Netflix, founded in 1997) and Voice
over Internet Protocol telephony. Arguably, having or not having broadband is quickly becoming the dominant
face of the digital divide. In 2016, 75% of the U.S. population used broadband technologies at home, reducing
dial-up services to a mere 5%. There are important rural-urban differences in the broadband digital divide, in
which urban areas are often overserved, while their rural counterparts lack easy access.
Numerous new applications arose in the face of increased Internet accessibility. Voice over Internet Protocol
telephony, for example, which has roots dating back to the 1960s, became popular, putting downward
pressure on the prices charged by telecommunications providers. Skype, founded in 2003, extended this
process to include video calls. Similarly, television adapted to the Internet as companies such as Netflix and
Hulu first became content deliverers, and then content producers. Streaming music services such as Pandora
and Spotify arose. The number of blogs (a term proposed by Jorn Barger in 1997) grew dramatically, and
the blogosphere became an important aspect of politics and personal expression. The world saw roughly 180
million blogs in 2014, and 30,000 more are added daily.
Page 8 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
In the late 1990s, email became the most common form of communications on the planet: Today, the world
sends 150 billion email messages each day. Instantaneous, free (or almost so), and asynchronous email
allows both one-to-one and one-to-many modes of communication. Email is widely used in the business
world, and in addition to being an inescapable tool for professional and personal success, it is also a prime
source of information overload. (According to The Radicati Group, a technology market research group, the
average person who uses email for work sent and received 110 messages per day in 2012.) On the heels of
the email revolution, spam—unwanted commercial email that comprises a large share of Internet traffic—also
grew in direct proportion. (Although the volume of spam grew as more people began using email, it is believed
to have originated far earlier, with a marketing message sent to ARPANET users in 1978.)
Current Status of the Internet (2000 Onward)
In the 21st century, the Internet has grown and changed in multiple ways, with a vast new array of applications
as well as social impacts. Chief among these changes are (a) the rise of Web 2.0, (b) the rise of the Internet
of Things, (c) the mobile and wearable Internet, (d) the explosion in social media, (e) mounting concerns over
the digital divide, (f) Internet censorship, and (g) the rise of the Chinese web.
One of the most important developments in the Internet’s recent history was the rise of Web 2.0, a term
defined in 2004 by technology publisher Tim O’Reilly of O’Reilly Media and Dale Dougherty, O’Reilly vice
president. Web 2.0 refers to the idea that by the early 2000s the web had become more interactive, in among
other ways by allowing individuals to contribute material to webpages. The term Web 2.0 was coined by Darcy
DiNucci in 1999 but popularized by O’Reilly and Dougherty. Traditionally, the Internet (retroactively labeled
Web 1.0) was a platform in which users were passive recipients. With the development of asynchronous
JavaScript, XML (AJAX), and application programming interfaces, websites allow users the freedom to upload
content and enjoy instantaneous interactions, making them active creators and participants who contribute
to a site’s contents, giving rise to what George Ritzer and Nathan Jurgenson call “prosumers” and what Axel
Bruns labels “produsage.”
In a sense, Web 2.0 has blurred the boundaries between the production and the consumption of information:
With YouTube and blogs, everyone could become a filmmaker and publisher. Arguably, as Web 2.0 has
produced large numbers of prosumers, it has generated a historically unprecedented democratization of
knowledge. Web 2.0 had enormous repercussions, allowing, for example, the development of social
networking sites (e.g., Facebook), video-sharing sites (e.g., YouTube, launched in 2005 by three former
PayPal employees, Chad Hurley, Steve Chen, and Jawed Karim), wikis (e.g., Wikipedia, founded in 2001),
reader comments (e.g., comments on blogs or newspapers), clickable advertising, and smartphone
applications. However, this change has its costs as well, which include that it can allow information consumers
to select only those sources that confirm their presuppositions and beliefs and never challenge them with
alternative ideas.
Widespread adoption of broadband also gave birth to the Internet of Things (IoT), a term variously attributed
to Peter Lewis in 1985 or to Kevin Ashton in 1999. Sometimes called the “infrastructure of the information
society,” the IoT allows Internet-connected devices to collect information, make decisions, or recommend
actions to assist users. Examples include smart buildings, smart cities, smart cars, smart grids, smart homes
(e.g., with smart thermostats), intelligent transportation systems, embedded biochips in people and animals,
radio frequency identification chips, and even copying machines, air conditioners, washers and dryers, and
refrigerators.
The rise of the IoT reflects a long history of control systems, wireless sensors, and ubiquitous computing. The
IoT is in its infancy, and some predict that it will include tens of billions of devices by 2020. The connectivity
of these smart objects vastly accelerates the automation of many tasks and illustrates the deep integration of
the physical and virtual worlds.
The latest chapter in the history of the Internet is the use of smartphones, giving rise to the mobile or
wireless Internet. Since the 1990s, the mobile or cellular telephone has become the most widely used form of
telecommunications in the world. In 2012, the World Bank reported that around three quarters of the planet’s
population had access to a cell phone.
Page 9 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
Rapid decreases in the cost of mobile phones, and the minimal infrastructure necessary for their operation
(i.e., cell towers), have made them affordable for vast numbers of people, including for those in the developing
world. By 2017, there were more than five times as many mobile phones as landlines; in some countries,
there are more mobile phones than people. For many people who cannot afford a personal computer, or
even a cybercafé, mobile phones are the major means of connecting to the Internet. Mobile Internet access
is particularly important in the developing world, where many countries never established effective landline
telephone systems.
Worldwide, the young tend to adopt the mobile Internet more than their elders. The mobile Internet greatly
enhances the network’s attraction, adding flexibility and convenience and facilitating the growth and use
of location-based services. Text messaging has become the preferred means of communicating for vast
numbers of people, and Twitter added a whole new dimension to social media. The rise of the mobile Internet
also raises new problems, such as issues of interoperability, slow upload and download speeds, and concerns
about security and privacy.
The portable Internet, broadband, and the IoT intersect in the form of the wearable Internet. Examples include
(a) smart watches that monitor heartbeats and other biometric data (e.g., Fitbit and Jawbone); (b) smart
clothing and jewelry; (c) body-worn computers; (d) digital shoes that count footsteps; (e) wearable panic
buttons and cameras; (f) thermal bracelets; (g) Google Glass, which immerses users in augmented reality; (h)
“hearables,” small devices worn in the ear to provide users with real-time information; and (i) even wearables
for pets (e.g., FitBark, designed to reduce unruly and destructive dog behavior).
Yet another recent change is the gradual rise of the Semantic Web, sometimes called Web 3.0, proposed by
Berners-Lee and the W3C in 2001. The Semantic Web would greatly expedite automated searches by making
web data more machine readable. The basic idea behind Web 3.0 is to define the structure of data and link
data sets in order to promote more effective automation, integration, and reuse across various applications.
The Semantic Web is expected to augment, not replace, the World Wide Web, making it more efficient by
harnessing the power of artificial intelligence.
Perhaps the most far-reaching set of changes in the recent evolution of the Internet has been the rise of social
media. Networking sites such as Friendster, LinkedIn (launched in 2003), Myspace (founded in 2003 by Tom
Anderson and Chris DeWolfe), and Facebook began in the early 2000s. Facebook (started in 2004 by Mark
Zuckerberg) is by far the most popular networking site in the world, with more than 1.8 billion users in late
2016, or 15% of the planet’s population: If Facebook were a country, it would be the world’s third largest.
Other social networking companies include Reddit (2005), Twitter (2006), and Instagram (2010). In Russia
and neighboring states, the Kontakte network is popular. China has promoted its homegrown Qzone system,
while in Brazil, Zing reigns supreme. Dozens of other smaller sites also exist, such as Maktoob (in the Arab
world), hi5.com (Mongolia), and Habbo (Finland). Some social media networks are aimed at finding romantic
partners (e.g., Match.com), while others, such as YouTube, allow sharing of digital content with everyone,
including like-minded strangers. Some argue that social media has changed what it means to be human:
Rather than the autonomous individual central to Western conceptions of the human subject, the Internet has
created the networked self, which refers to being so immersed in digital social media that one’s online identity,
persona, and reputation is as central to one’s life as the off-line counterparts to these.
Networked selves tend to live at the center of webs of digital ties that bind space in complex and often
unpredictable ways so that conventional geographies of everyday life have limited relevance to understanding
their outlook and behavior. The enormous enhancement of human extensibility offered by telemediated ties
has led to a far-reaching redefinition of the self. As social networks increasingly shift from a series of oneto-one ties to webs of one-to-many connections, weak ties have risen in importance. As a consequence, the
structure and rhythms of everyday life have become greatly complicated.
The growth in Internet users worldwide raised mounting concerns over the digital divide, or social and spatial
disparities in Internet access. As the uses and applications of the Internet have multiplied, the costs sustained
by those denied access rise accordingly. At precisely the historical moment that contemporary capitalism
has come to rely on digital technologies to an unprecedented extent, large pools of the economically
Page 10 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
disenfranchised are shut off from cyberspace. For many economically and politically marginalized
populations—the familiar litany of the poor, elderly, undereducated, and ethnic minorities—cyberspace is
simply inaccessible.
Access and use are admittedly vague terms, and they embrace a range of meanings, including the ability to
log on at home, at work, at school, or at a cybercafé. It may be more useful to discuss a gradation of levels
of access rather than simply discussing those with and without access, although it can be difficult to find the
detailed data that would allow this type of analysis. Class markers such as income and education are strongly
correlated with Internet access and use, and age plays a key role: Elderly persons are the least likely to adopt
the Internet. In many places, gender is important, too: In North America the gendered divide has disappeared,
but in Europe it persists, and in the developing world it is pronounced.
Most recent studies of the digital divide have included the roles of language, technical proficiency, disability,
and age. Whereas the vast bulk of the populations in economically developed countries use the Internet,
including near-universal rates in Scandinavia, penetration rates are markedly lower in the developing world
(but growing by leaps and bounds). Internet penetration rates (percentage of people with access) among the
world’s major regions range from as little as 1% in parts of Africa to as high as 98% in Scandinavia. In addition
to international discrepancies in access, Internet usage also reflects rural-urban differences within countries.
Everywhere, large urban centers tend to exhibit higher rates of connectivity than do rural areas. It appears
that the digital divide may exacerbate, not simply reflect, social inequalities.
Another chapter in the recent history of the Internet concerns censorship. The Internet has grown sufficiently
that it represents a real threat to dictatorial states, as it can allow individuals to circumvent tight government
controls on media. Essentially, censorship involves control over the access, functionality, and content of the
Internet. Motivations for Internet censorship include repression of criticism of the state, limiting the spread
of ideas considered heretical or sacrilegious, protecting intellectual property, and the oppression of ethnic
minorities or sexual minorities. An array of reasons and motivations may be at work in government censorship
when it is carried out by multiple government authorities that execute a variety of censorship strategies.
Government censorship varies in terms of its scope (or range of topics) and depth (or degree of intervention),
with some governments allowing nearly unfettered flows of information and others imposing tight restrictions
on what can be viewed.
In the 21st century, the rise of the Chinese Internet is another significant dimension. In mid-2017, China had
more than 738 million netizens, the largest such population in the world. Given the rapid growth in the size
and wealth of the Chinese market, services such as Baidu, Alibaba, and Tencent have appeared. Moreover,
the Chinese state has implemented a formidable censorship system known informally as the Great Firewall.
Finally, recent years have also seen changes to the regulatory structure of the Internet. In 2006, the World
Summit on the Information Society established the Internet Governance Forum, a multistakeholder approach
to the governance of cyberspace. In 2016, the Obama administration transferred control of IANA to the private
sector. Simultaneously, the Chinese began to assert influence over Internet standards, seeking to displace
the United States, and assert a Sino-centric model more tolerant of censorship.
Consequences of the Internet
Given that creative destruction is a long and ongoing feature of capitalism, it is worth remembering that
the Internet has not only created new phenomena but also annihilated others. For example, travel agents
have suffered mightily, as have print journalism, letter writing, music stores, telephone directories, public pay
phones, photo albums, and, some would argue, free time, face-to-face contact, and civility. Equally disturbing,
the Internet may generate subtle but pervasive changes in brain structure, including shortened attention
spans. Viruses, spyware, ransomware, and other types of malware have grown. Cryptocurrencies such as
Bitcoin (introduced in 2008) may destabilize financial systems. Finally, there is also the “dark web,” including
online fraud, identity theft, and hate speech.
The Internet is simultaneously a technological, social, political, cultural, and geographic phenomenon, with
world-changing repercussions. As it has grown, it has changed, for example, by adding sound and video,
Page 11 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
becoming commercialized and adapted to mobile phones and tablets, and adding ever-larger quantities of
content in languages other than English. It has enormously expedited the ability to search for and disseminate
information, reducing uncertainty in numerous domains of social life. As even more people find their way
online, as technological change occurs unabated, and as more, now-unforeseen, applications arise, there is
every reason to think that the history of the Internet is unfinished.
See also ARPANET; BITNET; Broadband Internet; Cyclades; Domain Name System; Email; Ethernet; Fiber
Optics and the Internet; File Transfer Protocol; Gopher; Hyperlink; Hypertext Transfer Protocol; Internet
Assigned Numbers Authority; Internet Censorship; Internet Engineering Task Force; Internet Governance;
Internet of Things; Internet Society; Military Internet; Mobile Internet; NSFNET; Packet Switching;
Smartphones; TCP/IP; Usenet; Web Browsers; Website and Webpage; WELL, The; World Wide Web; World
Wide Web Consortium
Barney Warf
http://dx.doi.org/10.4135/9781473960367.n155
10.4135/9781473960367.n155
Further Readings
Abbate, J. (2000). Inventing the Internet. Cambridge, MA: MIT Press.
Arminen, I. (2007). Review essay: Mobile communication society. Acta Sociologica, 50, 431–437.
Banks, M. (2008). On the way to the web: The secret history of the Internet and its founders. New York, NY:
Springer-Verlag.
Berners-Lee, T. (2000). Weaving the web: The original design and ultimate destiny of the World Wide Web.
New York, NY: HarperBusiness.
Bruns, A. (2008). Blogs, Wikipedia, Second Life, and beyond: From production to produsage. New York, NY:
Peter Lang.
Campbell-Kelly, M., & Garcia-Swartz, D. (2013). The history of the Internet: The missing narratives. Journal
of Information Technology, 28(1), 18–33.
Carr, N. (2010). The shallows: What the Internet is doing to our brains. New York, NY: W. W. Norton.
Cohen-Almagor, R. (2011). Internet history. International Journal of Technoethics, 2(2), 45–64.
Dijck, J. van. (2013). The culture of connectivity. A critical history of social media. New York, NY: Oxford
University Press.
Hafner, K., & Lyon, M. (1996). Where wizards stay up late: The origins of the Internet. New York, NY: Simon
& Schuster.
Internet Society. (1997). Brief history of the Internet. Retrieved from http://www.internetsociety.org/internet/
what-internet/history-internet/brief-history-internet
Jamilipour, A. (2003). The wireless mobile Internet: Architectures, protocols, and services. New York, NY:
Wiley.
Kellerman, A. (2002). The Internet on earth: A geography of information. London, UK: Wiley.
Kellerman, A. (2010). Mobile broadband services and the availability of instant access to cyberspace.
Environment and Planning A, 42, 2990–3005.
Leiner, B., Cerf, V., Clark, D., Kahn, R., Kleinrock, L., Lynch, D., … Wolff, S. (2009). A brief history
of the Internet. ACM SIGCOMM Computer Communication Review, 39(5), 22–31. Retrieved from
http://www.cs.ucsb.edu/~almeroth/classes/F10.176A/papers/internet-history-09.pdf
Mowery, D., & Simcoe, T. (2002). Is the Internet a U.S. invention? An economic and technological history of
computer networking. Research Policy, 31, 1369–1387.
Mueller, M. L. (2002). Ruling the root: Internet governance and the taming of cyberspace. Cambridge: MIT
Press.
Mueller, M. L. (2010). Networks and states: The global politics of Internet governance. Cambridge, MA: MIT
Press.
Murphy, B. (2002). A critical history of the Internet. In G. Elmer (Ed.), Critical perspectives on the Internet (pp.
27–45). Lanham, MD: Rowman & Littlefield.
New Media Institute. (n.d.). History of the Internet. Retrieved from http://newmedia.org/history-of-theinternet.html?page=1
Ritzer, G., & Jurgenson, N. (2010). Production, consumption, prosumption: The nature of capitalism in the
Page 12 of 13
The SAGE Encyclopedia of the Internet
SAGE
© 2018 by SAGE Publications, Ltd.
SAGE Reference
age of the digital “prosumer.” Journal of Consumer Culture, 10(1), 13–26.
Rogers, J. (1998). Internetworking and the politics of science: NSFNET in Internet history. Information
Society, 14(3), 213–228.
Ryan, J. (2013). The history of the Internet and the digital future. New York, NY: Reaktion Books.
Salus, P. (1995). Casting the net: From ARPANET to Internet and beyond. Reading, MA: Addison-Wesley.
Shah, R., & Kesan, J. (2007). The privatization of the Internet’s backbone network. Journal of Broadcasting &
Electronic Media, 51(1), 93–109.
Waldrop, M. (2008). DARPA and the Internet revolution: 50 years of bridging the gap. DARPA. Retrieved
from
http://www.darpa.mil/
attachments/(2O15)%20Global%20Nav%20-%20About%20Us%20-%20History%20-%20Resources%20-%2050th%20-%
Warf, B. (2013). Global geographies of the Internet (SpringerBriefs in Geography). Dordrecht, Netherlands:
Springer.
Warf, B. (2016). e-Government in Asia: Origins, politics, impacts, geographies. Cambridge, MA: Chandos.
Warf, B. (2014). Spaces of telemediated sociability. In P. C. Adams, J. Craine, & J. Dittmer (Eds.), The
Ashgate research companion to media geography (pp. 291–310). Abingdon, UK: Routledge.
Page 13 of 13
The SAGE Encyclopedia of the Internet
SAGE Reference
Encyclopedia of Transportation: Social
Science and Policy
Historic Transportation Facilities
Edited by: Mark Garrett
Book Title: Encyclopedia of Transportation: Social Science and Policy
Chapter Title: "Historic Transportation Facilities"
Pub. Date: 2014
Access Date: September 4, 2022
Publishing Company: SAGE Publications, Inc.
City: Thousand Oaks
Print ISBN: 9781452267791
Online ISBN: 9781483346526
DOI: https://dx.doi.org/10.4135/9781483346526.n270
Print pages: 759-761
© 2014 SAGE Publications, Inc. All Rights Reserved.
This PDF has been generated from SAGE Knowledge. Please note that the pagination of the online
version will vary from the pagination of the print book.
SAGE
© 2014 by SAGE Publications, Inc.
SAGE Reference
One of America's earliest transportation images is Paul Revere riding his horse to warn patriots that the
British were coming, while 21st-century transportation imagery boasts superhighways, supersonic jets, and
the probabilities of commercial space travel. Although the nation has progressed light-years from the era of
horses, canals, steam engines, Model Ts, and monoplanes, artifacts from the transportation past still dot the
landscape, some recycled and some repurposed. Barges still ply a few of the old canal networks and pioneer
roadways such as the Lincoln Highway and Route 66 have been incorporated into existing highway systems.
Abandoned railroad beds have been repurposed into recreational trails. Rockets leave vapor trails in the sky
as they speed toward the stratosphere, while airplanes belonging to Wilbur and Orville Wright and Charles
Lindbergh are displayed in the National Air and Space Museum.
Canals
Canals have played an important part in American transportation, and thousands of them still operate as part
of the U.S. maritime grid. Built between 1817 and 1825, the Erie Canal runs about 363 miles from the Hudson
River in Albany, New York, to Lake Erie at Buffalo, New York. Containing 36 locks, the Erie Canal provides a
navigable water route from the Atlantic Ocean to the Great Lakes.
The Erie Canal was the first transportation system between New York City and the Great Lakes that did
not require portage; using the waterway instead of wagons to move freight cut transport costs by about 95
percent. The Erie Canal stimulated a population explosion in western New York State, opened midwestern
regions to settlement, and elevated New York City to the chief port in the United States. Between 1834 and
1862, the Erie Canal was enlarged, but in 1918, the larger New York State Barge Canal replaced it. Today,
the Erie Canal is part of the New York State Canal System.
In 2000 the U.S. Congress designated the Erie Canalway a National Heritage Corridor to recognize its
national significance and its place as one of the most important civil engineering and construction
accomplishments in North America. Commercial traffic increased on the Erie Canal in 2008, but recreational
boats still provide most of its traffic load.
In 1823, the state of Illinois created a Canal Commission to oversee the design and construction of the
Illinois and Michigan Canal, finished in 1848 at a cost of $6.5 million dollars. The Illinois and Michigan
Canal furnished the first complete water route from the East Coast to the Gulf of Mexico by connecting Lake
Michigan to the Mississippi River through the Illinois River. The canal begins at the Chicago River's south
branch at Bridgeport and extends 96 miles to the Illinois River at LaSalle. Mules and horses toiling along the
tow paths pulled the barges that composed the canal's commercial traffic.
With the completion of the Illinois Waterway in 1933, the Illinois and Michigan Canal was closed to navigation
but was soon developed for recreation. Visitors can follow the Illinois and Michigan Canal State Trail along
the old tow path from Rockdale to LaSalle along 61.5 miles of scenic views of the Des Plaines and Illinois
Rivers. Numerous state parks, historical sites, and abundant wildlife as well as distinctive landscapes line its
banks. Interpretive programs and information centers are located at key places along the trail.
Railroads
As the major corporations of the 19th century, railroads brought sweeping social, economic, and political
change to the United States. They sparked a building boom of bridges, depots, and towns and allowed
ruthless rail magnates and magnificent steam locomotives to rule the country.
Railroad names like the Baltimore & Ohio, the South Carolina Canal and Rail Road Company, and the
Mohawk & Hudson Railroad became familiar to Americans. One of the greatest technological advances of
the time occurred at Promontory Summit in the Utah Territory, when the Central Pacific Railroad, starting in
San Francisco, California, and the Union Pacific, starting in Omaha, Nebraska, met there on May 10, 1869,
linking the eastern and western coasts with slender, steel rails. By the 1930s, the railroad companies had built
an efficient transportation network that crisscrossed the country, but the advent of cars, buses, trucks, and
Page 2 of 7
Encyclopedia of Transportation: Social Science and Policy
SAGE
© 2014 by SAGE Publications, Inc.
SAGE Reference
aircraft provided cutthroat and ultimately decimating competition. The amount of track in the United States
peaked at 254,261 miles in 1916, but by 2007 it had declined to 140,695 miles.
Steam Locomotives
Steam locomotives evolved along with railroads. Originally used to haul coal and other materials, railroads
quickly adapted steam engines to haul passenger trains. Steam engines became larger, more powerful, and
faster, their two-cylinder design expanding to four cylinders. Companies added gears to locomotives that
operated on industrial, mining, quarry, or logging railways. From the 1930s to the 1950s, diesel and electric
trains gradually replaced steam locomotives.
Page 3 of 7
Encyclopedia of Transportation: Social Science and Policy
SAGE
© 2014 by SAGE Publications, Inc.
SAGE Reference
A marker stands at the site of the western terminus of the Lincoln Highway in
San Francisco's Lincoln Park. A street sign also marks the highway's starting
point in Times Square in New York.
Page 4 of 7
Encyclopedia of Transportation: Social Science and Policy
SAGE
© 2014 by SAGE Publications, Inc.
SAGE Reference
In the 21st century, steam locomotives are used for historical, educational, or entertainment functions.
Railroad museums feature exhibits built around the history of steam engines, including the preserved and
operational steam locomotives themselves. Many museums and railroad societies offer steam engine trips,
including Steam Town in Scranton, Pennsylvania, and the Pacific Southwest Railway Museum in Campo,
California.
Rails-to-Trails
A 1960s rails-to-trails movement to convert abandoned or unused rail corridors has grown into a social and
ecological network. Once unused tracks disappeared, people started walking the old grades, socializing and
discovering the countryside. In the winter, they skied or snowshoed the corridor. Half a century later, 15,000
miles of rail trails and over 100 million users a year contribute to the national trails system.
The 237-mile Katy Trail stretches across most of Missouri, with over half of it following the path of Lewis
and Clark up the Missouri River's towering bluffs. It is America's longest “rail to trail” route, following the
former Missouri Kansas Texas rail line. After it leaves the Missouri River, the trail winds through farmland and
small towns, providing ideal terrain for hiking, running, cycling, and horseback riding on the Tebbetts-Portland
section.
Moving Down the Highway
Henry Ford did not create automobile technology, but he did perfect and popularize it by introducing the
assembly line in automobile manufacturing and offering a $5-a-day wage that he hoped would increase
worker productivity and encourage those workers to use part of their earnings to buy a new car. These
innovations helped make the automobile accessible to the American public. By the end of the 1920s the
number of registered drivers had reached 23 million, despite the fact that they could only order Ford's Model
T in black at a cost of $490. In 2007 the U.S. Department of Transportation estimated that there were 254.4
million registered passenger vehicles in the United States.
When Henry Ford commercialized the automobile in the early 20th century, the United States already
possessed a network of auto trails often marked and maintained by organizations of private citizens. The
National Road, the Lincoln Highway, the Dixie Highway, and Route 66 are just a few of America's vintage
roads that connect the present with the past.
In 1806 Congress passed an act creating the National Road and President Thomas Jefferson signed the act
into law, allowing for a road connecting the Atlantic Ocean and the Ohio River. Historically known by various
names, including the Cumberland Road, National Pike, and Western Pike, the National Road eventually ran
approximately 800 miles from Baltimore, Maryland, through Pennsylvania, West Virginia, Ohio, and Indiana,
officially ending at the old statehouse in Vandalia, Illinois. With the advent of the Federal Highway System in
the late 1920s, the government folded the National Road into its design for U.S. Route 40. In the era of the
transcontinental and interstate highway systems much of Route 40 disappeared into state highway systems,
with Interstate Highways 70 and 68 paralleling the National Road, although a careful scrutiny of the roadway
still reveals traces of earlier paths.
The Lincoln Highway Association maintains the Lincoln Highway, which stretches from Times Square in
New York City to Lincoln Park in San Francisco, and at times in its history passed through 14 states, 128
counties, and more than 700 cities, towns, and villages. Formally dedicated October 31, 1913, the Lincoln
Highway is a national memorial to President Abraham Lincoln and one of the first automobile roads across
America, bringing economic prosperity to the places it touched. It wends its way through historic places,
several national parks, and a few major cities.
Although it has been reconfigured over its 100-year life span and parts of it have been incorporated into
state and interstate highway systems, much of it as U.S. Route 30, remnants of the Lincoln Highway still
exist across America. In 1992, the Lincoln Highway Association reformed with the mission of identifying,
preserving, and improving access to the Lincoln Highway. The Lincoln Highway is on the National Register
Page 5 of 7
Encyclopedia of Transportation: Social Science and Policy
SAGE
© 2014 by SAGE Publications, Inc.
SAGE Reference
of Historic Places, and motorists from all over the world drove vintage automobiles along it to celebrate its
centennial in June 2013.
The Lincoln Highway inspired the slightly younger Dixie Highway. In 1914, the Dixie Highway Association
planned for this road to be part of the National Auto Trail system connecting the U.S. Midwest with the
southern United States. Constructed and expanded from 1915 to 1927, the Dixie Highway evolved into a
network of connected paved roads instead of a single highway. In 1927, the Dixie Highway Association
disbanded and the Dixie Highway became part of the U.S. Route system, with some part of it converted to
state roads. Several states continued to identify the “Dixie” or “Old Dixie” Highway.
By the late 1920s, a system of numbered U.S. highways replaced the auto trails, and Congress passed the
Federal Highway Act of 1921 to further establish a network of roads. U.S. Route 66, affectionately known
as the “Main Street of America,” was one of the original numbered highways. Established on November 11,
1926, Route 66 originally ran 2,448 miles from Chicago through Missouri, Kansas, Oklahoma, Texas, New
Mexico, and Arizona to its destination in Los Angeles, California.
Communities along Route 66 became prosperous as the highway increased in popularity and Route 66 went
through many improvements and realignments until June 27, 1985, when the U.S. government removed it
from the U.S. highway system after replacing it with newer roadways in the Interstate Highway System. The
U.S. Department of Transportation designated Route 66 a National Scenic Byway, renaming it Historic Route
66, and several states have incorporated bypassed sections of the old highway into their state road networks,
calling it State Route 66. People have restored gas stations, motels, and other businesses along the old Route
66, and every year—maps in hand to make the right connections—people from all over the United States and
the world motor the original Route 66.
Airplanes and Spaceships
On December 17, 1903, at Kitty Hawk, North Carolina, Wilbur and Orville Wright flew the first heavier-than-air
machine in a controlled, sustained flight with a pilot aboard. Designed and built by the Wright Brothers, who
solved the basic problems of mechanical flight (lift, propulsion, and control), the Wright Flyer's longest flight
measured 852 feet and lasted 59 seconds.
World War I aviation technology, Colonel Billy Mitchell, and Charles Lindbergh advanced the cause of aviation
during the 1920s and 1930s. Air power dominated World War II, and technological advances in aviation after
World War II ushered in the jet and space ages. The Yankee Air Museum at Willow Run in Belleville, Michigan,
houses a Willow Run bomber and other World War II airplanes, and the Smithsonian Institution's National
Air and Space Museum in Washington, D.C., displays the Wright Flyer and Charles Lindbergh's Spirit of St.
Louis.
The 21st-century space age features National Aeronautics and Space Administration (NASA) solar system
missions, and NASA's Voyager 1 space probe is speeding 11 billion miles from Earth and will soon explore
interstellar space. Back on Earth, companies like California-based Space Exploration Technologies
Corporation are researching the parameters of commercial space travel, and the U.S. Astronaut Hall of Fame
in Titusville, Florida, displays astronaut Gus Grissom's MR 4 space suit as a reminder of the price of the
transportation revolution.
• transportation
• canals
Kathy Warnes, Independent Scholar
http://dx.doi.org/10.4135/9781483346526.n270
See Also:
• Automobile Culture
• Cable Cars
• Ferries
Page 6 of 7
Encyclopedia of Transportation: Social Science and Policy
SAGE
© 2014 by SAGE Publications, Inc.
•
•
•
•
•
SAGE Reference
Funicular Railways
Recreational Travel
Scenic Roads and Parkways
Space Travel, Commercial
Train Stations, Economics of
Further Readings
Grant, R. G.The Complete History of Flight. New York: DK Publishing, 2007.
McCalley, Bruce. Model T. Ford: The Car That Changed the World. Iola, WI: Krause Publications, 1994.
Wiatrowski, Claude. Railroads Across North America: An Illustrated History. Minneapolis: MN: Voyager Press,
2007.
Page 7 of 7
Encyclopedia of Transportation: Social Science and Policy
How objective is your history textbook Jihyeon Kim (IDS-400, IDS-401, IDS-403)(CC) - YouTube
Technology and Event Introduction
Seamless integration of technology and its application into human existence accounts for
the increased dependence on its utility. Technology influences divergent social, cultural, and
global events, which affect the harmonious co-existence in the contemporary digitalized society.
An important social event that involves technology is the MeToo initiative.
The MeToo initiative is a landmark social event that was a turning point for gender-based
violence in the United States. The social movement stemmed from protests against rape culture
and the sexual abuse that women experience. Tarana Burke employed the phrase "Me Too" on
Myspace in 2006 after surviving a sexual assault. This occurrence sparked the onset of using the
phrase "Me Too" as a tactic to empower affected victims on social media (Loney-Howes et al.,
2021). Despite the relevance of the social event, its benefits were biased due to the digital divide
between rural and urban individuals (Vogels, 2021). This knowledge gap was prevalent in
individuals in rural America due to limited technology access (McGivern, n.d.).
Social Media Technology was the drive behind the social event, evident in the viral
dissemination of the hashtag on social media in 2017 (Loney-Howes et al., 2021). Furthermore,
significant public figures supported the course online to foster more support to end sexual
violence. Social media platforms such as Twitter were vital to enhance the social event's global
reach. Unexpectedly, technology further supported the event where broadcasting media
promoted the dissemination of information related to the social event. In this view, the social
event primarily involved all individuals despite the violence's significant impact on women.
The technology significantly affects the lives of individuals involved in sexual violence
through participation in social media empowerment. Technology has promoted communication
about the prevalence of sexual violence in society, enabling affected individuals to share their
trauma and receive help (Loney-Howes et al., 2021). Furthermore, technology avails the affected
individuals with the anonymity alternative, which fosters positive help-seeking behavior among
sexual assault victims.
The social sciences lens is the most practical lens to analyze the influence of technology
on the MeToo event. This general education interdisciplinary lens focuses on society and the
relationship between people in these societies (Turner & Baker, 2019). Therefore, given the
social basis of this event, this lens would provide the most appropriate insights compared to the
history, natural and applied sciences, and humanities. Furthermore, the lens of the social sciences
employs cultural artifacts to enhance the understanding of human interactions and social
harmony. Selecting the role of the social sciences was not accessible due to the intersection of
the lens with other lenses, such as history and humanities. In other words, there is a significant
correlation between the study of culture in humanities and the study of human interactions and
behavior in social sciences.
In constructing my thesis statement, it would look a little like this, Analyzing the
involvement of technology in the progress of the MeToo initiative from the lens of the social
sciences are essential to provide insights into the supportive social media culture to foster the
mental wellness of the affected victims.
References
Loney-Howes, R., Mendes, K., Fernández Romero, D., Fileborn, B., & Núñez Puente, S. (2021).
Digital footprints of# MeToo. Feminist Media Studies, 1-18.
https://doi.org/10.1080/14680777.2021.1886142
McGivern, R (n.d) Chapter 8. Media and Technology – Introduction to Sociology – 2nd Canadian
Edition
Turner, J. R., & Baker, R. M. (2019). Complexity theory: An overview with potential
applications for the social sciences. Systems, 7(1), 4.
https://doi.org/10.3390/systems7010004
Vogels, E. A. (2021). Some digital divides persist between rural, urban, and suburban America.
Pew Research Center.
Purchase answer to see full
attachment