Unformatted Attachment Preview
STUDENT INSTRUCTIONS for Week 2
Here are your instructions for week 2:
Step 1. Watch the “Week 2 Introduction” video from Prof. Ruberg.
Step 2. Read “History of Interactive Digital Media” by Julia Griffey (in Files section on Canvas). While you
read, pay attention to:
How did the machines that would become computers evolve overtime? What historical forces
were driving their advancement?
What were some key changes to the computer that made it like the computers you know today?
Step 3. Watch this week’s screening, The Imitation Game (2014) please find this movie and watch.
How does the film choose to represent this key moment in the history of computing? What are
other ways it could have chosen to tell this story?
How is the figure of Alan Turing represented? How does the film feel about him being “different”?
Step 4. Read “A Queer History of Computing (Part 1)” by Jacob Gaboury. Note: there is a link to part 2 at
the end of the webpage but you only need to read part 1. While you read, pay attention to:
What are some differences between how Gaboury tells the story of Turing’s life and work and the
way his story is told in The Imitation Game?
Why does Gaboury feel it is important to understand Turing as a queer (LGBTQ) figure?
Step 5. Watch the guest lecture video by Mar Hicks, “Programmed Inequality.” While you watch, pay
In what ways do earlier ideas about coding as “unskilled,” low-value women’s work differ from
how we think about coding today?
How does the idea of institutional discrimination play out in the story Hicks’ tells?
Step 6. Watch the “Week 2 Lecture” video from Prof. Ruberg.
Step 7. Write and submit your weekly reflection (due Saturday, April 11 by 11:59 pm). This should be an
original piece of writing, 500 to 700 words in length. It should consist of 2 paragraphs, 1 for each prompt
you select. Select 2 of the 3 prompts below. Please write in full sentences and proofread:
Prompt 1 (readings): In “History of Interactive Digital Media,” Julia Griffey describes some of the
major changes that have shaped digital media and the internet since the start of the 2010s (see
the pages marked 42 - 44 in the PDF). Pick one of the phenomena that Griffey describes,
paraphrase it in your own words, and explain how and why it has been an important part of your
own life experiences with digital media.
Prompt 2 (guest lecture): In “Programmed Inequality,” Mar Hicks explains the history of
computing in ways that might surprise people in the present-day. What is one thing that Hicks
talked about that you didn’t know before? For example, think about the kinds of people who did
computer work and how that work was imagined. Why is your new perspective important?
Prompt 3 (screening): How does The Imitation Game represent the relationship between Turing’s
sexuality and the history of computing? For example, does the film seem to take the perspective
that it does or does not matter for his innovative work on technology that Turing was gay?
Illustrate your point with reference to specific scenes, shots, lines, etc. from the film.
Step 9. Read and respond to one of your peers’ reflections. Each response should be 100 - 200 words in
length. Consider commenting on the following:
What points did your peer make that you hadn’t thought of? What makes them interesting?
Is there something in your peer’s response you disagree with. Be respectful and constructive?
Does your peer’s response make you think of any new topics that connect to the course material?
Interactive Digital Media
Concept and Practice
First published 2020
52 Vanderbilt Avenue, New York, NY 10017
and by Routledge
2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2020 Taylor & Francis
The right of Julia Griffey to be identified as author of this work has
been asserted by her in accordance with sections 77 and 78 of the
Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or
reproduced or utilised in any form or by any electronic,
mechanical, or other means, now known or hereafter invented,
including photocopying and recording, or in any information
storage or retrieval system, without permission in writing
from the publishers.
Trademark notice: Product or corporate names may be
trademarks or registered trademarks, and are used only
for identification and explanation without intent to infringe.
Library of Congress Cataloging-in-Publication Data
A catalog record for this title has been requested
ISBN: 978-0-367-14862-1 (hbk)
ISBN: 978-0-367-14863-8 (pbk)
ISBN: 978-0-429-05365-8 (ebk)
Typeset in Univers
by Integra Software Services Pvt. Ltd.
History of Interactive Digital Media
Interactive media applications have become ubiquitous and impact many aspects
of life today. We are so reliant on these technologies, it is hard to believe that
they have only been around for a few decades. Within this time, ideas such as
user-controlled interlinked media that once seemed far-fetched have become
technically feasible. To make it all happen, however, took impressive creative
thinking, collaboration and advances in computer hardware and networking.
ORIGINS OF THE COMPUTER
The computer is the most essential development and delivery tool for all forms
of interactive media. While much of our interactive media is now consumed on
handheld devices, these are still technically computers, albeit very small ones.
The evolution of the computer was not driven by a desire to create and deliver
interactive media. Instead, it was conceived as a timesaving device for making
difficult mathematical calculations.
The analytical machine weaves algebraic patterns just as the Jacquard loom
weaves flowers and leaves.
Ada Lovelace, 1843 (Evans, 2018, p. 20)
Countess Ada Lovelace was a well-educated, mathematically minded,
Antebellum-era British woman who first learned of mathematician and inventor
Charles Babbage and his amazing machines by visiting one of his weekly salons.
Babbage routinely threw grand social events and showed off his latest inventions
and acquisitions as entertainment. Babbage’s most notable accomplishment
was building a machine called the Difference Engine that could solve differential
equations. It was inspired by Jacquard weaving machines that received instructions from punch cards and created woven designs as prescribed. Comprised of
at least 8,000 individual components, the Difference Engine was “programmed”
by setting initial values on the wheels and then turning a crank to get the results
of the mathematical equation. The machine was far ahead of its time. Beyond its
" History of Interactive Digital Media
A recreation of
function as a great party trick, it sufficiently impressed the British government
to offer Babbage funds to further develop the machine.
Lovelace became enraptured with Babbage and his machines, and the two
began a correspondence. She started making notes for Babbage who had moved
on from refining the far-from-perfect Difference Engine and had begun trying to
build a bigger and better machine that he called the “Analytical Engine.” In poetic
fashion, Ada described what the Analytical Engine could do in addition to articulating its potential. Some believe that Ada understood the significance of Babbage’s
invention more than Babbage himself. In her writing, Ada described the Analytical
Engine as a machine that could “be programmed and reprogrammed to do a
limitless and changeable array of tasks” (Isaacon, 2014, p. 25). She explained
that “anything that could be represented symbolically—numbers, logic, and even
music—could pass through the machine and do wondrous things” (Evans, 2018,
p. 20). Essentially, she described a modern computer in the year 1843.
Computer technology would stay at a standstill for almost another century
until a visionary British mathematician named Alan Turing began writing about
a device that could do so much more than making calculations. In fact, he portrayed a machine that looked and acted a whole lot like a modern-day computer
using language even more precise than that of Lovelace. For example, Turing
described a “read/write head” on a machine that would operate like a hard drive
on a modern computer. And he did so somewhat unintentionally.
In 1937, while Turing was a graduate student, he published an article in the Proceedings of the London Mathematical Society titled: “On
History of Interactive Digital Media "
Computable Numbers, with an Application to the Entscheidungsproblem.” The
Entscheidungsproblem (the decision problem) was a question posed by mathematical genius David Hilbert in 1928, which asked: “was there some procedure
that could determine whether a particular statement was provable?” (Isaacon,
2014, p. 43). Turing argued that there was not. To make his argument, Turing
described a machine that could solve any problem. If it came to a solution, the
machine would halt, otherwise it would loop forever and never reach a solution.
While Turing’s proof was ingenious, the greater significance of his paper was that
he “put into circulation the first really usable model of a computing machine”
(Leavitt, 2006, p. 105). His visionary, theoretical machine became known as the
“Turing Machine,” and Alan Turing, “the father of the modern-day computer.”
While Turing and Babbage never built their visionary machines, their work
inspired the next generation of inventors. In 1937, a Harvard University physics
doctoral student named Howard Aiken discovered a demonstration model of
Babbage’s Difference Engine in the attic of a building on campus and thought
that it would be the perfect tool to help him make complex calculations required
for his research. He successfully lobbied Harvard to further develop Babbage’s
Difference Engine in conjunction with IBM, resulting in a new machine that
made its debut in 1941 called the “Mark 1.” The Mark 1 was a functional, fully
automatic “five-ton calculator” that could receive instructions from a paper tape
(McCartney, 1999, p. 26), However, it was not electronic, only electro-magnetic.
In the early 1940s, a few other independent inventors around the world
developed computing machines, each with different strengths, weaknesses and
levels of functionality. The focus, primarily, for these machines was to make mathematical calculations which were of utmost importance to the Allies in the midst
of World War II. At that time, it could take a week or more to calculate one missile
trajectory. Computational power was lagging behind firepower. With so many
American men overseas, mathematically-minded women were heavily recruited
into “computing” positions to help make these calculations. These women were
actually even referred to as “computers,” which coined the term used today.
In 1943, the War was not going particularly well for the Allies; Hitler had
taken over much of Europe and Japan was holding its ground. In desperate need
for more computational power, representatives in the U.S. Army contracted with
the Moore School of Engineering at the University of Pennsylvania for assistance
in making these calculations. The University of Pennsylvania was sought after
as a computational powerhouse due to its possession of a Differential Analyzer
(another mechanical, analog computer designed to solve differential equations
and invented by MIT electrical engineering faculty member, Vannevar Bush). The
Moore School was also the employer of John Mauchly, a new professor with an
idea to build an “electronic calculator that could take the place of all the [human]
computers” (McCartney, 1999, p. 55). When the Army heard about his idea in
1943, they readily supplied the funding to bring Mauchly’s idea to fruition.
At the Moore school, Mauchly partnered with Presper Eckert, a highly
intelligent recent graduate of the electrical engineering program, and a team
" History of Interactive Digital Media
The ENIAC. Source:
Online. Available HTTP:
of engineers to build the Electronic Numerical Integrator and Calculator which
became known as the ENIAC. While aspects of their design borrowed ideas
from other machines Mauchly had seen (which later resulted in legal disputes),
it differed from them all in that it was completely electronic and general purpose,
meaning that it could be programmed to do a variety of tasks. Ironically, the
ENIAC was ready to go to work in 1945, just after the war had ended.
While the ENIAC could run different types of programs, loading a new program involved rewiring the machine by hand. ENIAC engineers needed to figure
out how to make the computer perform a variety of tasks, so they recruited
a team of the finest computers (six women: Frances Bilas Spence, Elizabeth
Jennings, Ruth Lichterman, Kathleen McNulty, Elizabeth Snyder Holberton, and
Marlyn Wescoff Meltzer) to figure out how to program the ENIAC (Isaacon, 2014,
p. 87; McCartney, 1999, p. 95). These women essentially invented the field of
computer programming, coining terms still in use today such as “subroutine” for
a reusable block of code and “bug” for a glitch in a program. They even pulled off
a major programming victory by getting the ENIAC to calculate a ballistic trajectory just in time for its public debut. The ENIAC is significant in that it was the
world’s first general-purpose, electronic computer.
EVOLUTION OF THE COMPUTER
Throughout the 1950s computers were enormous beasts that could fill an entire
room. Their size was dictated by their contents: thousands of vacuum tubes (a
glass tube without air that contains electrodes for controlling electron flow). The
History of Interactive Digital Media "
ENIAC, for example, was the size of a 13' × 15' room, weighed 5 tons, and was
comprised of 17,486 vacuum tubes (McCartney, 1999, pp. 101–102).
The 1947 invention of the transistor by William Shockley, John Bardeen and
Walter Brattain at Bell Labs was the first step in allowing computers to get much
smaller. A transistor is a solid-state electronic switch that can be used as an alternative to a vacuum tube. It was an exciting invention because it was physically
much smaller and used a lot less power than the vacuum tube. It also generated
less heat which kept it from failing as quickly (Watkins, n.d.).
The invention and integration of the microchip much further condensed the
size of computers. A microchip is a set of electronic circuits on one small flat
piece of semiconductor material, (normally silicon), replacing bulky individual
electronic components such as transistors, resistors and capacitors. The microchip was actually conceived by two different people at two different locations:
Jack Kilby at Texas Instruments and Ronald Noyce at Fairchild Semiconductor
(and later Intel). Their competing claims to the invention led to legal disputes,
however both were able to prosper from their invention (iProgrammer, 2010).
Intel became a leader in microchip (and later microprocessor) technologies, improving the speed and function of their technologies while making them
smaller and less expensive. Gordon Moore, Intel co-founder, made a bold prediction in 1965 after noticing this trend. He surmised that computer power would
double every year while the costs were halved. His idea became known as
“Moore’s Law” which proved to be remarkably accurate for many years to come.
The advancement of the computer was not just about improved performance in a smaller package, it was about an evolving mode of interaction. Like
the Jacquard loom described by Ada Lovelace, early computers received instructions on punch cards which the computer would read, interpret and respond to.
If there was an error in a program on a punch card, the card would need to be
A dumb terminal from
the late 1970s. Source:
Online. Available HTTP:
" History of Interactive Digital Media
corrected and fed back into the computer. By the mid 1960s, dumb terminals
(a computer station that relied on a mainframe for processing power) emerged
which changed the way in which a user could interact with a computer. This
new mode of human-computer interaction facilitated a real-time conversation
between the computer and the user. However, because the mainframe computer was being accessed by many users simultaneously, computer time was
limited and therefore precious and expensive. The lucky few who had access to
computers got a leg up on the competition in the growing computing industry.
In fact, access to computer time is one of the reasons why Bill Gates got so
much coding experience. The forward-thinking mother’s club at his high school
raised money to pay for it.
Despite the fact that computers were still fairly inaccessible to the lay
person in the 1960s and early 1970s, a few radicals dared to dream of having
computing power in their own hands instead of having to share it among several
users. The idea of individuals possessing their own computers went hand-inhand with the prevailing “power to the people” philosophy of the 1960s. It
is a not a coincidence that the California Bay Area was both the mecca for
counterculture as well as computer technology. In the Bay Area, hippies intermixed with techies and formed groups such as the Palo Alto, California-based
Homebrew computing club, frequented by Steve Jobs and Steve Wozniak. This
mishmash of hippies and nerds were excited about getting computing technology into the hands of the people. And it would only be a few years for it to
become a reality.
THE COMPUTER GETS PERSONAL
By the early 1970s, new technology companies such as Intel and Atari were
establishing roots in the valley south of San Francisco thanks to an investment
by Stanford University in a corporate park not far from the University. The area
became known as “Silicon Valley,” after reporter, Don Hoefler coined the term in
a newspaper article about the local computer chip companies emerging in the
area. However, despite the fact that so much high-tech energy was concentrated
south of San Francisco, the first personal computer emerged 1,000 miles away
in an Albuquerque, New Mexico strip mall (Bernard, 2017).
Ed Roberts was a serial entrepreneur who (out of a desperate need to
rejuvenate his faltering calculator business) came up with the idea to build a
computer that hobbyists could assemble themselves. Named after a star in the
Star Trek series and the Intel microprocessor inside of it, Roberts introduced the
first personal computer to the world: the Altair 8800. Expecting to sell a few
hundred, Roberts’ company (called MITS) sold hundreds each day and struggled
to keep up with the demand. The Altair was featured on the January 1975 cover
of the magazine, Popular Electronics, which drove up demand even further and
became a rallying cry for computer geeks on the sidelines who dreamt of getting
in on the personal computer revolution.
History of Interactive Digital Media "
The Altair 8800.
When the issue of Popular Electronics hit newsstands, Bill Gates was a
junior at Harvard University. Already a proficient coder who took on projects with
his older, more mature, business partner, Paul Allen, the duo realized that this
was their moment to capitalize on the personal computer revolution. ...