Creativity Inc Book Chapter 5-9

User Generated

wbyvrsnz1990

Business Finance

Description

Read book Creativity Inc book from chapter 5 to chapter 9 and write a paper with 3 pages-3.5 pages (No longer than 4 pages) with font Time New Roman size 12, double space. Follow my professor instruction below:

"Pixar seems one of the few companies that consistently delivers fresh creativity. At the beginning of your paper, list one or more core competences that you believe Pixar has.

Next, identify some resources, actions and policies in Creativity Inc. that you think were particularly significant in creating and supporting these core competences and producing successes. You should then choose one of these resources, actions, or policies, identify a chapter where it is discussed, and write a paragraph explaining why you think this resource, action, or policy was important.

Finally, you write a discussion of whether the behaviors you have discussed would be important in industries where creativity is less central (such as furniture retailing). If you believe that such creativity-producing behavior is important such companies, discuss one way in which it could help IKEA or another firm you are familiar with. If you believe such creativity-oriented behavior is not important to such companies, discuss what you believe is important to companies like IKEA etc., that is not so central to Pixar. Remember, this class is Strategic Management class, you should apply your study into your paper as well."

I will give you book pdf below. Please read and write papers carefully because my professor is a hard-grader. If you use sources, please write down it into paper. Thank you

Unformatted Attachment Preview

Copyright © 2014 by Edwin Catmull All rights reserved. Published in the United States by Random House, an imprint and division of Random House LLC, a Penguin Random House Company, New York. RANDOM HOUSE and the HOUSE colophon are registered trademarks of Random House LLC. LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA Catmull, Edwin E. Creativity, Inc. : overcoming the unseen forces that stand in the way of true inspiration / Ed Catmull with Amy Wallace. pages cm ISBN 978-0-8129-9301-1 eBook ISBN 978-0-679-64450-7 1. Creative ability in business. 2. Corporate culture. 3. Organizational effectiveness. 4. Pixar (Firm) I. Wallace, Amy. II. Title. HD53.C394 2014 658.4′0714—dc23 2013036026 www.atrandom.com Jacket design: Andy Dreyfus Jacket illustration: © Disney • Pixar v3.1 CONTENTS Cover Title Page Copyright Introduction: Lost and Found Chapter 1: Animated Chapter 2: Pixar Is Born Chapter 3: A Defining Goal Chapter 4: Establishing Pixar’s Identity PART I: GETTING STARTED PART II: PROTECTING THE NEW Chapter 5: Honesty and Candor Chapter 6: Fear and Failure Chapter 7: The Hungry Beast and the Ugly Baby Chapter 8: Change and Randomness Chapter 9: The Hidden PART III: BUILDING AND SUSTAINING Chapter 10: Broadening Our View Chapter 11: The Unmade Future Chapter 12: A New Challenge Chapter 13: Notes Day PART IV: TESTING WHAT WE KNOW Afterword: The Steve We Knew Starting Points: Thoughts for Managing a Creative Culture Photo Insert Dedication Acknowledgments About the Authors INTRODUCTION: LOST AND FOUND Every morning, as I walk into Pixar Animation Studios—past the twenty-foot-high sculpture of Luxo Jr., our friendly desk lamp mascot, through the double doors and into a spectacular glass-ceilinged atrium where a man-sized Buzz Lightyear and Woody, made entirely of Lego bricks, stand at attention, up the stairs past sketches and paintings of the characters that have populated our fourteen films—I am struck by the unique culture that defines this place. Although I’ve made this walk thousands of times, it never gets old. Built on the site of a former cannery, Pixar’s fifteen-acre campus, just over the Bay Bridge from San Francisco, was designed, inside and out, by Steve Jobs. (Its name, in fact, is The Steve Jobs Building.) It has well-thought-out patterns of entry and egress that encourage people to mingle, meet, and communicate. Outside, there is a soccer field, a volleyball court, a swimming pool, and a six-hundred-seat amphitheater. Sometimes visitors misunderstand the place, thinking it’s fancy for fancy’s sake. What they miss is that the unifying idea for this building isn’t luxury but community. Steve wanted the building to support our work by enhancing our ability to collaborate. The animators who work here are free to—no, encouraged to—decorate their work spaces in whatever style they wish. They spend their days inside pink dollhouses whose ceilings are hung with miniature chandeliers, tiki huts made of real bamboo, and castles whose meticulously painted, fifteen-foot-high styrofoam turrets appear to be carved from stone. Annual company traditions include “Pixarpalooza,” where our in-house rock bands battle for dominance, shredding their hearts out on stages we erect on our front lawn. The point is, we value self-expression here. This tends to make a big impression on visitors, who often tell me that the experience of walking into Pixar leaves them feeling a little wistful, like something is missing in their work lives—a palpable energy, a feeling of collaboration and unfettered creativity, a sense, not to be corny, of possibility. I respond by telling them that the feeling they are picking up on—call it exuberance or irreverence, even whimsy—is integral to our success. But it’s not what makes Pixar special. What makes Pixar special is that we acknowledge we will always have problems, many of them hidden from our view; that we work hard to uncover these problems, even if doing so means making ourselves uncomfortable; and that, when we come across a problem, we marshal all of our energies to solve it. This, more than any elaborate party or turreted workstation, is why I love coming to work in the morning. It is what motivates me and gives me a definite sense of mission. There was a time, however, when my purpose here felt a lot less clear to me. And it might surprise you when I tell you when. On November 22, 1995, Toy Story debuted in America’s theaters and became the largest Thanksgiving opening in history. Critics heralded it as “inventive” (Time), “brilliant” and “exultantly witty” (The New York Times), and “visionary” (Chicago Sun-Times). To find a movie worthy of comparison, wrote The Washington Post, one had to go back to 1939, to The Wizard of Oz. The making of Toy Story—the first feature film to be animated entirely on a computer—had required every ounce of our tenacity, artistry, technical wizardry, and endurance. The hundred or so men and women who produced it had weathered countless ups and downs as well as the ever-present, hair-raising knowledge that our survival depended on this 80-minute experiment. For five straight years, we’d fought to do Toy Story our way. We’d resisted the advice of Disney executives who believed that since they’d had such success with musicals, we too should fill our movie with songs. We’d rebooted the story completely, more than once, to make sure it rang true. We’d worked nights, weekends, and holidays—mostly without complaint. Despite being novice filmmakers at a fledgling studio in dire financial straits, we had put our faith in a simple idea: If we made something that we wanted to see, others would want to see it, too. For so long, it felt like we had been pushing that rock up the hill, trying to do the impossible. There were plenty of moments when the future of Pixar was in doubt. Now, we were suddenly being held up as an example of what could happen when artists trusted their guts. Toy Story went on to become the top-grossing film of the year and would earn $358 million worldwide. But it wasn’t just the numbers that made us proud; money, after all, is just one measure of a thriving company and usually not the most meaningful one. No, what I found gratifying was what we’d created. Review after review focused on the film’s moving plotline and its rich, three-dimensional characters—only briefly mentioning, almost as an aside, that it had been made on a computer. While there was much innovation that enabled our work, we had not let the technology overwhelm our real purpose: making a great film. On a personal level, Toy Story represented the fulfillment of a goal I had pursued for more than two decades and had dreamed about since I was a boy. Growing up in the 1950s, I had yearned to be a Disney animator but had no idea how to go about it. Instinctively, I realize now, I embraced computer graphics—then a new field—as a means of pursuing that dream. If I couldn’t animate by hand, there had to be another way. In graduate school, I’d quietly set a goal of making the first computer-animated feature film, and I’d worked tirelessly for twenty years to accomplish it. Now, the goal that had been a driving force in my life had been reached, and there was an enormous sense of relief and exhilaration—at least at first. In the wake of Toy Story’s release, we took the company public, raising the kind of money that would ensure our future as an independent production house, and began work on two new feature-length projects, A Bug’s Life and Toy Story 2. Everything was going our way, and yet I felt adrift. In fulfilling a goal, I had lost some essential framework. Is this really what I want to do? I began asking myself. The doubts surprised and confused me, and I kept them to myself. I had served as Pixar’s president for most of the company’s existence. I loved the place and everything that it stood for. Still, I couldn’t deny that achieving the goal that had defined my professional life had left me without one. Is this all there is? I wondered. Is it time for a new challenge? It wasn’t that I thought Pixar had “arrived” or that my work was done. I knew there were major obstacles in front of us. The company was growing quickly, with lots of shareholders to please, and we were racing to put two new films into production. There was, in short, plenty to occupy my working hours. But my internal sense of purpose—the thing that had led me to sleep on the floor of the computer lab in graduate school just to get more hours on the mainframe, that kept me awake at night, as a kid, solving puzzles in my head, that fueled my every workday—had gone missing. I’d spent two decades building a train and laying its track. Now, the thought of merely driving it struck me as a far less interesting task. Was making one film after another enough to engage me? I wondered. What would be my organizing principle now? It would take a full year for the answer to emerge. From the start, my professional life seemed destined to have one foot in Silicon Valley and the other in Hollywood. I first got into the film business in 1979 when, flush from the success of Star Wars, George Lucas hired me to help him bring high technology into the film industry. But he wasn’t based in Los Angeles. Instead, he’d founded his company, Lucasfilm, at the north end of the San Francisco Bay. Our offices were located in San Rafael, about an hour’s drive from Palo Alto, the heart of Silicon Valley—a moniker that was just gaining traction then, as the semiconductor and computer industries took off. That proximity gave me a frontrow seat from which to observe the many emerging hardware and software companies—not to mention the growing venture capital industry—that, in the course of a few years, would come to dominate Silicon Valley from its perch on Sand Hill Road. I couldn’t have arrived at a more dynamic and volatile time. I watched as many startups burned bright with success—and then flamed out. My mandate at Lucasfilm—to merge moviemaking with technology—meant that I rubbed shoulders with the leaders of places like Sun Microsystems and Silicon Graphics and Cray Computer, several of whom I came to know well. I was first and foremost a scientist then, not a manager, so I watched these guys closely, hoping to learn from the trajectories their companies followed. Gradually, a pattern began to emerge: Someone had a creative idea, obtained funding, brought on a lot of smart people, and developed and sold a product that got a boatload of attention. That initial success begat more success, luring the best engineers and attracting customers who had interesting and high-profile problems to solve. As these companies grew, much was written about their paradigm-shifting approaches, and when their CEOs inevitably landed on the cover of Fortune magazine, they were heralded as “Titans of the New.” I especially remember the confidence. The leaders of these companies radiated supreme confidence. Surely, they could only have reached this apex by being very, very good. But then those companies did something stupid—not just stupid-in-retrospect, but obviousat-the-time stupid. I wanted to understand why. What was causing smart people to make decisions that sent their companies off the rails? I didn’t doubt that they believed they were doing the right thing, but something was blinding them—and keeping them from seeing the problems that threatened to upend them. As a result, their companies expanded like bubbles, then burst. What interested me was not that companies rose and fell or that the landscape continually shifted as technology changed but that the leaders of these companies seemed so focused on the competition that they never developed any deep introspection about other destructive forces that were at work. Over the years, as Pixar struggled to find its way—first selling hardware, then software, then making animated short films and advertisements—I asked myself: If Pixar is ever successful, will we do something stupid, too? Can paying careful attention to the missteps of others help us be more alert to our own? Or is there something about becoming a leader that makes you blind to the things that threaten the well-being of your enterprise? Clearly, something was causing a dangerous disconnect at many smart, creative companies. What, exactly, was a mystery—and one I was determined to figure out. In the difficult year after Toy Story’s debut, I came to realize that trying to solve this mystery would be my next challenge. My desire to protect Pixar from the forces that ruin so many businesses gave me renewed focus. I began to see my role as a leader more clearly. I would devote myself to learning how to build not just a successful company but a sustainable creative culture. As I turned my attention from solving technical problems to engaging with the philosophy of sound management, I was excited once again—and sure that our second act could be as exhilarating as our first. It has always been my goal to create a culture at Pixar that will outlast its founding leaders— Steve, John Lasseter, and me. But it is also my goal to share our underlying philosophies with other leaders and, frankly, with anyone who wrestles with the competing—but necessarily complementary—forces of art and commerce. What you’re holding in your hands, then, is an attempt to put down on paper my best ideas about how we built the culture that is the bedrock of this place. This book isn’t just for Pixar people, entertainment executives, or animators. It is for anyone who wants to work in an environment that fosters creativity and problem solving. My belief is that good leadership can help creative people stay on the path to excellence no matter what business they’re in. My aim at Pixar—and at Disney Animation, which my longtime partner John Lasseter and I have also led since the Walt Disney Company acquired Pixar in 2006—has been to enable our people to do their best work. We start from the presumption that our people are talented and want to contribute. We accept that, without meaning to, our company is stifling that talent in myriad unseen ways. Finally, we try to identify those impediments and fix them. I’ve spent nearly forty years thinking about how to help smart, ambitious people work effectively with one another. The way I see it, my job as a manager is to create a fertile environment, keep it healthy, and watch for the things that undermine it. I believe, to my core, that everybody has the potential to be creative—whatever form that creativity takes— and that to encourage such development is a noble thing. More interesting to me, though, are the blocks that get in the way, often without us noticing, and hinder the creativity that resides within any thriving company. The thesis of this book is that there are many blocks to creativity, but there are active steps we can take to protect the creative process. In the coming pages, I will discuss many of the steps we follow at Pixar, but the most compelling mechanisms to me are those that deal with uncertainty, instability, lack of candor, and the things we cannot see. I believe the best managers acknowledge and make room for what they do not know—not just because humility is a virtue but because until one adopts that mindset, the most striking breakthroughs cannot occur. I believe that managers must loosen the controls, not tighten them. They must accept risk; they must trust the people they work with and strive to clear the path for them; and always, they must pay attention to and engage with anything that creates fear. Moreover, successful leaders embrace the reality that their models may be wrong or incomplete. Only when we admit what we don’t know can we ever hope to learn it. This book is organized into four sections—Getting Started, Protecting the New, Building and Sustaining, and Testing What We Know. It is no memoir, but in order to understand the mistakes we made, the lessons we learned, and the ways we learned from them, it necessarily delves at times into my own history and that of Pixar. I have much to say about enabling groups to create something meaningful together and then protecting them from the destructive forces that loom even in the strongest companies. My hope is that by relating my search for the sources of confusion and delusion within Pixar and Disney Animation, I can help others avoid the pitfalls that impede and sometimes ruin businesses of all kinds. The key for me—what has kept me motivated in the nineteen years since Toy Story debuted—has been the realization that identifying these destructive forces isn’t merely a philosophical exercise. It is a crucial, central mission. In the wake of our earliest success, Pixar needed its leaders to sit up and pay attention. And that need for vigilance never goes away. This book, then, is about the ongoing work of paying attention—of leading by being self-aware, as managers and as companies. It is an expression of the ideas that I believe make the best in us possible. PART I GETTING STARTED CHAPTER 1 ANIMATED For thirteen years we had a table in the large conference room at Pixar that we call West One. Though it was beautiful, I grew to hate this table. It was long and skinny, like one of those things you’d see in a comedy sketch about an old wealthy couple that sits down for dinner—one person at either end, a candelabra in the middle—and has to shout to make conversation. The table had been chosen by a designer Steve Jobs liked, and it was elegant, all right—but it impeded our work. We’d hold regular meetings about our movies around that table—thirty of us facing off in two long lines, often with more people seated along the walls—and everyone was so spread out that it was difficult to communicate. For those unlucky enough to be seated at the far ends, ideas didn’t flow because it was nearly impossible to make eye contact without craning your neck. Moreover, because it was important that the director and producer of the film in question be able to hear what everyone was saying, they had to be placed at the center of the table. So did Pixar’s creative leaders: John Lasseter, Pixar’s creative officer, and me, and a handful of our most experienced directors, producers, and writers. To ensure that these people were always seated together, someone began making place cards. We might as well have been at a formal dinner party. When it comes to creative inspiration, job titles and hierarchy are meaningless. That’s what I believe. But unwittingly, we were allowing this table—and the resulting place card ritual— to send a different message. The closer you were seated to the middle of the table, it implied, the more important—the more central—you must be. And the farther away, the less likely you were to speak up—your distance from the heart of the conversation made participating feel intrusive. If the table was crowded, as it often was, still more people would sit in chairs around the edges of the room, creating yet a third tier of participants (those at the center of the table, those at the ends, and those not at the table at all). Without intending to, we’d created an obstacle that discouraged people from jumping in. Over the course of a decade, we held countless meetings around this table in this way— completely unaware of how doing so undermined our own core principles. Why were we blind to this? Because the seating arrangements and place cards were designed for the convenience of the leaders, including me. Sincerely believing that we were in an inclusive meeting, we saw nothing amiss because we didn’t feel excluded. Those not sitting at the center of the table, meanwhile, saw quite clearly how it established a pecking order but presumed that we—the leaders—had intended that outcome. Who were they, then, to complain? It wasn’t until we happened to have a meeting in a smaller room with a square table that John and I realized what was wrong. Sitting around that table, the interplay was better, the exchange of ideas more free-flowing, the eye contact automatic. Every person there, no matter their job title, felt free to speak up. This was not only what we wanted, it was a fundamental Pixar belief: Unhindered communication was key, no matter what your position. At our long, skinny table, comfortable in our middle seats, we had utterly failed to recognize that we were behaving contrary to that basic tenet. Over time, we’d fallen into a trap. Even though we were conscious that a room’s dynamics are critical to any good discussion, even though we believed that we were constantly on the lookout for problems, our vantage point blinded us to what was right before our eyes. Emboldened by this new insight, I went to our facilities department. “Please,” I said, “I don’t care how you do it, but get that table out of there.” I wanted something that could be arranged into a more intimate square, so people could address each other directly and not feel like they didn’t matter. A few days later, as a critical meeting on an upcoming movie approached, our new table was installed, solving the problem. Still, interestingly, there were remnants of that problem that did not immediately vanish just because we’d solved it. For example, the next time I walked into West One, I saw the brand-new table, arranged—as requested—in a more intimate square that made it possible for more people to interact at once. But the table was adorned with the same old place cards! While we’d fixed the key problem that had made place cards seem necessary, the cards themselves had become a tradition that would continue until we specifically dismantled it. This wasn’t as troubling an issue as the table itself, but it was something we had to address because cards implied hierarchy, and that was precisely what we were trying to avoid. When Andrew Stanton, one of our directors, entered the meeting room that morning, he grabbed several place cards and began randomly moving them around, narrating as he went. “We don’t need these anymore!” he said in a way that everyone in the room grasped. Only then did we succeed in eliminating this ancillary problem. This is the nature of management. Decisions are made, usually for good reasons, which in turn prompt other decisions. So when problems arise—and they always do—disentangling them is not as simple as correcting the original error. Often, finding a solution is a multi-step endeavor. There is the problem you know you are trying to solve—think of that as an oak tree —and then there are all the other problems—think of these as saplings—that sprouted from the acorns that fell around it. And these problems remain after you cut the oak tree down. Even after all these years, I’m often surprised to find problems that have existed right in front of me, in plain sight. For me, the key to solving these problems is finding ways to see what’s working and what isn’t, which sounds a lot simpler than it is. Pixar today is managed according to this principle, but in a way I’ve been searching all my life for better ways of seeing. It began decades before Pixar even existed. When I was a kid, I used to plunk myself down on the living room floor of my family’s modest Salt Lake City home a few minutes before 7 P.M. every Sunday and wait for Walt Disney. Specifically, I’d wait for him to appear on our black-and-white RCA with its tiny 12inch screen. Even from a dozen feet away—the accepted wisdom at the time was that viewers should put one foot between them and the TV for every inch of screen—I was transfixed by what I saw. Each week, Walt Disney himself opened the broadcast of The Wonderful World of Disney. Standing before me in suit and tie, like a kindly neighbor, he would demystify the Disney magic. He’d explain the use of synchronized sound in Steamboat Willie or talk about the importance of music in Fantasia. He always went out of his way to give credit to his forebears, the men—and, at this point, they were all men—who’d done the pioneering work upon which he was building his empire. He’d introduce the television audience to trailblazers such as Max Fleischer, of Koko the Clown and Betty Boop fame, and Winsor McCay, who made Gertie the Dinosaur—the first animated film to feature a character that expressed emotion—in 1914. He’d gather a group of his animators, colorists, and storyboard artists to explain how they made Mickey Mouse and Donald Duck come to life. Each week, Disney created a made-up world, used cutting-edge technology to enable it, and then told us how he’d done it. Walt Disney was one of my two boyhood idols. The other was Albert Einstein. To me, even at a young age, they represented the two poles of creativity. Disney was all about inventing the new. He brought things into being—both artistically and technologically—that did not exist before. Einstein, by contrast, was a master of explaining that which already was. I read every Einstein biography I could get my hands on as well as a little book he wrote on his theory of relativity. I loved how the concepts he developed forced people to change their approach to physics and matter, to view the universe from a different perspective. Wildhaired and iconic, Einstein dared to bend the implications of what we thought we knew. He solved the biggest puzzles of all and, in doing so, changed our understanding of reality. Both Einstein and Disney inspired me, but Disney affected me more because of his weekly visits to my family’s living room. “When you wish upon a star, makes no difference who you are,” his TV show’s theme song would announce as a baritone-voiced narrator promised: “Each week, as you enter this timeless land, one of these many worlds will open to you.… ” Then the narrator would tick them off: Frontierland (“tall tales and true from the legendary past”), Tomorrowland (“the promise of things to come”), Adventureland (“the wonder world of nature’s own realm”), and Fantasyland (“the happiest kingdom of them all”). I loved the idea that animation could take me places I’d never been. But the land I most wanted to learn about was the one occupied by the innovators at Disney who made these animated films. Between 1950 and 1955, Disney made three movies we consider classics today: Cinderella, Peter Pan, and Lady and the Tramp. More than half a century later, we all remember the glass slipper, the Island of Lost Boys, and that scene where the cocker spaniel and the mutt slurp spaghetti. But few grasp how technically sophisticated these movies were. Disney’s animators were at the forefront of applied technology; instead of merely using existing methods, they were inventing ones of their own. They had to develop the tools to perfect sound and color, to use blue screen matting and multi-plane cameras and xerography. Every time some technological breakthrough occurred, Walt Disney incorporated it and then talked about it on his show in a way that highlighted the relationship between technology and art. I was too young to realize such a synergy was groundbreaking. To me, it just made sense that they belonged together. Watching Disney one Sunday evening in April of 1956, I experienced something that would define my professional life. What exactly it was is difficult to describe except to say that I felt something fall into place inside my head. That night’s episode was called “Where Do the Stories Come From?” and Disney kicked it off by praising his animators’ knack for turning everyday occurrences into cartoons. That night, though, it wasn’t Disney’s explanation that pulled me in but what was happening on the screen as he spoke. An artist was drawing Donald Duck, giving him a jaunty costume and a bouquet of flowers and a box of candy with which to woo Daisy. Then, as the artist’s pencil moved around the page, Donald came to life, putting up his dukes to square off with the pencil lead, then raising his chin to allow the artist to give him a bow tie. The definition of superb animation is that each character on the screen makes you believe it is a thinking being. Whether it’s a T-Rex or a slinky dog or a desk lamp, if viewers sense not just movement but intention—or, put another way, emotion—then the animator has done his or her job. It’s not just lines on paper anymore; it’s a living, feeling entity. This is what I experienced that night, for the first time, as I watched Donald leap off the page. The transformation from a static line drawing to a fully dimensional, animated image was sleight of hand, nothing more, but the mystery of how it was done—not just the technical process but the way the art was imbued with such emotion—was the most interesting problem I’d ever considered. I wanted to climb through the TV screen and be part of this world. The mid-1950s and early 1960s were, of course, a time of great prosperity and industry in the United States. Growing up in Utah in a tight-knit Mormon community, my four younger brothers and sisters and I felt that anything was possible. Because the adults we knew had all lived through the Depression, World War II, and then the Korean War, this period felt to them like the calm after a thunderstorm. I remember the optimistic energy—an eagerness to move forward that was enabled and supported by a wealth of emerging technologies. It was boom time in America, with manufacturing and home construction at an all-time high. Banks were offering loans and credit, which meant more and more people could own a new TV, house, or Cadillac. There were amazing new appliances like disposals that ate your garbage and machines that washed your dishes, although I certainly did my share of cleaning them by hand. The first organ transplants were performed in 1954; the first polio vaccine came a year later; in 1956, the term artificial intelligence entered the lexicon. The future, it seemed, was already here. Then, when I was twelve, the Soviets launched the first artificial satellite—Sputnik 1—into earth’s orbit. This was huge news, not just in the scientific and political realms but in my sixth grade classroom at school, where the morning routine was interrupted by a visit from the principal, whose grim expression told us that our lives had changed forever. Since we’d been taught that the Communists were the enemy and that nuclear war could be waged at the touch of a button, the fact that they’d beaten us into space seemed pretty scary—proof that they had the upper hand. The United States government’s response to being bested was to create something called ARPA, or the Advanced Research Projects Agency. Though it was housed within the Defense Department, its mission was ostensibly peaceful: to support scientific researchers in America’s universities in the hopes of preventing what it termed “technological surprise.” By sponsoring our best minds, the architects of ARPA believed, we’d come up with better answers. Looking back, I still admire that enlightened reaction to a serious threat: We’ll just have to get smarter. ARPA would have a profound effect on America, leading directly to the computer revolution and the Internet, among countless other innovations. There was a sense that big things were happening in America, with much more to come. Life was full of possibility. Still, while my family was middle-class, our outlook was shaped by my father’s upbringing. Not that he talked about it much. Earl Catmull, the son of an Idaho dirt farmer, was one of fourteen kids, five of whom had died as infants. His mother, raised by Mormon pioneers who made a meager living panning for gold in the Snake River in Idaho, didn’t attend school until she was eleven. My father was the first in his family ever to go to college, paying his own way by working several jobs. During my childhood, he taught math during the school year and built houses during the summers. He built our house from the ground up. While he never explicitly said that education was paramount, my siblings and I all knew we were expected to study hard and go to college. I was a quiet, focused student in high school. An art teacher once told my parents I would often become so lost in my work that I wouldn’t hear the bell ring at the end of class; I’d be sitting there, at my desk, staring at an object—a vase, say, or a chair. Something about the act of committing that object to paper was completely engrossing—the way it necessitated seeing only what was there and shutting out the distraction of my ideas about chairs or vases and what they were supposed to look like. At home, I sent away for Jon Gnagy’s Learn to Draw art kits—which were advertised in the back of comic books—and the 1948 classic Animation, written and drawn by Preston Blair, the animator of the dancing hippos in Disney’s Fantasia. I bought a platen—the flat metal plate artists use to press paper against ink—and even built a plywood animation stand with a light under it. I made flipbooks—one was of a man whose legs turned into a unicycle—while nursing my first crush, Tinker Bell, who had won my heart in Peter Pan. Nevertheless, it soon became clear to me that I would never be talented enough to join Disney Animation’s vaunted ranks. What’s more, I had no idea how one actually became an animator. There was no school for it that I knew of. As I finished high school, I realized I had a far better understanding of how one became a scientist. The route seemed easier to discern. Throughout my life, people have always smiled when I told them I switched from art to physics because it seems, to them, like such an incongruous leap. But my decision to pursue physics, and not art, would lead me, indirectly, to my true calling. Four years later, in 1969, I graduated from the University of Utah with two degrees, one in physics and the other in the emerging field of computer science. Applying to graduate school, my intention was to learn how to design computer languages. But soon after I matriculated, also at the U of U, I met a man who would encourage me to change course: one of the pioneers of interactive computer graphics, Ivan Sutherland. The field of computer graphics—in essence, the making of digital pictures out of numbers, or data, that can be manipulated by a machine—was in its infancy then, but Professor Sutherland was already a legend. Early in his career, he had devised something called Sketchpad, an ingenious computer program that allowed figures to be drawn, copied, moved, rotated, or resized, all while retaining their basic properties. In 1968, he’d co-created what is widely believed to be the first virtual reality head-mounted display system. (The device was named The Sword of Damocles, after the Greek myth, because it was so heavy that in order to be worn by the person using it, it had to be suspended from a mechanical arm bolted to the ceiling.) Sutherland and Dave Evans, who was chair of the university’s computer science department, were magnets for bright students with diverse interests, and they led us with a light touch. Basically, they welcomed us to the program, gave us workspace and access to computers, and then let us pursue whatever turned us on. The result was a collaborative, supportive community so inspiring that I would later seek to replicate it at Pixar. One of my classmates, Jim Clark, would go on to found Silicon Graphics and Netscape. Another, John Warnock, would co-found Adobe, known for Photoshop and the PDF file format, among other things. Still another, Alan Kay, would lead on a number of fronts, from object-oriented programming to “windowing” graphical user interfaces. In many respects, my fellow students were the most inspirational part of my university experience; this collegial, collaborative atmosphere was vital not just to my enjoyment of the program but also to the quality of the work that I did. This tension between the individual’s personal creative contribution and the leverage of the group is a dynamic that exists in all creative environments, but this would be my first taste of it. On one end of the spectrum, I noticed, we had the genius who seemed to do amazing work on his or her own; on the other end, we had the group that excelled precisely because of its multiplicity of views. How, then, should we balance these two extremes, I wondered. I didn’t yet have a good mental model that would help me answer that, but I was developing a fierce desire to find one. Much of the research being done at the U of U’s computer science department was funded by ARPA. As I’ve said, ARPA had been created in response to Sputnik, and one of its key organizing principles was that collaboration could lead to excellence. In fact, one of ARPA’s proudest achievements was linking universities with something they called “ARPANET,” which would eventually evolve into the Internet. The first four nodes on the ARPANET were at the Stanford Research Institute, UCLA, UC Santa Barbara, and the U of U, so I had a ringside seat from which to observe this grand experiment, and what I saw influenced me profoundly. ARPA’s mandate—to support smart people in a variety of areas—was carried out based on the unwavering presumption that researchers would try to do the right thing and, in ARPA’s view, overmanaging them was counterproductive. ARPA’s administrators did not hover over the shoulders of those of us working on the projects they funded, nor did they demand that our work have direct military applications. They simply trusted us to innovate. This kind of trust gave me the freedom to tackle all sorts of complex problems, and I did so with gusto. Not only did I often sleep on the floor of the computer rooms to maximize time on the computer, but so did many of my fellow graduate students. We were young, driven by the sense that we were inventing the field from scratch—and that was exciting beyond words. For the first time, I saw a way to simultaneously create art and develop a technical understanding of how to create a new kind of imagery. Making pictures with a computer spoke to both sides of my brain. To be sure, the pictures that could be rendered on a computer were very crude in 1969, but the act of inventing new algorithms and seeing better pictures as a result was thrilling to me. In its own way, my childhood dream was reasserting itself. At the age of twenty-six, I set a new goal: to develop a way to animate, not with a pencil but with a computer, and to make the images compelling and beautiful enough to use in the movies. Perhaps, I thought, I could become an animator after all. In the spring of 1972, I spent ten weeks making my first short animated film—a digitized model of my left hand. My process combined old and new; again, like everyone in this fastchanging field, I was helping to invent the language. First I plunged my hand into a tub of plaster of Paris (forgetting, unfortunately, to coat it in Vaseline first, which meant I had to yank out every tiny hair on the back of my hand to get it free); then, once I had the mold, I filled it with more plaster to make a model of my hand; then, I took that model and covered it with 350 tiny interlocking triangles and polygons to create what looked like a net of black lines on its “skin.” You may not think that a curved surface could be built out of such flat, angular elements, but when you make them small enough, you can get pretty close. I’d chosen this project because I was interested in rendering complex objects and curved surfaces—and I was looking for a challenge. At that time, computers weren’t great at showing flat objects, let alone curved ones. The mathematics of curved surfaces was not well developed, and computers had limited memory capability. At the U of U’s computer graphics department, where every one of us yearned to make computer-generated images look as if they were photographs of real objects, we had three driving goals: speed, realism, and the ability to depict curved surfaces. My film sought to address the latter two. The human hand doesn’t have a single flat plane. And unlike a simpler curved surface—a ball, for example—it has many parts that act in opposition to one another, with a seemingly infinite number of resulting movements. The hand is an incredibly complex “object” to try to capture and translate into arrays of numbers. Given that most computer animation at the time consisted of rendering simple polygonal objects (cubes, pyramids), I had my work cut out for me. Once I had drawn the triangles and polygons on my model, I measured the coordinates of each of their corners, then entered that data into a 3D animation program I’d written. That enabled me to display the many triangles and polygons that made up my virtual hand on a monitor. In its first incarnation, sharp edges could be seen at the seams where the polygons joined together. But later, thanks to “smooth shading”—a technique, developed by another graduate student, that diminished the appearance of those edges—the hand became more lifelike. The real challenge, though, was making it move. Hand, which debuted at a computer science conference in 1973, caused a bit of a stir because no one had ever seen anything like it before. In it, my hand, which appears at first to be covered in a white net of polygons, begins to open and close, as if trying to make a fist. Then my hand’s surface becomes smoother, more like the real thing. There is a moment when my hand points directly at the viewer as if to say, “Yes, I’m talking to you.” Then, the camera goes inside the hand and takes a look around, aiming its lens inside the palm and up into each finger, a tricky bit of perspective that I liked because it could be depicted only via computer. Those four minutes of film had taken me more than sixty thousand minutes to complete. Together with a digitized film that my friend Fred Parke made of his wife’s face around the same time, Hand represented the state-of-the-art in computer animation for years after it was made. Snippets of both Fred’s and my films would be featured in the 1976 movie Futureworld, which—though mostly forgotten by moviegoers today—is still remembered by aficionados as the first full-length feature to use computer-generated animation. Professor Sutherland used to say that he loved his graduate students at Utah because we didn’t know what was impossible. Neither, apparently, did he: He was among the first to believe that Hollywood movie execs would care a fig about what was happening in academia. To that end, he sought to create a formal exchange program with Disney, wherein the studio would send one of its animators to Utah to learn about new technologies in computer rendering, and the university would send a student to Disney Animation to learn more about how to tell stories. In the spring of 1973, he sent me to Burbank to try to sell this idea to the Disney executives. It was a thrill for me to drive through the red brick gates and onto the Disney lot on my way to the original Animation Building, built in 1940 with a “Double H” floor plan personally supervised by Walt himself to ensure that as many rooms as possible had windows to let in natural light. While I’d studied this place—or what I could glimpse of it on our 12inch RCA—walking into it was a little like stepping into the Parthenon for the first time. There, I met Frank Thomas and Ollie Johnston, two of Walt’s “Nine Old Men,” the group of legendary animators who had created so many of the characters in the Disney movies I loved, from Pinocchio to Peter Pan. At one point I was taken into the archives where all the original paper drawings from all the animated films were kept, with rack after rack after rack of the images that had fueled my imagination. I’d entered the Promised Land. One thing was immediately clear. The people I met at Disney—one of whom, I swear, was named Donald Duckwall—had zero interest in Sutherland’s exchange program. The technically adventuresome Walt Disney was long gone. My enthusiastic descriptions were met with blank stares. To them, computers and animation simply didn’t mix. How did they know this? Because the one time they had turned to computers for help—to render images of millions of bubbles in their 1971 live-action movie Bedknobs and Broomsticks—the computers had apparently let them down. The state of the technology at the time was so poor, particularly for curved images, that bubbles were beyond the computers’ reach. Unfortunately, this didn’t help my cause. “Well,” more than one Disney executive told me that day, “until computer animation can do bubbles, then it will not have arrived.” Instead, they tried to tempt me into taking a job with what is now called Disney Imagineering, the division that designs the theme parks. It may sound odd, given how large Walt Disney had always loomed in my life, but I turned the offer down without hesitation. The theme park job felt like a diversion that would lead me down a path I didn’t want to be on. I didn’t want to design rides for a living. I wanted to animate with a computer. Just as Walt Disney and the pioneers of hand-drawn animation had done decades before, those of us who sought to make pictures with computers were trying to create something new. When one of my colleagues at the U of U invented something, the rest of us would immediately piggyback on it, pushing that new idea forward. There were setbacks, too, of course. But the overriding feeling was one of progress, of moving steadily toward a distant goal. Long before I’d heard about Disney’s bubble problem, what kept me and many of my fellow graduate students up at night was the need to continue to hone our methods for creating smoothly curved surfaces with the computer—as well as to figure out how to add richness and complexity to the images we were creating. My dissertation, “A Subdivision Algorithm for Computer Display of Curved Surfaces,” offered a solution to that problem. Much of what I spent every waking moment thinking about then was extremely technical and difficult to explain, but I’ll give it a try. The idea behind what I called “subdivision surfaces” was that instead of setting out to depict the whole surface of a shiny, red bottle, for example, we could divide that surface into many smaller pieces. It was easier to figure out how to color and display each tiny piece—which we could then put together to create our shiny, red bottle. (As I’ve noted, computer memory capacity was quite small in those days, so we put a lot of energy into developing tricks to overcome that limitation. This was one of those tricks.) But what if you wanted that shiny, red bottle to be zebra-striped? In my dissertation, I figured out a way that I could take a zebra-print or woodgrain pattern, say, and wrap it around any object. “Texture mapping,” as I called it, was like having stretchable wrapping paper that you could apply to a curved surface so that it fit snugly. The first texture map I made involved projecting an image of Mickey Mouse onto an undulating surface. I also used Winnie the Pooh and Tigger to illustrate my points. I may not have been ready to work at Disney, but their characters were still the touchstones I referenced. At the U of U, we were inventing a new language. One of us would contribute a verb, another a noun, then a third person would figure out ways to string the elements together to actually say something. My invention of something called the “Z-buffer” was a good example of this, in that it built on others’ work. The Z-buffer was designed to address the problem of what happens when one computer-animated object is hidden, or partially hidden, behind another one. Even though the data that describes every aspect of the hidden object is in the computer’s memory (meaning that you could see it, if need be), the desired spatial relationships mean that it should not be fully seen. The challenge was to figure out a way to tell the computer to meet that goal. For example, if a sphere were in front of a cube, partially blocking it, the sphere’s surface should be visible on the screen, as should the parts of the cube that are not blocked by the sphere. The Z-buffer accomplished that by assigning a depth to every object in three-dimensional space, then telling the computer to match each of the screen’s pixels to whatever object was the closest. Computer memory was so limited—as I’ve said—that this wasn’t a practical solution, but I had found a new way of solving the problem. Although it sounds simple, it is anything but. Today, there is a Z-buffer in every game and PC chip manufactured on earth. After receiving my Ph.D. in 1974, I left Utah with a nice little list of innovations under my belt, but I was keenly aware that I’d only done all this in the service of a larger mutual goal. Like my classmates, the work I’d championed had taken hold largely because of the protective, eclectic, intensely challenging environment I’d been in. The leaders of my department understood that to create a fertile laboratory, they had to assemble different kinds of thinkers and then encourage their autonomy. They had to offer feedback when needed but also had to be willing to stand back and give us room. I felt instinctively that this kind of environment was rare and worth reaching for. I knew that the most valuable thing I was taking away from the U of U was the model my teachers had provided for how to lead and inspire other creative thinkers. The question for me, then, was how to get myself into another environment like this—or how to build one of my own. I walked away from Utah with a clearer sense of my goal, and I was prepared to devote my life to it: making the first computer-animated film. But getting to that point would not be easy. There were, I guessed, at least another ten years of development needed to figure out how to model and animate characters and render them in complex environments before we could even begin to conceive of making a short—let alone a feature—film. I also didn’t yet know that my self-assigned mission was about much more than technology. To pull it off, we’d have to be creative not only technically but also in the ways that we worked together. Back then, no other company or university shared my goal of making a computer-generated film; in fact, each time I expressed that goal in job interviews at universities, it seemed to cast a pall over the room. “But we want you to teach computer science,” my interviewers would say. What I was proposing to do looked, to most academics, like a pipe dream, an expensive fantasy. Then, in November 1974, I received a mysterious call from a woman who said she worked at something called the New York Institute of Technology. She said she was the secretary to the institute’s president, and she was calling to book my airplane ticket. I didn’t know what she was talking about, and I told her so. What was the name of the institute again? I asked. Why did she want me to fly to New York? There was an awkward silence. “I’m sorry,” she said. “Someone else was supposed to call you before I did.” And with that, she hung up. The next phone call I received would change my life. CHAPTER 2 PIXAR IS BORN What does it mean to manage well? As a young man, I certainly had no idea, but I was about to begin figuring it out by taking a series of jobs—working for three iconoclastic men with very different styles—that would provide me with a crash course in leadership. In the next decade, I would learn much about what managers should and shouldn’t do, about vision and delusion, about confidence and arrogance, about what encourages creativity and what snuffs it out. As I gained experience, I was asking questions that intrigued me even as they confused me. Even now, forty years later, I’ve never stopped questioning. I want to start with my first boss, Alex Schure—the man whose secretary called me out of the blue that day in 1974 to book me an airplane ticket and then, realizing her mistake, slammed down the receiver. When the phone rang again, a few minutes later, an unfamiliar voice—this time, a man who said he worked for Alex—filled me in: Alex was starting a research lab on Long Island’s North Shore whose mission was to bring computers into the animation process. Money was not a problem, he assured me—Alex was a multimillionaire. What they needed was someone to run the place. Was I interested in talking? Within weeks I was moving into my new office at the New York Institute of Technology. Alex, a former college chancellor, had zero expertise in the field of computer science. At the time, that wasn’t unusual, but Alex himself certainly was. He naïvely thought that computers would soon replace people, and leading that charge was what excited him. (We knew this was a misconception, if a common one at that point, but we were grateful for his eagerness to fund our work.) He had a bizarre way of speaking that mixed bluster, non sequiturs, and even snippets of rhyming verse into a sort of Mad Hatter–ish patois—or “word salad,” as one of my colleagues called it. (“Our vision will speed up time,” he would say, “eventually deleting it.”) Those of us who worked with him often had trouble understanding what he meant. Alex had a secret ambition—well, not so secret. He said almost every day that he didn’t want to be the next Walt Disney, which only made us all think that he did. When I arrived, he was in the process of directing a hand-drawn animated movie called Tubby the Tuba. Really, the thing never had a chance—no one at NYIT had the training or the story sensibility to make a film, and when it was finally released, it vanished without a trace. Deluded though he may have been about his own skills, Alex was a visionary. He was incredibly prescient about the role computers would someday play in animation, and he was willing to spend a lot of his own money to push that vision forward. His unwavering commitment to what many labeled a pipe dream—the melding of technology and this handdrawn art form—enabled much groundbreaking work to be done. Once Alex brought me in, he left it to me to assemble a team. I have to give that to him: He had total confidence in the people he hired. This was something I admired and, later, sought to do myself. One of the first people I interviewed was Alvy Ray Smith, a charismatic Texan with a Ph.D. in computer science and a sparkling resume that included teaching stints at New York University and UC Berkeley and a gig at Xerox PARC, the distinguished R&D lab in Palo Alto. I had conflicting feelings when I met Alvy because, frankly, he seemed more qualified to lead the lab than I was. I can still remember the uneasiness in my gut, that instinctual twinge spurred by a potential threat: This, I thought, could be the guy who takes my job one day. I hired him anyway. Some might have seen hiring Alvy as a confident move. The truth is, as a twenty-nine-yearold who’d been focused on research for four years and had never had an assistant, let alone hired and managed a staff, I was feeling anything but confident. I could see, however, that NYIT was a place where I could explore what I’d set out to do as a graduate student. To ensure that it succeeded, I needed to attract the sharpest minds; to attract the sharpest minds, I needed to put my own insecurities away. The lesson of ARPA had lodged in my brain: When faced with a challenge, get smarter. So we did. Alvy would become one of my closest friends and most trusted collaborators. And ever since, I’ve made a policy of trying to hire people who are smarter than I am. The obvious payoffs of exceptional people are that they innovate, excel, and generally make your company—and, by extension, you—look good. But there is another, less obvious, payoff that only occurred to me in retrospect. The act of hiring Alvy changed me as a manager: By ignoring my fear, I learned that the fear was groundless. Over the years, I have met people who took what seemed the safer path and were the lesser for it. By hiring Alvy, I had taken a risk, and that risk yielded the highest reward—a brilliant, committed teammate. I had wondered in graduate school how I could ever replicate the singular environment of the U of U. Now, suddenly, I saw the way. Always take a chance on better, even if it seems threatening. At NYIT, we focused on a single goal: pushing the boundaries of what computers could do in animation and graphics. And as word of our mission spread, we began to attract the top people in the field. The bigger my staff became, the more urgent it was that I figure out how to manage them. I created a flat organizational structure, much like I’d experienced in academia, largely because I naïvely thought that if I put together a hierarchical structure— assigning a bunch of managers to report to me—I would have to spend too much time managing and not enough time on my own work. This structure—in which I entrusted everybody to drive their own projects forward, at their own pace—had its limits, but the fact is, giving a ton of freedom to highly self-motivated people enabled us to make some significant technological leaps in a short time. Together, we did groundbreaking work, much of which was aimed at figuring out how to integrate the computer with hand-drawn animation. In 1977, for example, I wrote a 2D animation program called Tween, which performed what’s known as “automatic in-betweening”—filling in frames of motion between key shots, an otherwise expensive and labor-intensive process. Another technical challenge that occupied us was the need for something called “motion blur.” With animation in general and computer animation in particular, the images created are in perfect focus. That may sound like a good thing, but in fact, human beings react negatively to it. When moving objects are in perfect focus, theatergoers experience an unpleasant, strobe-like sensation, which they describe as “jerky.” When watching live-action movies, we don’t perceive this problem because traditional film cameras capture a slight blur in the direction an object is moving. The blur keeps our brains from noticing the sharp edges, and our brains regard this blur as natural. Without motion blur, our brains think something is wrong. So the question for us was how to simulate the blur for animation. If the human eye couldn’t accept computer animation, the field would have no future. Among the handful of companies that were trying to solve these problems, most embraced a culture of strictly enforced, even CIA-like secrecy. We were in a race, after all, to be the first to make a computer-animated feature film, so many who were pursuing this technology held their discoveries close to their vests. After talking about it, however, Alvy and I decided to do the opposite—to share our work with the outside world. My view was that we were all so far from achieving our goal that to hoard ideas only impeded our ability to get to the finish line. Instead, NYIT engaged with the computer graphics community, publishing everything we discovered, participating in committees to review papers written by all manner of researchers, and taking active roles at all the major academic conferences. The benefit of this transparency was not immediately felt (and, notably, when we decided upon it, we weren’t even counting on a payoff; it just seemed like the right thing to do). But the relationships and connections we formed, over time, proved far more valuable than we could have imagined, fueling our technical innovation and our understanding of creativity in general. For all the good work we were doing, however, I found myself in a quandary at NYIT. Thanks to Alex, we were fortunate to have the funds to buy the equipment and hire the people necessary to innovate in the world of computer animation, but we didn’t have anyone who knew anything about filmmaking. As we developed the ability to tell a story with a computer, we still didn’t have storytellers among us, and we were the poorer for it. So aware were Alvy and I of this limitation that we began making quiet overtures to Disney and other studios, trying to gauge their interest in investing in our tools. If we found an interested suitor, Alvy and I were prepared to leave NYIT and move our team to Los Angeles to partner with proven filmmakers and storytellers. But it was not to be. One by one, they demurred. It’s hard to imagine now, but in 1976, the idea of incorporating high technology into Hollywood filmmaking wasn’t just a low priority; it wasn’t even on the radar. But one man was about to change that, with a movie called Star Wars. On May 25, 1977, Star Wars opened in theaters across America. The film’s mastery of visual effects—and its record-shattering popularity at the box office—would change the industry forever. And thirty-two-year-old writer-director George Lucas was only getting started. His company, Lucasfilm, and its ascendant Industrial Light & Magic studio had already taken the lead developing new tools in visual effects and sound design. Now, while no one else in the movie industry evinced even the slightest desire to invest in such things, George resolved in July 1979 to launch a computer division. Thanks to Luke Skywalker, he had the resources to do it right. To run this division, he wanted someone who not only knew computers; he wanted someone who loved film and believed that the two could not only coexist but enhance one another. Eventually, that led George to me. One of his key people, Richard Edlund, who was a pioneer of special effects, came to see me one afternoon in my office at NYIT wearing a belt with an enormous buckle that read, in huge letters, “Star Wars.” This was worrisome, given that I was trying to keep his visit a secret from Alex Schure. Somehow, though, Alex didn’t catch on. George’s emissary was apparently pleased with what I showed him, because a few weeks after he left, I was on my way to Lucasfilm in California for a formal interview. My first meeting there was with a man named Bob Gindy, who ran George’s personal construction projects—not exactly the qualifications you’d expect for a guy spearheading the search for a new computer executive. The first thing he asked me was, “Who else should Lucasfilm be considering for this job?” Meaning, the job I was there to interview for. Without hesitation, I rattled off the names of several people who were doing impressive work in a variety of technical areas. My willingness to do this reflected my world-view, forged in academia, that any hard problem should have many good minds simultaneously trying to solve it. Not to acknowledge that seemed silly. Only later would I learn that the guys at Lucasfilm had already interviewed all the people I listed and had asked them, in turn, to make similar recommendations—and not one of them had suggested any other names! To be sure, working for George Lucas was a plum job that you’d have to be crazy not to want. But to go mute, as my rivals did, when asked to evaluate the field signaled not just intense competitiveness but also lack of confidence. Soon I’d landed an interview with George himself. On my way to meet him, I remember feeling nervous in a way I rarely had before. Even before Star Wars, George had proved himself as a successful writer-director-producer with American Graffiti. I was a computer guy with an expensive dream. Still, when I arrived at the shooting stage in Los Angeles where he was working, he and I seemed pretty similar: Skinny and bearded, in our early thirties, we both wore glasses, worked with a blinders-on intensity, and had a tendency to talk only when we had something to say. But what struck me immediately was George’s relentless practicality. He wasn’t some hobbyist trying to bring technology into filmmaking for the heck of it. His interest in computers began and ended with their potential to add value to the filmmaking process—be it through digital optical printing, digital audio, digital non-linear editing, or computer graphics. I was certain that they could, and I told him so. In the intervening years, George has said that he hired me because of my honesty, my “clarity of vision,” and my steadfast belief in what computers could do. Not long after we met, he offered me the job. By the time I moved into the two-story building in San Anselmo that would serve as the temporary headquarters of Lucasfilm’s new computer division, I had given myself an assignment: to rethink how I managed people. What George wanted to create was a far more ambitious enterprise than the one I oversaw at NYIT, with a higher profile, a bigger budget, and, given his ambitions in Hollywood, the promise of much greater impact. I wanted to make sure that I was enabling my team to make the most of that. At NYIT, I’d created a flat structure much like I’d seen at the U of U, giving my colleagues a lot of running room and little oversight, and I’d been relatively pleased with the results. But now I had to admit that our team there behaved a lot like a collection of grad students—independent thinkers with individual projects—rather than a team with a common goal. A research lab is not a university, and the structure didn’t scale well. At Lucasfilm, then, I decided to hire managers to run the graphics, video, and audio groups; they would then report to me. I knew I had to put some sort of hierarchy in place, but I also worried that hierarchy would lead to problems. So I edged in slowly, feeling suspicious of it at first, yet knowing that some part of it was necessary. The Bay Area in 1979 could not have provided a more fertile environment for our work. In Silicon Valley, the number of computer companies was growing so fast that no one’s Rolodex (yes, we had Rolodexes back then) was ever up to date. Also growing exponentially were the number of tasks that computers were being assigned to tackle. Not long after I got to California, Microsoft’s Bill Gates agreed to create an operating system for the new IBM personal computer—which would, of course, go on to transform the way Americans worked. One year later, Atari released the first in-home game console, meaning that its popular arcade games like Space Invaders and Pac-Man could be played in living rooms across America, opening up a market that now accounts for more than $65 billion in global sales. To get a sense of how quickly things were changing, consider that when I was a graduate student, in 1970, we’d used huge computers made by IBM and seven other mainframe companies (a group that was nicknamed “IBM and the Seven Dwarves”). Picture a room filled with racks and racks of equipment measuring six feet tall, two feet wide, and 30 inches deep. Five years later, when I arrived at NYIT, the minicomputer—which was about the size of an armoire—was on the rise, with Digital Equipment in Massachusetts being the most significant player. By the time I got to Lucasfilm in 1979, the momentum was swinging to workstation computers such as those made by Silicon Valley upstarts Sun Microsystems and Silicon Graphics, as well as IBM, but by that time, everyone could see that workstations were only another stop on the way to PCs and, eventually, personal desktop computers. The swiftness of this evolution created seemingly endless opportunities for those who were willing and able to innovate. The allure of getting rich was a magnet for bright, ambitious people, and the resulting competition was intense—as were the risks. The old business models were undergoing continual disruptive change. Lucasfilm was based in Marin County, one hour north of Silicon Valley by car and one hour from Hollywood by plane. This was no accident. George saw himself, first and foremost, as a filmmaker, so Silicon Valley wasn’t for him. But he also had no desire to be too close to Los Angeles, because he thought there was something a bit unseemly and inbred about it. Thus, he created his own island, a community that embraced films and computers but pledged allegiance to neither of the prevailing cultures that defined those businesses. The resulting environment felt as protected as an academic institution—an idea that would stay with me and help shape what I would later try to build at Pixar. Experimentation was highly valued, but the urgency of a for-profit enterprise was definitely in the air. In other words, we felt like we were solving problems for a reason. I put Alvy in charge of our graphics group, which was dedicated initially to creating a digital approach to blue-screen matting—the process by which one image (say, a man on a surfboard) can be dropped into a separate image (say, a 100-foot wave). Before digital, this effect was accomplished on film with the use of sophisticated optical devices, and the special effects wizards at the time had no interest in leaving that painstaking method behind. Our job was to convince them otherwise. Alvy’s team set out to design a highly specialized standalone computer that had the resolution and processing power to scan film, combine special-effects images with live-action footage, and then record the final result back onto film. It took us roughly four years, but our engineers built just such a device, which we named the Pixar Image Computer. Why “Pixar”? The name emerged from a back-and-forth between Alvy and another of our colleagues, Loren Carpenter. Alvy, who spent much of his childhood in Texas and New Mexico, had a fondness for the Spanish language, and he was intrigued by how certain nouns in English looked like Spanish verbs—words like “laser,” for example. So Alvy lobbied for “Pixer,” which he imagined to be a (fake) Spanish verb meaning “to make pictures.” Loren countered with “Radar,” which he thought sounded more high-tech. That’s when it hit them: Pixer + radar = Pixar! It stuck. Within Lucasfilm, the special effects experts were relatively indifferent to our computer graphics technology. Their film editor colleagues, however, were outright opposed. This was driven home when, at George’s request, we developed a video-editing system that would enable editors to do their work on the computer. George envisioned a program that would allow shots to be banked and filed easily and cuts to be made far more quickly than they were on film. Ralph Guggenheim, a computer programmer (with a degree in filmmaking from Carnegie Mellon as well) I’d lured away from NYIT, took the lead on this project, which was so ahead of its time that the hardware needed to support it didn’t even exist yet. (In order to approximate it, Ralph had to mock up an elaborate makeshift system using laser disks). But as challenging as that problem proved to be, it paled in comparison to the bigger, and eternal, impediment to our progress: the human resistance to change. While George wanted this new video-editing system in place, the film editors at Lucasfilm did not. They were perfectly happy with the system they had already mastered, which involved actually cutting film into snippets with razor blades and then pasting them back together. They couldn’t have been less interested in making changes that would slow them down in the short term. They took comfort in their familiar ways, and change meant being uncomfortable. So when it came time to test our work, the editors refused to participate. Our certainty that video editing would revolutionize the process didn’t matter, and neither did George’s backing. Because the people our new system was intended to serve were resistant to it, progress screeched to a halt. What to do? If left up to the editors, no new tool would ever be designed and no improvements would be possible. They saw no advantage to change and couldn’t imagine how using a computer would make their work easier or better. But if we designed the new system in a vacuum, moving ahead without the editors’ input, we would end up with a tool that didn’t address their needs. Being confident about the value of our innovation was not enough. We needed buy-in from the community we were trying to serve. Without it, we were forced to abandon our plans. Clearly, it wasn’t enough for managers to have good ideas—they had to be able to engender support for those ideas among the people who’d be charged with employing them. I took that lesson to heart. During the Lucasfilm years, I definitely had my periods of feeling overwhelmed as a manager, periods when I wondered about my own abilities and asked myself if I should try to adopt a more forceful, alpha male management style. I’d put my version of hierarchy in place by delegating to other managers, but I was also part of a chain of command in the greater Lucasfilm empire. I remember going home at night, exhausted, feeling like I was balancing on the backs of a herd of horses—only some of the horses were thoroughbreds, some were completely wild, and some were ponies who were struggling to keep up. I found it hard enough to hold on, let alone steer. Simply put, managing was hard. No one took me aside to give me tips. The books I read that promised insight on the topic were mostly devoid of content. So I looked to George to see how he did it. I saw that his way seemed to reflect some of the philosophy he had put into Yoda. Just as Yoda said things like, “Do, or do not. There is no try,” George had a fondness for folksy analogies that sought to describe, neatly, the mess of life. He would compare the often arduous process of developing his 4,700-acre Skywalker Ranch compound (a minicity of residences and production facilities) to a ship going down river … that had been cut in half … and whose captain had been thrown overboard. “We’re still going to get there,” he would say. “Grab the paddles and let’s keep going!” Another of his favorite analogies was that building a company was like being on a wagon train headed west. On the long journey to the land of plenty, the pioneers would be full of purpose and united by the goal of reaching their destination. Once they arrived, he’d say, people would come and go, and that was as it should be. But the process of moving toward something—of having not yet arrived—was what he idealized. Whether evoking wagons or ships, George thought in terms of a long view; he believed in the future and his ability to shape it. The story has been told and retold about how, as a young filmmaker, in the wake of American Graffiti’s success, he was advised to demand a higher salary on his next movie, Star Wars. That would be the expected move in Hollywood: Bump up your quote. Not for George, though. He skipped the raise altogether and asked instead to retain ownership of licensing and merchandising rights to Star Wars. The studio that was distributing the film, 20th Century Fox, readily agreed to his request, thinking it was not giving up much. George would prove them wrong, setting the stage for major changes in the industry he loved. He bet on himself—and won. Lucasfilm, in those post–Star Wars days, was a magnet for big names. Famous directors, from Steven Spielberg to Martin Scorsese, were always stopping by to see what we were working on and what new effects or innovations they might use in their films. But more than the dropins from A-listers, the visit that would stick with me most was the group of Disney animators who came for a tour just after Valentine’s Day, 1983. As I showed them around, I noted that one of them—a kid in baggy jeans named John—seemed particularly excited about what we were up to. In fact, the first thing I noticed was his curiosity. When I showed everyone a computer-animated image that we were so proud of we’d given it a name—“The Road to Point Reyes”—he just stood there, transfixed. I told him we’d developed the image of a gently curving road overlooking the Pacific Ocean using a software program we’d developed called Reyes (for Renders Everything You Ever Saw), and the pun was intended: Point Reyes, California, is a seaside village on Route 1, not far from Lucasfilm. Reyes represented the cutting edge of computer graphics at the time. And it bowled this John guy over. Soon, I learned why. He had an idea, he told me, for a film called The Brave Little Toaster about a toaster, a blanket, a lamp, a radio, and a vacuum cleaner who journey to the city to find their master after being abandoned in their cabin in the woods. He told me that his film, which he was about to pitch to his bosses at Disney Animation, would be the first to place hand-drawn characters inside computer-generated backgrounds, much like the one I’d just shown him. He wanted to know if we could work together to make this happen. That animator was John Lasseter. Unbeknownst to me, soon after our meeting at Lucasfilm, he would lose his job at Disney. Apparently, his supervisors felt that The Brave Little Toaster was—like him—a little too avant-garde. They listened to his pitch and, immediately afterward, fired him. A few months later, I ran into John again on the Queen Mary, of all places. The historic Long Beach hotel, which also happens to be a docked ocean liner, was the site of the annual Pratt Institute Symposium on Computer Graphics. Not knowing of his newly unemployed status, I asked if there was any way he could come up to Lucasfilm and help us make our first short film. He said yes without hesitation. I remember thinking it was almost as if Professor Sutherland’s exchange program idea was finally getting its moment. To have a Disney animator on our team, even temporarily, would be a huge leap forward. For the first time, a true storyteller would be joining our ranks. John was a born dreamer. As a boy, he lived mostly in his head and in the tree houses and tunnels and spaceships he drew in his sketchbook. His dad was the parts manager at the local Chevrolet dealership in Whittier, California—instilling in John a lifelong obsession with cars —and his mom was a high school art teacher. Like me, John remembers discovering that there were people who made animation for a living and thinking he’d found his place in the world. For him, as for me, that realization was Disney-related; it came when he stumbled upon a well-worn copy of The Art of Animation, Bob Thomas’s history of the Disney Studios, in his high school library. By the time I met John, he was as connected to Walt Disney as any twenty-six-year-old on earth. He had graduated from CalArts, the legendary art school founded by Walt, where he’d learned from some of the greatest artists of Disney’s Golden Age; he’d worked as a river guide on the Jungle Cruise at Disneyland; and he’d won a Student Academy Award in 1979 for his short film The Lady and the Lamp—an homage to Disney’s Lady and the Tramp—whose main character, a white desk lamp, would later evolve into our Pixar logo. What John hadn’t realized when he joined Disney Animation, however, was that the studio was going through a rough, fallow period. The animation there had plateaued much earlier— no significant technical advances had been made since 1961’s 101 Dalmatians, and many of its young, talented animators had left the studio, reacting in part to an increasingly hierarchical culture that didn’t value their ideas. When John arrived in 1979, Frank Thomas, Ollie Johnston, and the rest of the Nine Old Men were getting up in years—the youngest was 65—and had stepped away from the day-to-day business of moviemaking, leaving the studio in the hands of a group of lesser artists who had been waiting in the wings for decades. These men felt that it was their turn to be in charge but were so insecure about their standing within the company that they clung to their newfound status by stifling—not encouraging— younger talents. Not only were they not interested in the ideas of their fledgling animators, they exercised a sort of punitive power. They were seemingly determined that those beneath them not rise in the ranks any faster than they already had. John was almost immediately unhappy in this noncollaborative environment, though it was still a shock when he got fired. No wonder he was so eager to join us at Lucasfilm. The project we enlisted John’s help on was originally going to be called My Breakfast with André, an homage of sorts to a 1981 movie we all loved called My Dinner with André. The idea was simple: an android named André was supposed to wake up, yawn, and stretch as the sun rose, revealing a lush, computer-rendered world. Alvy had drawn the first storyboards and was taking the lead on the project, which was a way for us to test some of the new animation technology we’d developed, and he was thrilled that John was coming aboard to help. John was an effusive presence with a knack for bringing out the best in others. His energy would enliven the film. “Do you mind if I say a couple of things?” John asked Alvy after being shown the early storyboards. “Of course not,” Alvy responded. “That’s why you’re here.” As Alvy tells it, John then “proceeded to save the piece. I’d foolishly thought I’d be the animator, but frankly, I didn’t have the magic. I could make things move very nicely, but not think, emote and have consciousness. That’s John.” John made some suggestions about the look of the main character, a simple, human-like figure with a sphere for a head and another sphere for a nose. But his most brilliant stroke was adding a second character, a bumblebee named Wally for André to interact with. (And who, by the way, was named for Wallace Shawn, who’d starred in the movie our short was inspired by.) The film was renamed The Adventures of André and Wally B., and it opened with André on his back, asleep in the forest, waking to find Wally B. hovering just above his face. Frightened, he flees as Wally B. gives chase, buzzing right behind him. That is the entire plot, if you can call it that—frankly, we weren’t as focused on story as we were on showing what was possible to render with a computer. John’s genius was in creating an emotional tension, even in this briefest of formats. The movie was designed to run two minutes, but we were still racing against time to complete it. It wasn’t just that the animation process was labor-intensive, though it surely was; it was that we were inventing the animation process as we went along. Adding to the stress was the fact that we’d left ourselves so little time to get it all done. We had a selfimposed deadline of July 1984—just eight months after John came aboard—because that was when the annual SIGGRAPH Conference would be held in Minneapolis. This week-long computer graphics summit was a great place to find out what everyone in the field was up to, the one time every year that academics, educators, artists, hardware salesmen, graduate students, and programmers all came together under one roof. According to tradition, the Tuesday of conference week was reserved for “movie night,” with a showing of the most exciting visual work produced in the field that year. Up until then, that had meant mostly 15second snippets of flying network news logos (think spinning globes and rippling American flags) and scientific visualization (everything from the NASA’s Voyager 2 fly-by of Saturn to illustrations of Contac time-release cold capsules). Wally B. would be the first computerized character animation ever shown at SIGGRAPH. As the deadline approached, however, we realized that we weren’t going to make it. We’d worked so hard to create images that were better and clearer and, to make things really hard, we’d set the movie in a forest (whose foliage tested the limits of our animation chops at the time). But we hadn’t accounted for how much computer power those images would require to render and how long that process would take. We could complete a rough version of the film in time, but portions of it would be unfinished, appearing as wire frame images—mock-ups, made from grid polygons, of the finished characters—instead of fully colored images. The night of our premiere, we watched, mortified, as these segments appeared on the screen, but something surprising happened. Despite our worries, the majority of the people I talked to after the screening said that they hadn’t even noticed that the movie had switched from full color to black and white wireframes! They were so caught up in the emotion of the story that they hadn’t noticed its flaws. This was my first encounter with a phenomenon I would notice again and again, throughout my career: For all the care you put into artistry, visual polish frequently doesn’t matter if you are getting the story right. In 1983, George and his wife Marcia split up, and the settlement would significantly affect the cash position of Lucasfilm. George hadn’t lost an ounce of his ambition, but the new financial realities meant that he had to streamline his business. At the same time, I was coming to realize that while we in the computer division wanted more than anything to make an animated feature film, George didn’t share our dream. He had always been most interested in what computers could do to enhance live-action films. For a while our goals, though disparate, had overlapped and pushed each other forward. But now, under pressure to consolidate his investments, George decided to sell us. The computer division’s primary asset was the business we’d created around the Pixar Image Computer. Although we originally designed it to handle frames of film, it had proven to have multiple applications, including everything from medical imaging to design prototyping to image processing for the many three-letter agencies around Washington, D.C. The next year would be one of the most stressful of my life. A management team brought in by George to restructure Lucasfilm seemed concerned mostly with cash flow, and as time went on, they became openly skeptical that our division would ever attract a buyer. This team was headed by two men with the same first name, whom Alvy and I nicknamed “the Dweebs” because they didn’t understand a thing about the business we were in. Those two guys threw around management consulting terms (they loved to tout their “corporate intuition” and constantly urged us to make “strategic alliances”), but they didn’t seem at all insightful about how to make us attractive to buyers or about which buyers to pursue. At one point, they called us into an office, sat us down, and said that to cut costs, we should lay off all our employees until after our division was sold—at which point we could discuss rehiring them. In addition to the emotional toll we knew this would take, what bugged us about this suggestion was that our real selling point—the thing that had attracted potential suitors thus far—was the talent we’d gathered. Without that, we had nothing. So, when our two like-minded overlords demanded a list of names of people to lay off, Alvy and I gave them two: his and mine. That temporarily halted that plan, but as we headed into 1985, I was keenly aware that if we weren’t sold off, and fast, we could be shut down at any moment. Lucasfilm wanted to walk away from the deal with $15 million in cash, but there was a hitch: Our computer division came with a business plan that required an additional infusion of $15 million to take us from prototype to product and ensure that we’d be able to stand on our own. This structure did not sit well with the venture capitalists they hoped would buy us, who didn’t typically make such significant cash commitments when they acquired companies. We were shopped to twenty prospective buyers, none of whom bit. When that list was exhausted, a string of manufacturing companies stopped in to kick our tires. Again, no luck. At long last, our group reached an agreement with General Motors and Philips, the Dutch electronics and engineering conglomerate. Philips was interested because, with our Pixar Image Computer, we had developed the foundational technology for rendering volumes of data, such as you get from CT scans or MRIs. General Motors was intrigued because we were leading the way in the modeling of objects, which they felt could be used in car design. We were within one week of signing the deal when it fell apart. At this point, I remember feeling a mixture of despair and relief. We’d known from the outset that entering into a relationship with GM and Philips would likely put an end to our dream of making the first animated feature film, but that was a risk no matter who we joined up with: Each investor was going to have its own agenda, and that was the price of our survival. To this day, I am thankful that the deal went south. Because it paved the way for Steve Jobs. I first met Steve in February of 1985, when he was the director of Apple Computer, Inc. Our meeting had been arranged by Apple’s chief scientist, Alan Kay, who knew that Alvy and I were looking for investors to take our graphics division off George’s hands. Alan had been at the U of U with me and at Xerox PARC with Alvy, and he told Steve that he should visit us if he wanted to see the cutting edge in computer graphics. We met in a conference room with a white board and a large table surrounded by chairs—not that Steve stayed seated for very long. Within minutes, he was standing at the white board, drawing us a chart of Apple’s revenues. I remember his assertiveness. There was no small talk. Instead, there were questions. Lots of questions. What do you want? Steve asked. Where are you heading? What are your long-term goals? He used the phrase “insanely great products” to explain what he believed in. Clearly, he was the sort of person who didn’t let presentations happen to him, and it wasn’t long before he was talking about making a deal. To be honest, I was uneasy about Steve. He had a forceful personality, whereas I do not, and I felt threatened by him. For all of my talk about the importance of surrounding myself with people smarter than myself, his intensity was at such a different level, I didn’t know how to interpret it. It put me in the mind of an ad campaign that the Maxell cassette tape company released around this time, featuring what would become an iconic image: a guy sitting low in a leather-and-chrome Le Corbusier chair, his long hair being literally blown back by the sound from the stereophonic speaker in front of him. That’s what it was like to be with Steve. He was the speaker. Everyone else was the guy. For nearly two months after that initial meeting, we heard nothing. Total silence. We were perplexed, given how intent Steve had been in our meetings. We finally learned why when, in late May, we read in the papers of Steve’s blowup with Apple CEO John Sculley. Sculley had persuaded Apple’s board of directors to remove Steve from his duties as head of the company’s Macintosh division after rumors surfaced that Steve was trying to stage a boardroom coup. When the dust settled, Steve sought us out again. He wanted a new challenge and thought maybe we were it. He came to Lucasfilm one afternoon for a tour of our hardware lab. Again, he pushed and prodded and poked. What can the Pixar Image Computer do that other machines on the market can’t? Who do you envision using it? What’s your long-term plan? His aim didn’t seem to be to absorb the intricacies of our technology as much as to hone his own argument, to temper it by sparring with us. Steve’s domineering nature could take one’s breath away. At one point he turned to me and calmly explained that he wanted my job. Once he took my place at the helm, he said, I would learn so much from him that in just two years I would be able to run the enterprise all by myself. I was, of course, already running the enterprise by myself, but I marveled at his chutzpah. He not only planned to displace me in the day-to-day management of the company, he expected me to think it was a great idea! Steve was hard-charging—relentless, even—but a conversation with him took you places you didn’t expect. It forced you not just to defend but also to engage. And that in itself, I came to believe, had value. The next day, several of us drove out to meet with Steve at his place in Woodside, a lovely neighborhood near Menlo Park. The house was almost empty but for a motorcycle, a grand piano, and two personal chefs who had once worked at Chez Panisse. Sitting on the grass looking out over his seven-acre lawn, he formally proposed that he buy the graphics group from Lucasfilm and showed us a proposed organizational chart for the new company. As he spoke, it became clear to us that his goal was not to build an animation studio; his goal was to build the next generation of home computers to compete with Apple. This wasn’t merely a deviation from our vision, it was the total abandonment of it, so we politely declined. We returned to the task of trying to find a buyer. Time was running out. Months passed. As we approached the one-year anniversary of our unveiling of The Adventures of André and Wally B., our anxiety—the kind that builds when survival is at stake and saviors are in short supply—was showing on our faces. Still, we had fortune on our side —or, at least, geography. The 1985 SIGGRAPH conference was being held in San Francisco, right up the 101 freeway from Silicon Valley. We had a booth on the trade show floor where we showcased our Pixar Image Computer. Steve Jobs dropped by on the first afternoon. Immediately, I sensed a change. Since I’d last seen him, Steve had founded a personal computer company, NeXT. I think that gave him the ability to approach us with a different mindset. He had less to prove. Now, he looked around our booth and proclaimed our machine the most interesting thing in the room. “Let’s go for a walk,” he said, and we set off on a stroll around the hall. “How are things going?” “Not great,” I confessed. We were still hoping to find an outside investor, but we were nearly out of options. It was then that Steve raised the idea of resuming our talks. “Maybe we can work something out,” he said. As we talked, we came upon Bill Joy, one of the founders of Sun Computer. Bill, like Steve, was an extraordinarily bright, competitive, articulate, and opinionated person. I don’t remember what they talked about as we stood there, but I’ll never forget the way they talked: standing nose to nose, their arms behind their backs, swaying from side to side—in perfect sync—completely oblivious to anything going on around them. This went on for quite a while, until Steve had to break off to go meet someone. After Steve left, Bill turned to me and said, “Boy, is he arrogant.” When Steve came by our booth again later, he walked up to me and said of Bill: “Boy, is he arrogant.” I remember being struck by this clash-of-the-titans moment. I was amused by the fact that each man could see ego in the other but not in himself. It took another few months, but on the third day of January, 1986, Steve said he was ready to make a deal and addressed, right off, the issue that had concerned me most—his previous insistence on controlling and running the company. He was willing to back off on that, he said, and not only that, he was open to letting us explore making a business out of the nexus of computers and graphics. By the end of the meeting, Alvy and I felt comfortable with his proposal—and his intentions. The only wild card was what he was going to be like as a partner. We were well aware of his reputation for being difficult. Only time would tell whether he would live up to it. At one point in this period, I met with Steve and gently asked him how things got resolved when people disagree with him. He seemed unaware that what I was really asking him was how things would get resolved if we worked together and I disagreed with him, for he gave a more general answer. He said, “When I don’t see eye to eye with somebody, I just take the time to explain it better, so they understand the way it should be.” Later, when I relayed this to my colleagues at Lucasfilm, they laughed. Nervously. I remember one of Steve’s attorneys telling us that if we were acquired by his client, we had better be ready to “get on the Steve Jobs roller coaster.” Given our dire straits, this was a ride Alvy and I were ready to board. The acquisition process was complicated by the fact that the negotiators for Lucasfilm weren’t very good. The chief financial officer, in particular, underestimated Steve, assuming he was just another rich kid in over his head. This CFO told me that the way to establish his authority in the room was to arrive last. His thinking, which he articulated out loud to me, was that this would establish him as the “most powerful player,” since he and only he could afford to keep everyone else waiting. All that it ended up establishing, however, was that he’d never met anyone like Steve Jobs. The morning of the big negotiating session, all of us but the CFO were on time—Steve and his attorney; me, Alvy, and our attorney; Lucasfilm’s attorneys; and an investment banker. At precisely 10 A.M., Steve looked around and, finding the CFO missing, started the meeting without him! In one swift move, Steve had not only foiled the CFO’s attempt to place himself atop the pecking order, but he had grabbed control of the meeting. This would be the kind of strategic, aggressive play that would define Steve’s stewardship of Pixar for years to come— once we joined forces, he became our protector, as fierce on our behalf as he was on his own. In the end, Steve paid $5 million to spin Pixar off of Lucasfilm—and then, after the sale, he agreed to pay another $5 million to fund the company, with 70 percent of the stock going to Steve and 30 percent to the employees. The closing took place on a Monday morning in February 1986, and the mood in the room was decidedly muted because everyone was so worn out by the negotiations. After we signed our names, Steve pulled Alvy and me aside, put his arms around us and said, “Whatever happens, we have to be loyal to each other.” I took that as an expression of his still-bruised feelings in the wake of his ouster from Apple, but I never forgot it. The gestation had been trying, but the feisty little company called Pixar had been born. CHAPTER 3 A DEFINING GOAL There is nothing quite like ignorance combined with a driving need to succeed to force rapid learning. I know this from firsthand experience. In 1986, I became the president of a new hardware company whose main business was selling the Pixar Image Computer. The only problem was, I had no idea what I was doing. From the outside, Pixar probably looked like your typical Silicon Valley startup. On the inside, however, we were anything but. Steve Jobs had never manufactured or marketed a high-end machine befor...
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Please let me know if there is anything needs to be changed or added. I will be also appreciated that you can let me know if there is any problem or you have not received the work. Please let me know if there is anything needs to be changed or added. I will be also appreciated that you can let me know if there is any problem or you have not received the work Good luck in your study and if you need any further help in your assignments, please let me know Can you please confirm if you have received the work? Once again, thanks for allowing me to help you R

Running Head: CREATIVITY INC

1

Name:
Institutional Affiliation:
Date:

2

CREATIVITY INC
Creativity Inc
Introduction
Pixar today stands out as one of the most successful companies in the entertainment
industry. The company is widely known for its superior movies like the famous Toy Story and

Finding Nemo. The company for a long time produced incredible animation films that attracted a
large population across the globe. The success of the company comes from various components
and abilities that drive its value chain and also customer retention tactics. Factors such as a
superior team of management and workforce among others play a critical role in boosting the
success and productivity of Pixar. The company for an extended period enjoyed improved
recognition in the market which increased its overall sales. Concentrating on animation and
creativity helps the company to keep its ahead of the similar rivals in the same industry and
marketplace. To succeed amongst the various competitors and within the market, the company
must ensure that it operated under a set of superior values and principles among other critical
assets that will see it successfully undertake its d...


Anonymous
Goes above and beyond expectations!

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags