two assignment

User Generated

wnpx1991

Computer Science

Description

Question 1 Read the paper, carefully and summarize it with your own words using the Paper Summary Template

2 Review Questions. The file is uploaded.

Unformatted Attachment Preview

Zhang, X., Nickels, D. W., & Stafford, T. F. (2010). Understanding the organizational impact of radio frequency identification technology: A holistic view. Pacific Asia Journal of the Association for Information Systems, 2(2), 1-17. Purpose (What are the objectives for writing the paper?): Provide an overview of RFID technology and its uses and to propose its effects on an organization’s IT infrastructure, business intelligence, and decision making. Design / Methodology / Approach (How are the objectives achieved? Include the main methods used for the research and the approach to the topic.): • • Provide a comprehensive overview of RFID and its current uses Propose general effects of using RFID on IT infrastructure, business intelligence, and decision making in the form of propositions • Propose a theory dubbed the “IT Decision Chain” employing and linking the propositions Main Points / Findings / Conclusions (What are the main points? What was found in the course of the work, and what are the major conclusions? This will refer to analysis, discussion, or results.): • • • • RFID is a diverse architecture with unlimited uses for tracking objects RFID presents challenges to existing IT systems during design and implementation RFID can increase the level and efficiency of real time data in a retail supply change There are assertions (propositions) that can be made regarding the effects of an RFID based system on IT infrastructure, business intelligence, and decision making. • The effects (propositions) on a business using an RFID based system can be linked into a decision chain • The creative use of the data from an RFID system is the real value, not the technology itself. Implications to Practice and Knowledge (What outcomes and implications for practice and knowledge as well as applications and consequences are identified?): The contribution to knowledge consists of a comprehensive literature research summary of RFID technology and its uses. The literature research is used as the basis to assert propositions to the effects that an RFID infrastructure and the resulting data can have on the firms overall IT infrastructure, business intelligence, and decision making. The propositions are then linked into an information technology decision chain that establishes a hypothesis for understanding the organizational impacts of employing RFID. Critique (Which parts of the paper you like, and which parts of the paper you don’t like? Why?): The article overall read well and flowed nicely through three major phases: literature research, establishment of the propositions, and IT Decision Chain hypothesis. One key area for IT that appeared to have been somewhat overlooked in the literature review was the use of RFID for security of assets outside of the supply chain. Transferring and moving of capital assets within a company not related to supply chain operations for large companies can benefit greatly by the establishment of an RFID infrastructure, perhaps a secondary benefit to establishing the infrastructure from which value could be attributed to. Proposition 1 includes data abstraction at the data source but this is fairly well established already as the basic tag has just a simple number while the data system can return detailed information about the tagged item. The paper and hypothesis also seem to be directed specifically at retail supply chains although that is only explicitly detailed at times. It does appear that retail supply chains would benefit most by current RFID infrastructure while tracking of small parts and raw resources for manufacturing would benefit only at the container level. I did like the assertion of the propositions. They progressed logically through the establishment of the infrastructure and data gathering up through the high level decision making process. In particular Proposition 4 regarding the increased use of data mining appeared the most important proposition established. Data mining especially for customer relations management could benefit greatly by more “real time” data management. This aspect of the data gathered by the infrastructure could be the most valuable as now more than ever companies try to establish a competitive edge realized by more up to date customer data. I am not that versed in business intelligence so I had to take Propositions 5 and 6 at face value as they do make logical sense. I thought overall that the logical progression through the eight propositions assisted in establishing the benefits of moving to an RFID infrastructure as there is value established by the propositions at all three levels: infrastructure, business intelligence, and decision making. Provide the article’s citation information in APA style here Purpose (What are the objectives for writing the paper?): Design / Methodology / Approach (How are the objectives achieved? Include the main methods used for the research and the approach to the topic.): Main Points / Findings / Conclusions (What are the main points? What was found in the course of the work, and what are the major conclusions? This will refer to analysis, discussion, or results.): Implications to Practice and Knowledge (What outcomes and implications for practice and knowledge as well as applications and consequences are identified?): Critique (Which parts of the paper you like, and which parts of the paper you don’t like? Why?): SPRING 2005 VOL.46 NO.3 Nicholas G. Carr The End of Corporate Computing Please note that gray areas reflect artwork that has been intentionally removed. The substantive content of the article appears as originally published. REPRINT NUMBER 46313 The End of Corporate Computing S omething happened in the first years of the 20th century that would have seemed unthinkable just a few decades earlier: Manufacturers began to shut down and dismantle their water wheels, steam engines and electric generators. Since the beginning of the Industrial Age, power generation had been a seemingly intrinsic part of doing business, and mills and factories had had no choice but to maintain private power plants to run their machinery. As the new century dawned, however, an alternative started to emerge. Dozens of fledgling electricity producers began to erect central generating stations and use a network of wires to distribute their power to distant customers. Manufacturers no longer had to run their own dynamos; they could simply buy the electricity they needed, as needed, from the new suppliers. Power generation was being transformed from a corporate function to a utility. Almost exactly a century later, history is repeating itself. The most important commercial development of the last 50 years — information technology — is undergoing a similar transformation. It, too, is beginning an inexorable shift from being an asset that companies own in the form of computers, software and myriad related components to being a service that they purchase from utility providers. Few in the business world have contemplated the full magnitude of this change or its far-reaching consequences. To date, popular discussions of utility computing have rarely progressed beyond a recitation of IT vendors’ marketing slogans, laden with opaque terms like “autonomic systems,” “server virtualization” and “service-oriented architecture.”1 Rather than illuminate the future, such gobbledygook has only obscured it. The prevailing rhetoric is, moreover, too conservative. It assumes that the existing model of IT supply and use will endure, as will the corporate data center that lies at its core. But that view is perilously shortsighted. The traditional model’s economic foundation already is crumbling and is unlikely to survive in the long run. As the earlier transformation of electricity supply suggests, IT’s shift from a fragmented capital asset to a centralized utility service will be momentous. It will overturn strategic and operating assumptions, alter industrial economics, upset markets and pose daunting challenges to every user and vendor. The history of the commercial application of information technology has been characterized by astounding leaps, but nothing that has come before — not even the intro- After pouring millions of dollars into in-house data centers, companies may soon find that it’s time to start shutting them down. IT is shifting from being an asset companies own to a service they purchase. Nicholas G. Carr Nicholas G. Carr is the author of Does IT Matter? Information Technology and the Corrosion of Competitive Advantage (Harvard Business School Press, 2004) and former executive editor of the Harvard Business Review. He can be reached at ncarr@ nicholasgcarr.com. SPRING 2005 MIT SLOAN MANAGEMENT REVIEW 67 duction of the personal computer or the opening of the Internet — will match the upheaval that lies just over the horizon. From Asset to Expense Information technology, like steam power and electricity before it, is what economists call a general-purpose technology.2 It is used by all sorts of companies to do all kinds of things, and it brings widespread and fundamental changes to commerce and society. Because of its broad application, a general-purpose technology offers the potential for considerable economies of scale if its supply can be consolidated. But those economies can take a long time to be fully appreciated and even longer to be comprehensively exploited. During the early stages in the development of a general-purpose technology, when there are few technical standards and no broad distribution network, the technology is impossible to furnish centrally. By necessity, its supply is fragmented. Individual companies must purchase the various components required to use the technology, house those parts on site, meld them into a working system and hire a staff of specialists to maintain them. Such fragmentation of supply is inherently wasteful: It forces large capital investments and heavy fixed costs on firms and leads to redundant expenditures and high levels of overcapacity, both in the technology itself and in the labor force operating it. The situation is ideal for the suppliers of the components of the technology, since they reap the benefits of overinvestment, but it is ultimately unsustainable. As the technology matures and central distribution becomes possible, large-scale utility suppliers arise to displace the private providers. Although companies may take years to abandon their proprietary supply operations and all the sunk costs they represent, the savings offered by utilities eventually become too compelling to resist, even for the largest enterprises. Abandoning the old model becomes a competitive necessity. The evolution of electricity supply provides a clear model of this process. When the commercial production of electricity became possible around 1880, many small utility suppliers quickly popped up in urban areas. These were largely mom-and-pop Although companies may take years to abandon their proprietary supply operations, the savings offered by utilities eventually become too compelling to resist. 68 MIT SLOAN MANAGEMENT REVIEW SPRING 2005 operations that used tiny coal-fired dynamos to generate modest amounts of power. The electricity they produced was in the form of direct current, which could not be transmitted very far, so their service distance was limited to about a mile. And their high-cost operations forced them to charge steep prices, so their customers were generally restricted to prosperous stores and offices, wealthy homeowners and municipal agencies, all of which used the electricity mainly for lighting. Relying on these small central stations was not an option for large industrial concerns. To produce the great quantities of reliable electricity needed to run their plants, these companies had no choice but to build their own dynamos. They contracted with electrical supply houses like General Electric and Westinghouse to provide the components of on-site generators as well as the expertise and personnel needed to construct them, and they hired electrical engineers and other specialists to operate the complex equipment and meld it with their production processes. During the early years of electrification, privately owned dynamos quickly came to dominate. By 1902, 50,000 private generating plants had been built in the United States, far outstripping the 3,600 central stations run by utilities.3 By 1907, factories were producing about 60% of all the electricity used in the country.4 But even as big manufacturers rushed to set up in-house generators, some small industrial concerns, such as urban printing shops, were taking a different route. They couldn’t afford to build generators and hire workers to maintain them, so they had to rely on nearby central stations, even if that meant paying high perkilowatt rates and enduring frequent disruptions in supply. At the time, these small manufacturers must have felt like laggards in the race to electrification, forced to adopt a seemingly inferior supply model in order to tap into the productivity gains of electric power. As it turned out, they were the vanguard. Soon, even their largest counterparts would be following their lead, drawn by the increasingly obvious advantages of purchasing electricity from outside suppliers. A series of technical advances set the stage for this shift. First, massive thermal turbines were developed, offering the potential for much greater economies of scale. Second, the introduction of alternating current allowed power to be transmitted over great distances, expanding the sets of customers that central plants could serve. Third, converters were created that enabled utilities to switch between different forms of current, allowing old equipment to be incorporated into the new system. Finally, electric motors capable of operating on alternating current were invented, enabling factories to tap into the emerging electric grid to run their machines. As early as 1900, all the technological pieces were in place to centralize the supply of power to manufacturers and render obsolete their isolated power plants.5 Technical progress was not enough, however. To overturn the status quo, a business visionary was needed, someone able to see So Long, PC If there’s a perfect symbol of corporate IT today, it’s the personal computer. Not only is the PC ubiquitous in modern companies, dominating the desks of most office workers, it is also a microcosm of the overall state of computing resources at the typical corporation: fragmented, redundant and increasingly underutilized. The invention of the PC was a great advance, one of the most important in recent business history. It dispersed the power of computing to individuals, spurred ingenuity, increased personal productivity and undoubtedly sped the development of networks, including the Internet and World Wide Web. But the rise of robust, high-capacity networks has also made the desktop PC less essential; computing resources can increasingly be provisioned to users from afar. And while the capacity of PCs has exploded, the needs of users have failed to keep pace. Few workers employ more than a tiny fraction of the computing horsepower at their disposal, and the multigigabyte hard drives of modern PCs tend to be either empty or filled with nonessential files. Some have argued that PCs are now so cheap that it doesn’t matter that they’re largely wasted. But that doesn’t account for the considerable costs of maintaining and updating huge fleets of PCs and their associated software. It also overlooks the fact that PCs often represent the biggest security hole in today’s companies, a gateway for hackers and a repository of ready evidence for the litigious. how the combination of technological, market and economic trends could lead to an entirely new model of utility supply. That person arrived in the form of a bespectacled English bookkeeper named Samuel Insull. Infatuated by electricity, Insull emigrated to New York in 1880 and soon became Thomas Edison’s most trusted advisor, helping the famous inventor expand his business empire. But Insull’s greatest achievement came after he left Edison’s employ in 1892 and moved to Chicago, where he assumed the presidency of a small, independent power producer with three central stations and just 5,000 customers. In less than 25 years, he would turn that little company into one of the country’s largest enterprises, a giant monopolistic utility named Commonwealth Edison. Insull was the first to realize that, by capitalizing on new technologies to consolidate generating capacity, centralized utilities could fulfill the power demands of even the largest factories. Moreover, utilities’ superior economies of scale, combined with their ability to spread demand across many users and thus achieve higher capacity-utilization rates, would enable them to provide much cheaper electricity than manufacturers could achieve with their private, subscale dynamos. Insull acted aggressively on his insight, buying up small utilities throughout Chicago and installing mammoth 5,000-kilowatt generators in his own plants. Equally important, he pioneered electricity metering and variable pricing, which enabled him to slash the rates charged to big users and further smooth demand. Finally, he launched an elaborate marketing campaign to convince manu- In the late 1990s, Oracle CEO Larry Ellison was roundly criticized for predicting that the PC, which he called “a ridiculous device,” would be supplanted by so-called thin clients — terminals and other stripped-down devices connected to centralized computers.i If Ellison’s timing was off, however, his assessment was not. The case for keeping desktop computers in companies will steadily weaken as utility computing becomes widespread. Unlike in the home, where the PC is the engine of computing, in business it is just a cog, and an increasingly unnecessary one at that. i. K. Girard, “Ellison Resurrects Network Computer,” Nov. 16, 1999, http://news.com.com/ Ellison+resurrects+network+computer/2100-1001_ 3-233137.html. facturers that they would be better off shutting down their generators and buying electricity from his utility.6 As Chicago manufacturers flocked to his company, Insull’s vision became reality. In 1908, a reporter for Electrical World and Engineer noted, “although isolated plants are still numerous in Chicago, they were never so hard pressed by central station service as now. … The Commonwealth Edison Company has among its customers establishments formerly run by some of the largest isolated plants in the city.”7 The tipping point had arrived. Although many manufacturers would continue to produce their own electricity for years to come, the transition from private to utility power was under way. Between 1907 and 1920, utilities’ share of total U.S. electricity production jumped from 40% to 70%; by 1930, it had reached 80%.8 By changing their view of electricity from a complex asset to a routine variable expense, manufacturers reduced their fixed costs and freed up capital for more productive purposes. At the same time, they were able to trim their corporate staffs, temper the risk of technology obsolescence and malfunction and relieve their managers of a major source of distraction. Once unimaginable, the broad adoption of utility power had become inevitable. The private power plant was obsolete. IT’s Transformation Begins Of course, all historical analogies have their limits, and information technology differs from electricity in many important ways. IT, for instance, incorporates software, which is a product of SPRING 2005 MIT SLOAN MANAGEMENT REVIEW 69 human creativity that is protected by intellectual property rights. But there are deep similarities as well — similarities that are easy for modern-day observers to overlook. Today, people see electricity as a “simple” utility, a standardized and unremarkable current that comes safely and predictably through sockets in walls. The innumerable applications of electric power, from table lamps in homes to machine tools on assembly lines, have become so commonplace that we no longer consider them to be elements of the underlying technology — they’ve taken on separate, familiar lives of their own. But it wasn’t always so. When electrification began, it was a complex, unpredictable and largely untamed force that changed almost everything it touched. Its application layer, to borrow a modern term, was as much a part of the technology as the dynamos, the power lines and the current itself. All companies had to figure out how to apply electricity to their own businesses, often making sweeping changes to long-standing practices, work flows and organizational structures. As the technology advanced, they had to struggle with old and often incompatible equipment — the “legacy systems” that can impede progress. As a business resource, or input, information technology today certainly looks a lot like electric power did at the start of the last century. Companies go to vendors to purchase various components, such as computers, storage drives, network switches and all sorts of software, and cobble them together into complex information-processing plants, or data centers, that they house within their own walls. They hire specialists to maintain the plants and often bring in outside consultants to solve particularly thorny problems. Their executives are routinely sidetracked from their real business — manufacturing automobiles, for instance, and selling them at a profit — by the need to keep their company’s private IT infrastructure running smoothly. The creation of tens of thousands of independent data centers, all using virtually the same hardware and for the most part running similar software, has imposed severe penalties on individual firms as well as on the broader economy.9 It has led to the overbuilding of IT assets, resulting in extraordinarily low levels of The creation of myriad independent data centers, all using virtually the same hardware and similar software, has imposed severe penalties on individual firms. 70 MIT SLOAN MANAGEMENT REVIEW SPRING 2005 capacity utilization. One recent study of six corporate data centers revealed that most of their 1,000 servers were using just 10% to 35% of their available processing power.10 Desktop computers fare even worse, with IBM Corp. estimating average capacity utilization rates of just 5%.11 (See “So Long, PC,” p. 69.) Gartner Inc., the research consultancy based in Stamford, Connecticut, suggests that between 50% and 60% of a typical company’s data storage capacity is wasted.12 And overcapacity is by no means limited to hardware. Because software applications are highly scalable — in other words, able to serve additional users at little or no incremental cost — installations of identical or similar programs at thousands of different sites create acute diseconomies in both upfront expenditures and ongoing costs and fees. The replication from company to company of IT departments that share many of the same technical skills represents an overinvestment in labor as well. According to a 2003 survey, about 60% of the average U.S. company’s IT staffing budget goes to routine support and maintenance functions.13 When overcapacity is combined with redundant functionality, the conditions are ripe for a shift to centralized supply. Yet companies continue to invest large sums in maintaining and even expanding their private, subscale data centers. Why? For the same reason that manufacturers continued to install private electric generators during the early decades of the 20th century: because of the lack of a viable, large-scale utility model. But such a model is now emerging. Rudimentary forms of utility computing are proliferating, and many companies are moving quickly to capitalize on them. Some are using the vast data centers maintained by vendors like IBM, Hewlett-Packard and Electronic Data Systems to supplement or provide an emergency backup to their own hardware. Others are tapping into applications that run on the computers of distant software suppliers. Such hosted programs, which include systems for procurement, transportation management, financial accounting, customer service, sales-force management and many other functions, demonstrate that even very complex applications can be supplied as utility services over the Internet. (See “The Pathbreakers.”) What these early efforts don’t show is the full extent and power of a true utility model. Today’s piecemeal utility services exist as inputs into traditional data centers; individual companies still must connect them with their old hardware and software. Indeed, firms often forgo otherwise attractive utility services or run into problems with outsourcing arrangements because the required integration with their legacy systems is so difficult. True utility computing will have arrived only when an outside supplier takes responsibility for delivering all of a company’s IT requirements, from data processing to storage to applications. The utility model requires that ownership of the assets that have traditionally resided inside widely dispersed data centers be consolidated and transferred to utilities. That process will take years to unfold, but the technological building blocks are already moving into place. Three advances — virtualization, grid computing and Web services — are of particular importance, although their significance has often been obscured by the arcane terminology used to describe them. In different ways, these three technologies play a role similar to that of the early current converters: They enable a large, tightly integrated system to be constructed out of heterogeneous and previously incompatible components. Virtualization erases the differences between proprietary computing platforms, enabling applications designed to run on one operating system to be deployed elsewhere. Grid computing allows large numbers of hardware components, such as servers or disk drives, to effectively act as a single device, pooling their capacity and allocating it automatically to different jobs. Web services standardize the interfaces between applications, turning them into modules that can be assembled and disassembled easily. Individually all these technologies are interesting, but combined they become truly revolutionary. Together with highcapacity, fiber-optic communication networks, they can turn a fragmented, unwieldy set of hardware and software components into a single, flexible infrastructure that numerous companies can share, each deploying it in a different way. And as the num- ber of users served by a system goes up, its demand load becomes more balanced, its capacity utilization rate rises and its economies of scale expand. Given that these technologies will evolve and advance while new and related ones emerge, the ability to provide IT as a utility — and the economic incentives for doing so — will only continue to grow. The biggest impediment to utility computing will not be technological but attitudinal. As in the shift to centralized electrical power, the prime obstacle will be entrenched management assumptions and the traditional practices and past investments on which they are founded. Large companies will pull the plug on their data centers only after the reliability, stability and benefits of IT utilities have been clearly established. For that to occur, a modern-day Samuel Insull needs to arrive with a clear vision of how the IT utility business will operate, as well as with the imagination and wherewithal to make it happen. Like his predecessor, this visionary will build highly efficient, large-scale IT plants, weave together sophisticated metering and pricing systems and offer attractive and flexible sets of services tailored to diverse clients.14 And he will make a compelling marketing case to corporate executives, demonstrating that centralizing the management of previously dispersed resources not only cuts The Pathbreakers When businesses began to turn to utilities for their electricity supply, smaller organizations led the way. Lacking the cash to build their own power plants, they had little choice but to buy power from outside suppliers. The most aggressive early adopters of utility computing also have tended to be capitalconstrained organizations: small and medium-sized businesses, government agencies and nonprofits. The Commonwealth of Pennsylvania, for instance, began to move toward utility computing nearly a decade ago. After several years of planning, the state began closing down the data centers operated by 17 government agencies in the fall of 1999, consolidating their hardware and software in a new facility that is now run by a consortium of suppliers led by Unisys. Similarly, Lincoln Center for the Performing Arts Inc. in New York City has adopted a utility model. It no longer maintains the application and database servers required to sell tickets and perform related functions, instead paying a simple monthly fee to use hardware owned and maintained by IBM. However, some big corporations are also beginning to embrace utility computing on a large scale. The Australian firm Qantas Airways Ltd., for example, began disassembling its data center in 2004, moving hundreds of servers and mainframe computers to a supplier’s facility. It will now pay a variable fee based on its actual usage of computing capacity. The airline has even outsourced its reservation and ticketing system, the very nexus of its operations, to Amadeus Global Travel Distribution SA, a technology provider headquartered in Madrid, Spain. According to Qantas CIO Fiona Balfour, the percentage of the airline’s data-center budget that is allocated to fixed costs has been cut from 70% to 30% as a result of this shift to utility supply.i Many other large companies are set- ting up their own internal “utilities” to supply computing resources throughout their organizations. They are consolidating previously dispersed computing, storage and networking hardware, imposing stricter software standards and using new technologies like virtualization and Web services to provide business units and corporate departments with services tailored to their particular needs. DHL, the shipping company, recently consolidated its eight North American data centers into a single facility in Arizona. The U.S. arm of Bayer AG, the chemical and drug company, centralized its IT operations by combining 42 data centers into two facilities and halving the number of its servers. The resulting savings: approximately $100 million. Such moves represent a first step toward a broader consolidation of IT resources as largescale utilities emerge. i. M. Levinson, “Host With the Most,” CIO, July 12, 2004, http://cio.idg.com.au/index.php?taxid=14& id=661732037. SPRING 2005 MIT SLOAN MANAGEMENT REVIEW 71 costs and frees up capital but also improves security, enhances flexibility and reduces risk. He will, in short, invent an industry. The Shape of a New Industry Exactly what that industry will look like remains to be seen, but it’s possible to envision its contours. It will likely have three major components. At the center will be the IT utilities themselves — big companies that will maintain core computing resources in central plants and distribute them to end users. Serving the utilities will be a diverse array of component suppliers — the makers of computers, storage units, networking gear, operating and utility software, and applications. Finally, large network operators will maintain the ultra-high-capacity data communication lines needed for the system to work. Some companies no doubt will try to operate simultaneously in more than one of these categories. What’s particularly striking about this model is that it reveals the unique characteristics that make IT especially well suited to becoming a utility service. With electricity, only the basic generation function can be centralized; because the applications are delivered physically through motors, light bulbs and various electronic devices, they have to be provisioned locally, at the user’s site. With IT, the immediate applications take the form of software, which can be run remotely by a utility or one of its suppliers. Even applications customized for a single customer can be housed at a supplier’s site. The end user needs to maintain only various input and output devices, such as monitors, printers, keyboards, scanners, portable devices, sensors and the like, that are necessary to receive, transmit and manipulate data and, as necessary, to reconfigure the package of services received.15 Although some customers may well choose to run certain applications locally, utilities will be able to own and operate the bulk of the hardware and software, further magnifying their scale advantages. Which companies will emerge as the new IT utilities? At least four possibilities exist. First are the big traditional makers of enterprise computing hardware that have deep experience in setting up and running complex business systems — companies like IBM, Hewlett-Packard and Sun Microsystems, all of which, not surprisingly, have already been aggressively positioning themselves as suppliers of utility services. Sun, in fact, not only rents processing and storage capacity for a fixed per-unit fee but is also setting up an online auction to sell excess computing power. Second are various specialized hosting operations, like VeriCenter Inc., based in Houston, Texas, or Virginia-based MCI’s Digex service, that even today are running the entire data centers of some small and midsized companies. These specialized firms, which struggled to survive after the dot-com collapse, are beginning to resemble the operators of the original central stations during the early stages of electrification. Third are Internet innovators like Google and Amazon.com Inc. that are building extensive, sophisticated computing networks that theoretically could be adapted to much broader uses.16 Finally 72 MIT SLOAN MANAGEMENT REVIEW SPRING 2005 are the as-yet-unknown startups that could emerge with ingenious new strategies. Because the utility industry will be scale driven and capital intensive, size and focus will be critical to success. Any company will find it difficult to dominate while also pursuing other business goals. To date, utility computing seems to be following the pattern of disruptive innovation defined by Clayton Christensen of the Harvard Business School: initially gaining traction at the low end of the market before ultimately emerging as the dominant supply model.17 As such, it may pose a grave threat to some of today’s most successful component suppliers, particularly companies like Microsoft, Dell, Oracle and SAP that have thrived by selling directly to corporations. The utility model promises to isolate these vendors from the end users and force them to sell their products and services to or through big, centralized utilities, which will have significantly greater bargaining power. Most of the broadly used components, from computers to operating systems to complex “enterprise applications” that automate common business processes, will likely be purchased as cheap, generic commodities.18 Of course, today’s leading component suppliers have considerable market power and management savvy, and they have time to adapt their strategies as the utility model evolves. Some may end up trying to forward-integrate into the utility business itself, a move that has good precedent. When manufacturers began to purchase electricity from utilities, the two largest vendors of generators and associated components, General Electric and Westinghouse, expanded aggressively into that business, buying ownership stakes in many electric utilities. As early as 1895, GE had investments totaling more than $59 million in utilities across the United States and Europe.19 But that precedent also reveals the dangers of such consolidation moves for buyers and sellers alike. As the U.S. electricity business became increasingly concentrated in the hands of a few companies, the government, fearful of private monopoly control over such a critical resource, stepped in to impose greater restrictions on the industry. The components of IT are more diverse, but the possibility remains that a few companies will seize exces- Which companies will emerge as the new IT utilities? Among the candidates are the traditional suppliers of enterprise computing like IBM, HP and Sun. sive control over the infrastructure. Not only would monopolization lead to higher costs for end users, it might also retard the pace of innovation, to the detriment of many. Clearly, maintaining a strong degree of competition among both utilities and component suppliers will be essential to a healthy and productive IT sector in the coming years. The View From the Future Any prediction about the future, particularly one involving the pace and direction of technological progress, is speculative, and the scenario laid out here is no exception. But if technological advances are often unforeseeable, the economic and market forces that guide the evolution of business generally play out in logical and consistent ways. The history of commerce has repeatedly shown that redundant investment and fragmented capacity provide strong incentives for centralizing supply. And advances in computing and networking have allowed information technology to operate in an increasingly “virtual” fashion, with ever greater distances between the site of the underlying technological assets and the point at which people access, interpret and manipulate the information. Given this trend, radical changes in corporate IT appear all but inevitable. Sometimes, the biggest business transformations seem inconceivable even as they are occurring. Today when people look back at the supply of power in business, they see an evolution that unfolded with a clear and inevitable logic. It’s easy to discern that the practice of individual companies building and maintaining proprietary power plants was a transitory phenomenon, an artifact of necessity that never made much sense economically. From the viewpoint of the present, electricity had to become a utility. But what seems obvious now must have seemed far-fetched, even ludicrous, to the factory owners and managers that had for decades maintained their own sources of power. Now imagine what future generations will see when they look back at the current time a hundred years hence. Won’t the private data center seem just as transitory a phenomenon — just as much a stop-gap measure — as the private dynamo? Won’t the rise of IT utilities seem both natural and necessary? And won’t the way corporate computing is practiced today appear fundamentally illogical — and inherently doomed? REFERENCES 3. A. Friedlander, “Power and Light: Electricity in the U.S. Energy Infrastructure, 1870-1940” (Reston, Virginia: Corporation for National Research Initiatives, 1996), 51. 4. D.E. Nye, “Electrifying America: Social Meanings of a New Technology” (Cambridge, Massachusetts: MIT Press, 1990), 236. 5. T.P. Hughes, “Networks of Power: Electrification in Western Society, 1880-1930” (Baltimore, Maryland: Johns Hopkins University Press, 1983), 106-139; and R.B. DuBoff, “Electric Power in American Manufacturing, 1889-1958” (New York: Arno Press, 1979), 42-45. 6. For more on Insull’s career and accomplishments, see Hughes, “Networks,” 201-226; and H. Evans, “They Made America: From the Steam Engine to the Search Engine: Two Centuries of Innovators” (New York: Little, Brown, 2004), 318-333. 7. “The Systems and Operating Practice of the Commonwealth Edison Company of Chicago,” Electrical World and Engineer 51 (1908): 1023, as quoted in Hughes, “Networks,” 223. 8. DuBoff, “Electric Power,” 40. 9. For a discussion of the homogenization of information technology in business, see N.G. Carr, “Does IT Matter? Information Technology and the Corrosion of Competitive Advantage” (Boston: Harvard Business School Press, 2004). 10. A. Andrzejak, M. Arlitt and J. Rolia, “Bounding the Resource Savings of Utility Computing Models,” working paper HPL-2002-339, Hewlett-Packard Laboratories, Palo Alto, California, Nov. 27, 2002. 11. V. Berstis, “Fundamentals of Grid Computing,” IBM Redbooks Paper, Austin, Texas, Nov. 11, 2002; www.redbooks.ibm.com/ redpapers/pdfs/redp3613.pdf. 12. C. Hildebrand, “Why Squirrels Manage Storage Better Than You Do,” Darwin, April 2002, www.darwinmag.com/read/040102/squirrels. html. 13. B. Gomolski, “Gartner 2003 IT Spending and Staffing Survey Results” (Gartner Research, Stamford, Connecticut, Oct. 2, 2003). 14. Effective and standardized metering systems will be as crucial to the formation of large-scale IT utilities as they were to electric utilities, and work in this area is progressing rapidly. See, for example, V. Albaugh and H. Madduri, “The Utility Metering Service of the Universal Management Infrastructure,” IBM Systems Journal 43, no. 1 (2004): 179-189. 15. Although the shift to utility supply will reduce the need for inhouse IT staff, companies will likely maintain groups of professionals with both technical and business skills to ensure that the purchased IT services are properly configured to support in-house processes and vice versa. 16. Google and Amazon.com already provide utility IT services. Companies draw on Google’s data centers and software to distribute advertisements over the Internet and add search functions to their corporate Web sites. Amazon, in addition to running its own online store, rents its sophisticated retailing platform to other merchants such as Target, JCPenney and Borders. 17. C.M. Christensen, “The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail” (Boston: Harvard Business School Press, 1997). 1. There are notable exceptions. See, for example, M.A. Rappa, “The Utility Business Model and the Future of Computing Services,” IBM Systems Journal 43, no. 1 (2004): 32-42; and L. Siegele, “At Your Service,” Economist, May 8, 2003 (a survey of the IT industry). 18. It’s telling that today’s vendors of utility IT services such as hosted applications and remote data centers have been among the most aggressive adopters of open-source software and other commodity components. 2. The term was introduced in a 1992 paper by T.F. Bresnahan and M. Trajtenberg, later published as “General Purpose Technologies: ‘Engines of Growth’?” Journal of Econometrics 65, no. 1 (1995): 83108. See also E. Helpman, ed., “General Purpose Technologies and Economic Growth” (Cambridge, Massachusetts: MIT Press, 1998). 19. Nye, “Electrifying America,” 170-174. Reprint 46313. For ordering information, see page 1. Copyright © Massachusetts Institute of Technology, 2005. All rights reserved. SPRING 2005 MIT SLOAN MANAGEMENT REVIEW 73 PDFs ■ Reprints ■ Permission to Copy ■ Back Issues Electronic copies of MIT Sloan Management Review articles as well as traditional reprints and back issues can be purchased on our Web site: www.sloanreview.mit.edu or you may order through our Business Service Center (9 a.m.-5 p.m. ET) at the phone numbers listed below. To reproduce or transmit one or more MIT Sloan Management Review articles by electronic or mechanical means (including photocopying or archiving in any information storage or retrieval system) requires written permission. To request permission, use our Web site (www.sloanreview.mit.edu), call or e-mail: Toll-free in U.S. and Canada: 877-727-7170 International: 617-253-7170 e-mail: smrpermissions@mit.edu To request a free copy of our article catalog, please contact: MIT Sloan Management Review 77 Massachusetts Ave., E60-100 Cambridge, MA 02139-4307 Toll-free in U.S. and Canada: 877-727-7170 International: 617-253-7170 Fax: 617-258-9739 e-mail: smr-orders@mit.edu Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Review Questions for Chapter 7: Telecommunications, the Internet, and Wireless Technology 1. Describe the features of a simple network and the network infrastructure for a large company. 2. Name and describe the principal technologies and trends that have shaped contemporary telecommunications systems. 3. Compare Web 2.0 and Web 3.0. 4. Define Bluetooth, Wi-Fi, WiMax, 3G, and 4G networks. 5. Define RFID, explain how it works, and describe how it provides value to business.
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

...


Anonymous
Nice! Really impressed with the quality.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags