Center for Coordination Science @ MIT



CCS Home  |  Working Papers   |  What's New?  |  People  |  Links  |  Site Map  |  Comments

NOTE: If an electronic version of a paper is not available here, or you are unable to successfully download the paper that you want in the formats available, please contact MIT Document Services at (617) 253-5668, DOCS@MIT.EDU. Please, do not contact the authors for copies of papers.

No. 101

Object Sharing in a Multi-User Hypertext System.

Melvina H. Tarazi

June, 1989

Object-oriented databases and hypertext systems are emerging in response to a demand by application developers for increased, less structured modeling power in database system. Such systems succeed in providing the necessary tools and facilities for building cooperative work applications but are limited in providing the appropriate object sharing environment.

We propose to address the object sharing requirements for one such system-Object Lens. Object Lens integrates features of hypertext systems, object-oriented systems and rule-based agents.

We evaluate various approaches to object sharing (including message passing, centralized object server and distributed object servers) and various schemes for concurrency control and update propagation (locking, timestamping) with respect to the characteristics desired in Object Lens (e.g. specialization hierarchy, user-interface, object linking, version control and combination of long interactive transactions and short automatic transactions).

We propose a new scheme for initiating object sharing through the exchange of electronic mail messages. Object protection is achieved by a hybrid scheme of access control lists and capability systems.

Analysis of the transactions in Object Lens reveals two sets of transactions: (1) interactive transactions that require relatively weak consistency requirements and relatively flexible concurrency control, and, (2) automatic transactions that require strict concurrency control requirements. We propose a hybrid locking and version control concurrency control scheme and accommodates the two types of transactions.


No. 102

What Good Are Semistructured Objects? Adding Semiformal Structure to Hypertext.

Thomas W. Malone, Keh-Chiang Yu, and Jintae Lee

June, 1989

This paper considers two questions: (1) What benefits (if any) can additional semiformal structure provide for hypertext users? and (2) Are these benefits worth their cost in additional complexity for users? These questions are investigated in the context of early experience with the Object Lens system, a general tool for cooperative work and information management.

First, the issues are illustrated by showing how a series of progressively more powerful applications can be easily constructed in Object Lens. The series begins with simple hypertext, then adds semistructured fields to individual nodes, and finally culminates with an advanced system for representing argumentation (similar to the gIBIS system).

Based on these and other examples, the paper concludes that adding semiformal structure to hypertext nodes can provide significant benefits for (a) summarizing the contents of objects and their relationships and (b) automatically searching and manipulating objects. The additional complexity required for these benefits is reduced when (a) the structure in the objects is "soft" (i.e., the objects are only semi-structured), and (b) the additional features are incremental (i.e., users who don't use them don"t need to know about them).


No. 103

Superseded by No. 111


No. 104

Software Hot Lines: A Preliminary Description.

Brian T. Pentland

October, 1989

Technical support is an integral part of most complex software products. This paper uses data from a set of 13 software support hot lines to create a preliminary description and analysis of the basic process of providing technical support. Some key managerial issues are identified and the relationship between learning and coordination is discussed briefly. The results reported here are part of my on-going research into the relationship between learning in coordination in organizations.


No. 105

The Duality of Technology: Rethinking the Concept of Technology in Organizations.

Wanda J. Orlikowski

October, 1989

Technology has long been portrayed either as an objective element in organizational processes, or as a socially constructed product of human action. This paper suggests that either view is incomplete. Instead, technology is posited as having a dual nature, being both constructed and enacted by human agents, as well a material force that shapes human action and social practices. The implications of reformulating the technology concept are explored, and a structurational model of technology is proposed as a means to understand the essentially dialectical relationship between technology and organizations. Findings from a field study of information technology are presented to illustrate the workings of the structurational model of technology. The discussion suggests that to fully understand the interaction of technology and organizations, technology needs to be understood as an element in both the institutional order of an organization, and the social interaction of organizational members.


No. 106


Superseded by No. 123


No. 107

Sibyl: A Qualitative Decision Mangement System

Jintae Lee

January, 1990

What should you buy? From whom? Each time questions like these are decided, the decision-making effort should become a piece of history that others can benefit from later on. Yet decisions of essentially the same kind are made over and over and the participants in the decision making go over the same tedious ground already plowed by thousands of others.

In this chapter, Lee shows that what we need is a language for describing how decisions are made in terms that make goals and arguments explicit. With such a language, all sorts of questions can be answered that traditional decision science can deal with only obliquely. To what extent, for example, can a past decision serve as a precedent for a current decision? How would a particular decision turn out if cost were not important? What would person X decide to do? What are the risks involved in choice Y? What would be the effect of product Z, if announced?

Thus Lee's language makes a new kind of what-if exercise possible for decision makers. Importantly, Lee's language also makes what-happened exercises vastly easier too. Just keeping track of what happened in big government system-integration contracts, so as to provide an audit trail later on, can consume half of the total cost. Were decisions made in the way envisioned by Lee, the audit trail would be a byproduct of decision making, not an add-on activity.


No. 108 - Superseded by No. 164


No. 109


Back To The Drawing Board?: Computer-Mediated Communication Tools for Engineers.

Mark J. Jakiela and Wanda Orlikowski

March, 1990

This paper describes a research study that attempted to determine the nature of computer-mediated information that would be exchanged by engineers engaged in a process of product design and manufacture. The results suggest that existing computer-mediated communication tools, which support asynchronous, written modes of information exchange, have little utility for engineers engaged in design and manufacture. Substantial augmentation of these tools will be necessary before their wide-spread adoption and use by engineers can be expected.


No. 110


Superceded by No. 122


No. 111


Partially Shared Views: A Scheme For Communicating Among Groups That Use Different Type Hierarchies.

Jintae Lee and Thomas Malone

April, 1990

Many computer systems are based on various types of messages, forms, or other objects. When users of such systems need to communicate with people who use different objects types, some kind of translation is necessary. In this paper, we explore the space of general solutions to this translation problem and propose a scheme that synthesizes these solutions. After first illustrating the problem in the Object Lens system, we identify two partly conflicting objectives that any translation scheme should satisfy: preservation of meaning and autonomous evolution of group languages. Then we partition the space of possible solutions to this problem in terms of the set theoretic relations between group languages and a common language. This leads to five primary solution classes and we illustrate and evaluate each one. Finally, we describe a composite scheme, called Partially Shared Views, that combines many of the best features of the other schemes. A key insight of the analysis is that partially shared type hierarchies allow "foreign" object types to be automatically translated into their nearest common "ancestor" types. The partial interoperability attained in this way makes possible flexible standards where people can benefit from whatever agreements they do have without having to agree on everything. Even though our examples deal primarily with extensions to the Object Lens system, the analysis also suggests how other kinds of systems, such as EDI applications, might exploit specialization hierarchies of object types to simplify the translation problem.


No. 112


What is Coordination Theory and How Can It Help Design Cooperative Work Systems.

Thomas W. Malone and Kevin Crowston

July, 1990

This paper describes a new perspective--the interdisciplinary study of coordination--that can help understand how people work together now and how they might do so differently with new information technologies. This perspective helps identify fruitful connections among different disciplines--such as computer science, sociology, political science, management science, economics, linguistics, and psychology--that have all dealt with fundamental questions about coordination.

The paper begins by defining coordination and coordination theory. Then examples are given of how previous research on computer-supported cooperative work has benefited from the kind of interdisciplinary transfers this perspective is intended to facilitate.

In the final section of the paper, a framework for developing this perspective further is proposed. The framework includes taxonomies of (1) the components of coordination (goals, activities, actors, and interdependencies), (2) kinds of interdependencies possible (such as prerequisites and shared resources), and (3) the processes underlying coordination (group decision-making, communication, and the perception of common objects).


No. 113


SIBYL: A Tool For Managing Group Decision Rationale.

Jintae Lee

August, 1990

We describe SIBYL, a system that supports group decision making by representing and managing the qualitative aspects of decision making processes: such as the alternatives, the goals to be satisfied, and the arguments evaluating the alternatives with respect to these goals. We use an example session with SIBYL to illustrate the language, called DRL, that SIBYL uses for representing these qualitative aspects, and the set of services that SIBYL provides using this language. We also compare SIBYL to other systems with similar objectives and discuss the additional benefits that SIBYL provides. In particular, we compare SIBYL to gIBIS, a well known "tool for exploratory policy discussion," and claim that SIBYL is mainly a knowledge-based system which uses a semi-formal representation, whereas gIBIS is mainly a hypertext system with semantic types. We conclude with a design heuristic, drawn from our experience with SIBYL, for systems whose goal includes eliciting knowledge from people.


No. 114


The Rainbow Pages.

Paul Resnick and Mel King

September, 1990

Telecommunication networks have the power to bring people together. Unfortunately, text-based bulletin boards require personal computers as front-ends and that restricts access to only those individuals and organizations that can afford to purchase computers. A voice bulletin board, on the other hand, uses any touch-tone telephone as a front end, and most people in this country already own or can afford to purchase a touch-tone telephone. This paper describes the rationale for and design of a free, public-access voice bulletin board that attempts to bring the power of computing to the people of one neighborhood in Boston.


No. 115


Modelling Coordination In Organizations.

Kevin Crowston

December, 1990

The end goal of my research is a more principled definition of coordination and coordination work. To do so, however, requires the development of better analysis techniques. In this paper, I describe two-stage modelling technique based on ideas from distributed artificial intelligence and illustrate its use in a field study of the engineering change processes in three large manufacturing companies. The paper concludes by discussing possible uses of these models.


No. 116


Supporting Collaborative Planning: The Plan Integration Problem.

David Rosenblitt

February, 1991

When different members of a work group develop their own individual plans, or sets of tasks to achieve desired goals, there may be conflicting and synergistic interactions among these plans. Conflicts may arise when one task negates the effect of another task, or two tasks compete for the same resource. Synergies may arise if the desired effects of some tasks are also accomplished by other tasks, allowing some of the tasks to be deleted. In many organizations, plans are often poorly integrated: conflict detection and resolution are performed late in the planning cycle, resulting in costly revisions and delays, and potential synergies are overlooked and unexploited, resulting in wasted resources.

This paper details a framework for solving the plan integration problem, and shows how this capability can support an important aspect of cooperative work: collaborative planning. The utility of plan integration in supporting collaborative planning is illustrated in a construction planning scenario, based on an actual project. The planning framework is domain-independent and provably correct. Unlike previous work in AI planning theory, it includes a general mechanism for reasoning about resources. The planning algorithms are implemented in Synapse, a prototype collaborative planning tool.


No. 117


Planning With Resources.

David Rosenblitt

February, 1991

The STRIPS-based (Fikes, 1971) propositional planning framework is extended for operators that consume and produce quantities of resources, which are ubiquitous in many real-world domains. In this framework, step preconditions must specify, in addition to a set of propositions, the amounts of resources that must be available in order for a step to be performed. Similarly, effects must specify the amounts of resources that are consumed or produced by a step. McAllester's formally precise, provably correct nonlinear propositional planner (McAllester, 1991) is extended for resource-manipulating operators, and is proved correct. The nonlinear planning with resources algorithm is implemented in a prototype system, Synapse.


No. 118


Recent Applications Of Economic Theory In Information Technology Research.

J. Yannis Bakos, and Chris F. Kemerer

February, 1991

Academicians and practitioners are becoming increasingly interested in the economics of Information Technology (IT). In part, this interest stems from the increased role that IT now plays in the strategic thinking of most large organizations, and from the significant dollar costs expended by these organizations on IT. Naturally enough, researchers are turning to economics as a reference discipline in their attempt to answer questions concerning both the value added by IT and the true cost of providing IT resources.

This increased interest in the economics of IT is manifested in the application of a number of aspects of economic theory in recent information systems research, leading to results that have appeared in a wide variety of publication outlets. This paper reviews this work and provides a systematic categorization as a first step in establishing a common research tradition, and to serve as an introduction for researchers beginning work in this area. Six areas of economic models of organizational performance, industrial organization, institutional economics (agency theory and transaction cost theory), and macroeconomic studies of IT impact. For each of these areas, recent work is reviewed and suggestions for future research are provided.


No. 119


An Approach To Modeling Organizational Knowledge.

Brian T. Pentland

January, 1991

Theories of practice offer a promising basis for models of organizational knowledge. From this perspective, organizational knowledge can be thought of as sequences of "moves" embedded in organizational structure. This paper uses ethnographic data from two technical sevice organizations to illustrate this approach.


No. 120


Toward An Interdisciplinary Theory Of Coordination

Thomas W. Malone and Kevin Crowston

April, 1991

This paper characterizes a new research area, called coordination theory, that focuses on the interdisciplinary study of coordination. Research in this area uses and extends ideas about coordination from disciplines such as computer science, organization theory, operations research, economics, linguistics, and psychology.

In the framework presented here, coordination is analyzed in terms of actors performing interdependent activities that achieve goals. A variety of processes are analyzed from this perspective and commonalities across disciplines are identified. Processes analyzed include goal decomposition, resource allocation, synchronization, group decisionmaking, communication, and the perception of common objects.

A major section of the paper summarizes recent applications of coordination theory in three different domains: (1) understanding the effects of information technology on human organizations and markets, (2) designing cooperative work tools, and (3) designing distributed and parallel processing computer systems. In the final section of the paper, elements of a research agenda in this new area are briefly outlined.


No. 121


A Comparative Analysis of Design Rationale Representations

Jintae Lee and Kum-Yew Lai

May, 1991

A few representations have been used for capturing design rationale. It is important to know in what ways they are adequate or limited so that we know how to improve them. In this paper, we develop a framework for evaluating design rationale representations based on a set of generic design tasks.. We build the framework by progressively differentiating the elements of design rationale that, when made explicit, support an increasing number of the design tasks. With this framework, we evaluate the expressiveness of the existing representations. We also present a language, DRL, that we believe is the most expressive of the existing representaitons without being too complex for human users. We also discuss the limitations of DRL as open problems for further research.


No. 122


Genres of Organizational Communication: An Approach to Studying Communication and Media

JoAnne Yates and Wanda J. Orlikowski

August, 1991

The traditional dichotomy between studies of media choice and studies of media effects ignores the reciprocal interaction between media and communicative behavior. Drawing on rhetorical theory and structuration, this paper proposes genres of organizational communication as a construct distinct from communication media. This approach allows us to study communication not as the result of isolated rational actions but as embedded in social process. We demonstrate the usefulness of our approach in an extended historical example, and then draw implications fore future research.

No. 123

Does Information Technology Lead to Smaller Firms?

Erik J. Brynjolfsson, Thomas Malone, Vijay Gurbaxani and Ajit Kambil

September, 1991

Among the many recent changes in the organization of work in the United States, the decline in the average size of firms, as measured by employment, has been particularly well-documented. The primary goal of this paper is to assess the hypothesis that the rapid growth of information technology is at least partially responsible for this shift to smaller firms. We use industry-level data on information technology capital and four measures of firm size, including employees per firm, from different sources to examine this hypothesis. We find broad evidence that investment in information technology is significantly associated with subsequent decreases in the average size of firms. We also find that the effects of information technology on organizations are most pronounced after a lag of two to three years.

No. 124

Performance Evaluation Metrics for Information Systems Development: A Principal-Agent Model

Rajiv D. Banker and Chris F. Kemerer

October, 1991

The information systems (IS) development activity in large organizations is a source of increasing cost and concern to management. IS development projects are often over-budget, late, costly to maintain, and not done to the satisfaction of the requesting user. These problems exist, in part, due to the organization of the IS development process, where information systems development is typically assigned by the user (principal) to a systems developer (agent). These two parties do not have perfectly congruent goals, and therefore a contract is developed to specify their relationship. An inability to directly monitor the agent requires the use of performance measures, or metrics, to represent the agent's actions to the principal. The use of multiple measures is necessary given the multi-dimensional nature of successful systems development. In practice such contracts are difficult to satisfactorily develop, due in part to an inability to specify appropriate metrics.

This paper develops a principal-agent model (based on information economics) that provides a set of decision criteria for the principal to use to develop an incentive compatible contract for the agent. These criteria include the sensitivity and the precision of the performance metric. After presenting the formal model, some current software development metrics are discussed to illustrate how the model can be used to provide a theoretical foundation and a formal vocabulary for performance metric analysis. The model is also used in a positive (descriptive) manner to show how current practice emphasizes metrics that possess relatively high levels of sensitivity and precision. Finally, some suggestions are made for the improvement of current metrics based upon these criteria.

No. 125

(Superseded by No. 130)

No. 126

An Incomplete Contracts Theory of Information, Technology and Organization

Erik Brynjolffson

December, 1991

Although there is good reason to expect that the growth of information work and information technology will significantly affect the trade-offs inherent in different structures for organizing work, the theoretical basis for these changes remains poorly understood. This paper seeks to address this gap by analyzing the incentive effects of different ownership arrangement in the spirit of the Grossman-Hart-Moore (GHM) incomplete contracts theory of the firm. A key departure from earlier approaches is the inclusion of a role for an "information asset", analogous to the GHM treatment of property. This approach highlights the organizational significance of information ownership and information technology. For instance, using this framework, one can determine when 1) informed workers are more likely to be owners than employees of firms, 2) increased flexibility of assets will facilitate decentralization, and 3) the need for centralized coordination will lead to centralized ownership. The framework developed sheds light on some of the empirical findings regarding the relationship between information technology and firm size and clarifies the relationship between coordination mechanisms and the optimal distribution of asset ownership. While many implications are still unexplored and untested, building on the incomplete contracts approach appears to be a promising avenue for the careful, methodical analysis of human organizations and the impact of new technologies.

No. 127

Answer Garden and the Organization of Expertise.

Mark Ackerman

January, 1992

Answer Garden facilitates the building of an organizational memory for commonly asked questions and their answers. The system includes an easy-to-use set of information retrieval engines, including a branching network of diagnostic questions. If the answer to the user's need is not present in the database, the system automatically routes the question to the appropriate human expert, and the answer is returned to the user as well as inserted into the branching network and database.

This research paper postulates that the major organizational and social innovations with Answer Garden include changing the information seeking behavior in an organization, building an organizational memory, and allowing firms to better coordinate and manage their intellectual assets. In addition, this paper presents a brief summary of the technical architecture. This paper supplements, and does not replace, working paper 108.

No. 128

Towards a Coordination Cookbook

Kevin Ghen Crowston

February, 1992

This thesis present the first steps towards a theory of coordination in the form of what I call a coordination cookbook. My goal in this research is hypothesis generation rather than hypothesis testing: I attempt to develop a theory of coordination grounded in detailed empirical observation. I am especially interested in using this theory to identify ways of coordinating that may become more desirable when information technology is used to perform some of the coordination.

I address the following question: how can we represent what people do to coordinate their actions when they work together on common goals, in a way that reveals alternative approaches to achieving those goals? To answer this question, I study groups of people making engineering changes to complex products as an example of a coordination-intensive task. I performed detailed case studies of the change process in three organizations: an automobile manufacturer, a commercial aircraft manufacturer and a computer system software developer.

To analyze these cases, I develop a technique for describing the behaviour of the members of an organization, based on research in distributed artificial intelligence (DAI). I first develop a data-flow model of the change process to identify what information was used and how it was processed by the different members of the organization. Then, using ideas from DAI, I model what each individual must have known about the task and the rest of the organization to act as observed.

To develop a theory of coordination, I generalize from these specific individuals to the kinds of tasks they performed. I develop a typology of interdependencies between organizational tasks and objects in the world (including resources and products). This typology includes four categories of coordination needs, due to interdependencies between: (1) different tasks, (2) tasks and subtasks, (3) tasks and objects in the world and (4) different objects.

I then re-examine the cases to identify the coordination methods used to address these needs (These coordination methods are similar in sprit to the weak problem solving methods of cognitive science.) I represent each method by a set of what I call coordination recipes that identify the goals, capabilities and knowledge of the individuals involved. In some cases, consideration of the possible distributions of these elements suggests approaches other than those actually observed. This framework allows an analyst to abstract from a description of how a particular organization performs a task to a description of the coordination needs of that task and a set of alternative coordination methods that could be used to address those needs.

The results of my these should be useful in several ways. A better understanding of how individuals work together may provide a more principled approach for designing new computer applications, for analyzing the way organizations are currently coordinated and for explaining perceived problems with existing approaches to coordination. By systematically exploring the space of possible coordination strategies, we may be able to discover new kinds of organizations - organizations in which humans and computers work together in as yet unimagined ways.

**NOTE: This working paper was orginally Kevin Ghen Crowston's Doctoral Thesis, and is quite large. The charge for this paper is $17.00 and $25.00 outside the US.

No. 129

Skip and Scan: Cleaning up Telephone Interfaces

Paul Resnick and Robert A. Virzi

February, 1992

The current generation of telephone interfaces is frustrating to use, in part because callers have to wait through the recitation of long prompts in order to find the options that interest them. In a visual medium, users would shift their gaze in order to skip uninteresting prompts and scan through large pieces of text. We present skip and scan, a new telephone interface style in which callers issue explicit commands to accomplish these same skipping and scanning activities. In a laboratory experiment, subjects made selections using skip and scan menus more quickly than using traditional, numbered menus, and preferred the skip and scan menus in subjective ratings. In a field test of a skip and scan interface, the general public successfully added and retrieved information without using any written instructions.

KEYWORDS: phone-based interface, semi-structure, audiotext, telephone form, menu, interactive voice response.

No. 130

The Productivity of Information Technology: Review and Assessment

Erik Brynjolfsson

April, 1992

Productivity is the bottom line for any investment. The quandary of information technology (IT) is that, despite astonishing improvements in the underlying capabilities of the computer, its productivity has proven almost impossible to assess. There is an increasing perception that IT has not lived up to its promise, fueled in part by the fact that the existing empirical literature on IT productivity generally has not identified significant productivity improvements. However, a careful review, whether at the level of the economy as a whole, among information workers, or in specific manufacturing and service industries, indicates that the evidence must still be considered inconclusive. It is premature to surmise that computers have been a paradoxically unwise investment. A puzzle remains in the inability of both academics and managers to document unambiguously the performance effects of IT. Four possible explanations are reviewed in turn: mismeasurement, lags, redistribution and mismanagement. The paper concludes with recommendations for investigating each of these explanations using traditional methodologies, while also proposing alternative, broader metrics of welfare that ultimately may be required to assess, and enhance, the benefits of IT.

KEYWORDS: Productivity, Computers, Performance measurement, Economic value, Investment justification.

No. 131

(Superseded by No. 134)

No. 132

(Superceded by No. 181)

No. 133

HyperVoice: A Phone-Based CSCW Platform

Paul Resnick

August 1992

A major shift is underway in how we think about telephones. For decades, they were used solely for one-to-one, synchronous communication. The increasing use of answering machines and voice messaging, however, is shifting the public perception of telephones, thus opening a space for more innovative applications. Five years from now, some of the most interesting and popular cooperative work applications will probably use telephones as the primary means of access. This paper presents evidence that there are practical phone-based cooperative work applications and describes a set of software tools that facilitate the development of such applications.

No. 134

Learning from NOTES: Organizational Issues in Groupware Implementation

Wanda Orlikowski

August, 1992

This paper explores the introduction of a groupware into an organization to understand the changes in work practices and social interaction facilitated by the technology. The results suggest that people's mental models and organizations' structure and culture significantly influence how groupware is implemented and used. Specifically, in the absence of mental models that stressed its collaborative nature, groupware was interpreted in terms of familiar personal, stand-alone technologies such as spreadsheets. Further, the culture and structure proved few incentives or norms for cooperating or sharing expertise, hence the groupware on its own was unlikely to engender collaboration. Recognizing the central influence of these cognitive and organizational elements is critical to developers, researchers, and practitioners of groupware.

No. 135

Why Information Technology Hasn't Increased the Optimal Number of Suppliers

J. Yannis Bakos, Erik Brynjolfsson

August, 1992

Information technology has generally reduced search and coordination costs. Ceteris paribus, this should lead firms to increase the number of suppliers with which they do business. However, there is little evidence of an increase in the number of suppliers used in the past few years. On the contrary, in many industries, leading firms are working with fewer suppliers. This suggests that other forces must be accounted for in a more complete model of buyer-supplier relationships.

This paper presents a model that shows how a buyer can increase its suppliers' incentives to invest in quality by decreasing their number. This makes it more difficult for the buyer to threaten to switch to alternative sources and thereby expropriate the supplier's share of the value created. As a result, suppliers are more willing to make "non-contractible" investments in quality. Thus, we argue that because information technology often increases the importance of quality, it can lead firms to employ fewer suppliers, and that this will be true even when search and coordination costs are very low. Evidence from several empirical studies of buyer-supplier relationships appears to be consistent with this explanation.

No. 136

A Decision Rationale Management System: Capturing, Reusing, and Managing the Reasons for Decisions

Jintae Lee

December, 1992

This thesis identifies the needs for capturing and managing decision rationales, articulates the concept of a decision rationale management system that meet these needs, and presents a computer system that implements the concept.

Capturing and managing decision rationales, i.e., the deliberations leading to decisions, can bring about many benefits. The rationales can then be used to support decision making, can be shared among decision makers, and can be reused for similar decisions. A decision rationale management system, i.e., a system that captures and manages decision rationales to provide these benefits, requires three major components: a language for representing elements of rationales, a method of using the language to capture the rationales, and a set of services that use the captured rationales to support decision making.

This thesis articulates a model of decision rationales and uses it to develop DRL, a language for representing the elements in this model. The thesis also presents SIBYL, a computer system that helps people to capture rationales in DRL by providing a number of interface features intended to reduce the overhead associated with explicit representations. Using the rationales captured in DRL, SIBYL provides computational decision services, such as retrieving useful rationales from past decisions, maintaining dependencies among the various elements of rationales, and keeping track of multiple decision states. These services realize the benefits of a structured representation of rationales, and provide further motivation for capturing rationales in DRL.

No. 137

Improving Performance in Product Deployment By Improving Coordination

Arthur L. Keigler II

December, 1992

The coordination practices used in a product deployment business unit were observed as they presently occur. Improved coordination practices based on Quality Function Deployment (QFD) were then introduced both at the product team level and at the business unit management level. The successful implementation of these methods was studied to identify general characteristics of introducing and using improved coordination practices in a rapidly changing and competitive product domain. This understanding of coordination was then applied to projected future use of Computer Supported Cooperative Work (CSCW) in product deployment; four prototype applications using the OVAL software language were developed and are presented herein.

Coordination is bringing into harmony diverse relations and actions. This is shown to be a primary activity of professionals in the product deployment group studied. Using case examples from the field work, coordination practices are analyzed as a user-interface by decomposing them into several components: interpersonal dynamics; communication media; cycles of exploration, clarification and closure; and relationship type and complexity. Observed practices are interpreted to show that rather than barriers to improvement there are reinforcing dynamics which maintain stability. By understanding and working with these dynamics a more appropriate and successful introduction of coordination practices like QED and CSCW is achieved.

In product deployment, actions and relations must be coordinated in at least three dimensions: across time, across functions, and across hierarchical levels. Coordination across functions is necessary when there is a high degree of interdependence between decisions and tasks. Traditionally it is difficult to coordinate such cross functional decisions and activities because people in different functions work with others of their own discipline more readily than with people from other disciples. This behavior is reasonable since the disciplines use different languages, focus people's attention through different types of lenses and models, and require that people spend much of their time working on problems and designs with members of their own discipline. Successful work with product teams showed that coordination in this dimension must provide a forum where different languages can be spoken, interpreted, and mutually understood. Coordination across levels is necessary when the business operates across broad temporal, geographic, or technical ranges. Inter-level coordination assumes additional importance in most organizations due to the strong leverage high level managers have on learning of new practices throughout the organization. It is also shown to be difficult due to the different nature of coordination work at different levels. Successful use of QFD methods for developing group strategy showed that coordination in this dimension must ensure fluid meshing of the different problem solving and design activities germane to different levels.

Coordination across time is necessary in the short run to accomplish intended objectives and in the long run to learn about and adapt to the environment. Planning and control is traditionally considered the aspect of coordination most crucial for management. Recently, in response to rapid evolution of the world economy, attention has expanded beyond adjustment of operating activities and efforts to encompass improvement of the policies, decision rules, and values through which the operating plans were developed; in other words, to learn how to design better plans. Successful use of a wall-chart meeting user-interface showed that coordination in this dimension can ensure both attainment of budgeted objectives and ongoing individual and organizational learning.

No. 138

A Strategic Analysis of Electronic Marketplaces

J. Yannis Bakos

December, 1992

Information systems can serve as intermediaries between the buyers and the sellers in a vertical market, thus creating an "electronic marketplace." A major impact of these electronic market systems is that they typically reduce the search costs buyers must pay to obtain information about the prices and product offerings available in the market. Economic theory suggests that this reduction in search costs plays a major role in determining the implications of these systems for market efficiency and competitive behavior. This article draws on economic models of search and examines how prices, seller profits and buyer welfare are affected by reducing search costs in commodity and differentiated markets. This reduction results in direct efficiency gains from reduced intermediation costs and in indirect but possibly larger gains in allocational efficiency from better-informed buyers. Because electronic market systems generally reduce buyers' search costs, they ultimately increase the efficiency of interorganizational transactions, in the process affecting the market power of buyers and sellers. The economic characteristics of electronic markets, in addition to their ability to reduce search costs, create numerous possibilities for the strategic use of these systems.

No. 139

Superceded by No. 166

No. 140

Superceded by No. 154

No. 141

Tools for inventing organizations: Toward a handbook of organizational processes

Thomas W. Malone, Kevin Crowston, Jintae Lee, Brian Pentland

May, 1993

This paper describes a new project intended to provide a firmer theoretical and empirical foundation for such tasks as enterprise modeling, enterprise integration, and process re-engineering.

The project includes (1) collecting examples of how different organizations perform similar processes, and (2) representing these examples in an on-line "process handbook" which includes the relative advantages of the alternatives. The handbook is intended to help (a) redesign existing organizational processes, (b) invent new organizational processes that take advantage of information technology, and perhaps (c) automatically generate software to support organizational processes.

A key element of the work is a novel approach to representing processes at various levels of abstraction. This approach uses ideas from computer science about inheritance and from coordination theory about managing dependencies. Its primary advantage is that it allows users to explicitly represent the similarities (and differences) among related processes and to easily find or generate sensible alternatives for how a given process could be performed.

No. 142

Ownership Principles for Distributed Database Design

Marshall Van Alstyne, Erik Brynjolfsson, Stuart Madnick

May, 1993

This research addresses the issues of database ownership and incentives and their impact on information sharing and system performance. Existing research has identified the benefits of centralized control and has formalized the importance of a vested authority setting standards, working towards user transparency, and reducing organization wide data inconsistencies. In practice, however, many centralization and standardization efforts have failed, typically because departments lacked incentives or needed greater local autonomy. Unfortunately, motivational factors have typically eluded formal characterization. Using an "incomplete contracts" approach from economics, it is possible to model the costs and benefits of decentralization, including critical intangible factors. This paper presents normative principles of database decentralization; it derives formulas that give the principles a theoretical underpinning; and it illustrates the application of each principle in actual practice.


No. 143

Superseded by No. 162

No. 144

Information Technology and the Re-Organization of Work: Theory and Evidence

Erik Brynjolfsson

June, 1993

The technological base and the organizational superstructure of American industry are undergoing significant changes. Information technology capital now accounts for over 40% of all capital spending while new forms of economic organization, both within and among firms, are emerging. These basic trends have long been anticipated, but only recently have we had the data and the theoretical tools needed to formally document and model them. This thesis consists of three essays on the impact of information technology on organizational structure. The essays are self-contained and can be read independently, but they also build-on and reinforce one another.

The first essay examines the relationship between information technology capital and firm size using industry data for the entire US. economy. The results indicate that increases in information technology are associated with significant decreases in firm size (as measured by employment) and vertical integration (as measured by the ratio of value added to sales). The effects are most pronounced after a lag of two years, and are robust to a variety of specifications. The evidence suggests that information technology is associated with a shift from hierarchies to markets for coordinating economic activity.

The second essay models the "new managerial work", which features a reliance on delegated decision-making and performance pay, as an optimal organizational response to advances in information technology. The technology has automated many routine tasks while accentuating the "Information explosion" that increasingly overwhelms top management. The model predicts an increase in decentralization and performance pay in two steps. 1) The growth of inexpensive, but economically valuable information both enables and necessitates the decentralization of organizational information processing and decision-making, and 2) effective use of distributed information processing requires that employees face more outcome-based incentives and fewer rewards based on pre-specified behavior. The analysis also indicates that there is a technological basis to the growth in the number of "knowledge workers" and the decline in the use of rigid work rules.

The third essay extends the results of the second essay to explain declines in firm size and vertical integration. I relax the assumption that all contracts between the principal and agent are comprehensive and shift to a property rights approach to the theory of the firm. It is shown that when information technology leads to an increase in the use of outcome-based incentives, agents will be more likely to independently own the assets they use, thereby leading to the deintegration of large firms. This is because it is impossible to provide optimal incentives to agents who have important non-contractible actions without also giving them ownership rights, which guarantee them a claim to the non-verifiable benefits they generate. This approach also makes it possible to examine the role of information in a variety of organizational forms and determine when 1) informed workers are more likely to own their own firms, 2) increased flexibility of assets will facilitate decentralization, and 3) the need for centralized coordination will lead to centralized ownership. The framework developed sheds light on the empirical findings regarding the relationship between information technology and firm size identified in essay 1 and clarifies the relationship between coordination mechanisms and optimal distribution of asset ownership.


Co-evolution of Information Processing Technology and Use: Interaction between the Life Insurance and Tabulating Industries

JoAnne Yates

June, 1993

Punched-card tabulators were an important immediate predecessor of the computer used for processing large amounts of data in many business firms during the first half of the twentieth century. The life insurance business was an information-intensive business dependent on its ability to manage large quantities of data. This paper examines both the role that tabulating machinery played in shaping insurance firms' business processes and, simultaneously, the role that life insurance as a user industry played in shaping the development of tabulating technology, between 1890 and 1950. During that period life insurance used various mechanisms (the market, user-based innovation, etc.) to influence developments in tabulating technology designed to link information processing with document production. At the same time, life insurance firms gradually reconfigured processes to take better advantage of the new capabilities of the technology. The on-going interaction between the life insurance and tabulating industries shaped both industries in significant ways, setting the stage for continued interaction between the two industries during the transition to computers beginning at mid-century.

No. 146

Computerized Loan Origination Systems: A Case Study of the Electronic Markets Hypothesis

Christopher M. Hess, Chris F. Kemerer

June, 1993

Much has been written in recent years about the changes in corporate strategies and industry structures associated with electronic coordination of market activities. This paper considers the advent of electronic market coordination in the home mortgage industry, focusing on Computerized Loan Origination systems (CLOs). Initially developed over a decade ago, CLOs give home buyers an automated means to compare, select, apply for and close mortgage information and application services from several lenders at the point of sale of the property to be mortgaged.

Case studies of five CLOs (First Boston's Shelternet, PRC's Loan Express, American Financial Network's Rennie Mae, Prudential's CLOS, and Citicorp's Mortgage Power plus) reveal a range of system functionalities. A three-level categorization, from "Loan Listing Service" to "Transformed Market" is proposed, and the five case studies are mapped to this categorization.

Predictions from the Electronic Markets Hypothesis (EMH) are tested against the empirical results of the five case studies. The results were that, as suggested by the EMH, coordination technology has reduced the time and effort required to select and secure a mortgage. In addition, financial intermediaries have been threatened by the introduction of XCLOs, and in some cases opposition has been mounted against the systems.

On the other hand, despite the availability of the technology and mortgages' favorable characteristics as an electronically-mediated market product, the industry has not yet been fundamentally changed by the introduction of these systems. Of the two case studies that could be characterized as electronic markets, neither continues to exist in that form today. And the system with the current largest dollar volume of mortgages is best characterized as a pure electronic hierarchy.

These results suggest that either the results predicted by the EMH require a longer gestation period, or that the underlying model will require some augmentation in order to fully explain the results in the home mortgage market. Some possible barriers to the advent of full electronic markets in the home mortgage industry are suggested as possible directions for future research to explore in continuing the validation of the EMH.

No. 147

Shaping Electronic Communication: The Administration of Computer Conferencing in a Japanese R & D Group

Kazuo Okamura, Masayo Fujimoto, Wanda J. Orlikowski, JoAnne Yates

June, 1993

This paper reports on a study into the administrative activities surrounding a computer conferencing system used by a product development group within a large Japanese manufacturing firm. Drawing on interview and textual data, the study investigated the goals, deliberations, responsibilities, and outcomes of administrators of the group's conferencing system. From this analysis, a conceptual framework is developed that outlines the administrative processes and policies that are associated with managing the effective introduction and ongoing use of computer conferencing systems. The framework should be useful to researchers attempting to understand the use of computer conferencing systems, and to practitioners seeking guidelines on how to create effective policies and processes that manage the communication media from an organizational perspective, while also being responsive to specific user feedback.

No. 148

Phone-Based CSCW: Tools and Trials

Paul Resnick

July, 1993

Telephones are the most ubiquitous, bet-networked, and simplest computer terminals available today. They have been used for voice mail but largely overlooked as a platform for asynchronous cooperative work applications such as event calendars, issue discussions, and question and answer gathering. HyperVoice is a software toolkit for constructing such applications. Its building blocks are high-level presentation formats for collections of structured voice messages. The presentation formats can themselves be presented and manipulated, enabling significant customization of applications by phone. Results of two field trials suggest social context factors that will influence the success or failure of phone-based cooperative work applications in particular settings.

No. 149

Incident tracking at Infocorp: Case study of a pilot NOTES implementation

Michael Gallivan, Cheng Hian Goh, Lorin M. Hitt, George Wyner

July, 1993

Software to support communications, coordination and collaboration within a work group, commonly referred to as "groupware", has been attracting increasing interest in both the business and research communities. While some observers are convinced that such software has the potential to significantly enhance the ability of groups to collaborate effectively, other researchers have found that organizational factors are often important and that these factors often lead to unexpected or undesired outcomes.

This study investigates the introduction of a groupware product, Lotus Notes, into the technical support department of an information and software products company. Using interview and observations, we examine how the Notes application was developed, accepted, and used, and how characteristics of the work environment and the individuals in the work group relate to the implementation outcome.

Overall, we found enthusiastic acceptance of the Notes application, which we attribute to three main factors: a focus on first order (incremental) change, the developer's implementation strategy, and the cooperative culture of the organization. In addition, our results provide further support for the claim that organizational issues play a significant role in groupware implementations.

No. 150

Knee-jerk Anti-LOOPism and other E-mail Phenomena: Oral, Written, and Electronic Patterns in Computer-Mediated Communication

JoAnne Yates, Wanda J. Orlikowski

July, 1993

This paper reports on an empirical investigation into the on-going electronic interaction of a natural distributed group. Prior organizational research into use of electronic media has focused primarily on usage patterns and only occasionally on a few linguistic features, while linguistics researchers have looked more closely at certain technical aspects of language use in electronic communication. Interested in a broader range of linguistic and textural features that might be exhibited in the electronic mail medium, we conducted an exploratory study of the electronic communication of a task-oriented group over a 27 month period. Using qualitative and quantitative techniques, we found that the electronic mail messages displayed features normally associated with both speech and written discourse, as well as features that seem new to the electronic medium. The use of all three patterns was influenced by characteristics of the medium, the group, and its task.

No. 151

A Flexible Change Request Management System Based on Plan Integration

Kazuo Okamura

September, 1993

In cooperative software development, each programmer has their own plans and conflicts or redundancies inevitably arise among them. We are concerned with two main problems: first, to control changes without sacrificing programmers' flexibility, and, second, to guide change activities to conform project policies. Traditional methods of change request management focus on the management process structure based on project policies while cooperative development methodologies concern mainly with the conflict resolutions among each changes. In this paper, we describe an architecture which deals with proposal of changes. Based on plan integration it seamlessly supports both change coordination through negotiations and the change management process to have changes converge until they meet the project goals.

No. 152

Technology Mediation: An Organizational Mechanism For Contextualizing Technologies in Use

Kazuo Okamura, Masayo Fujimoto, Wanda J. Orlikowski, JoAnne Yates

September, 1993

We suggest that a mechanism for facilitating the context-specificity of technologies in use already exists in organizations, but is under-appreciated, under-utilized, and under-managed. We studied the use of a conferencing technology and found that its effectiveness was significantly shaped by a set of actors--technology mediators--who actively guided and manipulated the technology and its use over time. These technology mediators contextualized the technology in use, while also adapting it over time to keep it current with changing circumstances. We argue that well-managed technology mediation is a useful mechanism for facilitating the context-specificity of technologies in use, and that it complements the existing approaches of involving users in design and empowering users to continue design in use.


From Tabulators to Early Computers in the U.S. Life Insurance Industry: Co-evolution and Continuities

JoAnne Yates

October, 1993

This paper takes an initial look at early interactions between insurance as a user industry and vendors of computing equipment during the period from the end of the war into the mid-1950s, when first generation computers were adopted by many insurance firms. The transition of life insurance from tabulating to computing technology illustrates two forces evident at many points of technological change: co-evolution and continuity. The technology and its use in life insurance co-evolved, shaping each other in their interactions over the decade; at the same time, this major user industry and the nascent vendor industry similarly exerted influence on each other. In addition, relationships established and choices made during the tabulator era affected events and choices in the early computer era. In its core technology, the computer may have marked a point of discontinuity with what came before, but it clearly demonstrated continuities in many other areas, including market relations, the punched card as storage and input-output medium, and application areas in insurance.

No. 154

From Vendors to Partners: Information Technology and Incomplete Contracts in Buyer-Supplier Relationships

J. Yannis Bakos, Erik Brynjolfsson

January, 1997


As search costs and other coordination costs decline, theory predicts that firms should optimally increase the number of suppliers with which they do business. Despite recent declines in these costs due to information technology, there is little evidence of an increase in the number of suppliers used. On the contrary, in many industries, firms are working with fewer suppliers. This suggests that other forces must be accounted for in a more complete model of buyer supplier relationships.

This article uses the theory of incomplete contracts to illustrate that incentive considerations can motivate a buyer to limit the number of employed suppliers. To induce suppliers to make investments that cannot be specified and enforced in a satisfactory manner via a contractual mechanism, the buyer must commit not to expropriate the ex post surplus from such investments. Under reasonable bargaining mechanisms, such a commitment will be more credible if the buyer can choose from fewer alternative suppliers. Information technology increases the importance of noncontractible investments by suppliers, such as quality, responsiveness, and innovation; it is shown that when such investments are particularly important, firms will employ fewer suppliers, and this will be true even when search and transaction costs are very low.


No. 155

Superseded by No. 167


No. 156

Technological Frames: Making Sense of Information Technology in Organizations

Wanda J. Orlikowski , Debra C. Gash

October, 1993

In this paper, we build on and extend research into users' and designers' cognitions and values by proposing a systematic approach to examining the underlying assumptions, expectations, and knowledge that people have about technology. Such interpretations of technology (which we label technological frames) are central to understanding technological development use, and change in organizations as they critically influence the way people act around technology. We suggest that where the technological frames of key groups in organizations--such as managers, technologists, and users--are significantly different, difficulties and conflict around the development, use, and change of technology may result. We use the findings of an empirical study to illustrate how the nature, value, and use of a groupware technology were interpreted differently by various organizational stakeholders, resulting in outcomes that deviated from those expected. We argue that technological frames offer an interesting and useful analytic perspective for explaining and anticipating actions and meanings around information technology that are not easily obtained with other theoretical lenses.


No. 157

The Interdisciplinary Study of Coordination

Thomas W. Malone and Kevin Crowston

November, 1993

This paper characterizes an emerging research area, sometimes called coordination theory , that focuses on the interdisciplinary study of coordination. Research in this area uses and extends ideas about coordination from disciplines such as computer science, organization theory, operations, research, economics, linguistics, and psychology.

A key insight of the framework presented here is that coordination can be seen as the process of managing dependencies between activities. Further progress, therefore, should be possible by characterizing different kinds of dependencies and identifying the coordination processes that can be used to manage them. A variety of processes are analyzed from this perspective and commonalities across disciplines are identified. Processes analyzed include those for managing shared resources, producer / consumer relationships, simultaneity constraints, and task / subtask dependencies.

A major section of the paper summarizes ways of applying a coordination perspective in three different domains: (1) understanding the effects of information technology on human organizations and markets, (2) designing cooperative work tools, and (3) designing distributed and parallel processing computer systems. In the final section of the paper, elements of a research agenda in this new area are briefly outlined.


No. 158

Network Externalities in Microcomputer Software: An Econometric Analysis of the Spreadsheet Market

Erik Brynjolfsson, Chris F. Kemerer

November, 1993

As an economic good, software has a number of interesting properties. In addition to the value of intrinsic features, the creation of or conformance to industry standards may be critical to the success of a product. This research builds and evaluates econometric models to determine which product features are important in the purchase and pricing decisions for microcomputer software. A special emphasis is to identify the effects of standards and network externalities.

Four main results were found for the microcomputer spreadsheet market for the time period 1987-1992.


  • Hedonic regression techniques can provide sensible estimates of the value users place on intrinsic features such as the ability to sort the data or to embed charts.
  • Network externalities measurably influence the value of products. Each one percent increase in a product's installed base enables the product to command an additional $3.94 in price.
  • Purchasers place significant value on adherence to standards. Products compatible with the Lotus menu tree interface earned a premium of approximately 305 of the average price in the sample.
  • Shifts in technology platforms substantially change vendor premiums. Products manufactured by Lotus Development Corporation commanded a premium of $272 on the DOS platform, but only $65 on non-DOS platforms.

The results of this research and the general model proposed can be used to estimate the relative values of software package features, adherence to standards, and increased market share. It also quantifies the opportunities created by changes in technology architecture. Finally, the results offer guidance into current public policy issues such as the value of intellectual property embodied in software.


No. 159

Superceded by No. 184


No. 160

Reducing Buyer Search Costs: Implications for Electronic Markets

J. Yannis Bakos

Revised, July 1993

Information systems can serve as intermediaries between the buyers and the sellers in a consumer or industrial market, creating an "electronic market" and lowering the cost of buyers in acquiring information about seller prices and product offerings. As a result, these electronic market systems are likely to reduce the inefficiencies caused by buyer search costs, in the process reducing the ability of sellers to extract monopolistic profits while increasing the ability of markets to optimally allocate productive resources. This article models the role of buyer search costs in commodity and differentiated product markets. The impact of reducing search costs is explored in the context of introducing an electronic market system, and the allocational efficiencies such a reduction can bring to a differentiated market are formalized. Finally, the resulting potential of information systems to affect the market power of buyers and sellers and the efficiency of inter organizational transactions is used to highlight certain aspects of their strategic significance.

No. 161

Some Estimates of the Contribution of Information Technology to Consumer Welfare

Erik Brynjolfsson

January 1994

Over the past decade, American businesses have invested heavily in information technology (IT) hardware. Unfortunately, it has been difficult to assess the benefits that have resulted. One reason is that managers often buy IT to enhance customer value in ways that are largely ignored in conventional output statistics. Furthermore, because of competition, firms may be unable to capture the full benefits of the value they create. This undermines researchers' attempts to determine IT value by estimating its contribution to industry productivity or to company profits and revenues.

An alternative approach is to estimate the consumer surplus from IT investments by integrating the area under the demand curve for IT. This methodology does not directly address the question of whether managers and consumers are purchasing the optimal quantity of IT, but rather assumes their revealed willingness-to-pay for IT is an accurate indicator of their preferences. Using data from the US. Bureau of Economic Analysis, we estimate four measures of consumer welfare, including Marshallian surplus, exact surplus based on compensated (Hicksian) demand curves, a non-parametric estimate, and a value based on the theory of index numbers. Interestingly, all four estimates indicate that in our base year of 1987, IT spending generated approximately $50 billion to $70 billion in net value in the US. Our estimates imply that the value created for consumers from spending on IT is about three times as large as the amount paid to producers of IT equipment, providing a new perspective on the IT value debate.


No. 162

Paradox Lost? Firm-level Evidence of High Returns to Information Systems Spending

Erik Brynjolfsson and Lorin Hitt

January 1997

The "productivity paradox" of information systems (IS) is that, despite enormous improvements in the underlying technology, the benefits of IS spending have not been found in aggregate output statistics. One explanation is that IS spending may lead to increases in product quality or variety which tend to be overlooked in aggregate output statistics, even if they increase sales at the firm-level. Furthermore, the restructuring and cost-cutting that are often necessary to realize the potential benefits of IS have only recently been undertaken in many firms.

Our study uses new firm-level data on several components of IS spending for 1987-1991. The dataset includes 367 large firms which generated approximately $1.8 trillion dollars in output in 1991. We supplemented the IS data with data on other inputs, output, and price deflators from other sources. As a result, we could assess several econometric models of the contribution of IS to firm-level productivity.

Our results indicate that IS have made a substantial and statistically significant contribution to firm output. We find that between 1987 and 1991, gross return on investment (ROI) for computer capital averaged 81% for the firms in our sample. We find that the ROI for computer capital is greater than the return to other types of capital investment and that IS labor spending generates several times as much output as spending on non-IS labor and expenses. Because the models we applied were essentially the same as those that have been previously used to assess the contribution of IS and other factors of production, we attribute the different results to the fact that our data set is more current and larger than others explored. We conclude that the "productivity paradox" disappeared by 1991, at least in our sample of firms.


No. 163

Combining Local Negotiation and Global Planning in Cooperative Software Development Projects

Kazuo Okamura

August 1993

In cooperative software development, each programmer has their own plans and conflicts or redundancies inevitably arise among them. We are concerned with two main problems: first, to control changes without sacrificing programmers' flexibility, and, second, to guide change activities to conform project policies. Traditional methods of change request management focus on the management process structure based on project policies while cooperative development methodologies concern mainly with the conflict resolutions among each changes. In this paper, we describe an architecture which deals with proposal of changes. Based on plan integration it seamlessly supports both change coordination through negotiations and the change management process to have changes converge until they meet the project goals.


No. 164

Answer Garden: A Tool for Growing Organizational Memory

Mark S. Ackerman

January 1994

Answer Garden allows organizations to develop databases of commonly asked questions that grow "naturally" as new questions arise and are answered. It is designed to help in situations (such as customer "hot lines" or help desks) where there is a continuing stream of questions, many of which occur over and over, but some of which the organization has never seen before. Answer Garden includes a branching network of diagnostic questions, as well as additional information retrieval methods, that help users find the answers they want. If the answer is not present, the system automatically routes the question to the appropriate expert, and the answer is returned to the user as well as inserted into the information database. Experts can also modify this network in response to users' problems. Through their normal interactions, users and experts build an organizational memory.

The thesis examines organizational memory and Answer Garden from three perspectives: in terms of organizational memory at an organizational level, information seeking at an individual level, and software systems at a technical level. It is asserted that information technology can support organizational memory in two ways, either by making recorded knowledge retrievable or by making individuals with knowledge accessible. The thesis also describes two additional organizational memory applications, the ASSIST and LiveDoc, and details the Answer Garden Substrate system underlying all three applications. Finally, the thesis reports a field study of software engineers' using Answer Garden.

No. 165

GroupLens: An Open Architecture for Collaborative Filtering of Netnews

Paul Resnick, Neophytos Iacovou, Mitesh Suchak, Peter Bergstrom, John Riedl

March 1994

Collaborative filers help people make choices based on the opinions of other people. GroupLens is a system for collaborative filtering of netnews, to help people find articles they will like in the huge stream of available articles. News reader clients display predicted scores and make it easy for users to rate articles after they read them. Rating servers, called Better Bit Bureaus, gather and disseminate the ratings. The rating servers predict scores based on the heuristic that people who agreed in the past will probably agree again. Users can protect their privacy by entering ratings under a pseudonym, without reducing the effectiveness of the score prediction. The entire architecture is open: alternative software for news clients and Better Bit Bureaus can be developed independently and can interoperate with the components we have developed.


No. 166

Genre Repertoire: Norms and Forms for Work and Interaction

Wanda J. Orlikowski, JoAnne Yates

March 1994


Using the genre perspective, we studied the electronic communication of knowledge workers collaborating on a multi-year project and found that their work and interactions were mediated by the use of four genres (or shared types) of communication. Drawing on these findings, we develop the concept of genre repertoire to designate the set of genres enacted by groups, organizations, or communities to accomplish their work. We show that the establishment of a community's genre repertoire, which typically occurs at its formation, is a process that is largely implicit and rooted in members' prior experiences of working and interacting. Once established, a genre repertoire serves as a powerful social template for shaping ho, why, and with what effect members of a community interact to get their work done. While serving to institutionalize norms and forms of work and interaction, genre repertoires can and do change over time through members' response to project events, task demands, media capabilities, time pressures, and converging community norms. The concept of genre repertoire offers organizational research a powerful way of understanding mediated work practices and interaction norms, and hence how communication technologies may be associated with changes in the work and interaction of groups, organizations, or communities.


No. 167

Shaping Electronic Communication: The Metastructuring of Technology in Use

Wanda J. Orlikowski and JoAnne Yates, Kazuo Okamura, Masayo Fujimoto

April 1994

In this paper we suggest that the use of computer-mediated communication technologies in new and fluid organizations can be facilitated by the explicit and ongoing adapting of those technologies to changing contexts of use. In an exploratory study on the use of a computer conferencing system in an R&D setting, we found that the new medium's effectiveness was significantly influenced by the intervention of a few individuals who took on a role we label technology-use mediation. These mediators shaped everyday use of the conferencing technology, modifying the technology as well as the context of use to promote effective electronic communication. Drawing on the insights of this empirical study, we develop a theoretical framework that views technology-use mediation as influencing how users structure their communication technologies, and hence as one form of metastructuring. We believe that the role of technology-use mediation constitutes a valuable mechanism for providing the ongoing attention and resources needed to contextualize what are often generic computer-mediated communication technologies to the shifting conditions of dynamic organizational forms.


No. 168

Coalition, Cryptography, and Stability: Mechanisms for Coalition Formation in Task Oriented Domains

Gilad Zlotkin and Jeffrey S. Rosenschein.

July, 1994

(To appear in the proceedings of the The National Conference on Artificial Intelligence, Seattle, Washington, July 1994.)


Negotiation among multiple agents remains an important topic of research in Distributed Artificial Intelligence (DAI). Most previous work on this subject, however, has focused on bilateral negotiation, deals that are reached between two agents. There has also been research on $n$-agent agreement which has considered ``consensus mechanisms'' (such as voting), that allow the full group to coordinate itself. These group decision-making techniques, however, assume that the entire group will (or has to) coordinate its actions. Sub-groups cannot make sub-agreements that exclude other members of the group.

In some domains, however, it may be possible for beneficial agreements to be reached among sub-groups of agents, who might be individually motivated to work together to the exclusion of others outside the group. This paper considers this more general case of $n$-agent coalition formation. We present a simple coalition formation mechanism that uses cryptographic techniques for subadditive Task Oriented Domains. The mechanism is efficient, symmetric, and individual rational. When the domain is also concave, the mechanism also satisfies coalition rationality.


No. 169

Consenting Agents: Designing Conventions for Automated Negotiation

Jeffrey S. Rosenschein and Gilad Zlotkin

(To appear in the AI Magazine, Vol 15, Fall 1994, No. 3.)

As distributed systems of computers play an increasingly important role in society, it will be necessary to consider ways in which these machines can be made to interact effectively. We are concerned with heterogeneous, distributed systems made up of machines that have been programmed by different entities to pursue different goals.

Adjusting the rules of public behavior (the rules of the game) by which the programs must interact can influence the private strategies that designers set up in their machines. These rules can shape the design choices of the machines' programmers, and thus the run-time behavior of their creations. Certain kinds of desirable social behavior can thus be caused to emerge through the careful design of interaction rules. Formal tools and analysis can help in the appropriate design of these rules.

We here consider how concepts from fields such as decision theory and game theory can provide standards to be used in the design of appropriate negotiation and interaction environments. This design is highly sensitive to the domain in which the interaction is taking place.


No. 170

Meet Your Destiny: A Non-manipulable Scheduler

Eithan Ephrati, Gilad Zlotkin and Jeffrey S. Rosenschein

(To appear in the proceedings of the Conference on Computer Supported Cooperative Work, North Carolina, October, 1994).

In this paper we present three scheduling mechanisms that are manipulation-proof for closed systems. The amount of information that each user must encode in the mechanism increases with the complexity of the mechanism. On the other hand, the more complex the mechanism is, the more it maintains the privacy of the users.

The first mechanism is a centralized, calendar-oriented one. It is the least computationally complex of the three, but does not maintain user privacy. The second is a distributed meeting-oriented mechanism that maintains user privacy, but at the cost of greater computational complexity. The third mechanism, while being the most complex, maintains user privacy (for the most part) and allows users to have the greatest influence on the resulting schedule.


No. 171

Helping CSCW Applications Succeed: The Role of Mediators in the Context of Use.

Kazuo Okamura, Masayo Fujimoto, Wanda J. Orlikowski, JoAnne Yates

(To appear in the proceedings of the Conference on Computer Supported Cooperative Work, North Carolina, October, 1994).

August 1994

This study found that the use of a computer conferencing system in an R&D lab was significantly shaped by a set of intervening actors--mediators--who actively guided and manipulated the technology and its use over time. These mediators adapted the technology to its initial context and shaped user interaction with it; over time, they continued to modify the technology and influence use patterns to respond to changing circumstances. We argue that well-managed mediation may be a useful mechanism for shaping technologies to evolving contexts of use, and that it extends our understanding of the powerful role that intervenors can play in helping CSCW applications succeed.


No. 172

Computers and Economic Growth: Firm-Level Evidence.

Erik Brynjolfsson, Lorin Hitt

August 1994

In advanced economies, computers are a promising source of output growth. This paper assesses the value added by computer equipment and information systems labor by estimating several production functions that also include ordinary capital, ordinary labor and R&D capital. Our study employs recent firm-level data for 367 large firms which generated approximately $1.8 trillion dollars in output per year for the period 1988 to 1992.

We find evidence that computers are correlated with significantly higher output at the firm level, although simulataneity makes it difficult to prove a casual relationship. Considering the rapid growth in the sample period than all other types of capital combined, despite the fact that they accounted for less than 2% of the total capital stock.


No. 173

Information Technology as a Factor of Production: The Role of Differences Among Firms.

Erik Brynjolfsson, Lorin Hitt

August 1994

Despite evidence that information technology (IT) has recently become a productive investment for a large cross-section of firms, a number of questions remain. Some of these issues can be addressed by extending the basic production function approach that was applied in earlier work. Specifically, in this short paper we 1) control for individual firm differences in productivity by employing a "firm effects" specification, 2) consider the more flexible translog specification instead of only the Cobb-Douglas specification, and 3) allow all parameters to vary between various subsectors of the economy.

We find that while "firm effects" may account for as much as half of the productivity benefits imputed to IT in earlier studies, the elasticity of IT remains positive and statistically significant. We also find that the estimates of IT elasticity and marginal product are little-changed when the less restrictive translog production function is employed. Finally, we find only limited evidence of differences in IT's marginal product between manufacturing and services and between the "measurable" and "unmeasurable" sectors of the economy. Surprisingly, we find that the marginal product of IT is at least as high in firms that did not grow during 1988-1992 sample period as it is in firms that grew.


No. 174

A Taxonomy of Organizational Dependencies and Coordination Mechanisms.

Kevin Crowston

August 1994

Interdependency and coordination have been perennial topics in organization studies. The two are related because coordination is seen as a response to problems caused by dependencies. Past studies, however, describe dependencies and coordination mechanisms only in general terms, without characterizing in detail difference between dependencies, the problems dependencies create or how the proposed coordination mechanisms address those problems. This vagueness makes it difficult or impossible to determine what alternative coordination mechanisms might be useful in a given circumstance or to directly translate these alternative designs into specifications of individual activities.

In this paper I develop a taxonomy of dependency types by considering possible combinations of activities using resources. The taxonomy includes task-resource dependencies and three types of task-task dependencies: shared resources, producer-consumer and common output. For each type of dependency, alternative coordination mechanisms are described. I conclude by discussing how the taxonomy helps to analyze organizational processes and suggest alternative processes.


No. 175

Electronic communication and new organizational forms: A coordination theory approach.

Kevin Crowston

August 1994

Describing and categorizing organizational forms remains a central problem in organization theory. Unfortunately defining organizational form poses numerous difficulties. Rather than attempting to categorize entire organizations, researchers have instead suggested focusing on how particular tasks are performed, i.e., adopting the process as the unit of analysis. An important practical problem then is to identify processes that would be suitable for performing a desired task, especially processes that are enabled by the use of new electronic media and other forms of information technology. Coordination theory provides an approach to the study of processes. In this view, the form a process takes depends on the coordination mechanisms chosen to manage dependences among tasks and resources involved in the process. These mechanisms are primarily information-processing and so the use of new media will particularly affect their cost, perhaps changing which are preferred.

In this paper, I use coordination theory to analyze the software change process of a large mini-computer manufacturer and suggest alternative ways the dependences involved could be managed and thus alternative forms the process could take. Mechanisms analyzed include those for task assignment, resource sharing and managing dependences between modules of code. The organization studied assigned problem reports to engineers based on the module which appeared to be in error; engineers specialized in particular modules. The framework suggests alternative mechanisms including assignment to generalists based on workload or based on market-like bids. Modules of code were not shared, but rather "owned" by one engineer, thus reducing the need for coordination; a more elaborate code management system would be required if multiple engineers needed to work on the same modules. Finally, engineers managed dependences between modules informally, based on their personal knowledge of which other engineers used their code; alternatives include formally defining the interfaces between modules and tracking their users.

Software bug fixing provides a microcosm of coordination problems and solutions. Similar coordination problems arise in most processes and are managed by a similar range of mechanisms. For example, diagnosing but reports and assigning them to engineers may have interesting parallels to diagnosing patients and assigning them to specialists.

While the case presented does not formally test coordination theory, it does illustrate the potential of the coordination theory for exploring the space of organizational forms. Future work includes developing more rigorous techniques for such analyses, applying the techniques to a broader range of processes, identifying additional coordination problems and mechanisms and developing tools for collecting and comparing processes and perhaps automatically suggesting potential alternatives.


No. 176

Grammatical Models of Organizational Processes.

Brian T. Pentland

August 1994

Grammar has been used metaphorically to describe organizational processes, but the metaphor has never been systematically developed so that it can be applied in empirical research. This paper develops the grammatical metaphor into a rigorous model for describing and theorizing about organizational work processes, defined here as sequences of actions that occur in the context of enabling and constraining structures. A grammatical model starts with a lexicon of elementary actions (called moves) and specifies the ways in which they can be combined to create a process. Unlike other sequential data analysis techniques, grammatical models provide a natural way of describing the layering and nesting of actions that typifies organizational processes. The example of a simple retail sales transaction is used to illustrate the underlying concepts. The paper also examines some methodological considerations involved in using process grammars and proposes an agenda for research , including: (1) creating descriptive taxonomies of organizational processes; (2) creating disconfirmable theories about the relationship between processes and the structures that enable and constrain them; (3) explaining the distribution of observed processes and predicting new processes that have not yet been observed; and (4) designing new organizational processes.

KEYWORDS: Grammars; Process models; Business processes; Sequential analysis.


No. 177

Grammatical Model of Organizational Routines in a Technical Service Organization.

Brian T. Pentland

August 1994

This paper explores the sequential structure of work processes in a task unit whose work involves high numbers of exceptions, low analyzability of search, frequent interruptions and extensive deliberation, and cannot be characterized as routine under any traditional definition. Yet a detailed analysis of the sequential pattern of action in a sample of 168 service interactions reveals that most calls follow a repetitive, functionally similar pattern. This apparent contradiction presents a challenge to our theoretical understanding of routines: how can apparently non-routine work display such a high degree of regularity? To answer this questions, we propose a new definition of organizational routines as a set of functionally similar patterns and illustrate a new methodology for studying the sequential structure of work processes using rule-based grammatical models. This approach to organizational routines juxtaposes the structural features of the organization against the reflective agency of organizational members. Members enact specific performances form among a constrained (but potentially large) set of possibilities that can be described by a grammar, giving rise to the regular patterns of action we label routines.


No. 178

Process Grammars: A Generative Approach to Process Redesign.

Brian T. Pentland

August 1994

Organizations are under increasing pressure to redesign core organizational processes. This paper describes the ways in which the Process Handbook (Malone, Crowston, Lee, and Pentland, 1993) can be used to describe and redesign business processes. The Process Handbook is an electronic database of process descriptions and analysis tools. When completed, it will embody a large lexicon of process steps and constraints on the ways in which they can be combined. The Process Handbook can therefore be viewed as a kind of grammar for generating alternative process configurations. The example of a supply chain is used to illustrate the concepts.



Roles for Electronic Brokers

Paul Resnick, Richard Zeckhauser and Chris Avery

The information superhighway directly connects millions of people, each both a consumer of information and a potential provider. If their exchanges are to be efficient, yet protected on matters of privacy, sophisticated mediators will be required. Electronic brokers can play this important role by organizing markets that promote the efficient production and consumption of information. One possible role for brokers would be to collect and redistribute product evaluations. We discuss issues of privacy, censorship, and incentives for participation that would arise in such a shared evaluation service. We then argue for the separation of the services of information provision and brokerage.


No. 180

The PIF Process Interchange Format and Framework

Jintae Lee, Gregg Yost and the PIF Working Group

December 1994

This document describes the first version of the Process Interchange Format (PIF, version 1.0). The goal of this work is to develop an interchange format to help automatically exchange process descriptions among a wide variety of business process modeling and support systems such as: workflow software, flow charting tools, process simulation systems, and process repositories.

Instead of having to write ad hoc translators for each pair of such systems, each system will only need to have a single translator for converting process descriptions in that system into and out of the common PIF format. Then any system will be able to automatically exchange basic process descriptions with any other system.

The current PIF format includes a core set of object types (such as activities, actors, and prerequisite relations) that can be used to describe the basic elements of any process. The PIF format also includes a framework for extending the core set of object types to include additional information needed in specific applications. These extended descriptions are exchanged in such a way that the common elements are interpretable by any PIF translator and the additional elements are interpretable by any translator that knows about the extensions.

The PIF format was developed by a working group including representatives from several universities and companies and has been used for experimental automatic translations among systems developed independently at three of these sites.

This document is being distributed in the hopes that other groups will comment upon the interchange format proposed here and that this format (or future versions of it) may be useful to other groups as well.


No. 181

Experiments with Oval: A Radically Tailorable Tool for Cooperative Work

Thomas W. Malone, Kum-Yew Lai, and Christopher Fry

December 1994

This paper describes a series of tests of the generality of a "radically tailorable" tool for cooperative work. Users of this system can create applications by combining and modifying four kinds of building blocks: objects, views, agents, and links. We found that user-level tailoring of these primitives can provide most of the functionality found in well-known cooperative work systems such as gIBIS, Coordinator, Lotus Notes, and Information Lens. These primitives, therefore, appear to provide an elementary "tailoring language" out of which a wide variety of integrated information management and collaboration applications can be constructed by end users.

No. 182

The Impact of Group Context on Patterns of Groupware Use: A Study of Computer Conferencing as a Medium of Work Group Communication and Coordination

Paul Cole

January, 1995

This paper describes a study of the use of computer conferencing for work group communication and coordination. The goal of the study was to examine the relationship between group context and technology utilization, i.e. the social factors which influence a group's use of groupware. This paper describes three work groups, identifies patterns of computer conferencing use for each group, and examines the relationship between use patterns and group context. The findings provide considerations for groupware introduction.


No. 183

Creating Value and Destroying Profits? Three Measures of Information Technology's Contributions

Lorin Hitt and Erik Brynjolfsson

January 1995

The business value of information technology (IT) has been debated for a number of years. Some authors have found large productivity improvements attributable to computers, as well as evidence that IT has generated substantial benefits for consumers. However, others continue to question whether computers have had any bottom line impact on business performance. In this paper, we focus on the fact that productivity, consumer value and business performance are separate questions and that the empirical results on IT value depend heavily on which question is being addressed and what data are being used. Applying methods based on economic theory, we are able to examine the relevant hypotheses for each of these three questions, using recent firm-level data on IT spending by 367 large firms. Our findings indicate that computers have led to higher productivity and created substantial value for consumers, but that these benefits have not resulted in measurable improvements in business performance. We conclude that while modeling techniques need to be improved, these results are consistent with economic theory, and thus there is no inherent contradiction between increased productivity, increased consumer value and unchanged business performance.


No. 184

Relief from the Audio Interface Blues: Expanding the Spectrum of Menu, List, and Form Styles

Paul Resnick and Robert A. Virzi

January 1995

Menus, lists, and forms are the workhorse dialogue structures in telephone-based interactive voice response applications. Despite diversity in applications, there is a surprising homogeneity in the menu, list, and form styles commonly employed. There are, however, many alternatives, and no single style fits every prospective application and user population. A design space for each dialogue structure organizes the alternatives and provides a framework for analyzing their benefits and drawbacks. In addition to phone based interactions, the design spaces apply to any limited bandwidth, temporally constrained display devices, including small screen devices such as Personal Digital Assistants (PDAs) and screen phones.


No. 185

Evolving Novel Organizational Forms

Kevin Crowston

April 1995

A key problem in organization theory is to suggest new organizational forms. In this paper, I suggest the use of genetic algorithms to search for novel organizational forms by reproducing some of the mechanics of organizational evolution. Issues in using genetic algorithms include identification of the unit of selection, development of a representation and determination of a method for calculating organizational fitness. As an example of the approach, I test a proposition of Thompson's about how interdependent positions should be assigned to groups. Representing an organization as a collection of routines might be more general and still amenable to evolution with a genetic algorithm. I conclude by discussing possible objections to the application of this technique.


No. 186

Evolving with Notes: Organizational Change around Groupware Technology

Wanda Orlikowski

June, 1995

This paper examines the use of a groupware technology--Lotus Development Corporation's Notes® --in the context of customer support to understand how the technology was used to enable organizational changes over time. Building on its successful implementation of the technology two years ago, the customer support department underwent a number of organizational changes that altered the nature and distribution of work, forms of collaboration, utilization and dissemination of knowledge, and coordination with internal and external units. These changes were enacted through a series of intended as well as opportunistic modifications to both the technology and the organization. The effectiveness of this change process suggests a strategy of implementing and using groupware technology that focuses first on enacting some initial planned organizational changes, and then builds on these to enact emergent changes in response to the opportunities and conditions occasioned by the planned changes. Because groupware technologies are largely open-ended and adaptable, this process of evolving organizationally with the technology over time may be a particularly useful way of implementing organizational change around groupware.


No. 187

Applying Specialization to Process Models

George Wyner and Jintae Lee

September, 1995

Object-oriented analysis and design methodologies take full advantage of the object approach when it comes to modeling the objects in a system. However, system behavior continues to be modeled using essentially the same tools as in traditional systems analysis: state diagrams and dataflow diagrams. In this paper we extend the notion of specialization to these process representations and identify a set of transformations which, when applied to a process description, always result in specialization. We analyze specific examples in detail and demonstrate that such a use of specialization is not only theoretically possible, but shows promise as a method for categorizing and analyzing processes. We identify a number of apparent inconsistencies between process specialization and the object specialization which is part of the object-oriented approach. We demonstrate that these apparent inconsistencies are superficial and that the approach we take is compatible with the traditional notion of specialization.

No. 188

Explicit and Implicit Structuring of Genres: Electronic Communication in a Japanese R&D Organization

JoAnne Yates, Wanda J. Orlikowski and Kazuo Okamura

February 1996

A study of a Japanese R&D group using a new electronic medium identified two contrasting patterns of media use: one involving explicit structuring of community genre norms, and one involving implicit structuring of local genre norms. These patterns provide initial explanations for how people begin to use new electronic media and how their use changes over time. We believe that the two patterns can serve as initial and suggestive archetypes for helping researchers and practitioners in their design, introduction, and ongoing management of new communication media.

No. 189

The Matrix of Change: A Tool for Business Process Reengineering

Erik Brynjolfsson, Amy Austin Renshaw and Marshall van Alstyne

January 1996

Business process reengineering efforts suffer from low success rates, due in part to a lack of tools for managing the change process. The Matrix of Change can help managers identify critical interactions among processes. In particular, this tool helps managers deal with issues such as how quickly change should proceed, the order in which changes should take place, whether to start at a new site, and whether the proposed systems are stable and coherent. When applied at a medical products manufacturer, the Matrix of Change provided unique and useful guidelines for change management.


No. 190

Productivity without Profit?: Three Measures of Information Technology's Value

Lorin Hitt and Erik Brynjolfsson

April 1996

The business value of information technology (IT) has been debated for a number of years. While some authors have attributed large productivity improvements and substantial consumer benefits to IT, others report that IT has not had any bottom line impact on business profitability. In this paper, we focus on the fact that while productivity, consumer value and business profitability are related, they are ultimately separate questions. Accordingly, the empirical results on IT value depend heavily on which question is being addressed and what data are being used. Applying methods based on economic theory, we are able to define and examine the relevant hypotheses for each of these three questions, using recent firm-level data on IT spending by 370 large firms. Our findings indicate that IT has increased productivity and created substantial value for consumers. However, these benefits have not resulted in supranormal business profitability. We conclude that while modeling techniques need to be improved, these results are consistent with economic theory. Thus, there is no inherent contradiction between increased productivity, increased consumer value and unchanged business profitability.


No. 191

An Improvisational Model of Change Management: The Case of Groupware Technologies

Wanda J. Orlikowski and J. Debra Hofman

February 1996

In this paper, we present an alternative way of thinking about technological change in organizations. This alternative approach is motivated by a recognition that traditional models for managing technological change - in which the major steps of the change are defined in advance and the organization then strives to implement these changes as planned in a specified period of time - are not particularly useful given the more turbulent, flexible, and uncertain organizational situations that many companies face today. Traditional models are also not particularly useful for helping the implementation of technologies such as groupware whose unprecedented, open-ended, and context-specific nature make it difficult to predefine the exact changes to be realized and to predict their likely organizational impact.

We suggest an alternative model of managing technological change, one that reflects the dynamic and variable nature of contemporary organizations and technologies, and which accommodates iterative experimentation, use, and learning over time. We label such a model of managing technological change "improvisational," and suggest that it may enable organizations to take advantage of the evolving capabilities, emerging practices, and unanticipated outcomes that accompany use of new technologies in contemporary organizations.


No. 192

The State Of Network Organization: A Survey In Three Frameworks

Marshall Van Alstyne

February 1996

This article reviews the literature on network organizations and interprets explanations for its behaviors in terms of established analytical principles. Tools from computer science, economics, and sociology give three markedly different interpretations of its core attributes but they also settle on a handful of common themes. The proposed benefits are a clarification of what it means for an organization to be network structured, a few insights into its origins, and a suggestion of where the boundaries to some of its different forms might lie.


No. 193

A Coordination Perspective on Software Architecture: Towards a Design Handbook for Integrating Software Components

Chrysanthos Nicholas Dellarocas

February 1996

This thesis argues that many of the difficulties associated with building software applications by integrating existing components are related to a failure of current programming languages to recognize component interconnection as a separate design problem, orthogonal to the specification and implementation of a component's core function.

It proposes SYNOPSIS, an architectural description language which supports two orthogonal abstractions: activities, for representing the functional pieces of an application, and dependencies, for describing their interconnection relationships. Coordination processes, defined as an attribute of dependencies, describe implementations of interconnection protocols. Executable systems can be generated from SYNOPSIS descriptions by successively replacing activities with more specialized versions and managing dependencies with coordination processes, until all elements of the description are specific enough for code generation to take place.

Furthermore, it proposes a "design handbook", consisting of a vocabulary of common dependency types and a design space of associated coordination processes. The handbook is based on the insight that many software interconnection relationships can be described using a relatively narrow set of concepts orthogonal to the problem domain of most applications, such as resource flows, resource sharing, and timing dependencies.

A prototype component-based application development tool called SYNTHESIS was developed. SYNTHESIS maintains a repository of increasingly specialized dependency types, based on the proposed design handbook. It assists the generation of executable applications by successive semi-automatic transformations of their SYNOPSIS descriptions.

A set of four experiments is described. Each experiment consisted in specifying a test application as a SYNOPSIS diagram, associating application activities with components exhibiting various mismatches, and using SYNTHESIS to assemble these components into executable systems. SYNTHESIS was able to exploit its dependencies repository in order to resolve a wide range of interoperability and architectural mismatches and integrate independently developed components into the test applications, with minimal or no need for additional manually-written code. It was able to reuse a single SYNOPSIS architectural description in order to generate versions of a test application for two different execution environments. Finally, it was able to suggest various alternative architectures for integrating each component set into its corresponding application.


No. 194

The PIF Process Interchange Format and Framework

Jintae Lee, Michael Gruninger, Yan Jin, Thomas Malone, Austin Tate, Gregg Yost & other members of the PIF Working Group

May 1996

This document provides the specification of the Process Interchange Format (PIF) version 1.1. The goal of this work is to develop an interchange format to help automatically exchange process descriptions among a wide variety of business process modeling and support systems such as workflow software, flow charting tools, planners, process simulation systems, and process repositories. Instead of having to write ad hoc translators for each pair of such systems, each system will only need to have a single translator for converting process descriptions in that system into and out of the common PIF format. Then any system will be able to automatically exchange basic process descriptions with any other system.

This document describes the PIF-CORE 1.1, i.e. the core set of object types (such as activities, agents, and prerequisite relations) that can be used to describe the basic elements of any process. The document also describes a framework for extending the core set of object types to include additional information needed in specific applications. These extended descriptions are exchanged in such a way that the common elements are interpretable by any PIF translator and the additional elements are interpretable by any translator that knows about the extensions.


No. 195

Lexical and Sequential Variety in Organizational Processes: Some Preliminary Findings and Propositions

Brian T. Pentland, Malu Roldan, Ahmed A. Shabana, Louise L. Soe, and Sidne G. Ward

August 1996

Routineness is a central concepts in organizational theory and design and is widely understood as a product of low task variety and high task analyzability. Standardized scales to measure these dimensions been developed and shown to be reliable, but preliminary results reported here suggest the possibility that these scales may measure routineness in the content of a task unit's work but not variety in the process. Comparing results from the standard measures and detailed observational studies in three task units, we discovered that work processes in the most "routinized" task units (as measured by the standard scales) are more varied than in the less "routinized" task unit. To help explain these findings, we introduce and operationalize the concepts of lexical and sequential variety and use them to formulate testable propositions. We also discuss the implications of this alternative view of routines and routineness for issues such as organizaitonal learning, process redesign, and mass customization.


No. 196

Early Interactions between the Life Insurance and Computer Industries

JoAnne Yates

July 1996

This paper is a study of how representatives of one commercial user industry, life insurance, interacted with some players in the newly forming computer industry in the years after World War II but before the slae of any computers for commercial purposes. In particular, this interaction shows how pioneers in life insurance, including the Prudential's early computer expert and proselytizer Edmund C. Berkeley and a broader industry effort by a special committee of the Society of Actuaries, viewed computer technology and their potential use of it, as well as the ways in which they influenced its development. Both sets of actors played important roles in educating their own firms and the life insurance industry as a whole about the potential uses of computers for insurance, as well as communicating that industry's needs, especially in the areas of rapid input/output and verification needed for routine transactions processing, to computer vendors. These interactions suggest that the theme of co-evolution of information technology and its use in life insurance, previously established in a study of the tabulator era, continues into the early computer era. Thsi paper reinforces the notion that users can and do shape information technology, just as information technology has a shaping influence on the way users do work.

No. 197

The Self-Governing Internet: Coordination by Design

Sharon Eisner Gillett and Mitchell Kapor

January 1997

Contrary to its popular portrayal as anarchy, the Internet is actually managed, though not by a manager in the traditional sense of the word. This paper explains how the decentralized Internet is coordinated into a unified system. It draws an analogy to an organizational style in which a manager sets up a system that allows 99% of day-to-day functions to be handled by empowered employees, leaving the manager free to deal with the 1% of exceptional issues. Within that framework, it discusses:

  • how the Internet's technical design and cultural understandings serve as the system that automates 99% of Internet coordination;
  • what the 1% of exceptional issues are in today's Internet, how they are handled by multiple authorities, and where the stresses lie in the current structure;
  • and the differences in mindset that distinguish the Internet's self-governance from the management of more traditional communication systems.

No. 198

Tools for inventing organizations: Toward a handbook of organizational processes

Thomas W. Malone, Kevin Crowston, Jintae Lee, Brian Pentland, Chrysanthos Dellarocas, George Wyner, John Quimby, Charley Osborne, and Abraham Bernstein

January 1997

This paper describes a novel theoretical and empirical approach to tasks such as business process redesign, enterprise modeling, and software development. The project involves collecting examples of how different organizations perform similar processes, and organizing these examples in an on-line "process handbook". The handbook is intended to help people: (1) redesign existing organizational processes, (2) invent new organizational processes (especially ones that take advantage of information technology), (3) learn about organizations, and (4) automatically generate software to support organizational processes. A key element of the work is an approach to analyzing processes at various levels of abstraction, thus capturing both the details of specific processes as well as the "deep structure" of their similarities. This approach uses ideas from computer science about inheritance and from coordination theory about managing dependencies. A primary advantage of the approach is that it allows people to explicitly represent the similarities (and differences) among related processes and to easily find or generate sensible alternatives for how a given process could be performed. In addition to describing this new approach, the work reported here demonstrates the basic technical feasibility of these ideas.

No. 199

Bundling Information Goods: Pricing, Profits and Efficiency

Yannis Bakos and Erik Brynjolfsson

January 1997

We analyze pricing strategies for digital information goods, such as those increasingly available via the Internet. Because perfect copies of such goods can be created and distributed almost costlessly, any single positive price for copies is likely to be socially inefficient. However, we show that, under certain conditions, a monopolist selling information goods in large bundles instead of individually may nearly eliminate this inefficiency. In addition, the bundling strategy can extract as profits an arbitrarily large fraction of the area under the demand curve for the individual goods while commensurately reducing consumers' surplus. The bundling strategy is particularly attractive when the marginal costs of the goods are very low, when the correlation in the demand for different goods is low, and when consumer valuations for the individual goods are of comparable magnitude. We also describe the optimal pricing strategies when these conditions do not hold; show how private incentives for bundling can diverge from social incentives; and describe a mechanism to recover information about the underlying demand for each individual good. The predictions of our analysis appear to be consistent with empirical observations of the markets for Internet and on-line content, cable television programming, and copyrighted music.

No. 200

Information Systems and the Organization of Modern Enterprise

Erik Brynjolfsson and Haim Mendelson

September 1997

In this short paper we briefly discuss the newly emerging organizational paradigms and their relationship to the prevailing trends in information technology (IT). We argue that IT is an important driver of this transformation. Finally, we place the studies selected for the special issue of the Journal of Organizational Computing within this context.

No. 201

Information Technology as a Factor of Production: The Role of Differences Among Firms

Erik Brynjolfsson and Lorin Hitt

September 1997

Despite evidence that information technology (IT) has recently become a productive investment for a large cross-section of firms, a number of questions remain. Some of these issues can be addressed by extending the basic production function approach that was applied in earlier work. Specifically, in this short paper we 1) control for individual firm differences in productivity by employing a "firm effects" specification, 2) consider the more flexible translog specification instead of only the Cobb-Douglas specification, and 3) allow all parameters to vary between various subsectors of the economy.

We find that while "firm effects" may account for as much as half of the productivity benefits imputed to IT in earlier studies, the elasticity of IT remains positive and statistically significant. We also find that the estimates of IT elasticity and marginal product are little-changed when the less restrictive translog production function is employed. Finally, we find only limited evidence of differences in IT's marginal product between manufacturing and services and between the "measurable" and "unmeasurable" sectors of the economy. Surprisingly, we find that the marginal product of IT is at least as high in firms that did not grow during 1988-1992 sample period as it is in firms that grew.

No. 202

Information Technology and Productivity: A Review of the Literature

Erik Brynjolfsson and Shinkyu Yang

September 1997

In recent years, the relationship between information technology (IT) and productivity has become a source of debate. In the 1980s and early 1990s, empirical research generally did not significant productivity improvements associated with IT investments. More recently, as new data are identified and new methodologies are applied, several researchers have found evidence that IT is associated not only with improvements in productivity, but also in intermediate measures, consumer surplus, and economic growth. Nonetheless, new questions emerge even as old puzzles fade. This survey reviews the literature, identifies remaining questions, and concludes with recommendations for applications of traditional methodologies to new data sources, as well as alternative, broader metrics of welfare to assess and enhance the benefits of IT.

No. 203

A Knowledge-Based Approach to Exception Handling in Workflow Systems

Mark Klein

April 1998

This paper describes a novel knowledge-based approach for helping workflow process designers and participants better manage the exceptions (deviations from an ideal collaborative work process caused by errors, failures, resource or requirements changes etc.) that can occur during the enactment of a workflow. This approach is based on exploiting a generic and reusable body of knowledge concerning what kinds of exceptions can occur in collaborative work processes, how these exceptions can be detected, and how they can be resolved. This work builds upon previous efforts from the MIT Process Handbook project and from research on conflict management in collaborative design.

No. 204

A Coordination Theory Approach to Process Description and Redesign

Kevin Crowston

July 1998

Managers must understand, influence, and redesign organizational processes to improve business performance. In this paper we present a technique for documenting a business process. The technique has six steps: defining process boundaries, collecting data, determining actors and resources, determining activities, determining dependencies and model verification. While similar to other process-mapping techniques, our approach is novel in incorporating ideas from coordination theory, thus the attention to dependencies. As a result, the technique is useful both for documenting a process and suggesting ways in which the process could be redesigned. We present an extended illustration with the hope that the technique can be used by readers of this article.

No. 205

Genre Systems: Structuring Interaction through Communicative Norms

Wanda Orlikowski and JoAnne Yates

July 1998

In this paper, we demonstrate that teams may use genre systems -- sequences of interrelated communicative actions -- strategically or habitually to structure their collaboration. Using data from three teams' use of a collaborative electronic technology, Team Room, over an eight month period, we illustrate that genre systems are a means of structuring six aspects of communicative interaction: purpose (why), content (what), form (how), participants (who/m), time (when), and place (where). We suggest that CSCW researchers, designers, implementors, and users may benefit from an explicit recognition of the role genre systems can play in collaboration.

No. 206

Toward Dialogue Documents as Creative Conversational Tools

Manabu Ueda

July 1998

In this paper I proposed the creation of a dialogue document for making available the knowledge contained within a creative conversation process. I discussed three main issues: the role of ordinary documents, the need to better represent conversation processes rationally, and the costs of editing conversation. I looked at the reasons why we rarely see the knowledge from the conversation process recorded in documents, even though this knowledge is in same cases as important as the result of the conversation. The dialogue documents I propose are documents of edited actual transcript for readers. My argument is that such documents in dialogue form, are the most effective way to provide access to the knowledge included in the conversation process, because the dialogue documents allow readers to become virtual audiences in the conversation. This means that dialogue documents convey not only explicit knowledge but also allow access to some tacit knowledge by relying on the reader's active formulation of the experience. Perhaps, this is the essential value of dialogue. To crystallize my notion of the dialogue document, I discussed its features in contrast with those of a transcript of conversation as well as an ordinary document. I analyzed the dialogue document from the perspective of 'production costs and benefits' and 'message quality and editing time'. Finally, I considered the possibility of IT support for the dialogue document production process and I discussed the implications of both the technological and social aspects of dialogue documents production and use.

No. 207

Inventing Organizations of the 21st Century: Producing Knowledge Through Collaboration

Nina Kruschwitz and George Roth

March 1999

This manuscript examines a Process Handbook (PH) special project using a learning history form. A learning history is an assessment-for-learning, designed such that its value is derived when read and discussed by teams interested in similar issues. Its contents come from the people who initiated, implemented, and participated in the documented efforts as well as non-participants who were affected by it. A learning history presents the experiences and understandings of people who have gone through a learning effort in their own words, in a way that helps others move forward without having to "re-invent" what the original group of learners discovered. The content of the learning history creates a context for conversation that teams within organizations wouldn't be able to have otherwise.

This learning history, and the PH project it describes, raises issues around knowledge creation and team structures by looking at how a project team of individuals from university, business, and consulting organizations was effective in creating new knowledge. The team members held different predispositions toward theory development, producing business outcomes, and developing capacity for action. Their complementary, and at times conflicting, interests provided a robust structure for knowledge creation. Knowledge created through this team structure is also multidimensional, having theoretical, methodological, and practical components.

No. 208

Useful Descriptions of Organizational Processes: Collecting Data for the Process Handbook

Brian T. Pentland, Charles S. Osborn, George Wyner, Fred Luconi

August 1999

This paper describes a data collection methodology for business process analysis. Unlike static objects, business processes are semi-repetitive sequences of events that are often widely distributed in time and space, with ambiguous boundaries. To redesign or even just describe a business process requires an approach that is sensitive to these aspects of the phenomena.

The method described here is intended to generate semi-formal process representations suitable for inclusion in a "handbook" of organizational processes. Using basic techniques of ethnographic interviewing and observation, the method helps users map decomposition, specialization, and dependency relationships at an intermediate level of abstraction meaningful to participants. By connecting new process descriptions to an existing taxonomy of similar descriptions in the Handbook, this method helps build a common vocabulary for process description and analysis.

No. 209

Genre Taxonomy:  A Knowledge Repository of Communicative Actions

Takeshi Yoshioka and George Herman

October 1999

In this paper, we propose a genre taxonomy as a knowledge repository of communicative structures or "typified actions" enacted by organizational members. The Genre taxonomy aims at helping people to make sense of diverse types of communicative actions, and has three features to achieve this objective. First, the genre taxonomy represents the elements of both genres and genre systems, sequences of interrelated genres, as embedded in a social context considering the "5W1H" questions (Why, What, Who/Whom, When, Where, and How). In other words, the genre taxonomy represents the elements of both genres and genre systems in terms of purpose, contents, participants, timing of use, place of communicative action, and form including media, structuring devices and linguistic elements. Second, the genre taxonomy represents both widely recognized genres such as a report and specific genres such as a technical report used in a specific company, because the difference between a widely recognized genre and a specific variant based on the more general genre sheds light on the context of genre use. Third, the genre taxonomy represents use and evolution of genre over time to help people to understand how a genre is relevant to a community where the genre is enacted and changed.

We have constructed a prototype of such a genre taxonomy using the Process Handbook, a process knowledge repository developed at MIT. We have included both widely recognized genres such as the memo and specific genres such as those used in the Process Handbook itself. We suggest that this genre taxonomy may be useful in the innovation of new document templates or methods for communication because it helps to clarify different possible uses of similar genres and explicates how genres play a coordination role among people and between people and their tasks.

No. 210

Towards a Systematic Repository of Knowledge About Managing Collaborative Design Conflicts

Mark Klein, PhD

October 1999

Increasingly, complex artifacts such as cars, planes and even software are designed using large-scale and often highly distributed collaborative processes. A key factor in the effectiveness of these processes concerns how well conflicts are managed. Better approaches need to be developed and adopted, but the lack of systematization and dissemination of the knowledge in this field has been a big barrier to the cumulativeness of research in this area as well as to incorporating these ideas into design practice. This paper describes a growing repository of conflict management expertise, built as an augmentation of the MIT Process Handbook, that is designed to address these challenges.

CCS No. 211, Sloan No. 4115

Domain-Independent Exception Handling Services That Increase Robustness in Open Multi-Agent Systems

Mark Klein and Chrysanthos Dellarocas

May 2000

A critical challenge to creating effective multi-agent systems is allowing them to operate effectively in environments where failures (‘exceptions’) can occur. This paper describes the motivation, progress and plans for work being pursued in this area by the MIT Adaptive Systems and Evolutionary Software research group (

CCS No. 212, Sloan No. 4116

An Experimental Evaluation of Domain-Independent Fault Handling Services in Open Multi-Agent Systems

Chrysanthos Dellarocas and Mark Klein

May 2000

A critical challenge to creating effective open multi-agent systems is allowing them to operate effectively in the face of potential failures. In this paper we present an experimental evaluation of a set of domain-independent services designed to handle the failure modes ("exceptions") that can occur in such environments, applied to the well-known "Contract Net" multi-agent system coordination protocol. We show that these services can produce substantially more effective fault handling behavior than standard existing techniques, while allowing simpler agent implementations.

CCS No. 213, Sloan No. 4124

Community-based Interpretive Schemes: Exploring the Use of Cyber Meetings within a Global Organization

Takeshi Yoshioka, JoAnne Yates, Wanda Orlikowski

July 2000

This paper explores the challenges of adopting a personal-computer-based meeting technology in several geographically dispersed units of a global organization. We use community-based interpretive schemes as an analytic lens for examining community assumptions and expectations about genre, technology and culture, and how they shaped use of the technology over time.

CCS No. 214, Sloan No. 4127


Coordinating Information Using Genres

Takeshi Yoshioka & George Herman

August 2000

In this paper, we demonstrate how a community may use genres for coordinating information. Genres help coordinate information related to resources, place and time since members in the community have enacted genres in the past and have expectations of the socially recognized information that genres bring. Using the HICSS website, we illustrate that genres are used for coordinating information addressing aspects of coordination mechanisms such as divisibility, concurrency, accessibility and timing that help people improve the coordination of work processes. We model these aspects using the Process Handbook, a process knowledge repository developed at MIT, and suggest that system designers and users may benefit from an explicit recognition of the coordination provided by using genres and by exploration of similar coordination through the use of this repository.


CCS No. 215, Sloan No. 4144

Grammatical Approach to Organizational Design

Jintae Lee and Brian Pentland

December 2000

This paper introduces a grammatical approach to the design and redesign of work processes.  Process grammar allows an explicit representation of the space of possible processes within a given domain, making it possible to search for alternatives more effectively than traditional techniques. Unlike some applications of formal grammar, where complete representation is required, process grammars can be useful even when the representation of the domain is incomplete. We argue that grammatical analysis provides a complementary approach to process redesign, rather than a substitute for existing approaches.

CCS No. 216, Sloan No. 4159

Defining Specialization for Process Models

George M. Wyner and Jintae Lee

January 2001

Object-oriented analysis and design methods take full advantage of the object specialization hierarchy when it comes to modeling the objects in a system.  When modeling system behavior, however, system analysts continue to rely on traditional tools such as state diagrams and dataflow diagrams.  While such diagrams capture important aspects of the processes they model, they offer limited guidance as to the ways in which a process can be improved.  In this paper we extend the notion of specialization to process representations and identify a set of transformations which, when applied to a process description, always result in specialization.  We analyze specific examples in detail and demonstrate that such a use of specialization is not only theoretically possible, but shows promise as a method for categorizing and analyzing processes.  This paper makes two contributions toward answering this question: first, it articulates a formal definition of process specialization which is compatible with object specialization but allows us to reason specifically in terms of process representations.  Second, it develops the concept of the "specializing transformation" as a means for systematically generating and exploring process alternatives.  We illustrate these results by applying them to two commonly used representations:  the state diagram and the dataflow diagram. We identify a number of apparent inconsistencies between process specialization and the object specialization which is part of the object-oriented approach.  We demonstrate that these apparent inconsistencies are superficial and that the approach we take is compatible with the traditional notion of specialization.

CCS No. 217, Sloan No. 4244-02

Peer to Peer File Sharing Systems: What Matters to the End Users?

Jintae Lee

April 2002

Peer-to-peer systems have been received much attention recently. However, few studies have examined what makes them successful from the user point of view. For example, how important is the interface for the success of a peer-to-peer system? How serious is the free-loading problem for the end user? This article reports a study examining end user perception of the features in peer-to-peer file sharing systems. First, it discusses the motivation for the study. Section 2 then describes the details of the study including the data collection and the analysis methods used. In particular, it identifies twenty-six features of peer-to-peer file sharing systems and examines how these features are perceived by the end user in similarity and in importance. Section 3 presents the results, interpretations, and an overall picture relating the system features to the traditional software requirement categories. The final section explores potential implications

CCS No. 218, Sloan No. 4362-02

A Coordination-Theory Approach to Exploring Process Alternatives for Designing Differentiated Products

Naoki Hayashi and George Herman


This paper describes a new systematic method for exploring and evaluating alternatives of a product design process for differentiated products - those that share some elements but also have differentiating features. Based on coordination theory, the method clarifies the opportunities and risks of process alternatives. The method consists of three steps:
1) finding applicable differentiation approaches, 2) finding applicable patterns of process coordination, and 3) evaluating total costs of the process alternatives.
We categorized the differentiation approaches as a taxonomy of design processes; the taxonomy includes approaches of adding or removing differentiating elements or sorting results. We also categorize how these are limited by type of interim resource in a design process. We outline three patterns of process coordination and how this interacts with the choice of product differentiation approaches. We show how the process alternatives vary in the success rate of the coordination and how this probability affects total cost of executing a design process. It raises an awareness of the importance of managing dependencies between activities, which many process analyses don’t focus on.

CCS No. 219, Sloan No. 4251-02

Information Technology Fashion: Building on the Theory of Management Fashions

Jintae Lee and Emilio Collar

June 2002

Recent studies of management fashions have used discourse data that contributed to our understanding of the forces underlying the rise and fall in popularity of new management techniques. Like management fashions, there are many IT (information technology) fashions. Testing the extent to which the theory of management fashions apply to IT fashions help us better understand not only IT fashions but also what is generic to the fashion phenomenon and what is unique to particular fashions like IT or management fashions. This study makes a step toward that goal by postulating and testing three hypotheses concerning the similarity and difference between IT and management fashion lifecycles: that IT fashions depend more heavily - relative to management fashions - on exogenous factors. As a result, the duration of its ascent period is (1) shortening over time, (2) shortening at a rate faster than that for management fashions, (3) but yet longer in absolute magnitude than that for management fashions. A bibliometric study yields partial confirmation, illustrating the usefulness of the theoretical framework provided by the theory of management fashion in the study of IT fashions while revealing unique characteristics of IT fashions that deserve further investigations.

CCS No. 220, Sloan No. 4323-02

Temporary assignments and a permanent home: A case study in the transition to project-based organizational practices

Robert Laubacher and Thomas W. Malone

December 2002


CCS No. 221, Sloan No. 4430-03

What Is In the Process Handbook? This chapter appears in: Malone, T. W., Crowston, K. G., & Herman, G. (Eds.) Organizing Business Knowledge: The MIT Process Handbook. Cambridge, MA: MIT Press, September 2003

George Herman and Thomas Malone

Septmber 2003

What kinds of things are included in the Process Handbook? How are they organized? And why did we choose to organize them in this way? This chapter gives our answers to these questions.

CCS No. 222, Sloan No. 4444-03

IT/Automation Cost Reduction in Intel’s Manufacturing Environment

Brian Subirana

July 2003

Intel manufacturing relied heavily on IT and Factory Automation during the manufacturing processes. At Intel, everything from scheduling products on the floor and product delivery systems to statistical process control was done through automation systems.

Shortly after an Intel meeting described in the document, a new position Computing Cost Reduction Manager - was created to lead a team within Factory Automation to drive cost reduction efforts which was a top priority for Intel in 2003. The computing cost reduction team s task was to come up with specific recommendations on how to achieve the cost goals established and to report out on a strategy in the following two weeks. In the document, the organization and business processes are examined and enough information is given to provide recommendations for cost reduction.

CCS No. 223, Sloan No. 4450-03

Measuring the Impact of Information Technology on Value and Productivity using a Process-Based Approach: The case for RFID Technology

Brian Subirana , Chad Eckes, George Herman, Sanjay Sarma, Michael Barrett

December 2003

There has been a lot of research addressing the relationship between Information Technology (IT) investments and productivity. Most of the work has been based on firm-level metrics such as total IT investment. We present what we believe is one of the first attempts to create a systematic methodology to assess the impact of IT in business process performance metrics. Our approach builds on the MIT Process Handbook as a basis to both guide the analysis and capture the resulting knowledge for future use. We will present preliminary results on how to use such methodology to analyze the impact of a given IT technology, namely RFID (radio frequency identification devices), in performance metrics of a consumer packaged goods company. We are interested in looking at how IT may impact performance metrics such as productivity, cost and value. We believe our methodology can help CPG companies prioritize their investments. We show results on how the specialization features of the MIT Process Handbook can incorporate performance metrics to help assess such investments in RFID.





CCS Home  |  Working Papers   |  What's New?  |  People  |  Links  |  Site Map  |  Comments