[Mirrored from: http://www.3-cities.com/~conrad/organic.htm]
By |
Kurt Conrad
|
Financial derivatives are like information technologies. In both arenas, many executives have turned decision makingand implicitly, policy makingover to technologists. The results can be disastrous: in the case of derivatives, to short-term financial health; in the case of information technologies, to long-term competitiveness.
Reducing and managing the risk associated with large information technology projects is critical if emerging technologies like SGML and HyTime are to gain acceptance. For most organizations, the adoption of these standards means significant changes to tools, processes, responsibilities, and even the way people think about information. Few organizations are prepared at a strategic level to identify, understand, and manage risks of this magnitude.
Organic Information Management Models (OIMMs), as contrasted against strictly engineered approaches, offer potential solutions to the risks associated with these transitions. Fundamentally, OIMMs shift the focus from technology to its impact on and relationship to human behaviors. By helping individuals and organizations adapt to environmental changes, OIMMs increase the value of information technology investments and protect those investments through time.
SGML and HyTime developers should understand the principles of OIMMs, both to protect their own projects from political disruption, and because much of the importance of SGML and HyTime results from their ability to balance engineered and organic definitions of value.
The authors are testing these concepts by applying them to the AtStake process. AtStake is a stakeholder-focused strategic planning process designed for use at the organizational and multi-organizational levels. It comprises a set of integrated information models, facilitation tools, and validation rules that operate against the relationships among the information models.
AtStake embodies many of the principles of OIMMs and can be used resolve issues of policy and reduce the political risks associated with SGML and HyTime implementations.
The financial derivative market is getting a lot of attention. Kodak, Orange County, Procter & Gamble, Metallgesellschaft, and other high-profile organizations have suffered major losses or become bankrupt as a result of speculating on derivatives. As the Harvard Business Review points out: "...if not managed correctly, [derivatives] create bigger problems than the ones that they purport to solve." [Tufano, page 35]
The problem results, in large part, from complexity. Derivatives have evolved so quickly and become so complex that few can explain or claim to understand exactly how many of these investments will behave in the market. Understanding, and thus much of the decision making, has shifted to a small group of technicians. As a result many senior managers have effectively abdicated their policy-making roles.
After first singing the praises of the new breed of financial analysts dubbed "Rocket Scientists", organizations that choose to trade in derivatives are starting to realize that "Financial decisions that were previously designed and implemented by specialists need to be monitored more closely from the very top of organizations." ["Using Derivatives," page 33]
A similar situation exists in the information technology arena. There too, senior managers have largely turned decision making over to technologists. By selecting and implementing technologies in a relative vacuum, these technologists also implicitly define corporate policies. This is unfortunate, because the stakes of bad information technology investments are just as high, if not higher, than bad financial investments. By some estimates, information and decision making are more valuable than capital, which is becoming "just another commodity" in the international markets.
Unlike their financial counterparts, however, the impact of poor information management policies and investments isn't as observable or as measurable. The harm done by inappropriate solutions can remain unrealized for years or decades, hampering organizational performance and competitiveness long after the important actors are gone. CIO turnover, outsourcing, lost opportunities, and downsizing are just a few of the results of poor information investment strategies. In addition, an increasing number of organizations and analysts are failing to find any relationship between IT investments and organizational success. Cost savings and improved productivity can no longer be assumed, as these promises routinely fail to be realized.
The solution will not be found in volumes of information technology plans and strategy documents.
"Computer staff can safely deliver superior results only if they are executing an unambiguous information policy. Fuzzy objectives and ill-defined means lead to confused execution. Under such conditions, it may be a matter of luck by blundering into success if what is delivered matches what management alleges that they always wanted." [Strassman, page 6]
The world is beginning to realize that technology alone does not guarantee success. It is within this context that SGML investments are being considered.
If information management is generally ignored at the policy level, what does it mean to information standards that derive much of their benefits from the ability to support long-range policy objectives? If the risks associated with traditional information technologies are beyond the management abilities of most organizations, where does that put SGML and HyTime?
For most organizations, the transition from proprietary, page-based document production and management architectures to SGML and HyTime (hereafter assumed to be included within references to SGML, unless otherwise noted) will be one of the most destabilizing efforts ever attempted. The adoption of these standards means significant changes to tools, processes, responsibilities, and even the way people think about information. To its credit, most of the SGML community is sensitive to these issues, but with the appearance of SGML tools from mainstream software vendors, many more individuals and organizations will choose to "go it alone."
Put yourself in the position of a manger who doesn't really understand "all that computer stuff," having effectively abdicated your decision making role to a some "expert": an in-house technology manager or developer, a vendor's sales rep, a consultant, or even the popular computing magazines. When confronted about a decision to use SGML, will you stop to educate yourself on the policy issues, or will you hand over virtually all learning and decision making to someone else? Will your organization make wise investments? How will you know?
Some notable examples of failed SGML initiatives have already occurred. How many more are likely to happen in the future? What will the impact be when magazines like CIO and InformationWeek publicize some failures as high-profile case studies?
Sure, similar articles have appeared about failed client-server projectsremember the Denver airportbut there is an army of hundreds, if not thousands, of database developers, account managers, and other technical personnel that exist to sing the praises of client-server database technologies and protect their jobs. The SGML industry is less mature and more vulnerable to a few splashy setbacks.
SGML is a powerful standard. Its flexibility allows organizations to do things that could not be done as easily any other way. That same flexibility also gives organizations the ability to do things that are not in their best interests. Like derivatives, SGML is a strategic technology that allows organizations to make big gains and big mistakes.
Senior managers are now beginning to learn that they need to provide oversight and define corporate policies for the use of derivatives to reduce and manage the risk associated with these investments. These same managers need to provide oversight of and policy-making for their information investments, especially investments involving SGML.
SGML technology suppliers and their clients need proven ways to lessen both real and perceived risks. As SGML moves more towards the mainstream, the industry will benefit by helping organizations to apply SGML and other information technologies more intelligently.
This paper provides some new approaches for managing risk and improving the returns on information technology investments.
The first section describes many of the principles associated with Organic Information Management Models (OIMMs). OIMMs provide a framework for dramatically lowering the risks associated with major information technology initiatives, including those involving SGML and HyTime. In addition, SGML and HyTime are fundamentally enabling technologies that can simplify the implementation of OIMMs.
The second section describes the application of these information management concepts to AtStake, a stakeholder-focused strategic planning process. AtStake is a proven tool that reduces risk by helping to ensure that a project's strategies and tactics are politically valid. It is also an example of an OIMM, where the emphasis is shifted away from engineering complicated and comprehensive data structures to opting for simpler structures that serve as a framework for identifying and documenting important relationships.
The need for organizational flexibility and resiliency is driving the demand for OIMMs. In contrast to traditional information management approaches where a software system is engineered to support a relatively static process, many organizations now need extraordinarily modular, flexible, and adaptable information systems. OIMMs allow organizations to respond to environmental changes more quickly and reduce the risks associated with information technology initiatives.
This section introduces many of the basic concepts of OIMMs, how they differ from engineered approaches, their benefits, some important guidelines for implementing OIMMs, and the roles of SGML and HyTime in OIMMs.
"During the first twenty years of computerization the approach to designing the flow of information came from Frederick Taylor's ideas of almost a century ago about factory efficiency. Accordingly, to overcome the chaos of error-prone paperwork, computer analysts saw their jobs not much differently than industrial engineers. They broke up paperwork steps into programmable sequences of transactions. Their objective in designing the computer system was to monitor the flow of the data. Whenever possible, the computer captured as many of the sequential processing steps as possible."
-Paul Strassman, The Politics of Information Management, page 179.
Today, however, accelerating economic and political change is forcing many organizations to adapt and change faster than their information infrastructures allow them to do. Many of these organizations have begun to realize that software isn't very soft. "The system won't let me do that" is an increasingly common complaint. Strategic business initiatives are often delayed or hampered by trying to adapt or modify legacy information management systems that weren't designed for such changes.
David M. Upton provides some examples of this phenomenon in his Harvard Business Review article: "What Really Makes Factories Flexible?." Upton studied the relationship between computer-integrated manufacturing (CIM) and operational flexibility within the fine paper industry. His conclusions may be surprising to some. "...there was little direct correlation between the degree of computer integration and the degree of operational flexibility." [Upton, page 75]
Even worse, Upton found that in most cases, CIM actually reduced flexibility. The CIM systems operated the plants "more conservatively" than human operators in order to reduce paper breakage and associated downtime. This not only slowed changeovers but actually produced more paper breaks than the computer's human counterparts. In addition, operator skills atrophied from lack of use after CIM systems were installed. The biggest detriment to flexibility, however, involved management behavior. By dumping the problem on technologists, the managers used CIM to dodge their responsibilities for defining flexibility objectives and managing the organization to meet those performance goals.
Unlike "traditional" information management, which is built around the concepts of engineered systems, Organic Information Management Models are designed around biological concepts, such as learning, adaption, competition, and chaos. Organic information management concerns itself less with mechanical efficiency and more with how organizations, groups, and individuals learn to respond to environmental changes.
OIMMs are built on the assumption that the tools and techniques used today will be replaced at an ever accelerating pace, evolving and adapting through time. They abandon the idea that analysts can identify and formalize every possible relationship between pieces of information. Instead, they allocate decision-making in such a way as to assure maximum flexibility to individuals and workgroups, within the context of enterprise and institutional requirements for information to be accessible and usable throughout its life cycle.
OIMMs reflect the way that people actually produce, deliver, and use information to effect change. They allow individual workgroups to work with their stakeholders to structure their information holdings based on process-specific requirements and provide a framework for individuals and groups to describe and document relationships as they are uncovered (or learned).
In reality, an effective computer system is a balance between engineered and organic elements. Numerous information management components will need to be designed and engineered very carefully for this approach to work, but OIMMs are characterized by intentionally limiting such engineering to the component and subsystem levels. This leaves much of the final integration to be done by humans in the form of defining relationships between components or executing some of the behaviors that would be automated in more fully engineered solutions. Such an approach is more expensive than fully automated solutions when comparing individual transactions, but is imperative to maintain flexibility and resiliency.
In Out of Control: the rise of neo-biological civilization, Kevin Kelly describes the impact of organic approaches on science, technology, and organizational theory. He summarizes is findings in what he calls "The Nine Laws of God:" [Kelly, page 648]
The following sections summarize a few of these concepts, as they apply to the development of the knowledge-based systems that are needed for organizations to adapt to and flourish in the face of organizational uncertainty. First, politics cannot be avoided, especially when dealing with information standards. When properly channeled, politics is the primary mechanism for assuring that technical solutions meet individual and organizational needs. Second, comprehensive, ideal solutions are almost impossible to realize, and are probably detrimental if they are actually achieved. Third, the relationships between information components are becoming more important than the components themselves, as they reflect the value-added human understanding and knowledge that provides competitive advantage.
Information technologists tend to look upon politics with disdain, the enemy of rational decision making. But at its core, information management is about politics. OIMMs embrace politics as a critical success factor, perhaps even the singular critical success factor.
As DeMarco and Lister observe in Peopleware: Productive Projects and Teams, projects that require no real technical innovation (like accounts receivable systems) are failing. Technology is not to blame in the vast majority of these failed projects. Instead, "politics" is the perceived culprit.
In The Politics of Information Management, Strassman shares similar frustrations, recounting his experiences from the late 1950s:
"...I was never sure how my technology plans would turn out because of the problems that emerged every time I started installing well conceived and formally approved computerization programs. It always came as a surprise that well-laid plans could go awry for no apparently good reason except for something I kept dismissing as mere company politics." [Strassman, page 172]
Today, Strassman has concluded that "Managing information systems is primarily a matter of politics and only secondarily a matter of technology." [Strassman, page xxv] He goes on to trace the evolution of computing as a political history, where management and power have shifted from centralized to decentralized institutions and powerful companies have lost their markets in the process. "Mainframe theocracies" gave way to "minicomputer hierarchies," which in turn were displaced by the "microcomputer revolution" and today's "age of cooperative alliances."
Just as western culture has seen a steady progression of governance models, Strassman argues that organizations today are posed to enter an "age of information democracy." Accordingly, decision making must be formally distributed throughout the organization using the federalist principles developed by Jefferson, Madison, and Adams to balance power among a central authority and a number of constituent political units.
"The simplicity of the U.S. Constitution has much to offer as a template for information policy guidance. It represents a point of view that addresses the governance of complexity by concentrating only on the fundamentals, while leaving everything else for resolution by means of a due process wherever that is appropriate." [Strassman, page 45]
Strassman recommends convening a constitutional convention, where "the most senior executives, as a group, should conceive of and preferably author the fundamental statements of the information policy themselves. Only after the top executive councils reach agreement about the fundamentals should these statements pass to others for elaboration." [Strassman, page 78] In addition, local constitutional initiatives can help promote innovative approaches to resolving corporate issues, just as state constitutions proved to be valuable templates for the U.S. Constitution.
In keeping with the idea of separation of powers, Strassman further recommends the establishment of an information management policy board which has sole authority over policy decisions. This board functions like a legislature to define policies involving standards, technologies, and implementation practices. Executive powers are split among a variety of enterprise, process, business, application, and local units. An audit organization is vested with judicial powers. In all cases, the "representatives of the governance processes are all managers, because all management is information management." [Strassman, page 82]
Perhaps the most sensitive issues involve rebellions, conflict resolution, and enforcement. Formalizing information management policies is of no value if they can be ignored, misinterpreted, or used for political gain. Although clear lines of accountability will resolve many of the conflicts that exist today, swift and competent interpretations are necessary to develop the precedents that future policy changes need to be based on. Strassman also cautions that it may take many years for the development of a comprehensive governance model. In the mean time, political conflicts must be addressed if innovative information management initiatives are to have a reasonable chance of success.
"...I cannot conceive how top executives can align their information systems with business plans unless they have also put into place a framework by which individuals and organizational units can cooperate while minimizing the debilitating effects of contentious politics." [Strassman, page 88]
Software engineering, especially as reflected in the "data dictionary" and "enterprise architecture planning" schools of thought are close to death. Relational database nirvanaall-encompassing relational database architectureswill not be reached. The main reason is time. Most organizations do not have time to wait. By the time comprehensive database architectures are designed, there's a good chance that the organization's understanding of their needs will have changed. By the time the system is implemented, the organization may not even exist.
This is exactly the situation that one of the authors faced in 1994, in an effort to integrate both processes and technologies for an organization called Business and Information Services (BIS). BIS comprised graphics, word processing, editing, photography, video, records management, and a few other production services. A team of project and technology planning personnel attempted to lead the organization through an enterprise architecture planning effort. Before the effort had been completed, a reorganization had taken place and many of the core processes had been reassigned or reengineered.
A parallel effort to develop an integrated tracking and billing system also failed. Nine months and $100,000 later, we had a requirements document that did not reflect then current understanding of the organization's needs. In addition, the requirements definition process did little to help the participants develop a better understanding of their long-range business needs, and focused, instead, on inventorying and formalizing current practices.
Risk increases dramatically with the size of the effort and can be minimized by keeping projects and design teams small. While this promotes sub-optimization and violates many life-cycle information management principles, chances are that the solutions would have had mistakes regardless of how well designed and engineered they were. The idea here is not to avoid mistakes, but to make as many mistakes as quickly as possible. This facilitates the learning and understanding which are necessary precursors to successful solutions.
OIMMs do not abandon engineering all together, but shift the emphasis from comprehensively engineered systems to well-engineered components that can be integrated in a number of useful ways. This integration is performed by the users of the system, not the system designer.
Integrating information management components as needed through time reduces risk and improves organizational flexibility. Risk is decreased because the system designer is not challenged to identify all possible uses for the information objects and every needed software behavior during development. Likewise, component engineering is smaller in scope and shorter in duration than system engineering, and that also lessens risk.
Organizational flexibility is improved when humans to do what they do best: interpret situations, weigh alternatives, and apply information and tools to meet task objectives. OIMMs do not lock individuals into predefined behaviors, but give them the information tools that they need to solve unanticipated business problems and preserve their value to the organization. Both individual and organizational flexibility are enhanced when "...the system [is] designed from the outset to help workers make better decisions rather than to cut them out of the decision-making process." [Upton, page 80]
Allowing individuals and groups to define their own interfaces and link them to important data elements is another attribute of OIMMs. Allowing individuals to preserve and document the relationships that they find useful also begins the transition to knowledge-based systems, where the both the system and the human become more knowledgeable the more that they interact.
Shifting from formalized structures to user-defined relationships helps to preserve investments, especially when the information technology initiatives have strategic value and designed to create destabilizing change, either to the way that the business operates or the way that people think.
As Lao Tzu wrote in the Tao Teh Jing in 600 BC:
SGML and HyTime developers should understand the principles of OIMMs, both to protect their own projects from political disruption, and because much of the importance of SGML and HyTime results from their ability to balance engineered and organic definitions of value.
Just as traditional information management approaches have been based on engineering concepts, the classic view of SGML has been as an interchange standard. This use of SGML promotes mechanical efficiency, minimizes data transformations, and emphasizes resource optimization. Value is measured in terms of cost, time, and fixed measures of quality. Strategic value results from improving the efficiency of existing processes and beating existing competitors.
Viewed from an engineering perspective, information life cycles tend to be more expensive than they need to be because the desire for optimization in one arena has often resulted in overall suboptimization. SGML can reduce overall information life-cycle costs by formally defining information structures that meet the needs of a wider variety of information producers and consumers, thus reducing the barriers and costs associated with sub-optimization. SGML can reduce the need for duplicate systems and costly data conversions and make it easier to retrieve, recycle, repurpose, reformat information.
The dynamics of an engineering-oriented SGML project are to pick a document type, stake out as much of the information life cycle as possible, and try to get everyone to agree on a single acceptable structure. On the systems side, information producers and consumers will use the resulting SGML DTD as an interface specification, and build or adapt information technologies to understand and produce and interact with conforming documents. Because of the magnitude of associated investments, these interchange DTDs are meant to be stable artifacts that are used as fixed points of reference for years, if not decades.
Unfortunately, such engineering efforts are susceptible to the same risks as other engineered information technology initiatives. Developers can protect themselves from risk by adopting organic approaches.
First, they can clearly differentiate policy from technical issues to help promote the active involvement of senior management. Second, they can perform political assessments to identify and map stakeholder interests. For organizations that do not distribute decision making along the lines that Strassman describes, the AtStake process can provide a framework for the required political analysis. Third, developers can be sensitive to the differences between engineered and organic definitions of value to help ensure that the solutions will meet the organization's needs for flexibility.
To support requirements for local flexibility, "organic" DTDs are likely to differ from interchange DTDs in a number of ways:
Organic DTDs are built with the expectation that they will change. Change will become necessary as the organization learns and adapts to new environments.
Organic DTDs may avoid the use of standardized generic identifiers. Different workgroups may need to retain their own vocabularies. The architectural forms construct allows these workgroup-specific vocabularies to be mapped back to corporate and industry standards, while retaining local flexibility.
Organic DTDs support a wide variety of compatible structures. Just as receptor sites accept a variety of compatible chemicals, organic DTDs contain a mixture of standard and non-standard structures. The standard structures are designed to "bond" with standard systems and non-standard structures are tuned to exploit localized systems. Such an approach is critical to "maximize the fringes" [Kelly, page 470] and promote experimentation.
Organic DTDs allow a wide variety of metadata models to be developed. As the quality of the metadata largely determines the usefulness of the information, information objects effectively "compete" for attention based on the quality of their metadata. Metadata requirements are expected to change through time, and this evolutionary process, too, reflects organizational learning. High-quality metadata models support the shift from information management to knowledge management.
Organic DTDs rely on "self-organization." Instead of organizing information through the centralized design of complex structures, the decentralized interaction of information providers, consumers, and software systems create and maintain order. Hypermedia links are a primary mechanism for organizing and integrating large bodies of information, allowing DTDs to become smaller, cheaper and more quickly developed. Comparatively simple chunks of information are used as shells to hang HyTime links. Like metadata models, sets of links (webs) will compete for resources based on their utility.
The challenge to system designers is to provide a framework where local experimentation and variety do not become barriers to information exchange and where investment decisions can be made at a tactical level and still meet strategic requirements. SGML is both flexible enough to meet local requirements and standardized enough to preserve long-term value by assuring that information is maintained in a form that is usable outside of the specific workgroups and applications that created it.
SGML and HyTime are important precisely because they allow local, tactical solutions to be integrated within a common information management architecture.
The ability to both standardize and customize is likely to become increasingly important as workgroups and organizations are pressured to quickly adapt to changing circumstances and are driven to pursue divergent information management strategies. By supporting both engineering and organic approaches, SGML and HyTime provide a framework for developing shared solutions that balance the competing drivers and help ensure organizational health in the face of uncertainty. SGML allows organizations and workgroups to emphasize either engineering or organic approaches to realize immediate benefits but does not preclude evolving to a more balanced approach in the future.
AtStake is a stakeholder-focused strategic planning process designed for use at the organizational and multi-organizational levels. It comprises a set of integrated information models, facilitation tools, and validation rules that operate against the relationships among the information models.
AtStake has been used successfully by over 100 groups within the past few years on high-conflict-potential projects and to align diverse and contentious groups of stakeholders on environmental issues. It has also been used to identify and define new missions for international governmental agencies and to address sensitive national security issues by structuring treaty negotiations between antagonistic governments.
AtStake relates to OIMMs in a number of important ways. AtStake can be used to identify and address the political issues that place technology initiatives at risk. AtStake is also an example of an OIMM, where simple data models are used as a foundation for humans to identify and document important relationships. The set of information collected and organized by using AtStake has long term value which would be enhanced by integrating it into an enterprise-level information management architecture that supports user-defined links to other planning and performance management systems. Finally, AtStake is based on human decision making models and can help to add organic awareness to a variety of applications.
The authors have been evaluating approaches for using SGML and HyTime to formalize the data models and relationships on which AtStake is based. This section describes the conceptual origins of AtStake, examines the relationship between AtStake and OIMMs, and explores potential applications.
AtStake was developed by one of the authors to address the political issues associated with the development and implementation of environmental policy. The author also had extensive experience developing political strategies within the executive and legislative branches of the federal government, and these experiences helped to shape the facilitation process.
He developed AtStake as a response to the weaknesses of traditional strategic planning and its inability to appropriately utilize external and internal stakeholder input. In addition, traditional strategic planning methods routinely neglect to identify short-term and long-term strategies, implementation responsibilities, and success criteria. Neglecting to identify success criteria at the time of planning can be fatal since strategic planning is a long-term process and new participants, not knowing the original baseline, become responsible for its continuance.
AtStake is based on a pair of simple, yet powerful conceptual foundations. First, charting a path, like solving a maze, is easier if you start at the destination. Second, values are more important than facts when making choices, especially when trying select desirable or acceptable future states.
As Strassman points out, "Politics is the art of the feasible." [Strassman, page 6] AtStake uses a predictability model called VIE to translate values, interests, and expectations into a range of acceptable options which defines the conceptual space for politically viable solutions. Each individual brings with them a set of unacceptable solutions. When those are combined at a group level, the resulting range of tolerance is smaller than for any given individual.
When VIE analysis is applied to a diverse group of stakeholders in a facilitated session, the resulting range of tolerance is broader than would be expected in political models that promote polarization. AtStake provides a collaborative framework for negotiating a shared consensus of how individual values should be weighed to identify a mutually desirable future state.
After achieving investment by the stakeholders in a shared vision, AtStake begins to identify mutual drivers and constraints, which are "marked" as key issues for navigating the development of future strategies. This is similar to marking land mines in order to navigate around them.
AtStake relates to OIMMs in four ways:
First and foremost, AtStake reduces risk. It can be used to address many of the political issues associated with the allocation of decision making, the formulation of corporate information policy, and other shared decisions.
Second, AtStake is an example of an OIMM. It uses very simple structures to focus the participant's attention on the relationships between information components. These relationships are more complex and less easily formalized than the basic data structures, but do follow well-defined patterns.
Third, AtStake is expected to serve an important role in the development of OIMM-based performance management systems. The data and strategies which are collected and developed during the application of AtStake have long-term value and should be linked to other project management and implementation modules.
Fourth, because VIE is based on human decision making models it can help to add organic awareness to a variety of applications.
Policy is the codification of organizational values that specifically emphasizes the desired balance between competing values. Thus the process of defining policies is a political process that pits the values, interests, and desires of groups and individuals against each other. Because AtStake is, at its core, a political process, it extraordinarily well suited to identifying and resolving many of the policy making and political issues that have been discussed in this paper. Dispute resolution occurs by focusing on mutual long term agreements rather than fixating on disagreements.
Not only can AtStake can help address political issues prior to the establishment of the constitutional framework that Strassman recommends, but it can speed the learning and negotiation processes that senior executives must go through during their constitutional convention.
Even where it is not used directly to develop policy, AtStake integrates the diverse views of multiple stakeholders into the strategic planning activity and leaves decisions to the true decision makers, not planners or technologists. This helps to assure that projects reflect realized requirements and have a supportive constituencies.
AtStake forces the participants to identify short-term and long-term strategies, and for each strategy, who's responsible and when the strategy is to be implemented and specific tangible success criteria. AtStake also quickly identifies situations where a planning process isn't neededfor example, where change will be driven by factors that are out of the group's control. Where these situations have been encountered, AtStake identified the realities of the situation before significant investments had been made.
AtStake uses a very simple data model that contains a fairly small number of #PCDATA components: values, interests, expectations, vision, mission, achievements, goals, objectives, assumptions, capabilities, strengths, weaknesses, opportunities, threats, strategies, initiatives, and activities. In addition, structural relationships are intentionally limited to allow the participants fairly wide latitude in how information is entered into the model.
Insight, learning, and creativity result not from complicated structural relationships and structural validation, but from the ability to identify and describe non-structural relationships: Does the vision accurately reflect the stated goals of the represented organizations? Will the achievements, goals, and objectives accomplish the vision and conform to the mission and values? Are strategies within the scope of the mission and the values to protect the integrity of the plan?
One of the more important set of relationships involves the congruence of values, interests, and expectations. The comparison of these three points allows positions to be evaluated for consistency and predicts likely future positions and decision making parameters.
Too often, strategic planning is a paper exercise with little or no relationship to the actual workings of the organization. To be of more use, strategic plans need to be tied to budgetary and project management systems.
The set of information collected and organized by using AtStake has long term value which would be enhanced by integrating it into an enterprise-level information management architecture that supports user-defined links to other planning and performance management systems, such as financial, program planning and control, and performance tracking systems.
Past experience has demonstrated that most choices are based on core values, not the "facts" that one would expect to drive rational or scientific decision making. VIE makes no judgement about whether decisions and choices should be based on values or facts. Instead, by clearly identifying both the values and facts that apply, VIE helps individuals and organizations to understand why particular decisions are made.
Once the formalized SGML representation of AtStake is complete, the authors anticipate a variety of possible applications:
Risk management is one of the critical issues in the information technology industry and will become more important to the SGML industry as SGML moves more into the mainstream. The ability to differentiate engineered from organic approaches is important to help manage risk and provide solutions that will meet customer needs through time.
Politics is one of the most dangerous organic elements of any information technology initiative. Without proven strategies for harnessing political forces, the risks of failure increase. Organizations can protect themselves from this risk by formally decentralizing decision making along constitutional principles.
"Conceiving, discussing, announcing, and working under the guidance of an information constitution is not only desirable but essential. The systems capabilities that an organization acquires in this way will shape the benefits of computerization more than any other influence. Governance will always surpass technology as the most critical success factor." [Strassman, page 88]
Proactive stakeholder involvement also lessens political risks. AtStake is a proven tool that reduces risk by helping to identify dangerous political issues early enough in the project to develop strategies for avoiding their disruptive effects. It is also an example of an OIMM, where the emphasis is shifted away from engineering complicated and comprehensive data structures to opting for simpler structures that serve as a framework for identifying and documenting important relationships.
Instead of being monolithic, organic systems consist of multiple, small components. Decision making should be distributed among these components. While this appears chaotic, it is the only way that large systems can adapt to changing circumstances. Decentralized experimentation is also necessary to help incubate new approaches. The more adaptive a system is, the less optimized it will be. Reducing optimization is relatively expensive, but complexity is created, not by efficient design, but by combining essentially autonomous modules.
SGML and HyTime are powerful standards. Properly applied they offer unparalleled capabilities for balancing and integrating engineered and organic approaches and definitions of value. But like financial derivatives, their implementation can disrupt operations and damage organizational performance if improperly managed.
Information management suppliers and their clients need to apply proven ways to lessen both real and perceived risks. As SGML and HyTime move more towards the mainstream, SGML professionals will benefit by helping organizations to make safe and rewarding information technology investments that help them cope with complex and rapidly changing environments.
Davenport, Thomas H., "Saving IT's Soul: Human Centered Information Management", Harvard Business Review, March-April, 1994, page 119.
Davis and Botkin, (Stan and Jim), "The Coming of Knowledge-Based Business," Harvard Business Review, September-October, 1994, page 165.
Galitz, Wilbert O., Humanizing Office Automation: The Impact of Ergonomics on Productivity, QED Information Sciences, Inc., Wellesley, MA, 1984.
GAO, Executive Guide: Improving Mission Performance Through Strategic Information Management and Technology - Learning From Leading Organizations, GAO/AIMD-94-115, May, 1994.
Hammer and Champy (Michael and James), Reengineering the Corporation, HarperCollins Publishers, Inc., New York, 1993.
Kelly, Kevin, Out of Control - The rise of neo-biological civilization, Addison Wesley Publishing Company, 1994.
Kuhn, Thomas S., The Structure of Scientific Revolutions, The University of Chicago Press, Chicago, 1970.
Levindow and Robins, (Les and Kevin), editors, Cyborg Worlds: the military information society, Free Association Books, London, 1989.
Mintzberg, Henry, "The Fall and Rise of Strategic Planning", Harvard Business Review, January-February, 1994.
Petroski, Henry, To Engineer is Human - The Role of Failure in Successful Design, Vintage Books, New York, 1992.
Strassman, Paul, The Politics of Information Management, The Information Economics Press, New Canaan, CT, 1995
Taylor and Felten, (James C. and David F.), Performance by Design: Sociotechnical systems in North America, Prentice Hall, Englewood Cliffs, New Jersey, 1993.
Tufano, Peter, "Using Derivatives: What Senior Managers Must Know," Harvard Business Review, January-February, 1995, page 35.
"Using Derivatives: What Senior Managers Must Know," Harvard Business Review, January-February, 1995, page 33.
Zachman, John, "A Framework for Information Systems Architecture," IBM Systems Journal, Vol. 26. No.3, 1987.
Copyright, The Sagebrush Group, 1995
This article was prepared for and presented at The 2nd International Conference on the Application of HyTime, August 16-17, 1995.