[Mirrored from: http://www.3-cities.com/~conrad/nondoc.htm]
By |
Kurt Conrad |
Many people mistakenly believe that SGML is only useful or appropriate for documents. SGML can also be used to manage administrative and financial data sets to support project planning and milestone tracking, fiscal year work plans, activity-based management, process improvement and reengineering efforts, and forms automation. When used for these applications, SGML can balance efficiency and flexibility, thereby enabling organizational adaptability and health.
The Department of Energy's Hanford Site is facing cataclysmic, all-consuming change. The latest evidence is seen in a recent issue of the Hanford Reach, the Site's weekly newsletter. The tone of this special issue was best summarized in the opening article:
"For the foreseeable future, perhaps the only constant at Hanford will be change. Even the site's geography will be transformed as cleanup progresses and Hanford land is released for other uses.... The need to fundamentally change our focus, the way we organize ourselves and the way we do our work is rooted in many complex, interrelated forces." [DOE-RL, page 1]
In the context of all of this change, DOE has also directed contractors to implement SGML-based life-cycle information management and delivery processes for scientific and technical information. Westinghouse Hanford Company (WHC) and Boeing Computer Services Richland (BCSR)--which functions as the information resources management department for WHC--have a rich legacy of document production standards. The associated cultural expectations are driving a Site-wide infrastructural approach to implementing SGML (Standard Generalized Markup Language), but given the changes described above, the costs associated with such an implementation strategy are increasingly difficult to justify.
The need to build an expansive business case for SGML has resulted in a quest to develop an understanding of and to identify opportunities for strategic value. While a technology-driven approach may seem inappropriate, it is actually a reflection of the inductive thinking process described by Hammer and Champy in Reengineering the Corporation. It is also consistent with SGML industry trends, where externally imposed requirements are translated into opportunities to improve internal information management processes. The GAO's Executive Guide Improving Mission Performance through Strategic Information Management and Technology and Paul Strassman's The Business Value of Computers have been very helpful in identifying strategic opportunities. These and other sources have led to viewing SGML not only as a document production or information management standard, but as a tool for improving organizational health.
This paper summarizes a few of the major conclusions: (1) Management systems can provide more strategic value than document production processes. (2) Information flows can be engineered to improve mechanical efficiency and cost effectiveness. (3) Emerging "organic" information management models are likely to improve organizational learning and adaption. (4) SGML-based approaches can help balance the mechanical and organic information management approaches.
It is hard to find an organization that isn't facing tremendous competition, change, and uncertainty. This is definitely the case for DOE contractors, both at Hanford and at other sites. With the end of the cold war, the emphasis has shifted from the production of nuclear weapons to environmental restoration. The national labs are searching for new customers. Routine budget cuts are increasing competition between individuals and organizations for scarce resources.
These changes are not unique to the Federal system. Organizations outside of the government face similar pressures.
Organizations get sick when they can't adapt to change. In The Stress of Life, Hans Seyle used the phrase General Adaption Syndrome to describe an organism's response to stress. He listed some of the observable signs of stress to be: hyperexcitation, impulsive behavior, inability to concentrate, hypermotility, and accident proneness. At Hanford, accelerating change is placing increasing stress on both individuals and management systems. Accident rates are a major concern. Information and management overload are becoming commonplace. Planning horizons are shrinking. Contract reform, outsourcing, activity-based management, and process management are some of the initiatives that cascade down on contractors with increasing frequency.
The more common use of SGML--to produce and deliver technical documents--would do little to improve the overall performance of organizations at Hanford. The document production process is important, but it is not the root cause of this organizational stress. Instead, the effectiveness and flexibility of Hanford's performance management systems are a greater concern. The processes and supporting information systems that direct, control, and monitor work have had difficulty adapting to the pace of change. Many of the existing processes and technologies are not well suited to the new demands. This is where true strategic value can be found.
In his 1990 book, The Business Value of Computers, Paul Strassman talks about the therapeutic effects of computers and their ability to improve an organization's well-being. The specific therapy should vary with the condition of the patient and the cause of the illness. Mechanical and organic approaches to information technology (IT) each have value, but both can be misapplied. The following sections describe these two views and explore how SGML can be used to integrate and balance what appears to be conflicting approaches.
The engineering metaphor is based on the concepts of mechanical efficiency, minimization of transformation costs, and an emphasis on resource optimization. Value is measured primarily in terms of the inputs consumed: materiel, labor, dollars, time. Stated simply, information life cycles tend to be more expensive than they need to be because the desire for optimization in one arena has often resulted in overall suboptimization.
Suboptimization occurs because computer applications almost always target a subset of the information life cycle and have a narrow organizational focus. In addition, most commercial applications retain, at their core, a paper paradigm, where the computer functions like a big, powerful pen. They "draw" very fast, can handle very complex images, and allow printing instructions to be revised, but they do little to manage the information content. In many organizations, the resulting emphasis on personal productivity threatens overall organizational performance.
Efficiency is also hampered by barriers to exchange. Reliance on proprietary technologies (hardware and software) and encoding schemes creates pockets of information that cannot be easily integrated. Duplicate systems also increase development and maintenance costs and raise serious data integrity concerns. The inability to reuse information increases costs as information itself is recreated and reconciled or must undergo expensive, labor-intensive conversions.
The use of formalized structures reflects an engineering approach, and can increase the value of information in a number of ways. First, formal structure can provide a conceptual framework for focusing attention on content. By contrast, repeated refinements of the printed appearance rarely improve the value of the information product. Second, reliability and reusability can be increased. Formalized structures can function as interface specifications between dissimilar systems. This improves communication and is critical to collaborative computing and many other automation approaches (e.g., electronic mail). Taken to its extreme, a formalized structure can act as a common denominator throughout the information life cycle, allowing the same information objects to be used for creation, intermediate processing and archival. Third, well-defined structures can improve retrievability by organizing information and the relationships between information components.
Rules-based formatting reduces cost. Life-cycle labor costs can be dramatically reduced as authoring and editing for appearance disappear. In most cases, the labor savings more than offsets the costs of developing formatting rule sets, and can even produce a net savings during information creation. Consistency and quality are often improved, as are cycle times. Rules-based formatting dramatically improves flexibility and accounts for much of the reusability and repackaging described above. By reducing delivery costs, rules-based formatting of SGML data can also make it cost-effective to support a wider array of delivery vehicles and customize presentation for specific customers.
While the SGML community has traditionally focused on how to improve mechanical efficiency during the production of documents, many of the same issues exist for other, non-document forms of information. Some local examples are described below. They include graphics applications (technical illustrations, spatial modeling, CAD), chemical analysis data packages, a data dictionary, electronic forms, and online phone directories.
Graphics software has historically taken an appearance-centric approach, designed to produce paper deliverables. This has resulted in an expanding variety of encoding schemes, which, in turn, limit reusability and make text and graphics integration difficult. The fixation on visual appearance is also a barrier to integrating the information content. The need to better manage the information content (as opposed to the printed appearance) of images is leading to new approaches. Some companies, for example, are exploring the development of SGML-based graphics encoding standards. SGML-encoding would make it easier to attach metadata and "intelligence" to drawings, thereby increasing the value of the information objects throughout more of the information life cycle and facilitate logical links to related information objects.
Spatial modeling represents a shift away from "drawing maps" to a focus on managing a more complex set of information that can be used to drive any number of specific visual representations, including virtual reality interfaces. The need to make spatial modeling data more readily available is driving discussions about possible SGML and HTML (Hypertext Markup Language) encoding of the data to support online delivery. In addition, there is interest in linking or integrating spatial data with traditional document applications to enhance navigation and retrieval.
In the CAD arena, a company in Richland, Washington, has developed software to store engineering data in the form of components and relationships, not lines and symbols. Translators take an appropriate subset of the engineering data and generate drawings (schematics, piping diagrams) "on the fly." By integrating the data for individual components (such as a valve) instead of spreading it over multiple drawings, configuration control is simplified and reliability is improved. This processing model is very SGML-like and the company is starting to look into ways of supporting or incorporating SGML and HyTime capabilities.
A group within WHC is exploring the development of SGML standards for chemical analysis data packages. The group procures analytical services and must devote substantial labor to validating that all of the requested information is present, before actually checking the correctness of the content. One engineer estimates that the automated validation of SGML-based data packages could cut labor costs by 90%, in addition to improving the usability of the information by downstream processes.
In FY94, some work was done to develop SGML DTDs (Document Type Definitions) for a Hanford Site data dictionary. A significant part of the justification was mechanical efficiency. The existing data dictionary resided on a mainframe. Vendor- and system-neutral encoding structures for data dictionary information were expected to improve cost effectiveness, in large part through the use of PC-based software tools. It was also envisioned that SGML-based standards for software development documents could be developed to collect and feed information into the Site data dictionary at much lower incremental cost.
BCSR has also started studying the feasibility of encoding electronic forms in SGML. In the past, considerable labor has been required to recreate many of the forms when the commercial software was upgraded. If feasibility can be demonstrated, it is expected that SGML encoding will reduce redevelopment costs and simplify integration with databases and workflow management systems.
BCSR is also piloting an SGML-based phone directory. The current system uses custom software to display and navigate an ASCII text file containing over 20,000 records (nearly 3MB). The use of a commercial browser should provide additional functionality: better searching (string matching against full-records and within specific fields), integration of photos and office location maps, navigation by the organizational hierarchy via HyTime links.
Many of the same life-cycle information management issues exist for the data required to manage organizational performance. The information that constitutes performance management systems is routinely spread across a wide variety of applications and processes. These include planning, project management, budgetary, financial, billing, quality, scheduling, calendaring, and tracking systems. With the increasing interest in business process reengineering (BPR), commercial software systems are starting to appear to collect and store BPR data sets. In some cases, these are even being linked to computer-aided software engineering (CASE) tools.
The individual software systems are often implemented in redundant, suboptimized ways, with unsynchronized parallel information streams and an over-reliance on paper-based information exchange. Many of the commercial software products rely on fairly closed architectures and proprietary database and encoding technologies. So even when they attempt to integrate a number of these processes, they risk creating islands of information that aren't easily integrated with the organization's business processes.
The value of integrating performance management processes and data sets is described by the GAO. In addition to improving overall organizational health, the GAO has determined that such integration also improves information management processes:
One of the more accepted approaches for integrating these performance management systems is to create a single set of integrated process and data models that describe the organization's behaviors and the associated data flows. SGML encoding of these information models can add value in a number of ways. SGML DTDs can be used to define interface specifications and facilitate the exchange of information between dissimilar systems. SGML encoding can also simplify the formatting of multiple output streams, especially for online and electronic delivery. Through the use of HyTime and other addressing mechanisms, data sets may not even need to be translated into SGML to be integrated with SGML data streams.
Redesigning fragmented performance management systems is an important strategy for improving organizational effectiveness and resiliency. The same characteristics that enable SGML to integrate document production processes can be applied to other information life cycles that have more impact on organizational health. Improving the mechanics of information creation and management may not, however, be sufficient to significantly change organizational behavior. The following section describes other information management approaches that attempt to address some of these cultural and behavioral issues.
While mechanical efficiency is important, overly-rigid information management systems can hamper organizational flexibility. As organizations evolve from "lumbering bureaucracies" to "fluid, interdependent groups of problem solvers" [Barrett] biological models are being applied to organizations and information systems. SGML is well-suited to providing the additional of flexibility required for organic approaches.
While the engineering of information management life cycles can provide tremendous value, a narrow focus on mechanical efficiency can be limiting when trying to nurture organizational change. The goal of reducing an organization and the individuals that make it up to a finite set of entities and relationships is idealized, at best.
Even more fundamentally, this quest for mechanical efficiency in human systems is based on idealized views of what science and engineering are all about. As Thomas S. Kuhn and Henry Petroski write--in The Structure of Scientific Revolutions and To Engineer is Human, respectively--change in both disciplines is anything but straightforward. The evolution of scientific thought is not a smooth progression of new ideas, but is paced by competition, territoriality, perceptions, and paradigms. Fundamental change in engineering results from learning from mistakes, some of them with disastrous consequences.
This idealized view of engineering has also corrupted the concept of reengineering. The definition of reengineering quoted above is not consistent with the process described in Reengineering the Corporation, which might explain why 70% of all reengineering efforts fail. As originally described by Hammer and Champy, reengineering is a very organic process, relying on inductive thinking and the breaking of rules to "create new ways of working." The institutionalization of inductive thinking, in turn, is an ongoing effort of study and learning, aimed at identifying promising technologies and mapping out their possible applications long before they reach commercial viability.
Reengineering, like most organic processes, is not designed for optimization and predictability. "A passion for learning is a passion for screwing up." [Peters, page 37] "To reengineer a company is to take a journey from the familiar into the unknown." [Hammer and Champy, page 101].
Organic systems are not engineered, they evolve. Accordingly, biological metaphors are a far more useful way to describe the challenges facing organizations today than mechanical metaphors. "...our strongest terms of change are rooted in the organic: grow, develop, evolve, mutate, learn, metamorphose, adapt. Nature is the realm of ordered change." [Kelly, page 353].
Instead of being monolithic, organic systems consist of multiple small components. Decision making is distributed among these components. While this appears chaotic, it is the only way that large systems can adapt to changing circumstances. Decentralized experimentation is necessary to help incubate new approaches. The more adaptive a system is, the less optimized it will be. Reducing optimization is relatively expensive, but complexity is created, not by efficient design, but by combining essentially autonomous modules.
In Out of Control - The Rise of Neo-Biological Civilization, Kevin Kelly describes the impact of organic approaches on science, technology, and organizational theory. He summarizes his findings in what he calls "The Nine Laws of God:" [Kelly, page 648]
This increasing understanding of organic systems has direct implications on our attempts to develop IT strategies to improve organizational performance. Information architectures have traditionally attempted to maximize the mechanical efficiency of computer systems and databases, structuring information sharing and minimizing redundant data. But such formalizations are difficult to align with the dynamics of most organizations and the time frames in which change must occur.
Thomas H. Davenport writes in the Harvard Business Review that "...a human-centered approach assumes information is complex, ever-expanding, and impossible to control completely. The natural world is a more apt metaphor for the information age than architecture. From this holistic perspective, all information doesn't have to be common; some disorder and even redundancy may be desirable." [Davenport, page 122]
To be effective, information architectures need to reflect the messy ways that people really produce, deliver, and use information to effect change. "...managers get two-thirds of their information from face-to-face or telephone conversations; they acquire the remaining third from documents, most of which come from outside the organization and aren't on the computer system." [Davenport, page 121] People are a critical delivery vehicle because their interpretations increase the value of information and add context.
Perhaps the penultimate example of organic information management systems is the Internet. The statistics are immense, the chaos is unimaginable, and the attention is becoming frenzied. No one owns it. No one controls it. But as TCP/IP and the World Wide Web demonstrate, it is becoming an increasingly important laboratory for international computing standards.
SGML is well suited to supporting the "human-centered" approach to information management that Davenport advocates. SGML is a user-driven standard and the industry has evolved with a strong emphasis on usability and interchange. SGML was designed to support multiple representations, allowing interfaces to be customized for different sets of users. Together with the HyTime standard, SGML can dramatically expand the types of information that are organized and referenced, well beyond what is normally "computerized."
Perhaps even more important, however, are the conventions that have evolved for the modeling of information (document analysis) and their formalization as DTDs. Davenport places particular emphasis on having individuals design their own information environments, because that participation directly influences the willingness to use the resulting conventions.
The need for participation brings with it a whole new set of issues. First non-technologists must be kept engaged in the modeling process. After their Fall 1993 conference and exposition, Seybold Publications reported seeing SGML used for a number of "database" applications. As compared with the use of relational database technology, SGML makes it easier to construct and interact with complex hierarchical information models, for example, those found in dictionary entries.
Second, Davenport observed that "The more a company knows and cares about its core business area, the less likely employees will be to agree on a common definition of it." [Davenport, page 122] DTD development is often described as a contact sport because of the conflict that results when a varied group of information producers and consumers attempt to develop common definitions. The increasing number of SGML applications attests to the flexibility of the standard and the ability of facilitation processes to successfully negotiate these definitions.
In the context of internal information architectures, however, such uniformity may be counterproductive. "All information doesn't have to be common; an element of flexibility and disorder is desirable." [Davenport, page 122] In addition, it is important to "assume transience of solutions," "assume multiple meanings of terms," and "build point-specific structures." [Davenport, page 123]
To support these additional requirements, "organic" DTDs are likely to differ from interchange DTDs in a number of ways:
Organic DTDs are built with the expectation that they will change. Change will become necessary as the organization learns and adapts to new environments.
Organic DTDs may never use standardized generic identifiers. Different workgroups may need to retain their own vocabularies. The architectural forms construct allows these workgroup-specific vocabularies to be mapped back to corporate and industry standards, while retaining local flexibility.
Organic DTDs support a wide variety of compatible structures. Just as receptor sites accept a variety of compatible chemicals, organic DTDs contain a mixture of standard and non-standard structures. The standard structures are designed to "bond" with standard systems and non-standard structures are tuned to exploit localized systems. Such an approach is critical to "maximize the fringes" [Kelly, page 470] and promote experimentation.
Organic DTDs allow a wide variety of metadata models to be developed. As the quality of the metadata largely determines the usefulness of the information, information objects effectively "compete" for attention based on the quality of their metadata. Metadata requirements are expected to change through time, and this evolutionary process, too, reflects organizational learning. High-quality metadata models support the shift from information management to knowledge management.
Organic DTDs rely on "self-organization." Instead of organizing information through the centralized design of complex structures, the decentralized interaction of information providers, consumers, and software systems create and maintain order. Hypermedia links are a primary mechanism for organizing and integrating large bodies of information, allowing DTDs to become smaller, cheaper and more quickly developed. Comparatively simple chunks of information are used as shells to hang HyTime links. Like metadata models, sets of links (webs) will compete for resources based on their utility.
How do organic information management approaches relate to performance management systems? Most organizations are in the midst of a massive learning process. Structuring and integrating the information from related systems is a powerful tool for both individual and organizational learning. Within Federal agencies and their contractors, the Government Performance and Results Act (GPRA) and its mandate of activity-based management (ABM) is driving new ways of thinking about financial systems. In addition, competitive pressures are changing accepted definitions of organizational performance and measurement approaches.
These learning curves are reflected at the workgroup level. Individual groups have different levels of experience and understanding of emerging performance measurement approaches. In addition, different organizations have been collecting performance metrics of varying quality. Many organizations have ongoing process definition or process change activities. Also, a wide variety of tools are in use that, again, vary at the workgroup level.
Taken together, it is unlikely that a single, monolithic performance management "system" will meet the needs of all organizations at the Hanford Site. It is also unlikely, given the Site culture of disparate software applications, that a single system would be culturally acceptable. "Some managers have always been distrustful of the information systems approaches of their companies, largely because they didn't understand them. In many cases, they were right to feel uneasy. As the diverse company experiences suggest, grand IT schemes that don't match what rank-and-file users want simply won't work." [Davenport, page 131]
Accordingly, a decentralized, evolutionary approach may be required. Instead of designing and building a monolithic Site-wide system, SGML-based performance management could be built around individual information components, which would be modeled one dataset at a time. Initially, planning datasets are being explored. A strategic planning facilitation and information model is being analyzed and the conversion of Site Support Program Planning (SSPP) sheets to SGML is being discussed.
The SSPP sheets are used to create the baseline fiscal year work plans. There is increasing demand to publish the work plans electronically. Ideally, the data would be updated periodically to document the budgetary "draw down." But more importantly, BCSR is still learning how it wants this process to work. Information and processing requirements are relatively volatile, as are the potential relationships between SSPP sheets and other performance measurement datasets. By converting the SSPP sheets to SGML, it is expected that they could be accessed using a commercial SGML viewer. In addition, HyTime-compliant links and annotations would allow reconciliation and other analyses to be documented on an ad hoc basis, while preserving the integrity of the original database. This approach would increase the short-term value of the information and help facilitate the requirements definition process at relatively low incremental expense.
The SSPP application illustrates some of the key concepts behind an organic approach to developing performance management systems. Investment decisions are retained at the workgroup level. Although subject matter experts contribute to analysis and modeling, the evolution of the application is anticipated, and even expected. SGML encoding helps remove barriers to information exchange. HyTime links are used to capture relationships to reduce the need for up-front analysis. Successes and failures are "fed back" into the design process to refine the data models and, in time, drive local customizations. Formalized conceptual models speed the communication and learning of new approaches, but the evolutionary nature of the design reduces the risk of hindering future adaption.
In addition, information models are being identified to collect and manage the information required for ABM. These include customer profiles, product and services directories, cost-driver descriptions, and performance metrics. The ABM information models will also need to incorporate a fairly rich set of process data, as the origin of the data is an important element of the planning process. Organizations that are collecting detailed operational data have noticeably different attitudes about reengineering and process improvements than those groups that rely on facilitated sessions to develop estimates.
The learning process is a core adaption mechanism. Information management architectures that do not allow and even encourage change hamper this learning process. SGML is flexible enough to allow localized variations and its strong emphasis on usability and interchange makes it an effective mechanism for focusing attention on an organization's core information management issues. The last section describes SGML's ability to enable a holistic approach to information management, balancing engineering and organic models.
The critical challenge of any information management system is to balance the benefits of mechanical efficiency with the need to facilitate the learning and flexibility necessary for timely adaption. In most organizations today, both mechanical and organic approaches exist, but they are not balanced.
Formalized entity-relationship models, structured databases, and standardized interfaces are in widespread use. Likewise, localized adaption and redundancy are quite the norm. By some estimates, over 200 different commitment tracking systems are in use at Hanford. The numerous variations do not add value, however, but are carefully engineered suboptimizations that become barriers to information exchange and don't have the potential to achieve the economies of scale that they should. Likewise, most existing applications lack the flexibility necessary to support expected rates of organizational change. The resulting stress literally makes organizations sick as they lose their ability to adapt and compete.
SGML can add functionality to an engineered information management architecture by helping to integrate and augment a wide variety of information technologies and by protecting information from technology changes. In addition, the formal validation process allows individual sets of information to be tested for conformance. The standard is flexible enough to allow individual organizations to determine how stringent or forgiving the conformance testing should be. Rules-based formatting maximizes flexibility during information delivery, while limiting redundancy and duplication.
SGML also allows, and perhaps encourages, local variations, as the DTD can be customized within an individual instance. This has the potential of being very chaotic, and is much in keeping the organic models of decentralized autonomy. Together with the HyTime standard, however, SGML provides mechanisms to embrace the chaos. Because the variations need to be explicitly defined, receiving systems can more easily identify them and determine whether an individual instance contains enough of the required elements to be usable.
As Lao Tzu wrote in the Tao Teh Jing in 600 BC:
The ability to both standardize and customize is likely to become increasingly important as workgroups and organizations are under increasing pressure to quickly adapt to changing circumstances and face divergent information management strategies. By supporting both engineering and organic approaches, SGML provides a framework for developing shared solutions that balance the competing drivers and help ensure organizational health in the face of uncertainty. For organizations that are overstressed, SGML allows organizations and workgroups to emphasize either engineering or organic approaches to realize immediate benefits but not preclude evolving to a more balanced approach in the future.
A focus on non-document applications, especially those related to performance management, is expected to provide greater opportunities for strategic value. This, in turn, should help justify a broadbased implementation strategy that balances the DOE directive to use SGML for the production and management of scientific and technical information and local requirements for cost effectiveness and ease of use.
Barrett, Tom, Cyberspace: First Steps, quoted by Tom Peters in "A Paean to Self-Organization," Forbes ASAP, October 10, 1994, page 156.
Brynjolfsson, Erik, "Technology's True Payoff," InformationWeek, October 10, 1994, page 34.
Davenport, Thomas H., "Saving IT's Soul: Human Centered Information Management", Harvard Business Review, March-April, 1994, page 119.
Davis and Botkin, (Stan and Jim), "The Coming of Knowledge-Based Business," Harvard Business Review, September-October, 1994, page 165.
DOE-RL (U.S. Department of Energy's Richland Operations Office), Hanford Reach - Special Issue: Changing Hanford, October 3, 1994.
GAO, Executive Guide: Improving Mission Performance Through Strategic Information Management and Technology - Learning From Leading Organizations, GAO/AIMD-94-115, May, 1994.
GPRA, Government Performance and Results Act of 1993, Public Law 103-62.
Hammer and Champy (Michael and James), Reengineering the Corporation, HarperCollins Publishers, Inc., 1993.
Kelly, Kevin, Out of Control - The Rise of Neo-Biological Civilization, Addison Wesley Publishing Company, 1994.
Kuhn, Thomas S., The Structure of Scientific Revolutions, The University of Chicago Press, Chicago, 1970.
Mintzberg, Henry, "The Fall and Rise of Strategic Planning", Harvard Business Review, January-February, 1994.
Peters, Tom, "If it Sounds Crazy, Try It," InformationWeek, September 5, 1994, page 32.
Petroski, Henry, To Engineer is Human - The Role of Failure in Successful Design, Vintage Books, 1992.
Strassman, Paul, The Business Value of Computers, The Information Economics Press, New Canaan, CT, 1990
Seyle, Hans, M.D., The Stress of Life, McGraw-Hill Book Co., New York, 1976.
Wheatley, Meg, Leadership and the New Science, quoted by Tom Peters in "A Paean to Self-Organization," Forbes ASAP, October 10, 1994, page 154.
"Where SGML and Databases Meet," Seybold Special Report, Volume 2, Number 3, November 29, 1993, page 20.
Zachman, John, "A Framework for Information Systems Architecture," IBM Systems Journal, Vol. 26. No.3, 1987.
Copyright, The Sagebrush Group, 1995
This article is a revision of work previously prepared by the author while employed by Boeing Computer Services, Richland. It was published November, 1994, in The Proceedings of SGML '94 and was presented at SGML '94, Vienna, Virginia, November 6-11, 1994.
It was prepared for the U.S. Department of Energy Office of Environmental Restoration and Waste Management and Westinghouse Hanford Company, the Hanford Operations and Engineering Contractor for the U.S. Department of Energy under Contract DE-AC06-87RL10930. It was assigned document number WHC-SA-2717-FP and approved for public release.