[Cache from http://xml.gov/minutes_20010117.htm; use this canonical URL/source if possible.]
Federal CIO Council XML Working Group
Meeting Minutes
January 17, 2001
American Institute of Architects
Board Room
Co-chairs Owen Ambur and Marion Royal convened the meeting at 9:30 a.m. at the American Institute of Architects. Attendees introduced themselves. The chairs asked for any comments on/corrections to the November 15 meeting minutes. None were forwarded and the minutes were approved.
Announcements
Presentation: "XML Messaging: Universal Data and Universal Data Interchange vs. Linear Development"
Mr. Richard Campbell of the FDIC briefed the WG on the FDIC’s "XML Message Bus" initiative and related institutional issues at the FDIC.
The XML Message Bus initiative was inspired by similar work ongoing in private industry.
Mr. Campbell pointed out that while messaging has been around for quite some time, XML messaging could bring new improvements. As XML messaging is nonproprietary, applications can use it speak to one another in different formats. This connects many different kinds of formerly separate applications, creating a convergence similar to Ethernet or TCP/IP, which are both standards used so widely that people take them for granted.
There are two sides to XML: data and interchange of messages. OASIS has just introduced XP (XML Protocol), a standard envelope for XML messages. Currently, there is a lack of mechanisms to do universal data interchange with XML that hinders XML messaging. Many organizations lack basic XML infrastructure and there are few messaging standards.
Current message interchanges involve transformations, data mapping, and metadata look-ups, all of which create a very linear process. This linear process makes it very hard for messages to cross applications. For instance, update messages must pass through a series of applications one at a time. In an XML environment, all applications can read the update message at the same time. XML does not require such a linear logic, as all applications could have equal access to messages.
Additionally, many messages are currently sent in EDI format. However, EDI requires the capability to read the X12 standards. The process of hard coding these standards into a database that can interact with the system messages will be shared within can cost a great deal of money. XML can eliminate much of this expense.
XML Message Bus provides a multipurpose engine for interchanging data, information, and processes that resides within a system as middleware. It de-couples applications from the interchange process, thereby facilitating cooperation. All applications in a system have equal access to the messages residing in the Message Bus, and they can subscribe to all messages and retrieve from the messages the pieces of information they need.
Mr. Christian asked what kinds of functions the FDIC planned to use the XML Message Bus for (e.g. e-mail, broadcasting, etc…).
Mr. Campbell replied that XML Message Bus could be used for many functions, as it can interface with any application.
Mr. Christian asked if any application existing on an organization’s Ethernet could sniff any message that passes through the Message Bus.
Mr. Campbell replied that it could do so, but rather than directing an application to examine any incoming message, the user would program it to subscribe to separate, specific kinds of messages.
Mr. Christian pointed out that this is a messaging format that is independent of any specific type of interaction.
Mr. Campbell agreed, saying that this format is independent of state. The goal of the Message Bus is to let any application talk to any other application, or let any business partner talk to any other business partner. Additionally, the FDIC hopes that Message Bus will facilitate nonlinear development, allow for simultaneous data updates, permit application replacement (taking legacy applications off-line for good), leverage efficiencies of components, and lower maintenance costs (which the FDIC is particularly concerned with).
XML Message Bus is nonproprietary; the Message Bus is open and all messages are in open XML. The Bus is the common interface with which all applications are connected and can receive and send messages.
Robert Benedict (NASA) asked if XML Message Bus is available in any COTS/GOTS packages.
Mr. Campbell replied that there are a variety of packages available from companies such as Vitria Technology. Those packages that are available probably have to be customized for each system, as each company must write its own interface with the Message Bus. The envelope for each message is in the W3C’s XML Protocol (XP).
Mr. Christian pointed out that the XML Protocol Working Group has not officially started working yet.
Mr. Campbell then discussed the FDIC’s next steps in the XML Message Bus initiative. They must get people to change from a linear application development mindset to a non-linear one. The FDIC will operate XML Message Bus in a non-linear format. The FDIC must also develop a standard data format for exchanging information, to avoid having multiple names for the same data element. The FDIC will develop a pilot to address many of these remaining issues.
With regards to its electronic invoice initiative, the FDIC is looking for vendors capable of generating XML messages. It also has been hoping to use electronic invoices in the government travel system. However, the FDIC deals with small companies that are hard to convince to use XML (e.g. law firms). The primary advantage of converting these electronic invoices to XML would be faster payment of bills.
When asked if the FDIC intends to promote the use of Message Bus throughout the federal government, Mr. Campbell replied that currently the FDIC is focusing on internal use of the Message Bus. However, it would like the other regulatory agencies with which it exchanges data to eventually adopt the XML Message Bus. It must first address issues of security, though.
Mr. Christian asked if the purpose of having open messages is defeated if the messages must be encrypted.
Mr. Campbell replied that that was true, but that the Message Bus was not designed for rapid queries.
Marion Royal (GSA) asked if there are any online references to this initiative that might provide more information.
Mr. Campbell replied that there currently are none.
Mr. Royal observed that it sounded as if users would subscribe to the Message Bus.
Mr. Campbell responded that there would be common schema at work within the Message Bus to which organizations would have to subscribe.
Mr. Royal asked if the Message Bus would be the schema.
Mr. Campbell replied that the FDIC is looking to involve in their work a company that is currently working on XML messaging to assist in development of the Message Bus.
Mr. Ambur asked if the FDIC has put out a request for information.
Mr. Campbell replied that it has and has received about 10 responses so far.
Don Egan (LMI) observed that in old messaging processes, messages would pass through applications sequentially, and each application would add value to the messages traveling through it. Will applications connected to the Message Bus add value to messages that can then be picked up by other applications?
Mr. Campbell responded that that is an issue the middleware will have to address.
Theresa Yee (LMI) pointed out that this brings the process back to the problem of functioning in a linear format.
Mr. Campbell disagreed. He stated that the Message Bus will break message processing down into chunks and each application will work with different chunks of the message. The processing logic will reside within the middleware. This substitutes the Bus architecture for a linear order. The goal of Message Bus is to break the middleware out so it will be available to more than one application.
Mr. Egan observed that this would put all of the FDIC’s internal exchanges into a standard format.
Susan Turnbull (GSA) asked if open architecture was deemed high in terms of priority in the eBanking initiative.
Mr. Campbell responded that the eBanking initiative deals with external exchanges, and Message Bus deals with internal ones.
Ms. Turnbull asked if the two initiatives were at all related. Does the work on eBanking affect the work on internal messaging.
Mr. Campbell replied that he did not foresee competitive banks approving the idea of moving data exchange into a standard format. One of the major obstacles to bank mergers is the difficulty of merging the different data formats used by different banks. If all banks used the same schema, it would be difficult to keep gigantic mergers from occurring. The best the FDIC could do is recommend, not enforce, a particular format.
For more information on XML Message Bus, please contact Richard Campbell at mailto:rcampbell@fdic.gov or 703-516-1135.
This presentation is available at the XML.gov website. http://xml.gov/xmlciowg/index.htm
Presentation: "XML for Information and Application Integration."
Brand Niemann (EPA) and Bill Donellan (NextPage, Inc.) presented the use of XML on FedStats.gov (http://www.fedstats.gov/) and FedStats.net (http://www.fedstats.net/index.htm).
Mr. Niemann briefed the group on the background of FedStats.gov and FedStats.net. OMB created and chairs the Interagency Council on Statistical Policy (ICSP). The ICSP created the FedStats Task Force to create a one-step web portal for federal statistics (FedStats.gov). The ICSP then partnered with the Digital Government Consortium to pioneer the FedStats.net site to serve as a test-bed for new informational products, serve as a collaborative tool, and provide integrated information services. FedStats.net was funded about six months ago, and the website has been up for the past two to three months. The site is serving as a test site for the new XML based standards Content Network Protocol (CNP) and XIL eXtensible Indexing Language (XIL).
Presentations of these new standards were made at the XML2000 conference in December and are available online at http://www.conferencenetwork.com/. Conference attendees can view the presentations for free and non-attendees can view them for a fee. Mr. Niemann recommended that all WG members subscribe to this service.
The FedStats.net group considered using schemas and implementations such as ebXML and BizTalk, but decided it needed to develop its own schema to support its person to person network. It thus developed CNP. CNP creates XML messages between servers for integrated tables of contents and searching. It also allows search requests to be simultaneously distributed to all servers in a content network.
XIL is based on the XSLT/XPath standard and can separate search fields and table of content structure and thus can bring many sites together into one. Additionally, it can be used to present user-unfriendly tags in a language that is easy to understand.
Mr. Niemann then performed a demonstration of FileMaker Pro, which allows users to send XML messages to the FileMaker database. The database then sends a return message in XML.
Mr. Niemann also demonstrated NXT 3, an application that allows a user to search or retrieve information in one of three ways: via a custom search form, conventional links in a homepage format, or a series of expanding hierarchical folders that appear in the left frame of the users’ browser. Each folder in this hierarchical system can be hosted on a different server. The links between the table of contents and the index all use XML.
Mr. Niemann demonstrated the use of NXT 3 in retrieving data from the CIA’s Country Profiles. The information files have been re-purposed into XML documents. The value added by this action is that it creates a hierarchical document that is "mixed" (contains numeric and textual data). This allows for the generation of tables and sorts via columns in the table.
Mr. Niemann then demonstrated StatServer, an application that allows the browser to run statistical analyses on the website.
Mr. Donellan then briefed the group on the details of NextPage’s NXT 3. NextPage built NXT 3 because it saw a trend towards the development of content networks in the marketplace. A content network is a network that gives the end user of a system the look and feel of being directly linked to all servers within a network.
Mr. Donellan provided an overview of the NXT 3 platform. Diagrams of the NXT 3 architecture are available online in Mr. Donellan’s presentation at the XML.gov website. http://xml.gov/FedStats%20for%20XMLGOV/index.htm
Mr. Christian asked if all of the data stored in NXT 3 was predicated on read-only.
Mr. Donellan replied that the application user could limit the data flow to read-only or could allow the flow to be bi-directional. This is all based on how the security service node of the platform is set up. NXT 3 contains a "Managed Content" mechanism that allows one to update the database.
Joe Carmel (House of Representatives) asked how link management was being handled with NXT 3.
Mr. Donellan responded that NXT 3 has a link validation process that can be used, or the agency can integrate its own link management system. All of the individual components of the NXT 3 platform can be modified.
Mr. Niemann added that there is a tool that allows the user to remotely administer and monitor all of the components.
Mr. Donellan then explained the uses of XML in NXT 3. XML tags are used for data fields, and are used to index native XML files, and create a style sheet for presentation control and customization for each user.
Bill LaPlant (Census) wondered how much control a contributing database or data source has over how its data is used in NXT 3.
Mr. Donellan replied that at some levels, the source has total control.
Mr. Niemann remarked that the FedStats.net group is currently dealing with this issue. As it stands now, agencies provide data to FedStats.net personnel, who display it in the format that the latter selects. The agencies then provide the FedStats.net group with feedback. The FedStats.net people hope that eventually, each agency will become responsible for its own data.
Mr. LaPlant pointed out that different agencies are subject to different laws for accessing data and may have different internal policies regarding data access and distribution.
Mr. Niemann stated that the FedStats.net collaboration exists because various agency heads wanted the data displayed on it to exist outside of individual agency firewalls.
Mr. Donellan then continued explaining the uses of XML in NXT 3. XML is used in navigation and searching both structured and unstructured data, the integration of content and searching across multiple servers.
For more information about NXT 3, Mr. Donellan directed WG members to several NextPage webpages:
http://www.nextpage.com/products/nxt3/econtent/econtent_home.asp (For demonstrations)
http://www.nextpage.com/services/techsupport/toolbox.html
There are several NXT 3 success stories available at the NextPage website, but they deal primarily with the banking, insurance, and publishing industries.
Mr. Royal asked if NXT 3 indexes on the relationships of content data.
Mr. Donellan replied that NXT 3 uses both a textual index and a contextual index.
Bruce Hoglund (DLA) asked if this is an object database.
Mr. Donellan responded that the index is created and knows where documents are located, so that all interactions with documents occur through the index.
Mr. Hoglund then asked if the index is generated upon a user query or offline.
Mr. Donellan replied that indexes are generally generated and maintained offline. However, the user can customize his update preferences.
Ms. Turnbull asked if NextPage is in step with the XML based eBook standard due to come out.
Mr. Donellan replied that he would assume so, if it were an XML based standard. Most of the publishers NextPage works with are large companies that have made a sizeable investment in SGML.
Mr. Carmel asked if FedStats is working on indexing in XML.
Mr. Niemann replied that XIL deals with that.
Mr. Carmel then asked if they were integrating with any XML editors.
Mr. Donellan responded that they are not at this time. However, they are working with a content delivery tool from Interwoven.
This presentation is available at the XML.gov website. http://xml.gov/FedStats%20for%20XMLGOV/index.htm
As there were no further questions, the WG broke for 15 minutes.
Presentation: "Human Resources Digital Network (HRDN)"
After the break, Elizabeth Martin (OPM) briefed the WG on the HRDN project. This projected is supported by the Human Resources Technology Council (HRTC), senior human resources personnel, and CIO officials from across the federal government. HRDN is an interagency project sponsored by OPM in its function as the co-chair of the HRTC, and the HRDN project reports to the HRTC.
Ms. Martin then provided the group with several handouts that explained the HRDN project’s concept of operations, goals, and proposed methods. The project will host an open house on January 26th.
OPM wants to consolidate and automate its current manual data feeds to create a human resources digital network. Every federal employee is required to have a personnel file, the Official Personnel File (OPF). Much of the data contained in this file appears in many different forms, all of which must be filled out individually. The HRDN project hopes to automate and compile this information electronically, creating a new file called the "Official Personnel Record," replacing the OPF.
Ms. Martin acknowledged that there are many privacy and security issues related to this initiative that the HRDN project must address.
Ms. Martin stated that the project does not want to replace individual agencies’ human resources systems, merely to compliment them. The HRDN will pull information from these individual systems.
The advantages of the HRDN include reducing time spent filing paper documents. For instance, when an employee transfers to another agency, he must once again fill out many of the same forms he filed with the former agency. With the HRDN, employees will only have to validate data that has already been entered into their Personnel Records. This streamlines activity for employees. Additionally, the HRDN hopes to make personnel forms available for employees to fill out online.
Ms. Martin added that the HRDN should work closely with the XML WG, as the latter hopes to develop DTDs for standard federal forms, and many of the forms in the OPF are standard forms. The HRDN does not simply want to place all personnel forms online in their current state, it wants to discover and ask for the data that is truly important.
The HRDN project is not physically located at OPM; it is at 1120 20th St. Many members of the project staff have been detailed from other agencies. The staff has begun to form several work groups.
Currently, one group is trying to discover what data OPM requires. To this end, the group is examining the information OPM currently receives, either on paper or through automated means. By identifying the truly important data, the group can also identify opportunities for process improvement in OPM.
However, Ms. Martin pointed out, there will be several legal barriers to be overcome if personnel forms are to be available online. Several personnel forms cannot, by law, be signed with an electronic signature. The group has resolved that if it cannot use e-sigs, it will simply re-engineer processes in a different way.
Mr. LaPlant observed that the HRDN project staff will also have to consider issues regarding accessibility. Section 508 requires that agencies ensure that people with disabilities can access federal websites as easily as people without disabilities. In fact, in four months, employees and citizens will be able to sue agencies that do not comply with Section 508.
Ms. Martin replied that the staff is aware of the accessibility issue. However, many other federal agencies will have to deal with it before the HRDN does, and so the HRDN staff can learn from the experiences of others.
Mr. LaPlant suggested that the HRDN deal with the accessibility issue from the very start of its work, for it will affect the very architecture of any personnel form website.
Mr. Royal remarked that the WG has asked LMI to standardize common federal data elements and rationalize common data. Anyone interested in discussing or working on standardizing forms should e-mail him at mailto:marion.royal@gsa.gov.
Mr. Royal then asked Ms. Martin if OPM currently uses Social Security Numbers to identify employees.
Ms. Martin responded that it does, but many agencies are moving away from using the SSN and towards using a unique identifier. In OPM’s retirement system, employee identification numbers are based on names and birth dates.
Mr. Royal asked if OPM has contemplated a government-wide employee identification system. If each agency develops its own method of identifying such things as employee e-signatures and PKIs, then there will be massive compatibility problems across the government. OPM could take the lead in providing a standard for employee identification, thereby avoiding such problems.
Ms. Martin replied that she was unaware of any such initiatives.
Mr. Royal added that OPM could also take the lead in issuing government-wide smart cards for universal access to systems and buildings throughout the government.
Mr. LaPlant observed that creating such a system while maintaining secure access to systems and buildings would require new equipment.
Ms. Martin said that she would make a note of these ideas and discuss them with the HRDN.
Ms. Martin then told the group that the HRDN project is also addressing the issue of what to do with historical data. The lifecycle of an OPF is 150 years. As a result, almost every single OPF that has ever existed is still on file or stored in OPM’s vault in Boyers, Pennsylvania. Creating an electronic personnel record will significantly reduce the amount of space required to keep these records.
Two agencies, the DoD Education Agency and the Air Force, have created electronic OPFs. The former images the files and the latter flows approximately seven documents into its system for imaging. The HRDN staff has studied both of these systems and decided that it does not want to do comparable amounts of imaging. The HRDN will reduce paper usage, but will never be able to eliminate it.
Mr. Benedict asked if the HRDN staff plans on involving ERP vendors in their work, as they are the people from whom the agencies buy their systems.
Ms. Martin replied that several vendors, including Oracle and Empower have expressed an interest in the HRDN. Staff members are trying to get in touch with PeopleSoft’s federal users network.
Mr. Ambur observed that XML could be a powerful tool for facilitating transactions and record keeping in the HRDN. He has heard that the CIO Council’s Architecture Working Group has identified human resources as an architectural segment it would like to study more closely.
For more information about the HRDN project, please visit its website at: http://www.opm.gov/hrdn
Discussion of Future Meeting Agendas
Mr. Ambur announced that the XML.gov site is up, but is currently only a shadow of what the website staff hopes it will become. All WG members should feel free to submit content for the website. At some point in the future, he would like to issue a RFI for the second generation of the site. He hopes that the site will become a demonstration as well as a discussion of XML. However, XML.gov will only be as good as the content provided by the WG members. Please take a look at it and feel free to make suggestions.
Regarding upcoming agendas: the co-chairs are trying to line up a speaker to discuss XGM. IFRI has requested an opportunity to use an OMB circular to demonstrate to the WG how they can refurbish forms.
In March, DLIS and Mary Mitchell (GSA) will brief the group on RosettaNet. Additionally, the co-chairs are trying to secure a speaker to discuss XML schema. BroadVision has also requested an appearance before the WG.
Mr. Ambur suggested that the WG extend its meeting time so that it meets from 9 a.m. until noon. Additionally, it should advertise more broadly now that it is meeting in a more spacious location.
Regarding meeting presentations, Mr. Ambur suggested that the WG pair a technological briefing with an agency specific briefing that describes how it plans to use XML.
Mr. Royal added that XML Solutions has offered to provide the WG with a W3C update in February and an ebXML update in March.
Ms. Turnbull suggested that the WG consider finding a speaker from the Library of Congress to discuss how it plans to implement the XML based eBook standard.
Next Meeting: February 14.
XML Working Group Attendance List
January 17, 2001
Owen Ambur |
DOI-FWS |
Nathan Baldwin |
DISA |
Robert Benedict |
NASA |
Richard Campbell |
FDIC |
Joe Carmel |
U.S. House of Representatives |
Dusty Cernak |
Forest Service |
Eliot Christian |
USGS |
Jim Disbrow |
DOE |
Bill Donellan |
NextPage, Inc. |
Donald Egan |
LMI |
Kathy Flitter |
USN |
Terrie Franks |
FDIC |
Laura Green |
LMI |
Scott Hoffman |
Extensibility |
Bruce Hoglund |
DLA (Pinkerton Computer Consultants) |
Jim Hunt |
GSA |
Randy Kaplan |
Government Reform Committee, U.S. House of Rep. |
Joan Kimmel-Franz |
DOI-NBC |
Dolores Knight |
DTIC |
Bill LaPlant |
Census |
Ginny Loiacona |
Treasury CIO |
Jack Marshall |
Census |
Elizabeth Martin |
OPM |
Amie Milan |
KPMG |
Brand Niemann |
EPA |
Nat Obey |
DLA |
Michael Palmer |
KPMG/XBRL |
Donald Roberts |
Mitretek Systems |
Marion Royal |
GSA |
Bruce Troutman |
8020Data |
Susan Turnbull |
GSA |
Cedric Vessell |
DoD |
Theresa Yee |
LMI |
Andy Yocke |
DOE |