Automated Forms: Putting the Customer First Through Intelligent Object-Oriented Chunking of Information and Technology


Owen Ambur, May 1997




Tragedy of the Commons | ITMRA and Information Sharing | Modular Contracting and Chunking | Object Oriented Programming | Document Objects | Structure and Formality of Information | Reverse Engineer People and Processes or Data and Databases? | Forms as Base Class People Objects | Forms Automation Features and Functionality | Forms Automation Vendors | Who Is the Client, Who is the Server? | Conclusion | References | End Notes




Tragedy of the Commons


In the traditional systems development life cycle (SDLC), implementation or delivery to the customers is near the end of the cycle.(1) Depending upon institutional inertia and system complexity, lengthy and detailed analysis may be conducted before any action is taken. Moreover, if the system is based upon database technology, the traditional model considers user applications to be dependent upon specific database technology, with application design to be undertaken late in the process, followed by delivery to the users. (Elmasri, p. 41)


In short, accepted practice tends to put technicians in control, those who develop and administer particular database technology. Within the context of the chosen technology, which is often pre-selected in advance of identification of common enterprisewide business and information requirements, narrow oligarchies of users are treated as supreme but ignorant royalty, reigning over the systems which represent their own particular kingdoms.


Turbin (1996, p. 369) expresses the problem as follows:


If selection and scheduling of information systems projects is based only on proposals submitted by users, the projects will reflect existing computer-user biases in the organization, managers' aggressiveness in submitting proposals, and various aspects of organizational power struggles, rather than the overall needs and priorities of the organization.


Meanwhile, the teeming masses -- the commoners who are the "average" end-users in the organization -- are treated as an afterthought, if at all, notwithstanding the fact that they are expected to interact with several, if not many, different IT systems. Little real attention beyond lip service is paid to the slogan "the customer is king." Even when serious efforts are made to address customer needs, the focus is usually on a narrow sets of requirements specified by various cabals, rather than on any coherent model of the desktop of the average user. Thus, highly specialized requirements become the golden fleece of which the emperor's clothes are made.


Turbin points out that "IT architecture should be viewed from an enterprisewide computing perspective..." (p. 344) and "[d]ata should be viewed as a corporate asset and managed as such" (p. 345). However, "many organizations have allowed their IT architecture to evolve over time without a systematic and explicit blueprint... [resulting in] a wide mixture of IT components which do not fit well together and do not fit well with the needs of the business." (p. 343)(2)


Under the circumstances, it should come as no surprise that a large proportion of all projects either fall short of meeting expectations or fail entirely.(3) While the fundamental concept of the business unit and the enterprise is in a state of flux, the trend is pronounced. And as organizational structures are decomposed and "flattened," the need to treat teeming masses as teaming masses is becoming more evident. At some point, it will become clear that plotting (pun intended) out turfdoms is no longer an acceptable way to conduct IT planning.(4)


Meanwhile, oblivious to any plan, meek or grandiose, business goes on via whatever "legacy" systems, large or small, centralized or decentralized, paper-based or electronic, automated or manual, that are already available to employees to accomplish their work.(5) In many cases, such systems do not readily share data with others that require some or all of the same data.(6) That is why they are called "stovepipes." They burn up excessive amounts of energy and resources generating information that goes up in smoke without being appropriately shared. Stovepipes beget other stovepipes, as much of the same information is generated and regenerated over and over again. Instead of adding value and innovating, people end up replicating data, duplicating documents and paper, and generally reinventing the wheel. Perhaps the worst type of legacy, stovepipe systems are those that require the use of paper, since information captured (documented) on paper is so cumbersome to manage, manipulate, and share.(7)


Addressing the need for an overall organizational IT systems architecture plan, Turbin (p 390) says:


By clearly defining the intersection of information and subsystems, an organization will not build separate, redundant information systems for different organizational processes. (emphasis added)(8)... When an organization decides to improve ... one process, other processes ... can be taken into consideration. This avoids building separate information systems for each subsystem.


Kerr (p. 7) summed up the problem and it's potential solution in similar terms, as follows:


... organizational boundaries that act to inhibit the creation of integrated systems must be broken down. Systems ... must be built independently of how the firm is organized. Firms must recognize that a single application that meets a multitude of requirements is better than many smaller applications ... (emphases added)


Many business data users today suffer from "data ego"... the belief that a user owns all the data he or she uses. Many business professionals believe that since they own the data, the don't have to share it with anyone.(9)



We must begin to promote the idea that all data assets are owned by the corporation and that users are merely custodians who keep them current.


That is especially true with respect to public employees and public information, for which the taxpayers have paid and which they, rather than their employees, quite literally own.


ITMRA and Information Sharing


Indeed, the Information Technology Management Reform Act of 1996 [ITMRA, section 5113(b)(3)] wholeheartedly embraces the notion of shared data and systems, and extends it beyond individual departments and agencies -- potentially to treat the entire Federal Government as a single enterprise for purposes of interagency information technology (IT) investments!(10) Moreover, section 5122(b)(4) requires each agency to think beyond the Federal domain to identify "... information systems investments that would result in shared benefits or costs for ... State or local governments" as well.


At the same time, the concept of the Internet is evolving not only through the establishment of organization-specific "intranets" but also to address the need to share information with selected external suppliers and customers. As Yogi (Berra) would say, "extranet" is the latest buzzword currently in vogue.(11) In concert with this thrust, through the Electronic Freedom of Information Act (E-FOIA, P.L. 104-231), Congress has mandated that agencies make available by electronic means all public records that are of interest to more than a few people [5 USC 552(a)(2)(E)]. (Ambur, 1997)


In addition, legislation has been introduced in Congress "... to minimize the burden of Federal paperwork demands upon small businesses, educational and nonprofit institutions, Federal contractors, State and local governments, and other persons through the sponsorship and use of alternative information technologies." (Talent, 1997)(12) Discussing the provisions of the bill, Rep. Myrick said:


The Paperwork Elimination Act will decrease the burden of Federal paperwork by requiring all Federal agencies to give small businesses, educational and nonprofit organizations, State and local governments the option of filing required information by means of electronic submission, such as e-mail, fax, and other means... As a small business owner myself, I can say that too much time is spent filling out forms ..."


Rep. Pryce concurred and asserted:


... meeting the Government's paperwork demands has a dollar value roughly equivalent to 9 percent of the Nation's gross domestic product. Congress must lighten this load. By enabling the Federal Government to take advantage of the information age, this legislation will enable small business owners across America to utilize smart technology available today to reduce those costs and to eliminate barriers to job creation and economic productivity. (emphasis added) That means less time spent filing forms ...


Rep. LoBiondo cited Small Business Administration (SBA) estimates that small business owners spend at least 1 billion hours a year in filling out government forms, at an annual cost of $100 billion.


Rep. McCarthy joined the discussion:


... I do not foresee a day in my lifetime when we will eliminate paperwork. Nor do I foresee the day when we will altogether eliminate regulations. What we can do, however, and what this bill does, is take advantage of existing technology capabilities and ease the regulatory burden on small businesses by reducing the amount of paper they must fill out, mail, and file.


Rep. Pryce's reference to the use of "smart technology" is instructive. As pointed out by Rep. Myrick, facsimile machines and E-mail can be used to submit information, and should be available as an option for those who do not have more efficient and effective means. However, for highly structured data, the truly intelligent alternative is forms automation software.(13)


Since the bill has passed the House unanimously not once but twice, it seems likely to be enacted during the current (105th) Congress. In any event, it is clear not only that Congress but also the public expects Federal agencies to begin to make the process of supplying information to the Government more efficient and less costly.(14)


A mainframe or database approach to the problem would be doomed to failure. It would be far too complex and time-consuming to endeavor to design the "mother of all databases" for any major Department or agency, much less for the Federal Government as a whole. Nor would Congress or the public stand for any attempt to require individuals, businesses, and organizations all to establish connection to the designated database, via an interface application "of Government, by the Government, and for the Government."(15) Moreover, even if that were politically possible, it would not be technically feasible because myriad elements would change far more rapidly than the database could be altered and maintained.


So what is a diligent public-service oriented agency to do?


Modular Contracting and Chunking


ITMRA [Sec. 5202(b)] charges Federal agencies "... to the maximum extent practicable, [to] use modular contracting ..." in the acquisition of information technology for major systems. "Modular contracting" is described as: "... successive acquisitions of interoperable increments. Each increment complies with common or commercially accepted standards ... so that the increments are compatible ..."(16)


In implementation guidance, OMB Director Raines expressed the concept more colloquially: "[systems] should be implemented in phased, successive chunks as narrow in scope and brief in duration as practicable, each of which solves a specific part of an overall mission problem and delivers a measurable net benefit independent of future chunks ..."(17)


And how are agencies to respond to the twin challenges set forth in ITMRA, which seem to be at odds with each other -- to implement projects in the smallest possible chunks, while at the same time reducing the number of different systems and sharing their functionality as widely as possible?


Object Oriented Programming


Object-oriented programming (OOP) provides a pertinent model to meet the challenge. OOP is based upon the encapsulation of properties, events, and methods together in logical, interoperating sets -- called "objects" -- that convey meaning and/or "act" appropriately when called upon to do so.(18) In Raines' and OOP terms, objects are chunks of larger systems in which some form of "intelligence" is embedded. Such intelligence is comprised of properties (or attributes) and methods (or procedures) that are called (executed and/or displayed in comprehensible form) by certain events (messages from other objects, which may include humans).(19)


Speaking of the development stage of the SDLC, Turbin (p. 454) says, "... the emerging ... approaches emphasize development speed, inherent self-documentation, high-level programming languages, reusable software components, and an 'open' (or common, standardized) systems orientation for compatibility in multivendor environments." (emphases added) Turbin was addressing systems development generically and his points are equally applicable to any development environment or approach, as well as to the IT systems which are developed. The systems themselves should provide for self-documentation and reusability of the data and information processed within them.(20)


Addressing OOP specifically, Turbin (p. 458) highlights two tracks. The first is "... oriented toward discovering requirements, modeling objects, developing the object architecture, developing prototypes, and extending the system over time." The second is "... assembly oriented; developers select and 'install' appropriate objects and components." Prior software developments or the commercial marketplace are the sources of the components used for current IT systems development projects.(21)


While OOP has gained great credence among IT specialists, the concept of "objects" is still too technical and ill-defined to capture much mindshare among the average user of information systems.(22) The average user continues to think in terms of, and to work with, documents, forms, and perhaps data. In some cases, they have made the transition to electronic representations of documents, forms, and data. However, few, if any would feel comfortable with the thought that they work with "objects" (much less that they themselves are "objects").(23) Thus, if a customer-focused approach is to be applied, it would be appropriate for systems developers to think in terms of documents, rather than objects or perhaps even data.


Document Objects


Consider the following definitions of the word "document":(24)


          An organized view of information. (PC DOCS, 1995)


          The unit of work within a defined business process. (Black Forest Group, 1995)


          A collection of related material, regardless of media, that conveys information. Documents can include paper, microfilm/fiche, word processing documents, spreadsheets, electronic mail, digitized images, videos, voice mail and so on. (Cronin, 1995)


          Information structured for human consumption. (Xerox, 1997)


          Data in context. (Unknown)


Using Raines' term, documents are chunks of information that have been lumped together in such a way as to be meaningful to someone. In OOPs terminology, documents encapsulate a logical set of information, which can be alphanumeric text, simple graphics, or more complex images. Documents may be as short or long, large or small, as appropriate to facilitate the necessary communication between the supplier and the customer.(25)


Forms are a particular kind of document used to gather highly structured information (i.e., "data") in a highly structured way. Forms facilitate the sharing of information among people and systems. The structure ensures that the data can be efficiently and effectively gathered, processed, shared, and used, with a minimum of confusion and little or no additional processing in terms of data interpretation, conversion, or reformatting.(26)


Structure and Formality of Information


Turbin points out that processes are a more fundamental aspect of business than are departments or other organizational arrangements (p. 378). He also highlights that detailed data elements can be grouped into classes, and that data classes are the underpinnings of business processes (p. 379). Conceptually, at a high level of technical abstraction but in terms readily understood by the average person, all types of information can be circumscribed by only three categories:(27)


          informal, unstructured


          formal, relatively unstructured


          formal, highly structured


Different means are applicable to the processing of information in each category, i.e., different applications software is best suited to each.(28) Informal, unstructured information is most appropriately processed and shared via E-mail, voice mail, and real-time voice communications, including one-on-one conversations and group meetings. Formal, highly structured information -- commonly called "data" -- is most appropriately gathered and shared via forms.


All documents, including forms, are processed and shared in document management systems, broadly defined, which may or may not be electronic, may or may not be well organized and managed, and may or may not be automated is some fashion. As individual classes of objects, some types of documents may be highly structured. For example, in addition to individual forms used for data collection purposes, particular point-in-time "views" of data that are produced as reports from databases are highly structured documents as well. Some kinds of text documents (e.g., letters, memoranda, contracts, and subpoenas) may also adhere to particular formats.


However, as a base class of information "objects" to be managed, "documents" are indistinguishable from each other without reference to the application in which they were created, the type classification to which they have been assigned, and/or other metadata associated with them. Thus, as a class, they are logically unstructured.


Reverse Engineer People and Processes or Data and Databases?


Kroenke (1995, p. 46) explains, "A database is a model of the users' model of their business activities... the development team must become thoroughly familiar with the users' model... [and] familiarity must be obtained early in the development process..." He highlights that there are two general strategies for developing databases -- top-down and bottom-up. The top-down approach proceeds from the general to the specific, and the bottom-up approach operates in the reverse order of abstraction, beginning with the need to develop a specific system. He suggests (p. 47), "... the entity-relationship approach is particularly effective with top-down development, and the semantic object approach is particularly effective with bottom-up development."


Unfortunately, the reality is that neither a top-down nor bottom-up strategy will be effective in large, bureaucratic organizations. The top-down approach inevitably fails to capture the complexity and richness that is vital to the efficient and effective conduct of business in the bowls of the organization. Databases developed from the top-down may fail to extend to the desktop of the majority of the average users. Insofar as they do succeed at some level in the organization, they may be stovepipes that fail to share data effectively and thus impose needless burdens on employees.


The myriad processes in a large organization are simply too numerous and complex to capture in a single database. Even the most massive reengineering and streamlining efforts, supported by the most cooperative workforce and enforced by a benevolent dictator, are likely to fail. More time will be spent studying and analyzing than implementing, and IT developments that do make it to the implementation stage will often be overtaken by events.(29) Top-down systems generally fail to reflect the vital view of the average users.(30) Consequently, users may develop their own "cuff" records systems while humoring, if not sabotaging top-down systems.


Conversely, the bottom-up approach cannot hope to achieve the breadth of vision required for an effective enterprisewide information management system. The bottom-up approach inevitably leads to stovepipes as well. Since they are developed by and for specific groups of users, such systems may reflect very well a narrow view of a slice of the organization's overall business processes. However, efficiency and effectiveness ends at the organizational boundaries of each group, as the data must be converted and transmitted repeatedly among systems and back again. As compared to top-down systems, bottom-up stovepipes are smaller and less wasteful individually. However, cumulatively, they may squander more resources in redundant and conflicting ways than top-down systems do.


Regardless of whether they start at the top or the bottom, database-focused IT developments do not lend themselves readily to chunking in a way that, cumulatively, will lead to the relatively few applications needed to meet the business and information requirements of the common user, at their desktop, while serving the overall interests of the organization as a whole.(31) Inevitably, viewing applications as an outgrowth of databases leads to needless programming, reprogramming, and re-reprogramming of applications software. It is as if people, the forms that are their data surrogates, and the means (applications) by which their data are processed are to be reverse engineered from databases.(32) Unfortunately, it is easy for database designer/administrators (DBD/DBAs) to lose sight of the fact that databases are a means to an end, not an end unto themselves.


Forms as Base Class People Objects


It is important to distinguish the difference between software applications versus the properties of individual forms used to collect highly structured data. Individuals and groups within any large enterprise will need access to different sets of data, and thus different database elements on the forms that they use. However, all will be called upon to a greater or lesser extent to supply data required to support the business and administrative processes of the organization. For highly structured data, the single application needed is electronic forms automation (E-forms).(33)


E-forms make access to databases seamless to the user. E-forms encapsulate much of the intelligence necessary for people to supply and process data efficiently and effectively, thereby minimizing the need for guidance and training.(34)


From an IT systems development standpoint, the beauty of the intelligent forms paradigm is that only the form itself needs to be changed to address new functional and technical requirements. Conceptually, in OOP terms, it makes the end users the base class of "objects" -- rather than establishing the databases as the super class, beneath which people are treated as virtual (pun intended) stepchildren.(35) The technical specifications required to write data to the databases become properties of the forms, rather than the forms being properties of any particular database. Thus, users are more likely to end up owning the databases, rather than the reverse. Just as databases break the program/data dependence of file processing systems, E-forms break the dependence of applications (and the people who are their users) upon particular database technology.


Notwithstanding the general grumbling that occurs, too many forms and too much "guidance" (including procedural guidance) are not the problem. Current efforts under the National Performance Review (NPR) to streamline guidance are somewhat akin to the Cultural Revolution in China. By destroying the intelligence and intelligentsia, everyone will be equal -- albeit ignorant. If there is a need for a form or for guidance, it should be made readily available to those who need it. Eliminating the guidance or the form does not eliminate the need for it, anymore than eliminating taxes obviates the need for public services. However, to the greatest degree possible, necessary and helpful guidance should be built into the forms and systems that people use to do their work, as intelligence that reduces if not eliminates the need for training and oral guidance, as well as the need to read and remember written guidance. In technospeak, it's called "infomating" our systems.(36) It is the essence of intelligent, automated forms.


Forms Automation Features and Functionality


E-forms are highly structured documents that are generated and displayed electronically, into which information can be entered electronically in specified fields, and from which information may be written directly to a database or multiple databases according to specified technical requirements for such databases.(37) Various kinds of intelligence -- such as lookup and validation tables as well as calculations and other data manipulations -- can be incorporated into electronic forms to assist and guide users in providing the requested information in appropriate format and substance. Depending upon network performance and database maintenance issues, lookup/validation tables can be encapsulated in the form itself or fields on the form can reference local or remote databases. Indeed, different databases can be called by different fields on the same form. Each form is in effect an expert system, an intelligent agent or surrogate designed to help its users do their work.


When appropriate and necessary, electronic forms may be used to gather data electronically for printed output in the specified format. However, to the greatest degree possible, the needless use of paper should be eliminated.(38) On the other hand, existing paper forms should be viewed as prototypes for more intelligent E-forms and streamlined processes.(39) If a data element is not already represented on a form somewhere in the organization, the need for it may be questionable. Indeed, perhaps the greatest frustration with forms stems from the fact that the very same data must be re-supplied so many different times. An enterprisewide E-forms approach can help to identify and minimize, if not eliminate such redundancies, in a way that enterprisewide, much less stovepipe database development efforts are unlikely to accomplish.


Forms automation software can also be used as a CASE tool.(40) In fact, by using a single forms automation client and forms design software, the entire SDLC can be turned on its head -- by using the forms to automate design of the database(s). In terms of customer focus and rapid development, contrast that with the traditional approach of leaving application development till the end of the development cycle and using database-specific software to design the forms and work processing applications.(41)


The forms automation approach can also help to instill in users, in a non-confrontational way, the necessary discipline and incentive to participate in establishing and maintaining enterprisewide data dictionaries. Each time the need for the collection of highly structured data occurs, the E-forms system would be available to facilitate the effort, by and for anyone in the organization. Those charged with conducting the data collection activity would need only to determine the appropriate elements of the collection, with which the form could quickly and easily be compiled.(42) At the same time, those asked to supply the data would merely fill out the appropriate form and process it in the prescribed fashion, which might be automated within the form itself. In any event, the very same E-forms software could be used for all such purposes.


If the collection is only for localized use, there would be no need to worry about broader organizational concerns. Once the form itself is designed, it could automatically create the necessary database structure for receipt of the data. However, forms designers would be encouraged to consider fully not only the potential uses of the data by others, but also the possibility that others have already established the appropriate elements in another database(s). In the event there is a need for more than localized use of the data and/or the forms designers would like to take advantage of elements already designed by others, they would be encouraged to consult the enterprisewide data dictionary.(43)


If the appropriate elements are already available, they could simply be incorporated into the new form. If not, subject to appropriate review and clearance procedures, the new elements could be entered into the data dictionary from the new form. In either case, the storage parameters (e.g., the data type and size, as well as the storage location) for each element could be tailored to the needs of the suppliers and customers for the data in each form. Depending upon network performance and user needs, data for each element may be stored in a distributed fashion throughout the enterprise. Indeed, a single form could be used to develop and populate a new client/server database while at the same time writing to a legacy database, either indefinitely or in temporary parallel until the new system is stabilized and proven.(44)


While the customers for the data would be free to decide where to store and maintain it, their decision would not impose any additional burden on the suppliers to log onto or become familiar with yet another stovepipe system.(45) In fact, the customers would be encouraged and enabled by E-forms software to consider alternatives to draw data from existing databases, either automatically and/or via validation/lookup tables, to be combined in the new data set required by the new form. Once compiled in the new form, the new and recombined data could be written to one or more additional databases.(46)


Forms Automation Vendors


Information on the members of the Business Forms Management Association is available on BFMA's home page (1997). Two of the market leading forms automation vendors are JetForm, which recently acquired Delrina FormFlow from Symantec, and Metastorm, which recently acquired the rights to develop and market InForms from Novell. (See Novell, 1996, December 19.)


According to it's Web site (About Us), Elite Federal Forms, Inc., is a systems integrator specializing in development and integration of electronic forms and workflow optimization for the Federal Government. The company maintains a large library of electronic forms, which is then licensed and customized for various Government agencies.(47) Elite says:


These forms provide fool-proof authentication and security and operate interchangeably on DOS, Windows, and Macintosh platforms.(48) Elite is the only licensed partner of Novell InForms for the Federal Government. This unique partnership with Novell, provides Elite with enhancements to the InForms product well in advance of the general public. These enhancements are passed on to Elite's clients immediately. Often these enhancements are initiated at the request of Elite and are crucial to the implementation of complex workflow designs. Elite's software development and design team provides Novell's development team with critical feedback. This feedback is necessary for InForms to facilitate a successful transition from a paper based system to a complete electronic office.


Elite, which is a subsidiary of Metastorm, claims to be "... the leading provider of electronic forms and workflow automation solutions for the U.S. Federal government." (Elite, About Us)


In its corporate profile on its Web site, JetForm says:


Forms are the international language of commerce and the basis for most business processes. They are used to collect business information in a structured way and to move work through an organization.


JetForm is a world leader in forms-based workflow automation... JetForm's core technology is based on a client /server, open system architecture that can be seamlessly integrated into the multi-platform computing environments typical of most large organizations. Our products are designed to interoperate with many leading business applications, allowing users to tailor our powerful forms automation solutions to their unique business requirements.


With reference to the company's Web-centric alternative, a press release (1997, March 10) summarizes some of the benefits of the use of JetForm's product:


The integrated solution responds to growing demands from organizations looking to extend existing applications to the Web or automate existing manual processes. JetForm solutions use intelligent forms to:


                      Initiate and manage a workflow approval process


                      Capture information from the user


                      Perform local, intelligent validation and handling on that information


                      Execute back-end forms processing, including printing, faxing, e-mailing and database updating


In announcing a new release of its E-forms software, EZX Corporation (1997) emphasizes object-orientation and boasts:


EZ-Forms PRO "fills" an organization's needs for a eforms processor, eforms designer, and eforms filler software for virtual paper forms. ...Forms "objects" can be linked into any program that is OLE enabled... EZ-Forms PRO can re-create an electronic version of any form for local or enterprise-wide use... EZ-Forms PRO's Designer was designed from the ground-up as a forms processor and focuses on form creation/design/modification, fill-in (scanned or internally generated), fill-out (pre-printed), calculation, validation, printing, etc. of any form(s)... The unique object oriented tools are specifically suited to forms design automation... with ... object linking and embedding .., robust mathematical/logical calculations, field validation, field help, field pick lists, auto-number/date/time, the ability to open multiple forms, [and] support for any installed Windows printer (including FAX) and paper size... EZ-Forms PRO provides a complete visual eforms solution for the desktop, laptop, enterprise, Intranet and Internet.


Although Lotus has touted Notes as a document management and forms automation solution in the past, there are signs that they may finally start to deliver the applications that organizations really need to conduct business, instead of relying upon third-party vendors to develop them.(49) Indeed, in mortal combat with Microsoft, McCready (1997) suggests they may have no other choice:


Inevitably, as Lotus seeks to differentiate Notes from Exchange, Lotus has no alternative but to move up the application curve and offer capabilities such as forms, imaging, document management and, perhaps in the future, even real workflow! After all, the alternative is for Lotus to compete on price with Microsoft -- and that clearly is something that one cannot do. However, if Lotus, and to a lesser extent Microsoft, begin to move the infrastructure up the application curve, where does that leave the imaging, workflow, document management, forms software and other specialists?(50)


If IBM/Lotus and Microsoft do indeed begin to deliver the forms automation and document management solutions that organizations need, third-party vendors will need to find higher-level means of adding value.(51) While that may be increasingly difficult for third-party software vendors to do, having the COTS applications needed to conduct business will be a boon to organizations. More particularly, it will be a boon to Federal agencies, who should not be in the software development business to begin with, and it will be a boon to taxpayers, who should not be forced to subsidize inefficient, redundant, ineffective, and costly software application development efforts.(52)


Another hopeful sign is that companies like PeopleSoft, which specializes in adding value through "Federal business applications for finance, materials management, distribution, and human resources," are beginning to recognize the imperative for distributed forms automation. An advertisement in Federal Computer Week (1997, March 31, p. 7) asserts:


PeopleSoft's workflow is open, so it can integrate with a variety of third-party products. You can use email for notifications, and electronic forms for turning around approvals. Or use internet forms and interactive voice response systems to communicate with PeopleSoft applications. And, unlike some solutions, PeopleSoft's are flexible enough for you to define your own processes and procedures. In other words, PeopleSoft adapts to the way you work, not the other way around. (emphasis added)


Who Is the Client, Who is the Server?


As implied by the PeopleSoft ad, there are two paths to the middle and it is possible to start at either end. However, except for the hype surrounding the Web, many if not most other factors argue for starting at the desktop rather than at the mainframe.(53) Some of those factors include the causes that have led to mandate for chunking; the wisdom of paying more than lip service to customer service and customer focus; the need to overcome the stovepipe mentality, to foster teamwork, and work toward a true enterprisewide information architecture to support common business processes, rather than simply a technology architecture that appeals to the IT specialists; not to mention the problems with the mainframe architecture that lead to client/server technology in the first place.(54)


No doubt Web browsers will evolve to include the functionality that organizations need to conduct business in a businesslike way -- particularly document management and forms automation features that enable users to efficiently and effectively process both highly structured data and relatively unstructured documents.(55) Meanwhile, virtually all of the more traditional client/server applications are being Internet enabled. At the points of convergence, the issue of browsers versus "fat" clients becomes a matter of semantics, a distinction without a difference.(56) Calling such clients "browsers" is semantically confusing, but the misapplication of the term should not be permitted to distort the appropriate application of the functionality. The browser will need only to look in the mirror to see its girth, and in its image it will see the client.(57)


Moreover, at least in theory, host-focused systems can -- through effective direction, efficient planning, and disciplined oversight -- be integrated to overcome the stovepipe problem with data sharing. However, if anything, Web technology increases the likelihood of stovepipes, because they are so easy to develop and multiply. Hypertext links may be quick and easy to use when they are relatively few in number, but as they multiply, they become yet another set of data or documents to be managed, both at the client as well as the host. Who will exercise effective coordination and leadership to ensure that a rational information architecture results, rather than an organizational Tower of Babel?


As required of OMB by ITMRA, Director Raines is endeavoring to fulfill that role for federal Executive Branch agencies. Neither he nor OMB can manage IT projects for other agencies. Indeed, ITMRA explicitly empowers agency heads with authority and makes them responsible and accountable for their own projects. However, the chunking principle that Raines has set forth is pertinent and insightful guidance. To reiterate and reemphasize, that principle specifies that projects should:


Be implemented in phased, successive chunks as narrow in scope and brief in duration as practicable, each of which solves a specific part of an overall mission problem and delivers a measurable net benefit independent of future chunks...


Multiple browser-to-host links backed by myriad databases and mainframes could each be interpreted to be a "chunk" of a larger system, and cumulatively they could add up to a rational enterprisewide system in which the needs of all are adequately addressed. However, it is difficult to see how such a multifaceted, hyra-headed, top-down approach is likely to meet the needs of the average user, much less the overall needs of the organization as a whole. Besides the fact that it is virtually antithetical to a customer-focused approach, it envisions and requires the specification, development, and maintenance of myriad user interfaces (UIs) at the hosts.(58)


Addressing the topic of human-computer interaction (HCI), Downey (1997) said that UIs now comprise about 60 percent of total programming lines of code (LOC). That statistic highlights the dubiousness of allowing DBDs to control the development of UIs tailored to their own particular database. It also calls into question why different UIs are needed for basic forms automation (and document management) requirements that are common across all organizations.


Downey pointed out that 75 percent of software life cycle costs occur during maintenance and that 80 percent of maintenance costs are due to unmet or unforeseen user requirements. (Only 20 percent are bugs or reliability problems.) With so many DBDs all attempting to reinvent the wheel in terms of forms automation (and document management) applications, there is little wonder that common user requirements are so often unmet and that so much time and money is spent trying to address such deficiencies while developing and maintaining home-grown, proprietary applications.


Isn't it time to ask the question, who is the client and who is the server? And, ultimately, who is the customer and who is the supplier? More directly speaking, do people exist to meet the needs of databases or do databases exist to meet the needs of people?


Conclusion


The answer to these questions is obvious and so too is the need for a new approach to IT systems development. There is a better alternative to the traditional database-focused SDLC approach, which so inevitably leads to stovepipe systems. There is an approach that is far more customer-oriented and, at the same time, far more likely to lead to a rational enterprisewide information management solution. That approach is to deliver to the desktop of the average user the basic set of applications they need to process all three categories of information -- informal/unstructured, formal/relatively unstructured, and formal/highly structured.


For the collection and processing of highly structured data, the single application needed by the average user is a standards-compliant COTS forms automation client with which myriad forms can be processed -- each as an incremental, self-contained object, chunk, and business process.(59)


Automated forms are the intelligent way to put the customer first... Isn't that the object?




References


Alsup, M. (1997, March/April). "Imagineers: Imaging, Document Management, and the Future." Document Management. pp. 18 & 19.


Ambur, O. (August 3, 1995). "Functional, Technical, and Resource Requirements for the Servicewide Document Management System: Findings of the Requirements Analysis Team, Together with Recommendations and Alternatives." U.S. Fish and Wildlife Service.


Ambur, O. (May 9, 1996). "Critical Success Factors for a Collaborative Database in a Large, Geographically Dispersed Organization." University of Maryland University College.


Ambur, O. (1997). "Some Provisions of Law Relating to Access to Public Information." Available at: http://www.fws.gov/laws/infolaw.html


Berra, Y. (1997) Some Yogi-isms are available on the Office Yogi Berra Web site at: http://www.imall.com/stores/yogiberra/yogiberra12.html


Black Forest Group. (January 1995). "Requirements for an Enterprise Document Management System." Association for Information and Image Management. Maryland: Silver Spring.


Business Forms Management Association (May 29, 1997). Home page available at: http://www.bfma.org/~bfma/


Cronin, J.L. (May 1, 1995). "An Introduction to Electronic Document Management." Wang Federal, Inc.


Cross, S., Director, Software Engineering Institute, Carnegie Mellon University. (March 3, 1997). Remarks in the keynote address at a symposium on software usability engineering at the National Institutes of Standards and Technology.


Davis, S. and Davidson, B. (1991) 2020 Vision. New York: Simon & Schuster.


Downey, L. (1997, March 3). Remarks at symposium on software usability engineering at the National Institutes of Standards and Technology.


Elite Federal Forms, Inc. "The Information Processing and Forms Automation Specialists." Home page available at: http://www.elitefedforms.com/


Elite Federal Forms, Inc. "About Us." Available at: http://www.elitefedforms.com/html/about_us.html


Elmasri, R., and Navathe, S. (1994). Fundamentals of Database Systems. New York: Addison-Wesley.


EZX Corporation. (1997, March 21). Press release entitled "EZ-FormsTM PRO v97D the Ultimate Universal 'Enterprise Solution' Electronic Forms (Eforms) Processor Introduced." Available at: http://ezx.com/pressrelease.htm


Federal Computer Week. (1997, March 31, p. 7). PeopleSoft advertisement.


Francis, R. Department of the Interior Forms Managers Web site. Available at: http://www.ios.doi.gov/oirm/oirm/forms.html


Gagnon, J. (1997, April 7). Quoted in "The Buzz" column by Bruce Hoard. Imaging World. p. 6.


Gellman, R. (1997, March 31, p. 25) "Electronic signatures bring new identity crisis." Government Computer News.


Gross, J. (1995, December). "Motorola Makes Electronic Transition." Reprinted from Forms Automation Technology Report. Available at: http://www.f3forms.com/formstech.html


ITMRA, Information Technology Management and Reform Act of 1996 (also known as the Clinger-Cohen Act). Division 5, P.L. 102-106. Signed February 10, 1996.


JetForm, Inc. (Home Page). Corporate/product information and virtual forms warehouse available at: http://www.jetform.com/


JetForm, Inc. (1997, January 23). Press release entitled "Novell Senior Executive to Head JetForm® Corporation U.S. Sales: Ian Fraser Assumes Vice President, U.S. Sales Post." Available at: http://www.jetform.com/fraser.html


JetForm, Inc. (1997, March 10). Press release entitled "JetForm Provides End-to-End Solutions for Intranet/Web Applications: Forms Provide Ideal Metaphor for Browser/Server-Style Computing." Available at: http://www.jetform.com/iwreleas.html


JetForm, Inc. (1997, March 11). Press release entitled "JetForm and Microsoft Collaborate: Future Strategy Set for Web-Based Workflow." Available at: http://www.jetform.com/msexch4.html


Keating, W. (1997, April 19). "The Wide World Web: A Stake in the Heart of Client/Server?" Center Spotlight. American Management Systems (AMS). Available at: http://www.amsinc.com/amscat/centspo2.htm


Kerr, J.M. (1991). The IRM Imperative: Strategies for Managing Information Resources. New York: Wiley.


Koulopoulos, T. (1997, April 7). "How the Internet is driving 'wide area workflow'." Imaging World. pp. 40 & 49.


Kroenke, D. (1995). Database Processing: Fundamentals, Design, and Implementation. Englewood Cliffs, New Jersey: Prentice Hall.


May, T. (1997, April 7). "Evolution in thinking about process redesign." Imaging World. p. 38.


McCready, S.C., and Murray, G.J. (1997, April 7). "ISVs compete for desktop with Microsoft and Lotus." Imaging World. pp. 1 & 105.


Metcalfe, R.M. (1997). Interview in "Expanding Intranet Technology to the Second Generation: Extranets." Promotional flyer by Internet Commerce Enterprises (ICE), an International Data Group (IDG) Company, for a conference to be held in Washington, D.C., on June 23-25, 1997. Additional information available at: http://www.idg.com/ice


Metastorm, Inc. "Optimizing business processes using workflow, electronic forms, and information processing tools." Home page available at: http://www.metastorm.com/


Novell (1996, December 19). Press release entitled "Metastorm Sign Partnership to Continue Development of the InForms Electronic Forms Solution." Available at: http://www.novell.co.za/News/archive/np-00289.html


PC DOCS, unidentified spokesperson. (February 1995). National users conference. Orlando, Florida.


PeopleSoft. Home page available at: http://www.fed.peoplesoft.com


Raines Rules. (1997). Summary of guidelines to be applied to IT investments included in the President's budget. Available at: http://www.fws.gov/laws/


Ramakrishnan, R. (1997, March/April). "Integrate Digital Signatures Into Your Document Management Product." Document Management. pp. 14 - 16.


Safdie, E. (1997, April 7). "Caution: technology convergence ahead." Imaging World. pp. 33, 36, 39, 41, 43 & 45.


Turbin, E., McLean, E., and Wetherbe, J. (1996). Information Technology for Management: Improving Quality and Productivity. New York: Wiley.


Turner, R.M. (1996, September 6). "The Tragedy of the Commons and Distributed AI Systems." Department of Computer Science, University of New Hampshire. Available at: http://cdps.umcs.maine.edu/Papers/1993/TofCommons/TR.html


Varon, E. (1997, March 31). "AMS revamps financial package." Federal Computer Week. pp. 36 & 41.


Varon, E. (1997, April 14, p. 25). "DOT cuts paperwork; allows grant applicants to use Web, Agency will be the first to create an interface that allows filing for any federal grant program." Federal Computer Week.


Varon, E. (1997, April 14, p. 10). "Top priority: governmentwide architecture." Federal Computer Week.


Walker, F. (1997, March 18). In a personal communication via E-mail, Walker indicated that U.S. Fish and Wildlife forms are identified in the agency manual, 281 FW 3, Appendix 1. They are numbered from 1 to 2146 but are not renumbered when a form is abolished. Thus, more than 2000 have been officially registered, but how many are in current use is uncertain.


Walker, F. (1997). U.S. Fish and Wildlife Forms Server. Available at: http://www.fws.gov/~r9pdm/FORMS/


Wood, D. (1997, April 21). "Netting new gains from earlier document capture." Imaging World. pp. 18-20.


Workflow Management Coalition (WfMC). Home page available at: http://www.aiai.ed.ac.uk:80/WfMC/


Xerox. (1997, February 23). In Safdie, E. "From Data to Information Warehousing." Imaging World. p. 29.




End Notes


1. Turbin (pp. 409-410) depicts implementation as the sixth of seven steps, followed only by evaluation and maintenance, in a "waterfall" model. In Kerr's (p. 61) model of the traditional development life cycle, delivery is the last of 10 steps.


2. Turbin (p. 417) defines "systems analysis" as "the process of separating a whole into its parts to allow examination of the parts." He says, "This leads to an understanding of their nature, functions, and interrelationships." While it is theoretically possible through a top-down approach to define an entire organization in terms of a database schema, in practice the task proves to be insurmountable for large organizations. An object-oriented approach, considering each form currently used within the organization, would seem to hold greater potential for success.


3. According to Cross (1997), 69 percent of the projects developed at the Department of Defense don't meet user requirements and one-third are canceled.


4. Turner (1996) offers an interesting discussion of the tragedy of the commons with respect to distributed artificial intelligence systems.


5. Koulopoulos (1997) observes, "The Web's widely held promise for connecting scattered workers and geographically dispersed workgroups has been obscured by security issues and unmet expectations... most business processes continue to rely on a variety of legacy and client-server applications."


6. Turbin (p. 369) identifies integration of various databases as one of four problems of IS planning. The other three are: integrating IS planning with the overall strategies and objectives of the organization, allocating resources among competing applications, and completing projects on time and within budget. Implementation of a single forms automation client software for all highly structured data collection and processing needs can help to overcome all four of these problems.


7. Wood (1997) provides an insightful commentary on the benefits of capturing documents early in their life cycle, indeed as part of the process by which they are created. Among the most pertinent points are: "Now the creator can log into a Web page, fill out an electronic form and complete the document submission with one keystroke." (p. 19) "... Although the benefits are large, the technical hurdles also are because the sending organization has to keep track of the same information the receiving organization does -- usually with a different accounting system... it require[s] ... a middleman [which] can manipulate the formats [of the data] to match the creation and destination formats... EDI has slowed ... because the process of setting up EDI is too cumbersome, support-intensive and expensive ..." (p. 20) Although Wood seems to be referring to a third-party organization, the "middleman" function can be performed by E-forms software. Likewise, the accounting/record-keeping function argues for: 1) an E-forms client in lieu of a simple browser-to-host connection, and 2) an EDMS to store and manage the completed forms (as files, not images) in lieu of simply relying upon the trustworthiness of whatever ends up in the host database. (Based upon discussion in the trade press, the "trojan horse" problem seems to be pretty intractable.)


8. It is ironic, as well as indicative of the sad state of affairs, that it is largely database designers (DBDs) and administrators (DBAs) who perpetuate the tragedy of the commons in IT systems. One of the most basic principles of database design is that many-to-many (M:N) relationships are resolved by constructing an intersectingentity. The third entity enables each of the many instances in each of the other two to relate effectively and efficiently to each instance in the other. No one should understand this relationship better than the DBD/DBAs. Yet few seem recognize forms automation software as the appropriate intersecting entity for all highly structured data collection activities, and those who do seem precluded by institutional inertia or constraints from implementing E-forms software enterprisewide.


9. With respect to documentation and data compiled with Federal funding, not only is a proprietary, egocentric attitude inappropriate, it is also illegal. For a discussion of Federal information access laws, see Ambur, 1997.


10. Varon (1997, April 14) reports that a Governmentwide systems architecture that will let agencies share information and services is the top priority for the Chief Information Officers (CIO) Council.


11. Metcalfe (1997) discussed the difference between intranets and extranets as follows: "...Intranet is the use of TCP/IP and Web technology to implement some of a company's information systems for use internally by its employees... Intranets are mostly found ... on private networks ... because of the Internet's current poor security, reliability, and performance. Extranets have ... been mostly on the public Internet. [But] private Extranets are now appearing because of the Internet's current shortcomings." With respect to whether Inter/intra/extranet technology means the death of client/server systems, Metcalfe says, "Let's not get carried away with ... buzzwords... [T]hin clients, with browser plug-ins and Java applets, are getting thicker again." In short, if the focus is on the functionality rather than the technology by which it is delivered, this issue becomes a distinction without a difference. He concludes: "My hope is that thepublic Internet can be upgraded rapidly enough -- security, reliability, and performance -- so that private Extranets do not become the norm... we're all better off ... being connected, rather than doing our own disconnected things." The speed with which the Net's shortcomings can be overcome remains to be seen, and business must be carried out with other tools in the meantime. Nevertheless, his point is well taken. It is another reflection of the tragedy of the commons.


12. The Paperwork Elimination Act (H.R. 852) passed the U.S. House of Representatives on March 13, 1997, by a vote of 395 to 0, and is pending action in the Senate. The provisions of the bill are summarized in H.Rpt. 105-7, issued by the House Committee on Small Business. The quoted remarks were made during the debate on House passage of the bill and appear on pages H989 - H1000 of the Congressional Record.


13. Whereas E-forms are the intelligent technology for sharing and processing highly structured data, electronic document management systems (EDMS) software is the intelligent technology for processing and sharing relatively unstructured documents, including E-forms as one of many logical classes of documents.


Safdie (1997, p. 39 & 41) provides a real-world example of an important relationship between highly structured data and documents from his own experience. Upon returning from a trip to Europe, he discovered a charge on his credit card billing statement for which he provided documents proving that the charge could not be accurate. When the company failed to correct the problem based upon his correspondence, he repeatedly called. However, the credit card company's information system only contained the (inaccurate) highly structured data, while the documents that disproved it "... lay in the back room where, dutifully filmed and cataloged, [they] were not available to the operator to resolve the dispute." As this example demonstrates, in many cases it is not sufficient merely to have transactional data in a database. Supporting documentation is needed as well, including perhaps an original copy of the E-form by which the data was entered into the database. Without such documentation, it is impossible to audit the data and correct mistakes.


Dale (1997) provides an interesting commentary on the relationship between documents and data with respect to EDI and forms. He says:


EDI transaction sets define the contents of what previously had been form-based documents... The most important aspect of EDI is that data and documents converge, thus minimizing distinctions between data, documents and records... tremendous cost savings [can be realized] by hosting internal forms on a Web server (intranet)... The user ID (under control of the information technology folks) authenticates the creator/submitter of such forms... The database will form the single repository and be the focus of all structured business information... Structured documents/records (transactions) will exist as fields in a database management system.


At the same time, he points out:


Traditional databases and tools ... cannot satisfy the unstructured aspects of information. Estimates have placed the amount of unstructured information in organizations at 65% or greater -- meaning that relational databases only accommodate one-third of an organization's informational assets.


Dale suggests that "universal server" OO database technology will be the means by which complex documents will be integrated for "structured access." He acknowledges that such an approach may ignore a more pressing problem -- information retrieval. However, there may be an even more fundamental issue. Even if the security/auditability concerns can be adequately addressed (a highly dubious assumption), the problem remains that such would be a top-down approach that is likely to fail for the same reason that most mother-of-all-databases do. Attempting to lump large, complex documents together with simple, highly structured data elements in an OO database does nothing to resolve the problems associated with designing and implementing large, complex databases. In fact, it aggravates those problems.


14. Government Computer News (GCN, March 3, 1997, p. 49) contains an article entitled "Clinton unveils new online plan: Electronic funds transfer by '99 is one of the administration's many goals." A table in the article highlights five goals, four of which include EFT, IT acquisition, IT sharing, and public access. However, a couple paragraphs are devoted to the other goal -- the Global Criminal Justice Information Network -- and its juxtaposition with EFT highlights that both are to a large extent forms automation functions (with a healthy dose of document management).


Greg Woods of that National Performance Review (NPR) is quoted as expressing surprise at "... all the ... people working on the same issue," but there is no reason for surprise that everyone is working on solving problems that are common to every organization. What's surprising is that it is so hard for people to see (or at least to acknowledge) that there is nothing unique about their forms automation and document management software application needs. Of course, the database structure -- beyond the core elements -- will differ for different purposes. However, many if not most of the software application requirements are essentially the same in all offices and organizations. What is difficult to understand is why managers allow database administrators to control and tailor the development of so many different user applications, instead of insisting upon a small set of standard COTS applications (including E-forms and EDMSs) that can meet the bulk of the requirements that are common to everyone.


The article indicates that the plan calls for "... federal, state and local police organizations to form an advisory group by May and define core network requirements... By September, the group should identify the funding, standards and leadership hurdles that must be overcome before field testing network requirements using commercial products in December." Hopefully, they will avoid squandering the taxpayers' money reinventing stovepipe database systems for user applications that would be better served by using COTS forms automation, document management, and database "report-writing" software.


15. May (1997) says, "Too frequently, redesign initiatives are structured as projects that are done to or done for users vs. projects that are done with them. Such top-down mono-dimensional programs typically fail." However, his solution -- to involve everyone and then to let everyone design their own processes -- is a little bit like apple pie and motherhood. Everybody is for them, but it takes more than that to make them. To gather highly structured data efficiently, the solution is not to have everyone design their own E-forms software. The solution is to make COTS E-forms software readily available to everyone who needs it, and then let them design their own forms to gather the specific data that they need. To the extent that data needs to be shared among different databases, the necessary linkages become properties of the E-form. The form, rather than any database or set of databases, is the logical object that can be understood, adapted, and used by individuals and groups in a fashion that is consistent with the needs of the organization as a whole.


Indeed, while the world's biggest generator of forms, the Internal Revenue Service, foundered on the shoals of its gigantic, ill-conceived, top-down Tax System Modernization (TSM) project, commercial vendors like Intuit and Kiplinger have delivered the forms automation client software, TurboTax and Tax Cut, needed to make compilation of tax returns somewhat less taxing. Using these clients, not only do taxpayers benefit from a great deal of intelligence built into the software, but they can even transmit their returns electronically to IRS. The question is whether IRS is prepared to accept them, or whether they will persist in pursuing mainframe, stovepipe, and paper-based solutions. (The fact that IRS continues to require electronic filers to sign and file a paper form has been identified as a significant impediment to efficient on-line filing.) One wonders if it would not actually save money if IRS were to get out of the software development business altogether and simply purchase and give every taxpayer in America COTS client software for tax filing purposes. (For additional background on TSM and an estimate of the cost of filling Federal tax forms, see end note 52.)


16. ITMRA, also known as the Clinger-Cohen Act, was enacted as Division V of the National Defense Authorization Act for Fiscal Year 1996. The cited provisions are contained in subsection 5202(a) of the Act and are codified in subsections (a) and (b) of section 35 of the Office of Federal Procurement Policy Act (41 U.S.C. 401 et seq.).


17. This is the seventh of eight policy guidelines issued by Director Raines in a memorandum dated October 25, 1996, to department and agency heads. Those guidelines have come to be known as "Raines' Rules". Conceptually, IT project chunking adheres to the evolutionary, "continuous improvement" paradigm of Total Quality Management (TQM) rather than the revolutionary paradigm espoused by the proponents of reengineering. Each paradigm has its place, and reengineering is particularly appropriate in the highly competitive regime of the private enterprise marketplace, where "creative destruction" is the key both to survival and profitability of the enterprise, as well as to rising standards of living and serving the best economic interests of consumers in the long run. However, short of revolution in the streets or political upheaval in the election booth, the continuous improvement paradigm is clearly more appropriate for public action under a democratic form of government, which is characterized by compromise, consensus, and incremental change. In some circumstances, governmental leaders may be in a position to exercise enlightened dictatorship. However, as a general governmental IT policy, chunking and continuous improvement are clearly the superior, if not the only realistic alternative.


18. Kerr (p. 128) characterizes OOP as follows: "Object oriented programming focuses on building systems that reapply existing code chunks and program designs." In that sense, E-forms software itself can be thought of in terms of being a software application object, comprised of the commonly required properties and methods for processing highly structured data. Once the E-forms object has been made available for one highly structured data collection and processing purpose within an organization, it can also be used for all other such purposes. Kerr (p. 136) also notes as one the benefits of OPP:


The combination of inheritance and encapsulation helps systems evolve gracefully. The "freeze and fix" mentality associated with conventional languages (i.e., the system in need of maintenance is taken from the user while it is being repaired) is replaced with a real-time systems maintenance initiative.


The same is true of the application of E-forms software, unlike database-oriented solutions.


19. At a symposium on software usability engineering at the National Institutes of Health on March 3, 1997, objection was raised to the thought of considering humans as mere components (or objects) of information systems.


20. Self-documentation and reuse can be fostered through the use of electronic document management systems (EDMS) and well-integrated forms automation (E-forms) applications.


21. With the advances in COTS software applications, a very real question is whether Federal agencies need to be in the software development business at all. Indeed, Raines' Rules No. 2 and 3 (1997) specify that Federal IT projects should only:


2. Be undertaken because no alternative private sector or governmental source can efficiently support the function.


3. Support work processes that have been simplified or otherwise redesigned to reduce costs, improve effectiveness, and make maximum use of commercial off-the-shelf (COTS) technology.


22. If a word association experiment were conducted, even in this era of political correctness, the most frequent response to "object" might still be "sex". Among the more sensitive, it might be "affection". Among the bon vivant, it might be "art". Of nearly 50 synonyms covering both pronunciations of the word in the WordPerfect thesaurus, the author's person favorite is "doohickey" -- as in doohickey-oriented programming (DOP) . Those who are somewhat more intimidated by the concept may prefer the synonyms "victim" or even "scapegoat" -- in which case the acronyms would be VOP or SOP, respectively, for victim-oriented or scapegoat-oriented programming. (However, SOP would be confusing because it could also stand for sex-oriented programming, which some folks might equate with "standard operating procedures.")


23. Perhaps a first step toward wider acceptance of OOP would be for IT systems developers to recognize E-forms software as a base class software applications object -- comprised of the feature set commonly required for the collection and processing of highly structured data. If software developers cannot understand the concept of reuse of features that are commonly required, there is no reason to think that the average user of IT systems should.


24. Several of these definitions are cited in Ambur (August 3, 1995, p. 1).


25. To stretch the concept further than the author would prefer, a document "object" might be considered to be the encapsulation of the message whose transmission from the customer "object" to the supplier "object" (and/or vice versa) is the event that triggers the desired method or procedures with respect to other objects (e.g., delivery of goods, payment for services rendered, or preparation of further documentation).


26. Forms and "paperwork" in general are widely derided as needlessly burdensome. However, many of the complaints are the functional equivalent of shooting the messenger. No doubt, many information and data collection and processing activities are needlessly complex and redundant, but informed discussion should focus on the roots of the problem -- which are the use of inadequate and inappropriate information systems, as well as the lack of data sharing. If the need for highly structured data is justified and the data cannot be more efficiently derived from existing systems, then a form certainly should be used to gather it from the people best qualified to supply it. Likewise, if information is truly needed and cannot be described by the customer in a highly structured format in advance, then it should be provided in relatively unstructured text and graphics that best represent the supplier's understanding of the customer's needs.


27. This classification was first set forth by Ambur in September 1995 for consideration by the Systems Architecture Team for the U.S. Department of the Interior and was reiterated in Ambur (1996, pp. 2 & 3).


28. Turbin (p. 387) points out that one of the main objectives of organizational information requirements analysis (OIRA) is to avoid fragmented, non-integrated systems. The first step of OIRA is to define underlying processes that are fundamental to the operation of the organization. Few, if any analysts would suggest that organizations should develop their own E-mail software, which is most appropriately used for informal, non-business quality communications. Thus, it is curious that proprietary, non-integrated systems are so often recommended for E-forms and EDMS purposes. This appears to be evidence of the need for organizations and analysts to move up the Capability Maturity Model (CMM) in terms of their ability to understand and to meet requirements associated with highly structured data and relatively unstructured documents, which together with E-mail circumscribe most organizational processes, to the degree that they can be reflected in IT systems.


29. The aim of planning is to define a course of action and the purpose of database planning is to impose structure on the data gathered and processed by the organization. However, the database approach fails to recognize that the data required by the organization is already defined in the forms used by its people. In one Federal agency alone, the U.S. Fish and Wildlife Service, more than 2000 forms have been officially registered and enumerated. (Walker, 1997) Simply by automating their forms, organizations can shorten the critical path to an efficient and effective enterprisewide information management system for highly structured data.


30. Regarding business process reengineering (BPR), May (p. 48) says, "The focus of redesign is no longer to engineer costs out of existing processes but rather to design value into new processes. The game has migrated from big-bang 'fix-it-all' to fix a process at a time." (emphasis added) E-forms address each highly data collection activity in that fashion, while the use of a single forms automation client eliminates the need for redundant applications software programming.


31. Recall Kerr's (p. 7) previously cited argument that a single application that meets a multitude of requirements is better than many smaller applications, and that systems must be built independently of how the firm is organized.


32. Kerr (p. 160) says, "Reverse engineering ... is aimed at converting old applications into new integrated database systems." Taking a broader view, the issue is whether people are the "old applications" or whether the databases are -- whether people should be "reverse engineered" or whether the databases should be. An approach which accommodates the needs of the databases to those of people would seem to have a significantly greater chance of success, and the needs of people would seem to be much better served by a single forms automation client than by myriad connections to myriad stovepipe database applications.


33. EZX Corporation (1997) defines an E-form as follows:


Simply it is the on-sceen, computerized manifestation of a form that is typically filled out on a piece of paper. Since filling the form is done within a computer program, many features unavailable with a typewriter or pen are available, including: Background/image protection, automatic fill-out field advance, calculations, validations, field specific help, non-printable on-screen clues, etc. The Eform is usually designed to emulate the paper form it replaces exactly. The user of an Eform simply fills it out according to the guidelines and rules set down by the Eforms designer. After the typical Eform is filled out/in the user can save it to file, send it via email, etc. or print a hard copy to blank paper or pre-printed forms. When printing to pre-printed forms, only the data entered is printed. In summary, an Eform is somewhat similar to Email, but an Eforms processor is quite different from a word processor, spreadsheet, etc. ...it is uniquely suited to processing forms.


34. In OO terms, training might be said to be the encapsulation of methods (or procedures) in the crania of people-objects, so that such procedures are carried out by the people-objects upon receipt of the appropriate messages (or stimuli) from other objects in their environment, e.g., the boss, a customer or colleague, the telephone, a computer screen, etc. With respect to a leading technology company's E-forms effort, Gross (1995) quoted a Motorola spokesperson as saying, "Any time you have to do extensive training .., you're failing. You don't need that for paper forms; why should you for electronic ones?" Likewise, organizations should not have to train users how to supply data for several or many database applications when one E-forms application can be used with all of them.


35. Electronic slaves to many masters would be another apt, albeit less object-oriented, analogy for the average user who is expected to interact with many databases through many different applications.


Turban (p. 355) defines "end-user computing" as "the use and/or development of information systems by the principal users of the systems' outputs ..." However, if those who are required to spend the most time and effort on the system are those who supply, rather than "use" the data, how can it truly be said that the "customers" rather than the suppliers are in fact the "end-users"? Such a definition belies a DBD/DBA bias. In fact, the real, primary users of database applications are those who are called upon to supply the data, and the application should be designed with their needs and interests primarily in mind.


36. For discussion of "infomating" and the "informationalization" of systems, see Davis and Davidson (p. 17 and first end-note to chapter 1).


37. In the context of database processing, Kroenke (p. 218) defines "form" as "... a screen display that is most often used for data entry and edit. Forms can also be used to report data, but in most cases when developers speak of forms, they mean those used for data entry and edit." For a form to seem natural and easy to use, he emphasizes that the structure of the form must reflect the structure of the object that it materializes, and he discusses some of the ways to accomplish that objective. (pp. 219 - 228)


38. Discussing continuing advances in PC capacity with reference to imaging applications, Alsup (1997) says:


One irony of all this new performance is that desktop document imaging and document management didn't really need much more PC performance... [However] additional PC performance will enable applications to do more with images. For example, forms processing is a standard application that needs configuration for each document type. Faster processors will directly contribute to faster recognition and the interpretation of data on source documents for machine entry. Arguably, this could lead to the elimination of much source document data entry based upon configuring standard PC forms processing software modules to recognize and interpret the data on the forms.


39. Even as existing paper forms should be considered prototypes for E-forms, all IT systems are prototypical depictions of the real world. There is no DBD nirvana. As the saying goes, "it is a journey, not a destination." Continuous improvement is the only realistic objective, and distributed E-forms technology lends itself much better to that objective than stovepipe, mainframe databases do.


Kerr (p. 109) suggests that the way to carry out data modeling is first to consider "each form, report, and inquiry" used in the business process as a "nominated entity." Then each "component of a document" is transformed into "a list of data items." In other words, the business data model is already reflected in the forms and documents used by the organization. While it might be desirable to reengineer those forms and documents, as well as the means by which they are processed, it may not be realistic to think that can be done in a timely fashion. In any event, the lack of reengineering should not be taken as an excuse or justification for failing to use E-forms (and EDMS) software to automate the existing business model.


Kerr (p. 68) also suggests, "It's important to note that DSSs, unlike traditional transaction-driven systems, are never considered complete. The DSS evolves after it is delivered." His point is well-taken. However, the distinction that he draws between decision-support and transactional systems is too narrow. Indeed, transactional systems are DSSs too, albeit at a lower level of data abstraction. While some of the traditional systems may have been fairly static for some period of time, that may be more a reflection of the difficulty in changing legacy systems than of any lack of need to continuously improve them.


Finally, Kerr (p. 170) says of the prototyper's life cycle:


It's no wonder that the prototyper's life cycle is void of a logical design phase: Its focus is the rapid development of applications. Systems can be built quickly when a project team begins its effort by defining a model of the physical system (i.e., prototyping) instead of analyzing requirements and modeling data needs.


Indeed, since physical systems are already in place for handling forms and documents, since those forms and documents already reflect the business and information requirements of the organization, and since there is no such thing as the perfect solution, what can be the excuse for not using E-forms (and EDMS) software immediately to process existing forms, while continuously prototyping them toward the more efficient collection and use of data in the future?


40. For information on computer-aided software engineering (CASE), see Turban (pp. 215-217) and Kerr (pp. 146-166). In his discussion of CASE tools, Kerr (p. 151) unintentionally but quite directly makes the case for a forms automation approach to systems development when he says:


Very large projects are often developed by decomposing them into smaller, implementable components. The integration of these components can become a nightmare without a framework in place that helps coordinate each development effort.


Intelligent, automated forms are the components from which very large systems can be composed, and the forms automation client and designer software constitute the frameworks that can turn the nightmare into a living dream. In effect, Kerr (p. 165) says as much:


The effective use of CASE technology can make the impossible happen. Automated project development allows small teams of people to build large integrated applications.


What he does not say but which is true is that many different teams can work on many different aspects of a large enterprisewide system while the forms automation client and designer software can serve as the emulsifiers, the agents that blend the ingredients together into a consistent whole without becoming unduly restrictive, binding, or resistant to evolution and change.


41. The prevailing view seems to be that work process reegineering should be accomplished prior to implementation of IT automation. Even with reengineered processes, workflow automation can be a highly complex and arduous task. While process improvements for which there is consensus (and/or effective leadership) should obviously be considered in any IT design process, seldom if ever are such changes obvious in the sense of being a matter of consensus -- for the same reason that a single database cannot be designed that represents everyone's view of the world. Thus, like the database approach to systems design, the reengineering approach is a prescription for failure in many, if not most instances. Moreover, even in those rare instances that both reengineering and workflow automation do have sufficient support, technical challenges and practical considerations are likely to draw out the development process for an inordinate amount of time, if not lead to outright failure.


While there are many problems with workflow automation, one hopeful development has been the formation of the Workflow Management Coalition (WfMC) to develop standards for interoperability among workflow automation systems. When the standards and the technology are mature, it will be possible to plug in and use different workflow automation tools interchangeably with different standards-compliant EDMS and E-forms systems. Until then, workflow automation should be approached with caution and only after more basic EDMS, E-forms, and E-mail capabilities have been mastered and stabilized.


42. Of Motorola's experience with forms automation, Gross (1995) noted:


It is important not to over-analyze each form ... On simple forms, the philosophy was to do them cheaply and revise quickly. Enabling print-on-demand capability for high-use forms can be done by almost anyone; you don't need a software engineer and the result is an immediate drop in printing costs. Those forms can later be transitioned to fully automated ones, if necessary.


Indeed, for systems involving the collection of highly structured data, proceeding form-by-form and process-by-process is the essence of chunking as required by Raines' Rule No. 7. Even as it is a mistake to try to build the "mother-of-all-databases" in any large organization, it is also a mistake to try to automate all forms at once or even to worry about fully automating the processes inherent in any particular form.


43. ITMRA, section 5122, requires Federal agencies to take into account potential benefits and costs to other Federal, State, and local agencies when designing IT systems. Section 5113(b)(3) requires OMB to establish an effective planning process that includes consideration of common needs that should be served by interagency or Governmentwide systems. While it would be inappropriate for OMB or anyone else to try to dictate that all Federal agencies implement any particular software product, from a functional standpoint, all agencies have common needs to share informal/unstructured information (E-mail) and to manage and share formal/relatively unstructured documents and formal/highly structured data. Thus, it would seem to be in the best interests both of the taxpayers and the agencies themselves if OMB were to direct planning toward Governmentwide E-mail, EDMS, and E-forms capabilities based upon open-systems standards and COTS software.


It would also seem logical to start with a small set of data elements that are common to all agencies, including a core set of document metadata. OMB should delegate authority to an agency such as GSA to act as the central registry, i.e., to maintain the data dictionary. Once the core elements have been validated, implemented, and established as Governmentwide standards, then affinity groups should be encouraged to extend the basic set for more detailed and specific purposes. Individual agencies and offices should not be prohibited from establishing their own data elements. However, as required by ITMRA, they should be expected and required to take into account cross-cutting needs and opportunities with other agencies and offices. Practically speaking, that means that they should check the Governmentwide data dictionary before establishing their own elements, and when they do establish new elements, they should be registered in the dictionary. Of course, like all other highly structured data collection activities, new data elements should be proposed and processed in a distributed E-forms system.


44. Turban (p. 349) discusses the distinction between "decentralized" and "distributed" computing. He explains, "decentralized computing breaks centralized computing into functionally equal parts, with each part essentially a smaller, centralized subsystem." By contrast, "distributed computing ... breaks centralized computing into many computers that may not be (and usually are not) functionally equal."


45. In discussing DSSs, Kerr (p. 62) suggests that a form should be developed for use in determining IS development priorities for decision support. However, he notes (p. 67) that the DSS team may have to wait months for the "feeder systems custodians" to provide the data transfer capability necessary to support the systems. His suggested solution is to: "Keep management aware during each step of the way! Make the interfacerequirements a management priority." Aside from the fact that management should determine its own priorities and it is doubtful that IT staff can drive them, seeking additional guidance is no substitute for providing the technology and the tools appropriate to the task. To the extent that the information involved is highly structured data, E-forms software is the single UI required. If management does not understand the need for E-forms, they should be educated. However, the first step is for IT professionals to recognize that multiple interfaces to myriad databases are no substitute for a single E-forms application encompassing all of the commonly needed functionalities.


46. Reasons for recording data redundantly in different databases include such things as security and access rights as well as database and network performance issues. Even within a single database, there may be reasons to "denormalize" the database structure, in which case the task of maintaining referential integrity falls to the applications software. E-forms software can be used to maintain referential integrity of data not only within a single database but across multiple databases.


47. In addition to its existing library of Federal forms, Elite offers to create additional forms for agencies for as little as $75.00 per surface. In other cases, for competitive advantage, Elite has agreed to convert forms from other software formats at no charge.


48. Ramakrishnan (1997) provides a very good summary of the incorporation of digital signature capabilities into an EDMS. While digital signature capabilities can be addressed through E-forms, E-mail, or even Web browser software, as a cross-cutting functionality, it is logical to provide such capabilities via EDMS software. E-forms data transmissions certainly warrant digital signatures, but it is doubtful that most E-mail messages or "browsing" activities do. If the functionality were free and technically unchallenging, perhaps there would be no reason for it not to be ubiquitous in all software applications. However, that does not seem to be the case. To the extent that EDMS technology is best suited to allformal information management needs, including the management of E-forms as a class of documents, it would seem to be the most appropriate tool in which to address digital signature requirements in a common, all-encompassing fashion.


49. Previously, Lotus' marketing strategy has been based largely upon two principles -- replication of data across multiple databases and support for rapid applications development. They have relied upon third-party developers to deliver the actual applications that people and organizations need. In addition, it should be noted that Notes is a proprietary database. As McCready (1997, p. 60) puts it, "The Notes sales model is to sell Notes as an enterprisewide messaging environment that has great enhancement functionality and 10,000+ business partners eager to develop custom solutions." Whether a "messaging" system is the appropriate platform for formal documents is questionable, regardless of whether the documents are relatively unstructured or highly structured (forms). And Raines Rule No. 6 explicitly directs Federal agencies to "avoid custom-designed" IT components.


50. McCready (1997, p. 60 & 61) answers his own question with respect to what will become of independent software vendors (ISVs) if Microsoft and IBM/Lotus actually begin to deliver the desktop functionalities, such as forms automation and document management, that people need. He points out that the choice for organizations between Notes and Microsoft Exchange is a matter of infrastructure, and he argues:


... if they raise the functionality too much, the software becomes too complicated and time-consuming to deploy and maintain. Sales cycles become protracted and third parties have less and less value-add. Furthermore, the price will become too high to deploy as an infrastructure technology.


Economics is on the ISV's side. Infrastructure is expensive and rarely yields an appropriate return. For example, replacing a mainframe E-mail system with Exchange or Notes is likely to give you a 40% three-year ROI and a five-plus years' payback. In comparison, a more discrete investment in document management will yield savings of better than 400% and a payback period of less than 12 months.


Distinguishing between infrastructure and applications, he observes, "Infrastructure costs, applications pay." He says that ISVs despair that Notes and Exchange are commonly sold in lots of 10,000 seats or more. However, he suggests that such deals are merely "readying the market for the applications that really pay..." In any event, the question is how many differentdesktop applications people and organizations really need for forms automation and document management purposes, regardless of the means or vendors who deliver them.


51. With reference to its EDMsuite, Gagnon (1997) asserts that IBM/Lotus still has a long way to go. He says, "They took a whole bunch of products, kludged them together and called it a suite. It's not integrated, and the products previously didn't have a very good position in the marketplace, so why would they now?" Of groupware on the Web, he argues, "You don't need the better functionality like that which Notes gives you, such as better security, better replication, better management of the discussion thread. That doesn't work with a bunch of people who work with each other infrequently." Indeed, some organizations have prematurely or inappropriately embarked upon the technically and institutionally challenging task of automating workflow -- often to discover that most of their work is ad hoc, in which case the workflow automation system becomes needlessly restrictive while the administrative burden of supporting it becomes counterproductive overhead.


52. The second of Raines' Rules is that Federal agencies should only engage in those activities for which they and they alone are best qualified. In few, if any instances would that include the development of IT systems and software.


Perhaps the world's biggest developer of forms is the Internal Revenue Service. Pejman (1997) paraphrases Rep. Armey as saying that Americans spend 5.4 billion hours each year preparing tax forms and pay $200 billion to file those forms. Since IRS is such a big developer of forms, one might imagine that they are pretty good at designing and automating them. To the contrary, IRS's experience is indicative both of the reason that Federal agencies should not be in the software development business as well as the problem with the database approach -- regardless of whether it is based on top-down or bottom up planning. Crenshaw (1997) reports:


In many ways, the agency's efforts to upgrade its computers stand as an allegory for all of its difficulties -- and the solution to many of them. The IRS has tried one grand solution after another, conceived and largely carried out entirely within the agency, only to see each one come to grief.


The Deputy Secretary of the Department of the Treasury is quoted as observing, "A crucial problem is that people try to build the Taj Mahal and then specifications change." In the meantime, Crenshaw continues:


... the agency patched together an array of other highly complex stand-alone systems. Today the agency still conducts its business on a mixture of 1960s mainframe computers and these [new] separate, "stove pipe" systems. Such systems take data in at one end, process it, and put the results out the other, but cannot communicate easily with each other.


Thus, it is evident that neither the grand, top-down approach nor a more segmented, bottom-up approach has worked for IRS. Critics say there are two principal reasons for the agency's failure: First, a lack of clear goals and a plan for achieving them. Second, an insular internal culture that led officials to try things in-house that the agency could not accomplish. As a result, Crenshaw reports that "... IRS canceled $36 million worth of technology projects last year because there wasn't an adequate 'business case' to continue them. He paraphrases the chief computer scientist at the General Accounting Office (GAO) as observing that "allgovernment computer projects run into trouble" and that "the government must learn to cut projects into manageable pieces and begin a new segment only after previous ones have shown they work ..." (emphasis added)


53. Varon (1997, March 31) cites a year-old report by the Chief Financial Officers (CFO) Council that concluded client/server technology would be the architecture of choice for future financial systems. The report indicated such systems would be less costly to modify and easier to integrate with other management applications. According to Varon, the financial management system developed by American Management Systems, Inc. (AMS) will allow users to create custom screens and reports from their desktops. It will also provide document routing capabilities and allow users to link to other applications, such as wordprocessors and spreadsheets.


54. At a March 5, 1997, seminar sponsored by SCO and IBM, entitled "The Internet Way of Computing," an SCO spokesperson was asked why they didn't just call it what it is -- the mainframeway of computing. Her response was frank and to the point: "Because you all would not have shown up" for the seminar. SCO and IBM have an economic imperative to capitalize on the hype, but organizations should not be swayed by it except to the degree that it can be harnessed to foster progress toward meeting business information needs in a fashion that serves the interests of the average user and the organization as whole. The last thing that most organizations need is a new fashioned means of developing mainframe stovepipes that become myriad and scattered islands of information that users are expected to visit in order to accomplish their work.


55. Varon (1997, April 14) reports that the Department of Transportation has developed an Electronic Grants Pilot Project, which "would let grant applicants use the World Wide Web to file standard forms requesting aid from a variety of federal government programs." However, "systems developers have to solve a[n] ... urgent problem -- securing transmissions over the Web -- before the pilot can be expanded." Were it not for the hype surrounding the Web, one might think that security would be considered to be an essential requirement in order to avoid the dreaded "vaporware" moniker. At the very least, it is fair to say that a lot of folks are betting on the come. To the extent that Federal agencies may be allocating substantial resources to such applications, they may be violating Raines' Rules No. 6 and 8, respectively, which specify the use of "fully tested pilots" and avoidance of undue risk to the Government.


56. In an informative discussion of the Web versus more traditional client/server technology, Keating (1997) of American Management Systems (AMS) suggests: "The key for us will be an integrated architecture that is built on TCP/IP as the common communications protocol. It would be great if a single middleware product provided all the intermediate services we need -- such as security, directory, system management and transaction management -- but we're resigned to working with a combination of products for some time to come. For us, the Web will be an important element of our architecture, but it certainly won't be all things to all people." Keating treads lightly on the current superstition of the Web, perhaps not wishing to be stoned for blasphemy. Mythology and motivation aside, for highly structured data collection and processing needs, E-forms software is the appropriate middleware. From a functional standpoint, whether it runs on TCP/IP or some other network protocol is immaterial.


Turban (p. 354) says:



Middleware can translate client requests into a form that certain servers can better understand and translate server responses into a form a client can better understand... In effect, middleware is in a gray area between doing work that a client or server should do and work that some sort of system "overseer" should do.


In effect, paper forms have always served as middleware between the suppliers and customers of highly structured data. Likewise, E-forms software performs a middleware function between client workstations and databases. Turban (p. 354) also discusses "frontware" -- as "tools that allow developers to create appealing graphical user interfaces for existing mainframe programs..." The question is why different tools should be used for interfaces to different mainframe databases, when in fact what users need is a single UI to all databases.


Concerning mainframe applications, Turban (p. 348) points out that "... businesses are increasingly choosing either to shelve ... or rewrite them for smaller, less expensive hardware like desktop PCs." Certainly, browser-to-host technology is far less expensive than traditional mainframe architectures and the marketplace endorsement of a standard communications protocol (TCP/IP) is a tremendous step in the right direction. Nevertheless, host-based Web applications have many of the characteristics of older-mainframe applications, albeit with a prettier, more graphical face.


Significantly, from the perspective of the user, they may now be faced with a plethora of Web-based applications, whereas they may have had to contend with one, two, or a few mainframe applications in the past. Moreover, from the organizational perspective, a multiplicity of browser-to-host connections may not sum to an efficient and logical whole. Worse yet, the question remains as to why common functional applications software should be programmed and reprogrammed repeatedly at multiple hosts when what the users really need is a single application at their desktop. The only apparent reason is the egocentricity of the DBDs and DBAs who think that they, rather than the users, own not only the data but also the application software. Such are the shades of the mainframe environment.


57. Turbin (p. 351) points out: "A client is generally agreed to be any system or process that can request and make use of data, services, or access to other systems provided by a server... A server is generally agreed to be any system or process that provides data, services, or access to other systems for clients, most often for multiple clients simultaneously (as a shared resource)." By definition, a browser is a client. In the relatively early stages of their life cycle, browsers have been considered to be "thin"clients, with application functionalities being provided at the host. For some applications, that may be appropriate. However, there is no reason to believe that it should be the modus operandi for all or even most business applications.


58. Hypertext links look suspiciously like old DOS menus. Buttons and clickable images spice up the presentation, but it is hard to see how they add up to the kind of standardized interface that the average user within an organization should have to do their business.


59. Depending upon the formality that each form requires, it may be routed and shared in E-mail (informal/unstructured) or managed and processed in an EDMS (formal/relatively unstructured). If highly structured processing is required and justified, a workflow automation component may be added. However, unless forms are the only type of document warranting such treatment, the workflow component should be implemented as an adjunct to the EDMS, rather than to the forms automation client. Moreover, since workflow automation may be highly complex and is technically challenging, as suggested by Raines' rule on chunking, it should be implemented separately, after the more basic EDMS and E-forms capabilities have been mastered.