CHI 97 Electronic Publications: Design Briefings
CHI 97 Prev CHI 97 Electronic Publications: Design Briefings

Evolution of a User Interface Design: NCR's Management Discovery Tool (MDT)TM

James F. Knutson
(work completed while employed by NCR)
Gateway 2000
610 Gateway Drive W-29
North Sioux City , SD
605-232-2000
knutsjim@gw2k.com

Tej Anand and Richard L. Henneman
NCR Human Interface Technology Center
Promenade 1, Room 8147
1200 Peachtree Street
Atlanta, Georgia
404-810-7145, 404-810-7083
(tej.anand, dick.henneman) @atlantaga.ncr.com

ABSTRACT

Many companies are developing large data warehouses to understand their customers and business trends better; however, tools to analyze these data typically require significant expertise. Because of this, NCR has developed the Management Discovery Tool (MDT) for the typical manager who wants answers to business questions without having to know SQL or database table and column names. We provide an overview of the user-centered design process used to design one part of the MDT (the "Folders View" dialog) and give rationale for design decisions.

Keywords

User-centered design, iterative design, prototypes, mockups, data mining, data analysis, data retrieval.

© Copyright ACM 1997



INTRODUCTION

To understand customers and business trends better, businesses are increasingly storing and analyzing large amounts of detailed data from their business transactions. Once these data have been organized into a consistent framework and placed in a relational database, they are commonly referred to as a "data warehouse". Retailers, financial institutions, insurance companies, transportation companies and many other types of businesses hope to exploit the huge storage and computational capabilities of parallel processing computers to realize new planning and operational control efficiencies. It is estimated that 95 percent of FORTUNE 500 companies have launched data warehousing initiatives [1]. A key problem in exploiting the benefits of a data warehouse, however, is generating meaningful information from raw data; for example, masses of raw data are not particularly useful to a time-pressured retail buyer who is trying to determine next season's product mix. Current analysis tools such as SQL database queries are not useful to the "typical" user.

Thus, NCR Corporation built the Management Discovery Tool (MDT) to allow the manager/executive user to access and analyze data in their data warehouse from a personal computer without having to know the SQL database query language or anything about the structure of their data. MDT is strategically an important product for NCR in that it significantly increases the accessibility of data in NCR parallel processing computers.

MDT works as follows. First, a user decides what type of analysis needs to be performed on the available data. The user then defines a software agent (called an "analyst") using a wizard-style dialog. A typical analyst contains a measure (e.g., Sales), segment (e.g, shirts) and time period. Analysts can be run at any time, be scheduled or be triggered when a certain event occurs (e.g., Sales > $25,000). When an analyst is executed, MDT creates a hypertext document called an InfoFrame TM, that is returned to a folder on the user's personal computer [2]. Alert Messages are InfoFrames that report that a certain measure went over a threshold.

The paper proceeds as follows. First we provide background information concerning typical MDT tasks, users, and environment of use. Second, we describe the user-centered development process. Third, we discuss a series of successively refined user interface mockups (developed in Microsoftâ Visual Basic Ô), including the evaluation methods and design recommendations (Only four designs are discussed, although there were numerous other designs along the way). Finally, we note some general lessons learned. We describe the two year long design process used to design one important part of MDT, the "Folders View" user interface. The Folders View interface provides a storage place in MDT for InfoFrames, Alert Messages and Analysts. Users access InfoFrames, and initiate analysis through the "Folders View" interface.

THE MANAGEMENT DISCOVERY TOOL

Four types of user interact with MDT: manager/executive, knowledge worker, administrator, and NCR Professional Services user. Only the manager/executive and knowledge worker will interact with the "Folders View" interface. The Manager/Executive users will primarily use MDT to view Analysts and InfoFrames. They typically know how to use a spreadsheet or word processor on a personal computer but do not have expertise in data analysis or data mining tools. The Knowledge Worker user has expertise in data analysis and mining tools. The Administrator has in-depth knowledge of databases and database tools. He/she would most likely be the company's database administrator. The NCR Professional Services User has expertise in data warehousing and databases and will assist in the MDT installation and setup with the Administrator.

Tasks and Environment

MDT will likely be used in a office setting. The primary tasks of the "Folders View" user is to: 1. View and run analysts; 2. View InfoFrames and Alert Messages; and 3. Create, rename and move folders and folder contents.

MDT is a three-tier client-server application. The client application runs on a Windows NT or '95 personal computer. The application server runs either UNIX or Windows NT. The client sends a request to the application server, which translates the request into SQL statements and queries the data warehouse. After using some intelligence to elaborate on interesting findings, the application server creates an InfoFrame and sends it back to the client.

THE DEVELOPMENT PROCESS

The Team

The Management Discovery Tool (MDT) was developed jointly by developers at three separate locations: the NCR Human Technology Center (HITC) in Atlanta, Georgia, NCR Parallel Systems Scalable Data Warehouse and Client Software (CSG) in San Diego, California and AT& T Bell Labs in Murray Hill, New Jersey (NCR was previously AT&T Global Information Solutions). Because the HITC has core competencies in user interface design and database mining it was responsible for the user interface and utilities for storing, retrieving, managing, and manipulating InfoFrames. Bell Labs was responsible for the intelligent middleware for translating user requests into queries, manipulating and analyzing the returned data and generating the InfoFrames. CSG was responsible for the interface between the intelligent middleware and the data warehouse. Fifteen people were on the development team, including one cognitive engineer, 2 intelligent systems experts, and twelve development engineers.

The Process

Early in the development of MDT it was realized that user acceptance would be key to creating a successful product. The HITC (which was responsible for the user interface) has made User Centered Design a cornerstone of its development process [3]. The process and methods of user-centered design were used to define and refine the user interface specification, which was used to derive the underlying system requirements.

The Schedule

The initial project meeting was in February, 1995 and the product was released in February, 1997. The mockup of the user interface was updated continually. The first user interface iteration was completed in about 3 months and refined thereafter; the internal design specifications took about 4 months to write, with refinements thereafter; and development took about 13 months.

ANALYSIS

Several methods were used to understand the usability of the prototypes, the users, their tasks, the environment of use and underlying technologies. These are presented next.

User Interviews

Interviews were performed one-on-one with the first author. The format started with an idea session where the participant was asked to give the requirements for the optimal data retrieval and reporting tool. After the participant had designed a rough sketch of what the tool would look like, a version of the user interface mockup was shown to him/her. Participants were then asked how the design compared to what they had envisioned, what its strengths and weaknesses were, and how it could be improved. Five formal interviews were conducted.

Focus Groups

The format for the focus groups was the same as for the user interviews. The two focus group sessions (including 16 participants) were moderated by the first author.

Literature and Current Tools Search

To gain a better understanding of how the tasks of data retrieval, analysis and presentation were performed currently, the first author evaluated commercially available tools, evaluated in-house concept prototypes and read trade articles. Existing products reviewed for interface ideas included electronic mail applications (e.g., Lotus CC:Mail and Microsoft â Mail) and data mining tools (e.g., Business Objects, Metacube by Stanford Technology Group Inc. (acquired by Informinx), PowerPlay by Cognos, ESPERANT by Software AG, BrioQuery by Brio Technology, Visual Warehouse by IBMã).

Demonstrations of the mockup user interface

A script was written to demonstrate the mockup and the mockup was distributed and shown to many potential users through different channels (e.g., trade shows, customer demonstrations, marketing briefs). The mockup was also available on a web site (accessible to NCR associates). The mockup created excitement for the future product and provided a mechanism to receive feedback on the user interface. Later, the mockup was used to demonstrate the user interface design to the manual and online help writers, and to NCR Professional Service organizations who would setup and install MDT at customer sites. The first and second authors gave approximately 100 demonstrations of the mockup. Others throughout NCR gave collectively about 100 demonstrations.

Writing Development Documents

All development team members (including the cognitive engineer) reviewed the design specifications. The cognitive engineer wrote the Product Description and Product User Interface Design Specification.

Development Team Design Reviews

Early review sessions were less formal, centering around discussion of the latest mockup. Later reviews were more formal. Each time a specification revision was released, indicating large changes to the user interface, the development team reviewed the document and emailed feedback to the cognitive engineer. Within a week, the screens that had been changed were formally reviewed in a design review. Although these discussions often became very heated, they provided outstanding design feedback. They also helped to solidify the commitment to implement exactly what was in the user interface specification. Six formal reviews and many informal reviews were held.

Usability Test

The test consisted of a brief introduction after which participants performed tasks with the mockup. No training was given as we wanted to assess how easy the mockup was to learn and how easy it was to use. Participants followed the "Think Aloud" method where the participant verbalizes everything he/she is doing and thinking as he/she performs the task [4]. A progressive usability format was used in which the user interface was changed to "correct" usability problems encountered by the previous participant before another participant was run. Problem areas typically had longer execution times and elevated error rates. When a participant had problems with a certain task or part of the user interface, he/she was asked to explain what he/she was doing and how he/she expected the user interface to work. After that participant had finished the test, the user interface was then changed to accommodate the participant's expectations. Those changes were then evaluated with the next participant. Four users participated in the test.

Figure 1: Initial Design

ITERATION I: THE INITIAL DESIGN

Design

Figure 1 shows the initial user interface design of the "Folders View" user interface. The first iteration was based upon the "Retail Store Assistant", a similar software concept developed at the HITC for grocery chains in the retail industry that allowed a store manager to set a threshold for a certain measure (e.g., Sales > $2500) and receive a report back when that threshold had been exceeded [5]. Findings from a literature review and current tools search influenced the initial design as well.

InfoFrames (called Smart Reports at that time), were stored in a modal "Inbox" dialog. Analysts were called "Templates" and were stored in a separate dialog very similar to the "Inbox" dialog with two tabs labeled "Smart Report Templates" and "Alert Message Templates". To view an InfoFrame, the user double-clicked on the name of an InfoFrame or selected the "View" button.

Evaluation Methods

One interview was conducted by a cognitive engineer with an insurance industry authority employed by NCR. Another interview was conducted with a NCR Product Marketing person who had visited many customers concerning data retrieval, analysis and presentation. Design reviews were conducted with other cognitive engineers and with the development team informally to provide feedback on the design. Below are key findings:

1. Alert Messages, InfoFrames and Analysts (Templates) should be grouped under the same folder. Users in interviews wanted Infoframes and Alert Messages in the same place as Analysts (Templates) that created them.

2. Users wanted to have the "Folders View" readily available, not in a modal dialog.

3. The concept of a "template" was confusing. Most people likened the templates to intelligent agents. It seemed strange to users that a report template would run on a schedule or be triggered based upon a change in a measure (e.g., Sales > 25,000) however, for an agent to be scheduled or triggered it seemed quite natural.

4. The user interview revealed that one could logically group "Segments" and "Measures" as "Business Information". Segments are categories of data such as "Atlanta customers". A Measure is a number that one tracks such as "Sales".

5. Users expressed a need to quickly and easily create and delete folders of Analysts, InfoFrames and Alert Messages.

Figure 2: Iteration II

ITERATION II

Design

The second iteration (Figure 2) brought the lists of Analysts, InfoFrames and Alert Messages together into the same folder. Folders were represented as tabs and users could dynamically create, delete or rename folders (tabs).

Evaluation Methods

Two interviews were performed for the insurance industry with financial consultants. The first interview was with a user who had been the CEO of 2 large life insurance companies. The second interview was with a former vice president of a property and casualty insurance company.

Two focus groups were conducted: one with employees of a large American clothing retailer and one with employees of a transportation company (railroad). Participants included inventory replenishment and managers, buyers, database administrators, data engineers, human factors engineers, associate systems engineers, business systems consultants, and decision technology professionals.

Besides development team design reviews, the "Administrator" user-type and NCR Professional Services user-types were interviewed to gather requirements for the setup user interface. Following are key findings:

1. Users liked and understood the concept of an "agent". One person in a focus group stated: "I gravitated to (the idea of an agent) very quickly. That makes sense to me...I visually understood the little spy there with his briefcase� I set him up with and how often I want him to report back �That was very clear to me."

2. A description of button functions is needed. In a focus group a user said: "You might also put a status bar at the bottom that as you roam the mouse over buttons or over text fields that you get a sense of what it is or what you can do."

3. Three views of Agents and InfoFrames were needed. In the focus groups, the users wanted a view a list of just analysts (agents), a list of just InfoFrames (Smart Reports), and a summary view with both analysts and InfoFrames. 4. Users wanted to allow the same analyst (agent) to be in two different folders One user stated: "Can I have the same...agent in two (folders)?...Okay. I like that."

5. To edit and run analysts, users wanted to drag and drop.

6. Template/analyst reuse was important Focus group participants wanted the ability to reuse MDT InfoFrames. They wanted to be able to grab someone�s report, change it, and then run it.

7. There was a need to "edit" (cut, copy, paste) to and from Infoframes. From a focus group and user interview came the requirement that text generated in the InfoFrame should be editable and reusable. Later this was added.

8. Users indicated that they might have as many as ten or more folders which created all of the problems associated with multiple rows of tabs or of scrolling rows of tabs.

9. One concern voiced at focus groups and interviews was that there should be a second chance to save something that was deleted before actually getting rid of it.

10. Users were concerned that the folders view would get buried in a design such as in Iteration II.

11. Users wanted to know immediately if an Alert Report (Alert Message) had been returned. One interviewee said: "I used to say to the president of the company it is 6:00 tonight do you know what your exposure is. And he didn�t. If he had a blinking light that went on and said wait a minute your million dollar exposure just went to 2 million dollars you ought to do something about it."

12. The concept of an "agent" worked well, however, the name of "agent" was inappropriate. The term �agent� is an unfortunate choice by MDT for its software agents, since in the insurance industry the term is automatically associated with �insurance agent�. One interviewee stated: "(Agent) is a very important term to insurance companies. You might think of a better name than agent...."

13. Placing labels on toolbar buttons creates internationalization problems. Some of the problems include; having to accommodate long button names, having to redo the icon bitmaps for every language, and screen real estate constraints.

14. Tabs were a very limiting metaphor to use for a folder. This design used tabs in an unconventional way, which turned out to be a problem because 1. Focus groups revealed that users would typically have between ten to fifteen folders, not three or four as had originally been thought. and 2. One could not have folders in folders as in most file management applications.

Figure 3: Iteration III

ITERATION III

Design

The third iteration (Figure 3) replaced button labels with pop-up labels that become visible when the mouse is moved over them. This gets around some internationalization problems (e.g., long label names, new bitmaps for each language) and takes up less real estate.

Folders were redesigned from tabs to a design similar to Microsoft'sâ Windows Explorer. This eliminated the problem of having multiple rows of tabs (or a scrolling row of tabs). In addition, it accommodated a "trash bin" folder, a requirement that resulted from one of focus groups.

Evaluation Methods

A formal usability test was run with two food retail store managers and two merchandise retail managers. These users matched those of the target user population. The development team reviewed changes made to the user interface. Every effort was made to design a product that met the users needs, was easy to use and could be implemented given the development tools available.

Many customer and in-house demonstrations were given of the user interface mockup. Following are the key findings:

1. The function of the "Window" menu item had changed from displaying windows to viewing different screens. The items under this menu item were changed again to include conventional Windows content such as "Tile Windows", Cascade Windows" and a list of open windows.

2. "Metauser" seemed too lofty a term for the administrator and professional services users.

3. In usability testing, users wanted a quick method to run an analyst. Several users suggested dragging an analyst to a picture of a rocket to "launch" the analyst. 4. Users found it awkward to go into the menu to define Segments and Measures. 5. Users wanted a quick way to know how many InfoFrames were finished and pending. During the usability test they continually looked into each of the folders to see if InfoFrames were completed/returned.

6. "Garbage" was inappropriate. European users did not typically use the term and found it slightly offensive. 7. The icons for the toolbar buttons did not look professional or even of a consistent style.

8. There was no way to quickly distinguish between read and unread InfoFrames and Alert Messages.

9. Users wanted to know when the analyst would run next and did not care what type of schedule (Time or Event) triggered the analyst to run.

Figure 4: Iteration IV

Figure 5: Released Product

ITERATION IV AND THE RELEASED PRODUCT

In iteration IV (Figure 4) the addition of more toolbar buttons gave the user quick access to frequently used functions. By using different icons to represent different types of objects in the lists, one could more readily distinguish between read and unread InfoFrames and between InfoFrames and Alert Messages.

Implementation

The implementation of the user interface design required creating some dialogs from scratch and reusing some of the Visual Basic dialogs. Storc Gold [6] was used to translate the Visual Basic code into resource files that could be used in Visual C++ (the development tool). This saved development time and ensured that the user interface was more consistent with the mockup design.

Because of the continual crunch of the development cycles, the cognitive engineer helped out with the development process by editing the resource files in the Visual C++ resource editor. This freed up development resources and also ensured that the look and feel of the screens conformed to the user interface specification. Because the cognitive engineer was an integral part of the development team, last minute design changes were discussed with him and they worked together to resolve the issues.

CONCLUSION

Organizational Issues

Coordination and communication were critical for making a project with three geographically separated teams work well. To hold the team together on a common goal, many design documents were written and shared, a mockup of the user interface was continually updated and distributed, status meetings took place weekly via teleconference and periodic team meetings were held in one of the host cities (Atlanta, Murray Hill, San Diego). The development of an internal World Wide Web site for the MDT project allowed the design team to place all of the design documents, presentations, mockups and other information in a central place.

Process Issues

Although each of the methods used to evaluate the user interface had merit, different types of information were gleaned from each. Interviews, focus groups and demonstrations were good for getting high-level design ideas and qualitative feedback on how much users liked the user interface and the concepts therein; however, these methods spoke little to how easy to use the user interface was. The usability test really evaluated the usability of the user interface. Using a progressive usability test, in which the user interface was updated between each participant to minimize the problems found by the previous participant, was very efficient at finding remedies for large usability problems. However, usability tests of greater rigor are planned for future releases of the product.

Design Issues

The final version of the user interface varies little from the final version of the mockup because:

1. The cognitive engineer wrote the product description and user interface specification and worked as a part of the development team. This even included editing resource files in the C++ resource editor so that "minor" cosmetic (and not-so-cosmetic) aspects of the designs would be implemented as specified. These aspects included keyboard accelerators, button sizes, icons, list lengths, drop-down list widths and lengths and placement of controls.

2. The development team was committed to realizing the design and took an active part in critiquing it and making it work given the constraints of the intended software platforms.

3. Early design of the mockup user interfaces caused a push to implement what was designed.

4. The mockup served as a communication tool to orient the development team, product management and marketing as to what it was that we were building.

5. The mockup was widely shown to potential customers under non-disclosure agreements and to NCR company associates long before any code was written.

Given more time and resources, we would have run more usability tests and used more graphic artist resources.

ACKNOWLEDGEMENTS

We thank everyone on the MDT development team (Kevin Copas, Scott Coulter, Yih-Shiuan Hu, Soheila Taheri, Alice Lei, Peter Selfridge, Glenn Wikle, Steve Churg, Mike Georgantos, Drew Lettington, Marshall Lindsay, Chrisjan Matser, Ken O'Flaherty, Al Meyer, Rich Schubert, Jeff Ludwig) and participants of the interviews, focus groups and usability tests.

REFERENCES

1. Data Warehousing: The Road to Knowledge Production, Fortune, November, 1996, S-1 to S-9.

2. Anand, T. Opportunity Explorer: Navigating Large Databases Using Knowledge Discovery Templates, In Journal of Intelligent Information Systems, 5(1): 23-35, 1995.

3. Henneman, R. & MacTavish, T. The NCR Human Interface Technology Center. This volume.

4. Knutson, J., Umberger, T., and Tinney, B. Report of the Informal Usability test and Usability Walkthrough Analysis of the Retail Store Assistant Prototype, AT&T Global Information Solutions, 1994

5. Nielsen, J., Thinking Aloud in Usability Engineering (pp.195-198),Cambridge, MA: AP Professional, 1993.

6. Storc Gold v2.0 User Manual, PractiSys: Agoura, CA, 1992.


CHI 97 Prev CHI 97 Electronic Publications: Design Briefings

CHI 97 Electronic Publications: Design Briefings