Abstract
As technology becomes more ubiquitous in our world, so the desire to provide more natural, intuitive interactions between man and machine mounts. Yet often, the intermediary between man and machine is an artificial mapping from key interfaces to procedural commands, a legacy of earlier attempts at automation. These mappings, such as the buttons in an elevator, limit the potential value of a system, placing an arbitrary bound on how the user can communicate his or her goals.
When a user steps into an elevator, the goal is not, for the most part, to go to a new floor level. The goal of the user is, quite often, to go to a particular destination on that floor level, whether it is a colleague’s office, a meeting in the conference room, or the lab. Buttons for floor levels serve to provide a simplified mapping between the user’s goal and an elevator command. Dialog systems, however, provide the means to rise above such considerations.
In the context of a spoken dialog system for an elevator application, our goal is two-fold: eliminate artificial button or key interfaces in favor of more intuitive forms of communication, namely spoken natural language; and secondly, develop a system that can intelligently handle user demands over perfunctory requests.
In the first case, it is clear that button-to-floor mappings are a straightforward approach for a functional elevator experience; this system, however, will provide the means to get to a floor level through multiple sources, whether it is the name of a colleague, a room, or an office number.
In the second case, button interfaces are useful for those who already know the mapping between their destination and the floor level in the building. In other words, guests in the building may need to look up on a table index (if one exists) to find the floor information. An intelligent dialog-driven system, however, could capitalize on this data to facilitate the flow of that information among its user-base. The elevator could know, for example, that experiments take place in the Eye-tracking lab, and that the lab is located on the fourth floor, down the hall to the right. In this way, it might also infer that anyone wishing to participate in the latest experiment needs to go to the fourth floor.
I present a spoken dialog system that enables such interactions in an elevator. Designed to operate as an intelligent receptionist, it is able to communicate in three different languages, recognize familiar speakers, interact with users through spoken dialog, and importantly, aid the user in arriving at his or her desired destination.
No Comments