Site Navigation

Contents of Thesis ack'ments - Introduction - Context - Accessibility - W3C/WAI - LitReview - Metadata - Accessibility Metadata - PNP - DRD - Matching - UI profiles - Interoperability - Framework - Implementation - Conclusion - References - Appendix 1 - Appendix 2 - Appendix 3 - Appendix 4 - Appendix 5 - Appendix 6 - Appendix 7

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 2.5 Australia License.

Abstract User Interfaces - include FLUID in this chptr

Chapter Summary

note the following:


URC Standards Out for Final Committee Draft Vote

The latest drafts of the ISO/IEC standard on a universal remote console have been recently distributed to the members of the pertaining subcommittee, ISO/IEC JTC1 SC35 user interfaces, for final committee draft voting. If successful, this vote (closing in August 2006), will likely be the last vote on the subcommittee level. At that point, the draft standard will be passed on to JTC1 for final voting and publication as an ISO/IEC International Standard.

The pertaining schema files (for XML and RDF) have been posted on this site, as follows:

XML schema for UI Socket Description (http://myurc.org/ns/uisocketdesc) - there is also a more illustrative and table-like description of the schema (http://myurc.org/ns/uisocketdesc/uisocketdesc.html).

XML schema for Presentation Template (http://myurc.org/ns/pret) - there is also a more illustrative and table-like description of the schema (http://myurc.org/ns/pret/pret.html).

XML schema for Target Description (http://myurc.org/ns/targetdesc) - there is also a more illustrative and table-like description of the schema (http://myurc.org/ns/targetdesc/targetdesc.html).

RDF schema for Resource Description (http://myurc.org/ns/res) - note that this file won’t show in the browser since it has a MIME type of “application/rdf+xml,” so you need to save it and open manally with a text editor or browser.

------------------------------------

 

 

Jason's paper...

Introduction

The URC represents a level of abstraction that demonstrates the state of the art of accessibility principles in 2004. An URC is capable of being used with a range of devices, in a range of languages, and with a variety of accessibility features. It is, in fact, no more than a platform on which intelligence is loaded in real time for the benefit of users confronted by other devices. The type or brand of device is not important if the URC protocol is observed as each device can have skins and information specific to its needs and comply with the generic URC specifications for that type of device.

So URC compliance is about metadata standards: the description of device and user needs and commands in URC specified ways makes for a common language that can be used any time by an URC, in any context for a user.

Use case written by LN for INCITS V2:

An INCITS/V2 scenario:

Imagine Maria driving a wheel-chair to a shopping centre.

She needs to get money and then can do her shopping. First she needs to find an ATM she can use. Attached to her wheel chair is the device she uses for accessing such things as ATMs, the washing machine etc.

Generally, her chair device, known as the universal remote console (URC), is not online in the sense of being connected to the Internet but it does make wireless connections to local devices. The URC is now within wireless reach of an ATM that might be suitable. Maria uses the discovery metadata from the ATM to determine if it will be suitable for her. This information is provided by the ATM, upon request from her URC, and can be of several forms, minimal or rich, depending upon what the ATM has to offer. In fact, the ATM has only basic information - what it can do in terms of interface features and its location and physical features. This is enough for Maria to determine that she can use her URC to interact with the ATM and get money for her groceries. Maria's URC uses a secure connection to 'talk' to the ATM, telling it that she needs extra time for entering her PIN and similar details, and asking for the cash she wants.

As Maria has difficulty remembering how much money she should take out on any particular occasion, she relies on the special 'skin' that the ATM offers to users of smart URCs. This 'skin' helps Maria focus on ATM functionality that helps her choose how much money to take out etc. (She has seen her friend Rose using the same ATM with a different skin that helps her by using her voice input technology.)

Maria has to do some grocery shopping but first she wants to treat herself to a new microwave oven. She enters the 'white goods' shop and is able to see how well the various makes of microwave ovens respond to her URC. Some immediately provide her with custom skins that show their features and how to use them; others just have generic skins, and she gets basic information but it is not set out in any specially friendly way, and some make no response to the URC. Maria chooses the one she likes the best in terms of how it communicates with her via the URC and how it will look in her kitchen.

Paying for the new oven is so easy, she wonders how she did it in the past - today she just locates the shop's credit payment system, types in the amount, and presses the 'pay' button - done! It's so good to not have to struggle with writing her signature. And it is very satisfying to give the delivery instructions to the shop as part of the payment process, instead of having to write out those boring details. Pleased with her purchase, Maria goes off to the supermarket.

inserted here is paper published by NIST....see http://ois.nist.gov/nistpubs/technipubs/recent/search.cfm?dbibid=16375

Wireless communication technologies make it feasible to control devices and services from virtually any mobile or stationary device. A Universal Remote Console (URC) is a combination of hardware and software that allows a user to control and view displays of any (compatible) electronic and information technology device or service (which we call a “target”) in a way that is accessible and convenient to the user.   We expect users to have a variety of controller technologies, such as phones, Personal Digital Assistants (PDAs), and computers.   Manufacturers will need to define abstracted user interfaces for their products so that product functionality can be instantiated and presented in different ways and modalities. There is, however, no standard available today that supports this in an interoperable way.   Such a standard will also facilitate usability, natural language agents, internationalization, and accessibility.

A URC is, typically, a device that a user carries, such as a PDA, a high-end cell phone, a specialized wristwatch, a Braille-based note-taker, an Alternative and Augmentative Communication (AAC) console, or other computer-based assistive technology. Users interact with URCs in a wide variety of ways, including touching touch sensitive screens, pressing single large buttons, using breath controlled switches, using speech-based technology, and the usual means provided by visual displays, keyboards, and mice. URCs show users options, target states, help, and other information using a variety of techniques including visual displays; tactile, including Braille, displays; and generated speech. Disabled people are the most obvious beneficiaries of this technology; but people, in general, will want a more convenient way to control things in their environment using any of a variety of interface modalities, thereby increasing the potential audience for a standardized URC.

Possible targets in the home environment include: TVs, Video Cassette Recorders (VCRs), stereos, thermostats, microwave ovens, lights, and home security systems; and in the public and work environments include: information kiosks, Automated Teller Machines (ATMs), electronic directories, elevators, and copy machines; as well as Web services such as online travel agencies, or world time services. Figure 1 shows the standing user using a voice-controlled URC and the seated user employing a touch-controlled URC [6].

A need for a URC

A stable URC standard would allow a target manufacturer to author a single user interface (UI) that would be compatible with all existing and forthcoming URC platforms.   Similarly, a URC provider would need to develop only one product that would interact with all existing and forthcoming targets that implement the URC standard.   Users would then be free to choose any URC that fit[ted] their preferences, abilities, and use-contexts to control any URC-compliant targets in their environment.

The INCITS V2 Activity

Giving users the ability to control the intelligent part of consumer electronics, environmental systems, and devices that provide various public services can be challenging. This is especially true for products and services that are controlled remotely.   It is expensive and confusing to have unique user interfaces for each different product or service. This is especially true for users with disabilities. Because disabled people comprise a relatively small, but diversified market, few products or services are tailored to their needs or are specifically adapted to the requirements of their assistive technologies.

This problem is being addressed by INCITS/V2 [2]. INCITS/V2 is a technical committee working on standards for the InterNational Committee for Information Technology Standards (INCITS) in the area of Information Technology Access Interfaces. V2 is developing standards for an Alternative Interface Access Protocol (AIAP). The AIAP is a specification for providing a target's abstract user interface description. It specifies a flexible way of automatically generating a target's user interface on a specific personal device. Use of the AIAP allows personal devices such as hand-held devices, PDAs, small laptop computers, and cell phones to be used as a URC to control a variety of AIAP-compliant target devices. The AIAP-URC specifications are intended to include specialized assistive technology devices used by people with disabilities so that the awkward use of an ATM machine as shown in Figure 2 can be avoided.

jug

Overview of URC Architecture

The device or service to be accessed is referred to as a target. A user interface for a target is described by a user interface socket, a User Interface Implementation Description (UIID), and a set of supplemental resources. The user interface socket is a low level description of a target, specifying user input acceptable to or expected by the target for control and output from the target to report status or other information to the user or to request control or other input from the user. It describes the functionality and state of the target as a set of typed data points and commands. The data points must include all of the data manipulated by or presented to a user. The commands must include all the target functions that users can activate.

A UIID is a user-oriented representation of a target that maps some or all of the user interface socket elements to interaction mechanisms. It provides a structure into which the elements of the presentation are embedded. The presentation mechanisms may be either modality-independent or specifically designed for a given class of user devices. Any number of UIIDs could be defined for a single target.

Interface text and other interface resources can be stored independently of the UIIDs as supplemental resources, referenced by UIIDs. These resources may include labels; help text, graphics or other multimedia elements. They may also include translations into different languages.

Figure 3 shows how the three classes of components of the URC have been conceptualized in the early phases of the project [4]. The developments of prototypes and further work have already suggested a reorganization of some aspects of this diagram and some simplifications that are expected to improve the effectiveness and ease of use of the URC.

 

URC Architecture OverviewNote this image was taken from http://www.incits.org/tc_home/v2htm/docs/V2/02-0086/v2020086.htm on 21/12/2004 to replace printed version from paper...

4. Where Metadata Fits

Metadata is "data about data" and, more specifically, “structured data about data”. Metadata is often divided into three conceptual types, although there is some overlap among them.   They are as follows:

Dublin Core views the above conceptual categories as four classes of resource information. The descriptive metadata category is separated into the two resource classes:

The identity resource class is typically the metadata that Dublin Core uses to match a resource to its resource description. Once identified, the resource is fetched and displayed. The action of fetching and displaying is the instantiation class of Dublin Core's metadata.

Dublin Core's third class of resource information, called intellectual property, is similar to the conceptual metadata type called “structural metadata.” Dublin Core uses this class as a means of relating sources that are of different types. In the case of INCITS/V2, for example, a device may be matched to a digital document-like object and a service. What passes from one to the other might well be thought of as metadata and from this metadata is generated the instantiations that are thought of as data.

Dublin Core's fourth class of resource information, called a core metadata for administration, is similar to the conceptual metadata type called “administrative metadata.” Thus, there is need for AIAP metadata that can be used to support the discovery, use, storage, and migration of resources used in the creation of user interfaces on URCs. The definition of metadata for these digital operations is an important part of the creation of specifications such as the AIAP.

We are using the Dublin Core Metadata Element Set (DCMES) to describe and find the additional resources that may be needed by a URC using the AIAP.   The metadata for the AIAP defines a set of attributes for specifying resources. Text labels, translation services, and help items are examples of such resources. The metadata also defines the content model needed to interface with suppliers of such resource services.

In phase one of the development of the AIAP, a metadata profile has been developed that defines what a resource can be “used for” and “where it is” (function and location). The developers of the AIAP-URC specification have identified several classes (i.e., category, subcategory, isAbbreviation, reference, content, conformsTo) that are outside the domain that is typically served by DCMES. It is not unusual, however, to want to describe functions, and some metadata developers choose to use elements such as dc:subject to do so, while others decide not to use the DCMES at all for that purpose.

In phase two, we plan to use metadata to define classes of services that may be needed by an end-user; to modify these classes to define user interface components; and to specify end-user needs, preferences and roles. We would like to use the DCMES for these purposes, or at least, DCMES-type metadata. Again, we question how well the DCMES, designed to describe intellectual content type resources, describes our objects. People and services, we observe, have somewhat different characteristics from intellectual content resources, including needs in terms of display and presentation.

5. Metadata Defined in Multiple Phases

The AIAP metadata is being defined in multiple phases, two of which have been identified. The first phase deals with the identification of resources so that they can be found and used. Phase 2 involves establishing metadata for identifying targets (devices or services), classes of interfaces and user preferences. Taxonomies will be identified or developed for classifying values for each of these major areas.

Phase 1

Metadata has been defined that consists of a minimum set of attributes for specifying resources. The Metadata Application Profile for the AIAP so far includes the following: Identifier, type, subject, relation (conformsTo, isReplacedBy, hasVersions, isVersionOf, and reference), language, creator, publisher, contributor, date, contextOfUse, and audience. These are the types of metadata elements needed to identify core and external resources used within the AIAP framework [3].

There are certain terms that need additional work in order to be fully specified. For example, the term ‘identifier’ needs a coding scheme that uniquely identifies each external resource. This might be accomplished through an international registry or resource identification.

Another term needing additional work is subject. Our intended application of this term is to define general classes of resources needed to support a particular user. Subject will be defined by a taxonomy or ontology, a “controlled vocabulary”, which will allow us to establish a hierarchical relation among classes of services that may be provided by resources or by targets. It is important to establish such a hierarchy so that users can request services generically. Without such a hierarchical definition, either the URC or the user has to know ahead of time the designation or location of every potential target or resource providing a service.

One or more hierarchical taxonomies or ontologies may already exist defining such resources. If we do not find an appropriate encoding, we will need to develop one or find an appropriate organization to do the work.

Phase 2

INCITS/V2 is going to use the structure developed to define services provided by resources in phase 1 to also define the services to be provided by targets in phase 2. The differences will be in the controlled vocabularies developed to define the unique designation and the kind of services (subject) provided by the target. The user metadata is of three types: preferences, needs, and roles. It will probably reside on the URC initially, that might possibly be a smart card. The smart card might also be an individual’s identification device. It is of paramount importance that metadata be used to assist in the organization and management of user interface components, including the provision of a mechanism for users to specify their needs, preferences, and roles. This type of work is already being done by others (i.e., Smart Card Alliance, Smart Card Group, and INCITS/B10) and may be suitable for adoption.

6. How the AIAP uses Metadata

The metadata to be used is determined by the requirements of the end-user of the URC to accomplish some task. In this section, we will look at an example of how the user's needs, skills, and preferences, together with the nature of the task, the URC and the target, determine the metadata used.

For the purposes of this example, we are assuming there is already a substantial implementation of the standard. By that, we mean that:

Example: Steve uses an Elevator in France

Steve is in the Lobby of the hotel on the first floor and wants to go to the fifth floor. Steve uses a powered wheelchair. Steve has a URC that uses a small touch screen and a RF wireless network. He uses his URC to find the location of the nearest elevators, by browsing the hierarchy of services. What is happening here is that a request for a class of targets (“elevators”) is sent from the URC to the wireless network that his URC is connected to. This is part of what we call “discovery”.

Steve is in France, but he speaks only English, and the URC knows this. The elevator controller sends its abstract user interface description and the directions for how to reach the elevator lobby (both tagged as being in French) to the URC. The URC issues a query, asking if the elevator controller can provide the directions and user interface labels in English.

The Elevator controller responds that it cannot provide the directions or labels in English. Using the Internet, the URC issues a request seeking a translation service using the DCMES. In this case, the URC will interact with an available search engine to seek a service that can do the translation. A translation service is located. The URC contacts that service and requests translations of both the directions and user interface labels into English. The translation service transmits the directions and user interface labels in English back to the URC. Following the directions provided, Steve approaches the elevator lobby. Although simply stated, the complexities for providing directions in a building can be great; but the benefits of implementing the AIAP standard are greater.

The elevator controller recognizes that Steve has arrived in the elevator lobby (the elevator controller uses the same mechanism that was used in the beginning of this exercise to identify where Steve is) and asks Steve if he wants to call an elevator. Note that we are assuming that Steve’s URC has the capability of caching the translated user interface labels. Thus, further translation requests for elevator user interface labels will be unnecessary.

Steve selects “yes” on his URC.

The User Interface is displayed on the URC, based on the UI socket, resources received from the elevator and translation service, and the Presentation-Independent Template (PIT), which is a special form of UIID that includes all of the elements of the user interface socket. This UIID is self-contained and does not refer to supplemental resources. Simply stated, in our example, there is no platform specific UIID available for the URC Steve is using.

The URC asks Steve to select the floor he is on and the floor he is going to. Steve can also select a longer waiting time on his URC so he selects 1 minute. The elevator decides, based on the signal strength received from the URC, that Steve is probably on the first floor, but if he was not he could easily tap a different floor number. Steve selects his destination floor as floor 5.

URC transmits a command to the elevator to request that an elevator come to the first floor. The elevator tells the URC which elevator is arriving for Steve, the URC tells Steve which elevator to go to. The elevator arrives and opens.

Steve drives his wheelchair into the elevator, which is staying open for a minute to allow him extra time to enter. Once inside the elevator, Steve selects ‘door close’ on the URC. Since this URC is the one that requested the extra time, the elevator closes the door.

The elevator controller sends a message to the URC that the there is a new UI Socket and PIT available, and transmits them to the URC (once inside the metal elevator, the only elevator the URC can communicate with is the one Steve is in).

The URC displays the new UI, in this case containing floor numbers, elevator location, open door, close door, and alarm. The labels were properly tagged with metadata that allows the URC to reuse the translations requested previously for labels, only new labels, such as alarm, have to be sent to the translation service to be translated into English.

If another user selects a floor, a floor update command is sent from the elevator to the URC. As the elevator travels, floor status is transmitted to URC. As the elevator approaches the 5th floor, the elevator transmits that information to the URC, which alerts Steve that his floor is next.

The elevator doors open, and Steve exits the elevator. The elevator controller knows this is the floor Steve is exiting on, so it leaves the doors open for one minute to allow him plenty of time to exit. As a courtesy to the other riders, once he has cleared the door, he selects close door on his URC.

The URC sends the close door command to the elevator #3, since it knows that was the elevator Steve was on. The elevator control session ends, now that Steve has reached his destination.

7. Partnership with the Dublin Core Metadata Initiative

The Dublin Core Metadata Initiative is an open forum engaged in the development of interoperable online metadata standards that support a broad range of purposes and business models. DCMI's activities include consensus-driven working groups, global workshops, conferences, standards liaison, and educational efforts to promote widespread acceptance of metadata standards and practices.

INCITS/V2 is not a metadata standards body. It is a body concerned with the development of a comprehensive standard that uses metadata. INCITS/V2 prefers to work with communities that are already experienced in the development of relevant standards, including those for metadata. Rather than develop a completed profile for presentation to the Dublin core community, INCITS/V2 prefers to present the problems encountered and the architecture so far developed in order to work collaboratively with the Dublin Core community to complete the work.

In some cases, what V2 needs is standard DC-type metadata but in some cases what it needs has been developed and used in other communities. V2 offers the challenge to the DC community to look more broadly at the whole metadata arena and support its already global standard for intellectual resources with metadata profiles more appropriately defined for people and their needs, and devices and services. Hopefully this does not mean rebuilding DC so much as growing the range of utility of DC from the initial resource area into these others, building on the work already done elsewhere to develop full-fledged profiles for those other areas. DC versions might, as has been the case for resources, be a lightweight, interoperable set of profiles, tuned to the context or object being discovered, used, or altered.

INCITS/V2 has found that many of the technologies needed to complete its task have been developed by W3C and as such can be considered standards that have undergone rigorous examination for interoperability, internationalization and accessibility. As all three of these qualities are essential to the V2 work, collaboration with W3C has been most successful. Our collaboration is accomplished through joint memberships. This has enabled V2 and various W3C interest areas to ensure that our mutual concerns are addressed. We anticipate that the same will be true for work undertaken in collaboration with DCMI.

8. Conclusion

This paper is a call for active collaboration between DCMI and V2 in extending areas of application of the DCMES. For INCITS/V2, the process of working toward a standard has been a voyage of innovation and collaboration. There have been many organizations involved in the conceptual development of the AIAP over a six-year period. The technology that will use the AIAP standard has been prototyped by several organizations. Notably the TRACE R&D Center under the leadership of Gottfried Zimmermann has taken the lead in developing implementations of the core software. The TRACE prototype has demonstrated controlling devices such as a TV, a table lamp, and a fan using several different URCs. These URCs have included a laptop computer using voice recognition, a Compaq IPAQ using a palm size touch screen, and a Braille Note using a Braille display and keypad. For the Dublin Core community, the AIAP represents yet another context in which the Dublin Core model clearly shows potential utility and economy. It is open now to the Dublin Core community to extend their reach, to encompass more than intellectual property resources if they are to support leverage of the technologies already in place, and increase opportunities for interoperability. Such an approach would also admirably support the work of INCITS/V2. We are interested in how broadly the concept of intellectual property resources applies. The question is ‘do concepts defined under “subject” include services provided by Internet resources’. What is the difference between a Web-based resource that provides a service and any other intellectual property resource that demands an additional element in the DCMES? The core question is what is different about a service from any other content available from a resource? We feel collaboration with DCMI will help answer these questions.

References

[1]   Dublin Core Metadata Initiative. Dublin Core Metadata Element Set (http://www.dublincore.org/ documents/dces/).

[2]   INCITS/V2. Technical Committee on Information Technology Access for the InterNational Committee for Information Technology Standards ( http://www.incits.org/tc_home/v2.htm ).

[3]   Roucoux, Perrine. Metadata Terms used to Describe V2 External Resources. [Working Paper submitted to INCITS/V2], April 4, 2003.

[4]   Trewin, Shari [editor]. Architecture of the Universal Remote Console Specification (AIAP-URC) of the Alternate Interface Access Protocol (AIAP) [Draft V2 Working Document]; Version 1.4, 11/21/2002 ( http://www.incits.org/tc_home/v2htm/docs/V2/02-0086/v2020086.htm) .

[5]   Zimmermann, Gottfried; Vanderheiden, Gregg; Gilman, Al. “Universal Remote Console - Prototyping for the Alternate Interface Access Standard.” [This paper is to be published in Carbonell, N. and Stephanidis, C. (eds.); Universal Access: Theoretical perspectives, practice and experience - 7th ERCIM UI4ALL Workshop, Oct. 2002, Paris, France - Selected Papers. Lecture Notes in Computer Science, Springer-Verlag.] ( http://trace.wisc.edu/docs/ui4all2002/index.htm) .

[6]   TRACE, Images generously contributed by HREOC, (http://www.hreoc.gov.au/disability_rights/inquiries/ecom/atmpic2.jpg) and TRACE (www.trace.wisc.edu) and INCITS/V2 (www.incits.org/tc_home/v2.htm).


see also http://www.myurc.com/ and http://www.incits.org/tc_home/v2.htm

 

Next -->