CARVIEW |
XR Accessibility User Requirements
W3C First Public Working Draft
- This version:
- https://www.w3.org/TR/2020/WD-xaur-20200213/
- Latest published version:
- https://www.w3.org/TR/xaur/
- Latest editor's draft:
- https://w3c.github.io/apa/xaur/
- Editors:
- Joshue O Connor (W3C)
- Janina Sajka
- Jason White
- Michael Cooper (W3C)
Copyright © 2020 W3C® (MIT, ERCIM, Keio, Beihang). W3C liability, trademark and permissive document license rules apply.
Abstract
This document lists user needs and requirements for people with disabilities when using virtual reality or immersive environments, augmented or mixed reality and other related technologies (XR). It first introduces a definition of XR as used throughout the document, then briefly outlines some uses of XR. It outlines the complexity of understanding XR, introduces some accessibility challenges, and introduces accessibility multimodal support for a range of input and output devices, the importance of customization. Based on this information, it outlines accessibility user needs for XR and their related requirements, followed by information about related work that may be helpful to understand the complex technical architecture and processes behind how XR environments built and what may form the basis of a robust accessibility architecture for XR.
This document is most explicitly not a collection of baseline requirements. It is also important to note that some of the requirements may be implemented at a system or platform level, and some may be authoring requirements.
Status of This Document
This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
This is a First Public Working Draft of XR Accessibility User Requirements by the Accessible Platform Architectures Working Group. It is developed by the Research Questions Task Force (RQTF) who work to identify accessibility knowledge gaps and barriers in emerging and future web technologies. The requirements outlined here come from research into user needs that then provide the basis for any technical requirements. This version is published to collect public feedback on the requirements prior to finalization as a Working Group Note.
To comment, file an issue in the W3C APA GitHub repository. If this is not feasible, send email to public-apa@w3.org (archives). Comments are requested by 7 April 2020. In-progress updates to the document may be viewed in the publicly visible editors' draft.
This document was published by the Accessible Platform Architectures Working Group as a First Public Working Draft.
Publication as a First Public Working Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the W3C Patent Policy. The group does not expect this document to become a W3C Recommendation. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 1 March 2019 W3C Process Document.
1. Introduction
XR is an acronym used to refer to the spectrum of hardware, applications, and techniques used for virtual reality or immersive environments, augmented or mixed reality and other related technologies. This document is developed as part of a discovery into accessibility related user needs and requirements for XR. This document does not represent a formal working group position, nor does it currently represent a set of technical requirements that a developer or designer need strictly follow. It aims to outline for the reader some of the diversity of current accessibility related user needs in XR and what potential requirements to meet those needs may be.
1.1 What does the term 'XR' mean?
As with the WebXR API spec and as indicated in the related WebXR explainer, this document uses the acronym XR to refer to the spectrum of hardware, applications, and techniques used for virtual reality or immersive environments, augmented or mixed reality and other related technologies. Examples include, but are not limited to:
- Immersive or augmented environments used for education, gaming, multimedia, 360° content and other applications.
- Head mounted displays, whether they are opaque, transparent, or utilise video passthrough.
- Mobile devices with positional tracking.
- Fixed displays with head tracking capabilities.
The important commonality between them being that they all offer some degree of spatial tracking with which to simulate a view of virtual content as well as navigation and interaction with the objects within these environments.
Terms like "XR Device", "XR Application", etc. are generally understood to apply to any of the above. Portions of this document that only apply to a subset of these devices will be indicated as appropriate.
1.2 Definitions of virtual reality and immersive environments
Virtual reality and immersive environment definitions vary but converge on the notion of immersive computer-mediated experiences. They involve interaction with objects, people and environments using a range of controls. These experiences are often multi-sensory and may be used for educational, therapeutic or entertainment purposes.
1.3 Definitions of augmented and mixed reality
Augmented and mixed reality definitions vary but converge on the notion of computer-mediated interactions involving overlays on the real world. These may be informational, or interactive depending on the application.
2. What is XR used for?
XR has an exhaustive range of purposes from education, gaming, multimedia, immersive communication and many others. It is currently evolving at a very fast rate and is not yet mainstream. This will change as computing power increases, hardware improves as well as the quality of user experience, XR will be more commonly be used for the performance of everyday practical tasks, for therapeutic uses, education and for entertainment.
3. Understanding XR and Accessibility Challenges
Understanding XR itself presents various challenges that are very technical. They include technical issues with a range of hardware, software, authoring tools as well as a need to understand interaction design principles, accessibility semantics and more. These all represent 'basic' technical complexities that are in themselves substantial. To add to this, for many designers and authors they may neither know or have access to people with disabilities for user testing. Neither may they have a practical way of understanding user needs that they can build a solid set of requirements from. In short, they just may not understand what user needs they are trying to meet when making XR accessible.
Some of the issues in XR, for example in gaming, for people with disabilities including:
- Over emphasis on motion controls. The are many motion controllers that emphasise using your body to control the experience. Some games with XR components may lock out traditional control methods when a VR headset is being used, and the user should always be able to use a range of input mechanisms.
- VR Headsets need the user to be a physical position to play. The user should not have to be in a particular physical position to play a game or perform some action. Or there should be ability to remap these 'physical positions' to other controls (such as using WalkinVRDriver).
- Games and hardware being locked to certain manufacturers Consoles should allow full button remapping on standard game controllers - to different types of assistive technologies such as switches. These remapping preference should be mobile, and transportable across a range of hardware devices and software.
- Gamification of VR forces game dynamics on the user. Some users may wish to just explore an immersive environment without the 'game', or any particular challenge.
- Audio design lacks spatial accuracy sound design needs particular attention and can be critical for a good user experience for people with disabilities. In deed the auditory experience of a game or other immersive environment may 'be' the experience [able-gamers].
There are also a range of other disabilities that will need to be considered in making XR accessible. It is beyond the scope of this document to describe them all in detail. General categories or types of disabilities are:
- Auditory disabilities
- Cognitive disabilities
- Neurological disabilities
- Physical disabilities
- Speech disabilities
- Visual disabilities
A person may have one of these disabilities or a combination of several. Each of these 'types' will be presented as a user need that should be met and understanding these needs are crucial in rising to the range of interesting challenges XR designers and authors will have when supporting accessibility and multimodality in XR environments.
These may be:
- Understanding specific diverse user needs and how they relate to XR.
- Successfully identifying modality needs that are not obvious - but still need to be supported.
- Having suitable authoring tools for designers that support accessibility requirements in XR.
- Using languages, platforms and engines that support accessibility semantics.
- Successfully abstracting XR applications by providing accessible alternatives for content and interaction.
- The provision of specific commands within the VR environment (e.g., to go directly to a specified location or to follow another user) which assist with non-visual navigation.
- The use of virtual assistive technologies (e.g., white cane via a haptic device) to provide non-visual feedback. The research identified that if the same audio cues associated with a real-world infrared white cane were used in immersive environment, users were able to effectively centre themselves in the middle of pathways and walk successfully through virtual doorways based on the same audio feedback as used in the equivalent real-world device [maidenbaum-amendi]
3.1 Immersive Environment challenges
Some of the many challenges with immersive environments accessibility (and also gaming) include the use of extremely complex input devices, control schemes that require a high degree of precision, timing and simultaneous action; ability to distinguish subtle differences in busy visual and audio information, having to juggle multiple complex goals and objectives [web-adapt].
There are also currently very useful accessibility guidelines available that are specific to gaming [game-a11y].
3.2 XR and supporting multimodality
Modality relates to modes of sense perception such as sight, hearing, touch and so on. Accessibility can be thought of as supporting multi-modal requirements and the transformation of content or aspects of a user interface from one mode to another that will support various user needs.
Considering various modality requirements in the foundation of XR means these platforms will be better able to support accessibility related user needs. There will be many modality aspects for the developer and/or content author to consider. XR authors and content designers will also need access to tools that support the multimodal requirements listed below.
The following Inputs and Outputs can be considered modalities that should be supported in XR environments.
3.3 Various input modalities
The following are example of some of the diverse input methods used by people with disabilities. In many real world applications these input methods may be combined.
- Speech - this is where a user's voice is the main input. Using a range of voice commands a user should be able to navigate in an XR environment, interact with the objects in that environment using their voice alone.
- Keyboard - this is where the keyboard alone is the user's main input. A user should be able to navigate in an XR environment, interact with the objects in that environment using the keyboard alone.
- Switch this is where a since button Switch alone is the user's main input. A user should be able to navigate in an XR environment, interact with the objects in that environment using a Switch alone. This switch may be used in conjunction with an assistive technology scanning application within the XR environment that allows them to select directions for navigation, macros for communication and interaction.
- Gesture - this is where gesture based controllers are the main input and can be used to navigate in an XR environment, interact with the objects in that environment make selections using their voice alone.
- Eye Tracking - this is where eye tracking applications is the main input. Using a range of commands a user should be able to navigate in an XR environment, interact with the objects in that environment using these eye tracking applications.
3.4 Various output modalities
The following are a list of outputs that can be available to a user to help them understand, interact with and 'sense' feedback from an XR application. Some of these are in common use on the Web and other exploratory (such as Olfactory and Gustatory.)
- Tactile - this is using the sense of touch, or commonly referred to as haptics.
- Visual - this is using the sense of sight, such as 2D and 3D graphics.
- Auditory - this is using the sense of sound, such as rich spatial audio, surround sound.
- Olfactory - this is the sense of smell.
- Gustatory - this is the sense of taste.
3.5 XR controller challenges
As mentioned there are a range of input devices that may be used. Supporting these controllers requires an understanding of what they are and how they work. There are a variety of alternative gaming controls that may be very useful in XR environments and applications. For example the Xbox Adaptive Controller.
While XR is the experience, the controller is king, and plays a critical part in overcoming some complexity as well as mediating issues that may relate to other challenges around usability and helping the user understand sensory substitution devices.
Controllers such as the Xbox Adaptive Controller and other switch type inputs allow the issuer to remapping keyboard inputs to control virtual environments. The powerful customizations may allow the user to "do that thing that is difficult" for them with ease. In conjunction with this controller, for example, users with limited mobility they can also simulate actions in the XR environment that they would not be able to physically perform. WalkinVRDriver is a good example of this where motion range, position and orientation can be set to the user's ability.
3.6 Customization of control inputs
Give the user the ability to modify their input preference or use a variety of input devices. The remapping of keys used to control movement or interaction in virtual environments is not currently required by WCAG. It is nevertheless noted in the literature as desirable.
3.7 Using multiple diverse inputs simultaneously
A user with a disability may have several input devices. A user may switch 'mode' of interaction or the tools used and should be able to do so without degrading into a poor user experience where they lose focus on a task and cannot return to it, or make unforced errors, accidental input and so on.
3.8 Consistent tracking with multiple inputs
There may be tracking issues when switching input devices. A tracking issue is where the user may lose their focus or it can be modified in unpredictable or unwanted ways, this can cause loss of focus and potentially push the user to make unwanted inputs or choices.
Outputs sent to multiple devices will need to be synchronised.
3.9 Usability and affordances in XR
An XR application should have a high level of usability for someone with a disability who is using assistive technology. Some challenges in translating interaction models may be:
- How can a user understand the affordance models used in XR interactions? Or can this be mediated by their own interaction preferences and controllers?
- What interactions are allowed or not allowed?
- How can an accessibility abstracted XR experience focussed on supporting a different modality, successfully interact with another?
4. XR User Needs and Requirements
4.1 User needs definition
This document outlines various accessibility related user needs for XR. These user needs should drive accessibility requirements for XR and its related architecture. These come from people with disabilities who use assistive technologies and wish to see the features described available within XR enabled applications.
User needs and requirements are often dependent on context of use. The following outline some accessibility user needs and requirements that may be applicable in immersive environments, augmented reality and 360° applications.
These following are neither exhaustive, nor definitive but are presented in order to help orientate the reader towards understanding some broad user needs and how to meet them.
4.2 Immersive semantics and customization
- User Need 1: A user of assistive technology wants to navigate, identify locations, objects and interact within an immersive environment.
- REQ 1a: Navigation mechanisms must be intuitive with robust affordances. Navigation, location and object descriptions must be semantically accurate and identified in a way that is understood by assistive technology.
- REQ 1b: Controls need to support alternative mapping, rearranging of position, resizing and sensitivity.
4.3 Motion agnostic interactions
- User Need 2: A person with a physical disability may want to interact with items in an immersive environment in a way that doesn't require particular bodily movement to perform any given action.
- REQ 2a: Allow the user performing an action in the environment, in a device independent way, without having to do so physically.
- REQ 2b: Ensure that all areas of the user interface can be accessed using the same input method.
- REQ 2c: Allow multiple input methods to be used at the same time.
4.4 Immersive personalisation
- User Need 3: Users with cognitive and learning disabilities may need to personalise the immersive experience in various ways.
- REQ 3a: Support Symbol sets so they can be used to communicate and layered over objects and items to convey affordances or other needed information in way that can be understood according to user preference.
- REQ 3b: Allow the user to turn off of 'mute' non-critical environmental content such as animations, visual or audio content, or non-critical messaging.
4.5 Interaction and target customization
- User Need 4: A user with limited mobility may need to be able to hit a larger 'Target size' for a button or other controls in immersive environments.
- REQ 4a: Ensure fine motion control is not needed to activate an input.
- REQ 4b: Ensure hit targets are large enough with suitable spacing around them.
- REQ 4c: Ensure multiple actions or gestures are not required at the same time to perform any action.
4.6 Voice commands
- User Need 5: A user with limited mobility may want to be able to use Voice Commands within the immersive environment, to navigate, interact and communicate with others in XR environments.
- REQ 5a: Ensure Navigation and interaction can be controlled by Voice Activation.
4.7 Color changes
- User Need 6: Color blind users may need to be able to customise the colors used in the immersive environment. This will help with understand affordances on various controls or where color is used to signify danger or permission.
- REQ 6a: Provide customised high contrast skins for the environment to suit their particular luminosity and color contrast requirements.
4.8 Magnification context and resetting
- User Need 7: Screen magnification users may need to be able to check the context of their view in immersive environment.
- REQ 7a: Allow the screen magnification user to check the context of their view and track/reset focus as needed.
4.9 Critical messaging and alerts
- User Need 8: Screen magnification users may need to be made aware of critical messaging and alerts in immersive environments often without losing focus. They may also need to route these messages to 'second screens' (see REQ 1).
- REQ 8a: Ensure that critical messaging, or alerts have priority roles that can be understood and flagged to AT, without moving focus.
4.10 Gestural interfaces and interactions
- User Need 9: A blind user may wish to interact with a gestural interface, such as a virtual menu system, using gloves and a head set.
- REQ 9a: Using a virtual menu system - enable a self-voicing option and have each category, or item description spoken to them as they receive focus via a gesture or other input. As the blind user gestures to trigger both movement and interaction. The user may get more detail about items that are closer to them, if navigating a virtual store the user must be allowed to query and interrogate items and make selections.
4.11 Text description transformation
- User Need 10: A deaf or hard of hearing person, for whom English or any other written language, may not be their first language and may have a preference for signing of text alternatives or equivalents.
- REQ 10a: Allow object or item text descriptions to be presented to the user via a signing avatar.
4.12 Safe harbour controls
- User Need 11: People with Cognitive Impairments may be easily overwhelmed in Immersive Environments.
- REQ 11a: Allow the user to set a 'safe place' - quick key, shortcut or macro.
4.13 Immersive time limits
- User Need 12: Users with cognitive impairments may be adversely affected by spending too much time in any immersive environment or experience, or may lose track of time.
- REQ 12a: Allow the user to set a time limit for any immersive session.
4.14 Reset focus and orientation
- User Need 13: A screen magnification user or user with a cognitive disability or learning impairment may easily lose focus and be disorientated in immersive environments.
- REQ 13a: Ensure the user an reset and calibrate their orientation/view in a device independent way.
- REQ 13b: Ensure field of view in Immersive environments, are appropriate, and can be personalised - so users are not disorientated.
4.15 Routing to second screens
- User Need 14: A deaf-blind user communicating via a RTC application in XR may have sophisticated 'routing' requirements for various inputs and outputs.
- REQ 14a: Allow the user to route text output, alerts, environment sounds and audio to a braille or other devices.
4.16 Interaction speed
- User Need 15: Users with physical disabilities or cognitive and learning disabilities may find some interactions too fast to keep up with or maintain.
- REQ 15a: Allow users to change speed at which they travel through an immersive environment, or can perform interactions.
- REQ 15b: Allow timings for interactions or critical inputs to be modified or extended.
- REQ 15c: Provide an XR angel or helper for the user with a cognitive or learning disability.
- REQ 15d: Provide clear start and stop mechanisms.
4.17 Avoiding sickness triggers
- User Need 16: Users with vestibular disorders, Epilepsy, and photo sensitivity may find some interactions trigger motion sickness and other affects.
- REQ 16a: Avoid interactions that trigger epilepsy or motion sickness and provide alternatives.
- REQ 16b: Ensure flickering images are at a minimum, will not trigger seizures (more than 3 times a second), or can be turned off or reduced.
4.18 Binaural audio track alternatives
- User Need 17: Deaf and hard of hearing users may need binaural recordings of audio content in order to perceive it.
- REQ 17a: Provide alternative binaural audio recording tracks to emulate 3 dimensional sound forms in immersive environments.
4.19 Subtitling customization
- User Need 18: Users with vision impairments may need to customise subtitles and other text in immersive environments.
- REQ 18a: Allow customisable context sensitive reflow of text and subtitled content in XR spaces. The suitable subtitling area may be smaller than what is required currently for television [inclusive-seattle].
5. Related Documents
Other documents that relate to this and represent current work in the RQTF/APA are:
- XR Semantics Module - this document outlines proposed accessibility requirements that may be used in a modular way in immersive, augmented or mixed reality (XR). A modular approach may help us to define clear accessibility requirements that support XR accessibility user needs, as they relate to the immersive environment, objects, movement, and interaction accessibility. Such a modular approach may help the development of clear semantics, designed to describe specific parts of the immersive eco-system. In immersive environments it is imperative that the user can understand what objects are, understand their purpose, as well as another qualities and properties including interaction affordance, size, form, shape, and other inherent properties or attributes.
- WebXR Standards and Accessibility Architecture Issues - this document is informative and aims to outline some of the challenges in understanding the complex technical architecture and processes behind how XR environments are currently rendered. To make these environments accessible and provide a quality user experience it is important to also understand the nuances and complexity of accessible user interface design and development for the 2D web. Any attempt to make XR accessible needs to be based on meeting the practical user needs of people with disabilities.
A. Acknowledgements
A.1 Participants of the APA Working Group Active in the Development of This Document
- Matthew Tylee Atkinson, The Paciello Group
- Judy Brewer, W3C
- Michael Cooper, W3C
- Marku Hakkinen, Educational Testing Service
- Charles Hall, Invited Expert
- Scott Hollier, Invited Expert
- Charles LaPierre, Benetech
- Melina Möhlne, IRT
- Joshue O Connor, W3C
- Janina Sajka, Invited Expert
- Léonie Watson, TetraLogical
- Jason White, Educational Testing Service
A.2 Previously Active Participants, Commenters, and Other Contributors
Ian Hamilton
A.3 Enabling Funders
This work is supported by the EC-funded WAI-Guide Project.
B. References
B.1 Informative references
- [able-gamers]
- Thought On Accessibility and VR. AJ Ryan. March, 2017. URL: https://ablegamers.org/thoughts-on-accessibility-and-vr/
- [game-a11y]
- Game Accessibility Guidelines. Barrie Ellis; Ian Hamilton; Gareth Ford-Williams; Lynsey Graham; Dimitris Grammenos; Ed Lee; Jake Manion; Thomas Westin. 2019. URL: https://gameaccessibilityguidelines.com
- [inclusive-seattle]
- W3C Workshop on Inclusive XR Seattle. W3C; Pluto VR. W3C. Nov 2019. URL: https://www.w3.org/2019/08/inclusive-xr-workshop/
- [maidenbaum-amendi]
- Non-visual virtual interaction: Can Sensory Substitution generically increase the accessibility of Graphical virtual reality to the blind?. Maidenbaum, S.; Amedi, A. In Virtual and Augmented Assistive Technology (VAAT), 2015 3rd IEEE VR International Workshop on (pp. 15-17). IEEE. 2015.
- [web-adapt]
- W3C Workshop on Web Games Position Paper: Adaptive Accessibility. Matthew Tylee Atkinson; Ian Hamilton; Joe Humbert; Kit Wessendorf. W3C. Dec 2018. URL: https://www.w3.org/2018/12/games-workshop/papers/web-games-adaptive-accessibility.html