| CARVIEW |
Internet Multimodal Access to Graphical Exploration (IMAGE)
Making internet graphics accessible to blind users through audio and touch.
Wearable Haptics
Sensor-embedded footwear and clothing for VR, training, and rehabilitation.
Next-Generation Avionics Interfaces
Developing displays for Required Time of Arrival (RTA).
Observing ATC
Studying workflows of air traffic controllers and pilots to improve communication.
Enhancing Shopping Accessibility
Helping blind and low-vision individuals shop independently with vision-language reasoning.
Improved Information Display in High-Consequence Environments
Investigating augmented reality displays for monitoring patient vital signs.
AI-Digital Nurse Avatar (ADiNA)
A conversational avatar for interaction with older adults.
Multimodal guidance for future taxi operations
Researching visual and haptic cues to help pilots with STBO operations
Accessible Interpretation of Complex Charts
Using sonification to help blind users interpret stock data.
Voice-Based Interaction with Autonomous Co-Pilots.
Interaction with a voice-enabled autonomous co-pilot to support shared decision-making.
Bionic Ear
Building a multimodal model of audio attention to steer intelligent listening devices.
Ultra-Videoconferencing
Flexible, low-latency IP transport system for audio, video, and vibrosensory data.
About Us
The Shared Reality Lab leverages our engineering skills and human-centered design experience to tackle a variety of problems of importance to society. We work with audio, video, and haptic technologies, mixed reality, machine learning, and mobile computing, building systems that facilitate and enrich both human-computer and computer-mediated human-human interaction. Active projects include audio and audio-haptic rendering of graphics for blind users, development of conversational avatars for healthcare applications, design of the flight deck of the future, and multimodal delivery of information to users in high-consequence environments. For questions about the lab, please contact Prof. Jeremy Cooperstock.
The Shared Reality Lab is currently funded by grants and contracts from the Natural Sciences and Engineering Research Council, Healthy Brains, Healthy Lives, CRIAQ, and Humanware. Past funding sources include the Fonds Nature et technologies, Sécurité publique Québec, Canadian Internet Registry Association, Networks of Centres of Excellence, Minstère du développement économique, de l’innovation et de l’exportation, Secrétariat du Conseil du trésor, CANARIE, and Innovation, Science and Economic Development Canada, as well as industrial support from HP Labs, Google Research, Mozilla, InterDigital Corporation, Haply Robotics, and iMD Research.