CS student’s Microsoft dissertation grant supports her vision disability research

Computer science graduate student Haley A. Adams has been awarded a 2021 Microsoft Research Dissertation Grant. She is one of 10 recipients in the United States and Canada who are underrepresented in the field of computing and pursuing research aligned to research areas carried out by Microsoft researchers.

Microsoft aims to increase the pipeline of diverse talent receiving advanced degrees in computer-related fields by providing research funding opportunities for doctoral students. Recipients receive up to $25,000 for the academic year 2021-2022 to help them complete research as part of their doctoral thesis. They also are invited to attend the company’s two-day PhD Summit this fall to meet Microsoft researchers and share their research.

Haley Adams

Adams combines her expertise in computer graphics, perceptual psychology, and human-computer interaction to understand how virtual and augmented reality affect the way people interact with their surroundings. Her dissertation work notably focuses on improving the accessibility of immersive technology for people with vision impairments, a population that accounts for over 14 million people in the United States alone.

Adams is a researcher in the School of Engineering’s Learning in Virtual Environments (LiVE) Laboratory, which is led by her adviser Bobby Bodenheimer, professor of computer science and electrical engineering.

“Many people with vision impairments are unable to use immersive head-mounted displays (HMDs)—like Microsoft’s HoloLens or Facebook’s Oculus Quest—due to their inaccessible designs. Worse yet, best practices for accessibility in extended reality (XR) are still a nascent area of research,” she said. Adams is developing rendering techniques that improve perception in XR displays for both normally sighted and visually impaired users.

In her dissertation—Exposing Blind Spots in XR Accessibility With Simulated Vision Impairments—Adams introduces a visual impairment simulation that is both eye-tracked and data-driven. The testbed will be validated by analyzing behavioral responses across both visually impaired and normally sighted users.

“My testbed will allow normally sighted individuals to better understand and design for people with visual impairments by allowing them to experience visual impairments firsthand.” she said.  The results of Adams’ research may be used to ensure that immersive technology is accessible to the widest audience possible.

Contact: Brenda Ellis, 615 343-6314
brenda.ellis@vanderbilt.edu