WEBCAST FEB 8: A11yVR – Accessible Multimodal Input in Augmented Reality Training Applications

livestreamOn Tuesday February 8 2022 at 19:00 EST (00:00 UTC / 09:00 JST Jan 20) the Accessibility VR Meetup hosts ‘Accessible Multimodal Input in Augmented Reality Training Applications‘ online. Ashley Coffey will share recent PEAT research on making immersive hybrid workplaces more inclusive and accessible. She’ll reflect on themes explored in an “Inclusive XR in the Workplace” white paper co-authored with the XR Association. Tim Stutts will share his design experience with multi-modal inputs (hardware controls, gesture, voice) and sensory feedback (graphical, LEDs, sound, haptics) for augmented reality head-mounted displays, touching briefly on his previous work on Magic Leap’s Lumin OS, symbolic input and surrounding accessibility efforts, before diving into current Vuforia Work Instructions applications for HoloLens, RealWear, and most recently Magic Leap, with the debut of Capture and Vantage applications to the platform.

The session will be webcast live via a partnership with the Internet Society New York Chapter Accessibility SIG (A11ySIG).

LIVESTREAM: https://livestream.com/internetsociety/a11yvr22 (open captions)

PARTICIPATE VIA HUBS: https://www.meetup.com/a11yvr/events/282792497/ (open captions)

REAL TIME TEXT: https://www.streamtext.net/player?event=a11yvr

TWITTER: @a11yvr @ashleycoffey @PEATWorks @timstutts @Vuforia @PTC @magicleap #a11y #AR #XR #VR @IAAPOrg @MozillaHubs #captioned

SIMULCASTS (open captions)
https://youtu.be/HERoRreHNAQ (monitored for comments)

ARCHIVE https://archive.org/details/a11yvr22