Today, as we observe Global Accessibility Awareness Day, it’s an opportune moment to review the strides that Meta is making in ensuring its products are more accessible to a wider audience. Meta’s commitment to enhancing accessibility is pivotal in making sure that technology impacts everyone positively. Here, we delve into some of the latest advancements Meta has made in this field.
### Enabling Hands-Free Navigation with Ray-Ban Meta Glasses
Meta has introduced an innovative product, the Ray-Ban Meta glasses, which offer a truly hands-free experience. These glasses are integrated with Meta’s artificial intelligence, allowing users to perform everyday tasks more conveniently. While these glasses cater to everyone, they are particularly beneficial for individuals who are blind or have low vision.
With Ray-Ban Meta glasses, users can effortlessly capture and share photos, send text or voice messages, make phone calls, conduct video calls, listen to music, translate speech in real-time, and interact with Meta AI for instant assistance. Since the launch of these glasses, millions have shared moments with loved ones, enhancing their connectivity as these products become available in more regions globally.
Starting today, Meta is rolling out a new feature that allows users to customize Meta AI to offer detailed responses based on the user’s surroundings. This functionality is set to deploy in the U.S. and Canada over the coming weeks, with a broader rollout to follow. To activate this feature, users simply need to navigate to the Device settings in the Meta AI app and toggle on the detailed responses under Accessibility.
Moreover, Meta is excited to announce the launch of the “Call a Volunteer” feature, developed in collaboration with Be My Eyes. This feature will soon be available across all 18 countries where Meta AI is supported. “Call a Volunteer” connects individuals who are blind or have low vision with sighted volunteers, facilitating real-time assistance for everyday tasks.
### Advancing Human-Computer Interaction with Wristband Devices
Meta is pushing the boundaries of human-computer interaction (HCI) through the development of wristband devices. These devices are particularly beneficial for individuals with diverse physical abilities, including those experiencing hand paralysis or tremors. By leveraging surface electromyography (sEMG) technology, these wristbands interpret muscle signals to interact with computing systems, providing an accessible interface for those who may not be able to make large movements.
The sEMG wristband used in the Orion AR glasses prototype represents a significant step forward in this technology. Meta is investing in collaborative research focused on accessibility use cases, ensuring that these products are usable by a diverse range of people.
In April, Meta completed data collection with a Clinical Research Organization to assess the potential of sEMG-based models for users with hand tremors due to conditions like Parkinson’s and Essential Tremor. Additionally, Meta has an active research collaboration with Carnegie Mellon University, aimed at enabling individuals with hand paralysis to use sEMG-based controls for HCI. This research is promising, as it demonstrates that even minimal motor signals can facilitate HCI, allowing users to begin interacting with systems from Day 1 of use.
### Breaking Down Communication Barriers in the Metaverse
Meta is also working to make the metaverse more accessible by implementing live captions and live speech features in its extended reality products. Live captions convert spoken words into text in real-time, allowing users to read content as it is delivered. This feature is available at various levels, including the Quest system level, Meta Horizon call level, and in Meta Horizon Worlds.
The live speech feature converts text into synthetic audio, providing an alternative communication method for individuals who may have difficulty with verbal interactions or prefer not to use their voices. Since its introduction, live speech has seen high retention rates, prompting Meta to enhance it further by allowing users to personalize and save frequently used messages.
Meta’s work with Llama, a collection of open-source AI models, further underscores its commitment to accessibility. Developers at Sign-Speak have utilized Llama’s capabilities to create a WhatsApp chatbot that translates American Sign Language (ASL), facilitating communication between Deaf and hearing individuals. This software allows a Deaf person to sign ASL into a device, which is then translated into English text for the hearing person. The hearing person can respond via voice or text, and the software will sign the message to the Deaf person through an avatar.
### Commitment to a More Accessible Future
Meta’s ongoing investment in accessibility features and products underscores its dedication to making connections easier for everyone. As technology continues to evolve, Meta is committed to addressing the diverse needs of billions of people worldwide who rely on its products. By fostering innovation and inclusivity, Meta is paving the way for a future where technology is accessible to all.
For more information on Meta’s efforts and developments in accessibility, you can visit their official channels.
For more Information, Refer to this article.