!Discover over 1,000 fresh articles every day

Get all the latest

نحن لا نرسل البريد العشوائي! اقرأ سياسة الخصوصية الخاصة بنا لمزيد من المعلومات.

Will Artificial Intelligence Redefine Accessibility in the Workplace?

I suffer from a corneal degeneration disease known as “keratoconus.” It is a gradual thinning of the cornea, causing blurred vision and light halos as the condition progresses. While I can do most things normally, I use various solutions to help. Among those solutions is an application called “Be My Eyes.” The “Be My Eyes” app was first launched in 2015 and connects visually impaired users with sighted volunteers through a video call. Once connected, the volunteer helps guide the user through any task they need assistance with. I have used the app to locate my gates in airports and navigate small texts. Sometimes, I would use “Be My Eyes” in restaurants, where the volunteer humorously comments on the beer choices while mentioning their names from an overhead chalkboard. We would discuss the delights of different IPAs.

There’s something truly special about using the app, especially in the first few times. Connecting the disabled with sighted volunteers offers a real opportunity for empathy and a better understanding of the daily realities of blindness. However, on a practical level, “Be My Eyes” relies on the availability of volunteers and their competence in guiding users through the task. The app can feel more like a quirky fun experience than a reliable tool. With a new update, this is about to change.

The “Be My Eyes” app introduces a digital assistant powered by AI

This winter, a digital assistant called “Be My Eyes AI” will be launched, utilizing artificial intelligence. Users can open the app and point their phones at what is in front of them. The phone then provides a detailed verbal description of the environment and other relevant information like furniture, vehicles, people, or appliances. Users can ask follow-up questions if they need more detailed information or answers to specific questions.

Thanks to GPT-4, OpenAI’s large decentralized model, “Be My Eyes AI” can perform the same function as a human volunteer for most basic tasks while providing better privacy and a more consistent experience. It is not a substitute for a cane or a guide dog, but for people with vision loss, “Be My Eyes” offers a level of independence and self-sufficiency that once seemed impossible. It is now available to users in the beta version.

AI advancements breaking down accessibility barriers in the workplace

While assistive technology has always been essential for people with vision loss—screen magnifiers, text-to-speech functions, dark mode, and Braille displays—the advancements in artificial intelligence have helped break down accessibility barriers in the workplace. This is especially true when it comes to remote work. Virtual assistants like Siri and Alexa can help navigate through computers and phones, while programs like Microsoft Copilot provide image description tools that give a detailed description of what is happening in an image, a particularly useful tool if the image lacks alternative text.

Many people with accessibility needs have been using AI tools in new and innovative ways: look at broadcaster Stephen Scott, co-host of Access Tech Live. His weekly program tackles technology from an accessibility perspective, giving individuals with disabilities insight into how they utilize various tools. It’s a smart and entertaining viewpoint in this rapidly evolving field from someone who knows it well. Scott uses the tools himself, integrating AI into his preparatory work for the show.

While sighted people can quickly skim through articles online, Scott relies on a screen reader to analyze the news. This can be a slow process. While a sighted person can scan an article in seconds to obtain the pertinent information, listening to the entire story can take hours. Instead, Scott inputs links into ChatGPT to get concise summaries of articles. He then delves deeper into the topic if he needs more information. ChatGPT allows him to draft text much faster, an innovation born out of necessity.

He uses

Scott also uses AI-generated summaries to cut through the current internet noise. With every site bombarding users with pop-up ads – subscribe to our newsletter! Click our sponsors! Give us your firstborn! – navigating those spaces can be a real hurdle.

Successful integration requires team collaboration and a general willingness to learn

Advances in artificial intelligence have helped break down accessibility barriers in the workplace. This is especially true when it comes to remote work. Virtual assistants like Siri and Alexa can aid in navigating computers and phones, while programs like Microsoft Copilot provide image description tools that offer detailed descriptions of what is happening in the picture, which is especially helpful if the image lacks alternative text.

Many individuals with accessibility needs have utilized AI tools in new and innovative ways: take broadcaster Stephen Scott, co-host of Access Tech Live. His weekly program tackles technology from an accessibility perspective, giving individuals with disabilities insight into how they use various tools. It is a smart and entertaining viewpoint in this fast-evolving field from someone who knows it well. Scott uses the tools himself, integrating AI into his preparatory work for the show.

While sighted individuals might skim through articles online, Scott relies on a screen reader to analyze the news. This can be a slow process. A sighted person may be able to scan an article in seconds to glean relevant information, whereas listening to the entire story can take hours. Instead, Scott enters links into ChatGPT to get concise summaries of articles. He then dives deeper into the topic if he needs more information. ChatGPT allows him to generate texts much more quickly, a necessity-driven innovation.

Scott also uses AI-generated summaries to cut through the current internet noise. With every site bombarding users with pop-up ads – subscribe to our newsletter! Click our sponsors! Give us your firstborn! – navigating those spaces can be a real hurdle.

Successful integration requires team collaboration and a general willingness to learn

Advances in artificial intelligence have helped break down accessibility barriers in the workplace. This is especially true when it comes to remote work. Virtual assistants like Siri and Alexa can aid in navigating computers and phones, while programs like Microsoft Copilot provide image description tools that offer detailed descriptions of what is happening in the picture, which is especially helpful if the image lacks alternative text.

Many individuals with accessibility needs have utilized AI tools in new and innovative ways: take broadcaster Stephen Scott, co-host of Access Tech Live. His weekly program tackles technology from an accessibility perspective, giving individuals with disabilities insight into how they use various tools. It is a smart and entertaining viewpoint in this fast-evolving field from someone who knows it well. Scott uses the tools himself, integrating AI into his preparatory work for the show.

While sighted individuals might skim through articles online, Scott relies on a screen reader to analyze the news. This can be a slow process. A sighted person may be able to scan an article in seconds to glean relevant information, whereas listening to the entire story can take hours. Instead, Scott enters links into ChatGPT to get concise summaries of articles. He then dives deeper into the topic if he needs more information. ChatGPT allows him to generate texts much more quickly, a necessity-driven innovation.

Scott also summarizes the AI-generated summaries to eliminate the current online noise. With every site bombarding users with pop-up ads – Subscribe to our newsletter! Sponsor us! Give us your firstborn! – navigating those places can be a real obstacle.

Successful integration requires team collaboration and a general willingness to learn

Successful integration requires collaboration among team members and a general willingness to learn. “There is a responsibility on all of us to address our issues,” Scott says. “I acknowledge the challenges and work to overcome them, but we can only do this together as a community,” he adds.

According to the Department of Labor, nearly 1.8 million people with disabilities have joined the labor market since before the pandemic spread in the United States, a 28% increase. One of the main reasons for this is remote work. It’s an encouraging statistic, especially when considering that people with disabilities have a high unemployment rate compared to others. New AI-powered tools will help bolster this growth and point towards a more equitable future. But more importantly, I think it’s essential to remember that just because someone may need to do something differently (or may need assistance to accomplish something) does not mean they cannot do it at all. Remote work enables people with disabilities to access good jobs, but we all need to create the right environment for employee success. With tools like “Be My AI Eyes” and other emerging assistive technologies, it becomes easier and easier to achieve that.

Author

Graham Isador is a writer based in Toronto. He was a contributing editor at VICE, and his work regularly appears in The Globe and Mail and GQ.

Source: https://blog.dropbox.com/topics/work-culture/will-ai-redefine-workplace-accessibility


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *