Microsoft's 14th Ability Summit takes AI to the center stage in curbing the disability divide with promises of accessible features for Windows Copilot, Azure AI, and other Microsoft applications. The annual summit is Microsoft's one-day celebration honoring individuals with disabilities, their allies, and accessible leaders who will offer educational opportunities.  

Headlining the summit's promise of accessible AI technology are Copilot's new features to support live captions and a narrator option that will reportedly come as soon as late March 2024.

(Photo : JOSEP LAGO/AFP via Getty Images)
People visit the US technology company Microsoft's stand during the Mobile World Congress (MWC), the telecom industry's biggest annual gathering, in Barcelona on February 26, 2024.

Copilot's natural language processing features were also highlighted in a separate blog post as capable of making it easier for users to request or create adjustments tailored to their needs. Such as helping people with various disabilities navigate color-coded charts and simplifying difficult materials.   

A plethora of upgrades for Microsoft apps was also announced at the event. One such update is Accessibility Assistant, a set of tools to assist content creators in producing accessible material. It is already available in Word Insider preview and will be available soon for PowerPoint and Outlook.  

Read Also: Microsoft and Mistral AI's Multi-Million Euro Partnership to Reshape AI Landscape, New Challenger to OpenAI's Dominance

Azure AI's Accessibility

Azure AI was presented as an all-encompassing model as it is expected to power a slew of applications to help curb the disability divide such as 'Seeing AI,' a mobile application created with and for the blind community, for a more accessible Microsoft environment.

Seeing AI helps with everyday tasks like understanding your surroundings, reading the mail, and identifying products. Additionally, users can use Seeing AI's natural language capabilities to communicate with it and ask inquiries about a picture or document.  

Additionally, Azure AI will support AI-powered audio descriptions, a new accessibility-focused tool that opens up many possibilities for people who are blind or have low vision. More in-depth and easily understandable video descriptions are now possible because of improved machine vision capabilities.   

Alongside these new AI applications and features, Azure will also improve personalized communications via an open-source picture board communication application, and mental health support chatbots.

Accessible Technology Developed by the Disabled

Most significantly, though, it has been reported that Microsoft has promised to make AI development accessible to all developers, regardless of skill level, enabling developers with disabilities to participate in the field.

This is expected to pave the way for the next generation of AI-driven accessibility solutions, developed by people with lived experience and able to help even more people.  In the post, technology was also positioned as a means of assisting in the pursuit of long-term goals, such as the discovery of an ALS (motor neuron disease) cure.

Microsoft is reportedly pleased to help Answer ALS and the ALS Therapy Development Institute (TDI) nearly increase the amount of clinical and genomic data that is available for study by utilizing Azure.

Answer ALS made their research publicly accessible in 2021 via the Neuromine Azure Data Portal. It is claimed that since then, more than 300 independent research projects worldwide have been made possible by this data. Researchers can move closer to a cure by incorporating data from the ALS TDI into the current ALS Research Collaborative (ARC) investigation. 

Related Article: Microsoft Surface Pro 10, AI PCs Coming This March? Report

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion