By S. Dent: For Complete Post, Click Here…
The Speech Accessibility Project aims to help people with Down’s syndrome, Parkinson’s and more.
The University of Illinois (UIUC) has partnered with Amazon, Apple, Google, Meta, Microsoft and nonprofits on the Speech Accessibility Project. The aim is to improve voice recognition for communities with disabilities and diverse speech patterns often not considered by AI algorithms. That includes people with Lou Gehrig’s disease (ALS), Parkinson’s, cerebral palsy, Down syndrome and other diseases that affect speech.
“Speech interfaces should be available to everybody, and that includes people with disabilities,” UIUC professor Mark Hasegawa-Johnson said. “This task has been difficult because it requires a lot of infrastructure, ideally the kind that can be supported by leading technology companies, so we’ve created a uniquely interdisciplinary team with expertise in linguistics, speech, AI, security and privacy.”
To include communities of people with disabilities like Parkinson’s, The Speech Accessibility Project will collect speech samples from individuals representing a diversity of speech patterns. The UIUC will recruit paid volunteers to contribute voice samples and help create a “private, de-identified” dataset that can be used to train machine learning models. The group will focus on American English at the start.