-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Newsroom
- :
- Learning Stories Blog
- :
- AI and Disability: Benefits and Risks for Students...
AI and Disability: Benefits and Risks for Students and Instructors
- Subscribe to RSS Feed
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
In February, I attended a panel at axe-con called "Human-Centered AI and Accessibility" that combined two topics I often think about separately, but rarely together: AI and the disability community. The webinar is free, so I don't want to spoil it too much, but the speakers shared a variety of ways AI is and could further be applied to accessibility technology, as well as the very real risks of applying it without due thought or caution.
The talk was interesting and informative, but it mostly got me thinking: every major technological advance I can think of has had both positive and negative effects on people with disabilities. The internet opened the floodgates of information and community for many people, and mobile devices make apps like BeMyEyes and Speak For Yourself possible – but inaccessible technology also leaves a lot of people behind. Just ask the many blind users who can look up a restaurant website, but not read the menu.
So: as AI takes over the world (hopefully not literally), I wanted to learn more about how it can also help or hinder students and instructors with disabilities.
Naturally, I started my research by asking ChatGPT, and after some back and forth conversation and a couple of iterations, it gave me the following benefits and risks:
Benefits of AI for Instructors and Students with Disabilities:
- Personalized Support and Adaptive Tools: AI offers customized learning experiences for students and adaptive teaching resources for instructors, accommodating unique needs and enhancing academic success for both.
- Assistive Communication and Engagement: AI-powered communication aids and immersive technologies improve classroom interactions and create engaging, dynamic learning environments, benefiting students and instructors with disabilities.
- Enhanced Mobility and Independence: AI technologies like robotics and exoskeletons assist with physical navigation and independence, supporting both students and instructors in accessing and participating fully in educational activities.
Risks of AI for Instructors and Students with Disabilities:
- Dependence on Technology: Over-reliance on AI tools may reduce the development of essential skills and coping strategies for both students and instructors, potentially impacting effectiveness and independence.
- Bias and Inequity: AI systems may perpetuate biases, leading to unequal access to resources and opportunities, and failing to adequately address the diverse needs of students and instructors with disabilities.
- Privacy Concerns: The collection of sensitive data about disabilities raises privacy and security concerns, posing ethical and legal challenges for both students and instructors in managing and protecting this information.
- Technical Barriers: The effectiveness of AI tools is limited by technical issues, insufficient resources, and lack of training, which can hinder the benefits for both students and instructors with disabilities.
- Exclusion from Mainstream Activities: Poor integration of AI solutions can lead to feelings of isolation and exclusion from general classroom activities, affecting social inclusion and participation for both students and instructors with disabilities.
Not wanting to rely solely on AI, I asked Macmillan Learning’s AVID (Awareness of Visible and Invisible Disabilities) employee resource group the same question in our monthly meeting. What do my fellow disabled employees think are the actual benefits and risks AI presents them with? The responses I got only proved that the human element remains invaluable – ChatGPT could never have predicted some of what was said.
Of course, some of the ideas put forward by ChatGPT were echoed by our community members. We talked about the potential promise of virtual tools that could help with organization or communication, and some further added that they already use ChatGPT to help lessen anxiety around things like brainstorming ideas or writing emails. Community members also agreed with ChatGPT's concerns around AI causing job displacement, and some even shared real-world examples they'd witnessed. Others recounted more disturbing encounters, ranging from Google Search AI's now-infamous recommendation to drink urine for kidney stones to actual medical misinformation showing up in AI tools used by their doctors' practices. In researching this post, I landed on a story about parents with disabilities being possibly flagged by AI for presumed neglect.
Overall, AVID members were most concerned about losing the human touch. People with disabilities often have specific needs or circumstances that may require out-of-the-box thinking; our current crop of AI tools, by definition, can't think too far outside of the box. One of my own chronic illnesses was ultimately diagnosed because a doctor thought to ask a question that no one else had thought to ask; if she had relied too much on AI, would I still be undiagnosed today? Will workers with disabilities who don't fit certain norms find themselves on the wrong end of an employer AI tool's decision making? Will things like AI accessibility tools and AI-powered design make it easier for people with disabilities to navigate the world, or will it lead to less human oversight? Without enough buy-in and/or stringent checks and balances, these are very real concerns.
Ultimately, as one AVID member put it: we keep learning about AI tools and uses we didn't know about, for better and for worse. As AI continues to surge forward, how can we make sure that for people with disabilities, especially students and instructors, it's better more often than not?
Resources:
Human-Centered AI and Accessibility – a free webinar from axe-con
Generative AI holds great potential for those with disabilities - but it needs policy to shape it
Trained AI models exhibit learned disability bias
Why AI fairness conversations must include disabled people
Designing Generative AI to Work for People with Disabilities
-
2020
12 -
2021
32 -
2022
44 -
2023
52 -
2024
52 -
Accessibility
3 -
Achieve
15 -
AI
18 -
Author Spotlight
1 -
Authors
3 -
COVID19
1 -
DEI
44 -
Diversity and Inclusion
1 -
History
1 -
iClicker
4