AI Is Not (Necessarily) Accessible

Guest Post

This guest post was written by Dr. Mary Rice, University of New Mexico who serves as a NCADEMI National Advisory Council member

Many people became very excited about artificial intelligence (AI) at the end of 2022. AI existed previously, but mainly engineers and computer scientists knew about it. Different types of AI programs have different designs and do different jobs. The kind of AI that got people excited in 2022 was called a Large Language Model (LLM). It uses language that already exists in places like the internet. LLMs study language using math to learn patterns, then make new paragraphs and sentences. Besides LLMs that gather, study, and then make new language, other forms of AI exist. Most forms of AI involve learning from something existing, like photos or weather patterns.  

Magnifying glass focused on a student working on a laptop
Image created using ChatGPT 4o, 6/23/25. OpenAI.

Some educators and policy makers hoped AI would make learning on devices, such as mobile phones, tablets or laptops, more accessible for students with disabilities. Accessibility happens when students with disabilities can use the same materials in the same manner and at the same time as students without disabilities. For example, when students watch videos in class, the videos need captions so deaf students can learn. Since AI uses existing language to make more language, its programs can make captions for video. While captions made with AI programs might be better than having no captions at all, there can be limitations. For example, captions made with AI might not identity people’s names correctly. Also, because AI-generated captions might not recognize variations in how people speak English, they might not be understandable. Finally, when a system that uses AI thinks it is doing the right thing, it continues to do it without changing. Therefore, errors can be difficult to correct.  

In addition, many educators are excited about chatbots that respond when asked questions. In school, many questions are about how to do tasks, such as “How do I find the slope of a line?” in math. Other tasks might be social or emotional, such as “How can I deal with stress?” When students need something explained more than once, a chatbot will repeat the language many times. These features of chatbots can support educators with personalizing learning and providing more immediate supports for students with disabilities. 

Using AI at school has some potential to help students. However, it is important to remember that a product or service is not accessible just because it uses AI. The technology may not be able to interpret the words of people who speak differently from those whose language was used to build the program.  For example, AI might not recognize the words of a student with a speech impairment. Some AI programs interfere with assistive technologies like screen readers, by blocking their use. Some AI programs cover the buttons that people use to customize for accessibility, such as color contrast. This makes the materials inaccessible.  

The programs that use AI do not know whether the language they make is true or not. All the program can do is determine whether words are likely to occur in a specific order. LLMs can generate sentences that are grammatically correct and seem to be true, but they may not really be so. This can be a barrier to accessibility because some students with disabilities may need more time than students without disabilities to learn from the less-natural language made by AI. Schools that adopt AI programs for student use must be aware of such barriers and ensure accommodations are provided for students with disabilities.  

Two student using technology, one using a cane - with icons representing AI and accessibility
Image created using ChatGPT 4o, 6/23/25. OpenAI.

What’s more, AI does not always check for truth and fairness. For example, if there are lies, misrepresentations, or unkind statements about people who have disabilities, that will be represented in language that is made by the program. This is an accessibility barrier because it costs time when students must confront and dismiss stereotypes to keep working. 

Finally, many types of AI used in schools need new language all the time to keep functioning. One source of language for AI is the population of students who use it. AI programs use and store everything they are given. The programs keep personal information about family or geographic location, as well as information about individuals’ disabilities. Information is shared intentionally when a student says to a chatbot, “I have dyslexia.” Sharing also happens unintentionally. For example, a dyslexic student could enter language into a program, and the characteristics of dyslexia will show up in what they wrote. Schools do not always know what information is being collected and where it is going.

Disability advocates have warned that people with disabilities can be restricted in where they can work or live when information about them is made publicly available. Decision makers who access the information might discriminate. For example, an employer might be able to ask an AI program for information about a person as a prospective job candidate, and specific disability identification might be disclosed. Colleges or other education and training programs might also be provided information by AI programs that have been used by individuals with differences, and disabilities. That could happen if AI programs were trained with the social media posts, medical records, educational records, and/or previous employment records of a person with a disability. These disclosures might include direct naming of conditions and diagnoses or information that would help someone make an inference. A landlord might also access this type of information and not allow a person to rent a house. A bank might use it to deny a loan.  

Educators need to ask questions about AI prior to using this technology with students with and without disabilities. Here are some examples of questions about accessibility and AI in schools: 

  1. How do educators expect that AI will support learning of all students? 
  1. How will AI affect the means of learning? The manner of learning? The time needed to learn?
  1. What information will be given to students about what AI is and what it does (and cannot do)? 
  1. What steps are school leaders taking to protect students’ personal information and privacy?  
  1. What alternatives do students have when they do not want to use AI programs at all, on a specific day, or for a specific task? 

What are your questions about AI and accessibility for students with disabilities? Have you experienced any of the above examples? Please share your thoughts in the comments.