HomepageISTEEdSurge
Skip to content
ascd logo
Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
March 1, 2026
5 min (est.)
Vol. 83
No. 6

Reimagining Accessible Literacy

author avatar
AI-enabled tools are reshaping reading and writing supports for learners with diverse needs.
premium resources logo

Premium Resource

Accessibility & Inclusive LearningArtificial Intelligence
A young girl wearing wireless earbuds writes on a tablet, with digital learning icons — including a virtual classroom, certificate, and video screen — floating above the device.
Credit: Family Stock / Shutterstock
For generations, educators have worked to make reading and writing accessible to all students. They recognize that classrooms hold stories waiting to be told—stories that live in hesitant voices as they find confidence or in handwritten notes revised and rewritten—and they work to support every student in sharing their stories, even if learners face obstacles along the way. Students with dyslexia and dysgraphia, learners who are blind or have low vision, and those learning English are among the many students who may encounter barriers to fully experiencing the power of words. In today’s classrooms, accessibility tools enhanced by artificial intelligence (AI) are helping educators and students discover new ways to read, reflect, and respond, while ­expanding what it means to be literate in a digital world.

AI as an Ally

It’s easy to view accessibility as something extra, like an add-on feature to the learning process. We’ve seen separate devices such as screen readers, closed captioning software, or speech-to-text programs designed to help learners catch up or keep pace. Many times, these tools are introduced only after a struggle is revealed. The cycle is familiar: A student shows signs of difficulty, assessments are conducted, meetings are held, and eventually, accommodations are made. How much time passes between concern and commitment? What is the cost to student confidence, participation, and belonging in the meantime?
When accessibility tools are paired with AI, they shift from reactive helpers to proactive partners. Let’s take the example of Amira, programmed to measure fluency and vocabulary. Instead of waiting for an identified need, the platform can be designed to perform the following real-time tasks:
  • Adapt pace and presentation: Monitor a student’s reading rhythm and adjust speed, line spacing, or highlighting to support focus and fluency.
  • Detect cognitive load: Recognize when a reader hesitates or rereads and offer instant scaffolds such as glossaries, visuals, or brief summaries without interrupting flow.
  • Personalize supports: Provide audio narration, multilingual overlays, or simplified explanations based on learner engagement.
  • Provide actionable insights: Flag emerging patterns that can help teachers intervene early with targeted strategies.
Tools like these show how AI can be positioned as an ally, a partner that helps accessibility tools recognize patterns faster, respond sooner, and support more learners. Isabelle Hau, executive director of Stanford Accelerator for Learning and author of Learn to Love: The Transformative Power of Care and Connection in Early Education (PublicAffairs, 2025), envisions these AI-based interactions as partnerships where “educators, students, and accessibility tools would feel like a circle of connection—where every voice is heard, every learner is seen, and technology quietly strengthens the bonds between them.” Hau continues:
AI would not replace relationships but deepen them—helping teachers notice what might otherwise be missed, and giving each child new ways to express curiosity, creativity, and care. When accessibility tools become instruments of connection, they don’t just level the playing field; they expand the possibilities of human learning for all. (personal communication, October 31, 2025)
This partnership between AI and accessibility tools hints at a larger evolution: a chance to move beyond assisting readers to reimagining the very experience of reading.

AI as an Amplifier

What does it mean to read? Merriam-Webster defines “read” as “to take in the sense of (letters, symbols, etc.) especially by sight or touch.” This broad definition differs slightly from how we experience the act of reading, which traditionally refers to the activity of comprehending meaning through text (National Council of Teachers of English, 2019). Moreover, many of us think of reading as opening a book and looking at the words on a page.
Given these distinctions, it’s not surprising there are disagreements on the best methods and supports to teach students how to read (Petscher et al., 2020). For learners who are blind or have low vision, traditional supports include large print and braille. Learners with dyslexia may rely on friendly fonts, text-to-speech, or features such as highlighting. Other available tools include audiobooks and devices that can scan text and read it aloud.
Despite various supports, students are still struggling to read (National Assessment of Educational Progress, 2024). Reports point to multiple causes for the decline, ranging from gaps in foundational skills to the lingering effects of the COVID-19 school closures (Schwartz, 2025). But the challenge runs deeper than mechanics. Reading today competes with new modes of information, such as videos, images, and generative media, which reshape how learners encounter and interpret text. As a result, we must consider how we design reading experiences that meet learners where they are. Here, AI can act as an accessibility amplifier. For students with diverse learning needs, AI enhances how learners interact with text and meaning.
  • For students with dyslexia or ADHD: AI-driven tools (e.g., Microsoft Immersive Reader) now use natural, expressive voices and synchronized highlighting to guide the reader’s attention line by line, reducing cognitive overload and improving comprehension.
  • For students who are blind or have low vision: visual models (e.g., Be My AI) can interpret images, graphs, and even handwritten notes, converting them into spoken or tactile formats so learners can access all parts of a text.
  • For multilingual learners: AI translation models such as DeepL can preserve nuance, idioms, and tone, helping learners understand content across languages while building ­literacy, instead of switching between ­translation and learning models.
Personalization further expands accessibility for all students. With just a few prompts, generative AI tools like ChatGPT, Gemini, and Nano Banana can create text and images based on a learner’s needs and interests, helping them better connect to the learning material. Built-in multimodal features, such as text-to-speech and interactive visuals, allow learners to choose how they read and respond. Learners are more likely to stay engaged when what they read is relevant to their lives and experiences and when they can take greater ownership over their learning.

The partnership between AI and accessibility tools hints at a larger evolution: a chance to move beyond assisting readers to reimagining the very experience of reading.

Author Image

AI as a Co-Creator

If reading is the way we take in ideas, writing is the way we let them out. In ancient times, humans expressed ideas through images and symbols carved into rocks. Centuries later, we started to use quills and ink. The printing press transformed those private expressions into shared knowledge, while the typewriter gave words rhythm and speed. The computer invited collaboration, allowing text to flow. Today, writing continues to evolve through digital tools and AI, where words can be dictated, translated, and refined in real time. At its core, writing has always been about the connection between thought and expression—and new technologies are helping more people find their voice.
Kalantzis and Cope (2024) define generative AI as a technology of writing. The researchers compare the multimodal expressions of Indigenous peoples (image, song, and dance) to the capabilities of AI, such as writing text, reading images, and generating images from text-based prompts. In this way, they suggest that AI’s multimodal capabilities function as contemporary counterparts to Indigenous expressive modalities, with each expanding the range of how ideas can be communicated beyond alphabetic text. Just as AI tools and AI-related systems can make reading more accessible for learners, the technology can also open more opportunities for expressing ideas through writing.
  • Speech-to-text and dictation tools (e.g., Otter.ai): Convert spoken ideas into written form, supporting students with dysgraphia, mobility challenges, or those who think best aloud.
  • Differentiated graphic organizers (e.g., Eduaide.ai): Educators can generate digital organizers with varied scaffolds, including visual cues, simplified text, or structured prompts, to make writing tasks more adaptable.
  • Predictive text and grammar support (e.g., Microsoft Word): Suggest words as people type, helping multilingual learners or emerging writers.
  • Multimodal composition platforms (e.g., Gemini Storybook, NotebookLM): Combine text, audio, and visuals, providing opportunities to transform traditional essays into creative narratives.
Kalantzis and Cope (2024) suggest alternative ways of writing, including using emojis with emerging writers to reflect how graphics have become a growing form of expression. As writing becomes more accessible through technology, the act of expression itself is being redefined. More learners can shape their stories, share their experiences, and discover that writing, in many modalities, can be a bridge between thought and connection.

Educators in the AI Era

Tomorrow’s students may not distinguish between reading a printed page and interacting with a responsive digital text. Our job as educators is to ensure that our definition of literacy evolves alongside our learners as they seek out ways to make meaning, express ideas, and engage with their technology-rich world. Ethical and pedagogical questions remain regarding how student data is used, how AI systems are trained, and how to mitigate bias. As AI tools become more integrated into classrooms, educators must lead with purpose and be intentional about how they shape AI’s role in supporting accessibility. To begin that work, educators can take three initial steps:
  • Become familiar with capabilities of existing tools. Your search for accessible tools starts in your existing learning environment. Some of the tools you already use may have AI-related accessibility features.
  • Stay up to date on emerging technologies. Accessibility is evolving. Organizations and information centers such as CAST, Closing the Gap, Innovations in Special Education Technology, and the Assistive Technology Industry Association can help you stay informed.
  • Discuss accessibility with colleagues. There’s an adage about the teacher down the hall being a wealth of knowledge. Leverage your connection with colleagues for insights on how they incorporate accessibility into their literacy lessons.
Tiffani Martin, award-winning CEO of VisionTech.co, who is blind, emphasizes the importance of accessibility and the power of AI:
I can complete tasks that once required help from others because the technology both augments and fills in the gaps. It amplifies the ingenuity that disability demands and turns my problem solving into an opportunity. The automation in AI gives me back my time, energy, and dignity that used to get lost in the process. (personal communication, October 30, 2025)
Martin’s statement reflects the impact accessibility can have for learners. Students stop being categorized by what they can or cannot do. They start being known by what they create, question, and contribute.

Reflect & Discuss

  • The author recommends starting by examining AI-related accessibility features in your existing learning environment. What tools are you already using, and what hidden capabilities might you be overlooking?

  • How might your school shift from viewing accessibility as a service to seeing it as a shared responsibility?

 

References

Kalantzis, M., & Cope, B. (2024) Literacy in the time of artificial intelligence. Reading Research Quarterly, 60(1), e591.

National Assessment of Educational Progress. (2024). The nation’s report card. National Assessment of Educational Progress.

National Council of Teachers of English. (2019). The act of reading: Instructional foundations and policy guidelines. National Council of Teachers of English.

Petscher, Y., Cabell, S. Q., Catts, H. W., Compton, D., Foorman, B. R., Hart, S. A., et al. (2020). How the science of reading informs 21st-century education. Reading Research Quarterly, 55(Suppl 1), S267–S282.

Schwartz, S. (2025, January 29). Why are reading scores still falling on the Nation’s Report Card? Education Week.

Dr. Nneka McGee taught middle school mathematics and served as a chief academic officer in a south Texas school district. As part of her doctoral studies, she completed research on the impact of artificial intelligence in K-12 learning environments. She helps educational institutions and organizations on issues related to emerging technologies.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Discover ASCD's Professional Learning Services
From our issue
Educational Leadership magazine cover titled “Literacy in the Age of AI,” featuring a collage of notebook paper strips, books, a pen, and a laptop arranged on a light background.
Literacy in the Age of AI
Go To Publication