photo of meeting table and chairs inside room
|

Week 4 Reflection: Accessibility and AI in Professional Technology Use

Inquiry Connection

In my previous reflections, I explored how AI tools are used in both academic and professional settings, and how responsible use requires critical evaluation and accountability. Building on this, I wanted to examine how AI also plays a role in accessibility, particularly in professional environments where tools like Microsoft Copilot are integrated into everyday workflows.


What?

This week, a discussion during lab briefly shifted toward accessibility in technology, which made me reflect on how AI tools are used in practice to support accessibility. During my internship last summer, I saw several examples of how AI-powered tools were used to improve communication and efficiency in the workplace.

For example, tools like Microsoft Copilot and related features were used to generate captions during meetings, provide transcripts for recorded discussions, and assist in summarizing large amounts of text. In some cases, AI was also used to help automate repetitive tasks or quickly identify key variables within large documents, making it easier to navigate complex information.

These uses of AI were not always explicitly framed as accessibility features, but they clearly helped make information more understandable and easier to work with.


So What?

These examples highlight how AI can improve accessibility by reducing barriers to understanding and participation. Features like captions and transcripts can support individuals who are hard of hearing, non-native speakers, or anyone who benefits from reviewing information in multiple formats. Similarly, summarization and information extraction can help reduce cognitive load, making it easier to process large amounts of information efficiently.

At the same time, these benefits are not evenly distributed. Access to tools like Microsoft Copilot often depends on organizational resources, subscriptions, or technical familiarity. This raises questions about equity, as not all students or workers may have access to the same tools or know how to use them effectively.

There are also limitations to consider, which can be summarized as follows:

AI FeatureAccessibility BenefitLimitation
CaptionsSupports hearing accessibility and real-time understandingMay be inaccurate or miss context
TranscriptsAllows users to review and revisit informationCan omit tone
SummarizationReduces cognitive load and improves clarityMay oversimplify key details
AutomationReduces repetitive effort and increases efficiencyMay reduce deeper engagement or understanding

While AI can support accessibility, it does not replace the need for intentional design and inclusive practices. If these tools are relied on without verification, they can introduce new barriers instead of removing them.

Now What?

Moving forward, I want to be more aware of how AI tools can be used to support accessibility while also recognizing their limitations. This includes thinking critically about who benefits from these tools, how accurate and reliable they are, and whether they are being used in ways that genuinely support inclusion.

In professional settings, this means not only using tools like Copilot for efficiency, but also considering how they can improve communication and access for others. At the same time, it reinforces the importance of not relying entirely on AI outputs, especially when accessibility features such as captions or summaries may contain errors.

This perspective adds another layer to my inquiry into responsible AI use. Beyond efficiency and productivity, responsible use also involves considering accessibility, equity, and the broader impact of technology on different users.

Featured photo by Nastuh Abootalebi on Unsplash.