Back

AI and Indigenous Knowledge: Ethical Technology and Data Sovereignty

—By K. Quinn Smith

An elder woman dressed in a purple shirt and adorned with a beaded medallion looks down.

The rise of artificial intelligence (AI) has captured imaginations worldwide.

It promises faster decisions, streamlined systems, and sweeping changes across industries. As these tools become more embedded in our everyday lives, a critical question is emerging: How does AI interact with Indigenous knowledge systems?

This is more than just a technical question — it’s also an ethical one. For communities who have long fought to protect and reclaim their stories, languages, and ways of knowing, AI presents both exciting opportunities and real risks. We must be willing to ask: Who benefits from these tools? Who has a say in how they’re used? And how can Indigenous communities lead the way in shaping digital futures that honour tradition?

What Is at Stake?

AI tools like ChatGPT, image generators, and predictive algorithms are trained on huge datasets. These datasets are often compiled from the open internet or institutional archives, usually without the consent of the individuals or communities whose stories and knowledge are being used. This means that oral histories, language resources, and even digitized cultural artifacts can be absorbed and reproduced by AI systems, often out of context and without honouring their creators and origins.

For Indigenous communities, this can be problematic. AI can strip nuance from knowledge, propagate stereotypes, or turn sacred stories into content. Worse, it can become part of extractive systems that take without giving back.

That’s why data sovereignty — the right of communities to govern how their data is collected, used, and shared — is a key issue in the AI conversation. Who within a community governs the data is also an important conversation — who gives permission for collective knowledge?

Opportunities for Empowerment

Despite the risks, many Indigenous leaders and technologists are exploring ways to harness AI in ways that reflect community values.

  • Language revitalization: AI models can help generate new language learning tools, translate between dialects, and preserve rare recordings. Initiatives like the Indigenous Languages Technology Project are already doing this with care and consent.
  • Community mapping: AI can enhance visual storytelling and land-use planning by helping communities organize, interpret, and present spatial and cultural knowledge in meaningful ways. With tools like satellite imagery analysis and pattern recognition, AI can detect environmental changes such as shifting water levels, deforestation, or erosion over time. This information can be visualized through maps that are layered with traditional place names, harvesting areas, seasonal knowledge, and oral histories. AI can also assist in translating narrative data — like interview transcripts or archival texts — into visual formats by identifying themes and linking them to specific locations. These combined methods allow communities to tell their own land-based stories, support decision-making, and safeguard traditions using accessible, interactive formats.
  • Ethical research: Tools like AI-powered transcription or text analysis can assist community researchers, provided they're used transparently and with permission.

At Narratives, we’ve seen how emerging tech, when guided by Indigenous ethics and leadership, can amplify stories rather than overwrite them. In our Research & Learning and Capacity Facilitating work, we often support communities in shaping their own data practices. This can include anything from qualitative coding and archival analysis to digital storytelling.

A person holding a drum and notebook with both hands.

Grounded in Consent, Guided by Care

There’s no one-size-fits-all answer to whether a community should adopt AI tools. Some may choose to make these tools their own, using them in ways that reflect community goals and cultural practices. Others may resist these technologies altogether, focusing instead on place-based practices and intergenerational knowledge transfer.

What’s most important is that these choices are community-led.

Here are a few key principles to consider when approaching AI in Indigenous contexts:

  • 1

    Consent comes first. Just because knowledge exists in digital form does not mean it is unilaterally available for AI use. Ask: What intent was this information shared with in the first place? Digitized content is often shared within cultural or relational boundaries, so using it outside those contexts can be exploitative and even harmful.

  • 2

    Context matters. AI can strip away nuance. Center human storytellers so the richness of the context can stay intact.

  • 3

    Invest in relationships, not just tools. Long-term partnerships with community researchers and Elders can guide ethical tech use.

  • 4

    Support local governance. Data governance policies like OCAP® (Ownership, Control, Access, and Possession) are essential.

Where Do We Go from Here?

The conversation about AI and Indigenous knowledge is still unfolding. But one thing is clear: technology must serve people, not the other way around.

As researchers, designers, and facilitators, we have a role to play in ensuring that the digital tools we use reflect the values of care, accountability, and self-determination. At Narratives, that means staying grounded in relationships, listening deeply, and always asking what justice looks like, in both our outcomes as well as, perhaps more importantly, our methods.

The future of AI doesn't have to be extractive. With Indigenous leadership, it can become something entirely different: a future where tradition and technology can walk together.

K. Quinn Smith, Qualitative Data Analyst